CN111163213A - Terminal control method and device and terminal equipment - Google Patents

Terminal control method and device and terminal equipment Download PDF

Info

Publication number
CN111163213A
CN111163213A CN201911139005.1A CN201911139005A CN111163213A CN 111163213 A CN111163213 A CN 111163213A CN 201911139005 A CN201911139005 A CN 201911139005A CN 111163213 A CN111163213 A CN 111163213A
Authority
CN
China
Prior art keywords
area
sensing data
terminal equipment
screen
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911139005.1A
Other languages
Chinese (zh)
Inventor
卢彩娇
席迎军
陈宏达
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201911139005.1A priority Critical patent/CN111163213A/en
Publication of CN111163213A publication Critical patent/CN111163213A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/02Power saving arrangements
    • H04W52/0209Power saving arrangements in terminal devices
    • H04W52/0261Power saving arrangements in terminal devices managing power supply demand, e.g. depending on battery level
    • H04W52/0267Power saving arrangements in terminal devices managing power supply demand, e.g. depending on battery level by controlling user interface components
    • H04W52/027Power saving arrangements in terminal devices managing power supply demand, e.g. depending on battery level by controlling user interface components by controlling a display operation or backlight unit
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Abstract

The embodiment of the application provides a terminal control method, a terminal control device and terminal equipment, wherein the method is applied to the terminal equipment, the terminal equipment comprises an acceleration sensor, and the method comprises the following steps: when the terminal equipment is in a call answering state, acquiring sensing data acquired by the acceleration sensor within a preset time period; when judging that the sensing data meet preset conditions, acquiring the capacity value of each area in a screen of the terminal equipment, wherein the preset conditions are the conditions met by the sensing data acquired by the acceleration sensor when the terminal equipment executes effective movement with the movement distance larger than a first threshold and the movement speed larger than a second threshold; and lightening or extinguishing the screen of the terminal equipment according to the capacity value of each area in the screen of the terminal equipment. The problem of screen brightness of the comprehensive screen terminal device in a call state is solved.

Description

Terminal control method and device and terminal equipment
Technical Field
The present application relates to the technical field of terminal devices, and in particular, to a terminal control method and apparatus, and a terminal device.
Background
At present, the smart phone can realize the functions of turning off a screen when an object approaches the smart phone and turning on the screen when the object is far away from the smart phone in the conversation process.
The prior art has implemented this function by proximity light sensors. The proximity optical sensor is arranged above the mobile phone screen, and when the smart phone is in a call state, infrared rays are sent outwards to measure the distance between an external object and the mobile phone screen. When the measured distance between the object and the mobile phone screen is smaller than a certain value, the mobile phone is turned off, false triggering in the conversation process is prevented, when the measured distance between the object and the mobile phone screen exceeds the certain value, the mobile phone is turned on when no object is close to the mobile phone. However, with the development of smart phones, the full screen of the mobile phone is a trend, and there is not enough space on the screen of the mobile phone to place the proximity light sensor.
Disclosure of Invention
The embodiment of the application provides a terminal control method, a terminal control device and terminal equipment, and aims to solve the problem of screen control of comprehensive screen terminal equipment in a call state.
In a first aspect, an embodiment of the present application provides a terminal control method, which is applied to a terminal device, where the terminal device includes an acceleration sensor, and the method includes:
when the terminal equipment is in a call answering state, acquiring sensing data acquired by the acceleration sensor within a preset time period;
when judging that the sensing data meet preset conditions, acquiring the capacity value of each area in a screen of the terminal equipment, wherein the preset conditions are the conditions met by the sensing data acquired by the acceleration sensor when the terminal equipment executes effective movement with the movement distance larger than a first threshold and the movement speed larger than a second threshold;
and lightening or extinguishing the screen of the terminal equipment according to the capacity value of each area in the screen of the terminal equipment.
In the process, when the terminal device is in a call answering state, the acceleration sensor in the terminal device can be started to collect sensing data in a preset time period to obtain acceleration data of the terminal device in the preset time period, and then whether the movement of the terminal device in the preset time period is effective movement or not is judged by judging whether the acceleration data of the terminal device in the preset time period meets a preset condition or not. When the terminal equipment moves in a preset time period and is effectively moved once, the capacity value of each area in the screen of the terminal equipment is acquired, whether the effective movement is close to the movement once or is far away from the movement once is further judged, and then the type of the effective movement is used for controlling the lightening or extinguishing of the screen of the terminal equipment.
In a possible implementation manner, the preset conditions include a first preset condition and a second preset condition, and when it is determined that the sensing data meets the preset conditions, acquiring the capacitance values of the regions in the screen of the terminal device includes:
judging whether the sensing data meet the first preset condition or not, wherein the first preset condition is a condition met by the sensing data acquired by the acceleration sensor when the terminal equipment executes non-shaking movement;
if so, judging whether the sensing data meeting the first preset condition meets the second preset condition, wherein the second preset condition is a condition met by the sensing data acquired by the acceleration sensor when the non-shaking movement is the effective movement;
and if so, acquiring the capacity value of each area in the screen of the terminal equipment.
In the process, whether the movement of the terminal equipment in a preset time period is shaking movement or non-shaking movement is judged by judging whether the collected sensing data meet a first preset condition or not, when the movement of the terminal equipment is determined to be shaking movement, the capacity value of a screen area is not obtained, when the movement of the terminal equipment is determined to be non-shaking movement, whether the non-shaking movement is effective movement or not is further judged by a second preset condition, the capacity value of the screen area is obtained only when the effective movement is carried out, and the screen of the terminal equipment is lightened or extinguished according to the obtained capacity value, so that the problems that the screen capacity value is detected when a slight gesture shakes, and the screen of the terminal equipment is lightened or extinguished are avoided.
In one possible embodiment, the acceleration sensor includes an X-axis acceleration sensor, a Y-axis acceleration sensor, and a Z-axis acceleration sensor; the first preset condition comprises the following conditions:
any one of the X-axis curve, the Y-axis curve and the Z-axis curve has an intersection point with the other two curves;
the number of wave crests and wave troughs in the X-axis curve, the Y-axis curve and the Z-axis curve is less than a third threshold value;
the X-axis curve, the Y-axis curve and the Z-axis curve all have monotonic characteristics;
the X-axis curve is formed by sensing data collected by the X-axis acceleration sensor in the preset time period, the Y-axis curve is formed by sensing data collected by the Y-axis acceleration sensor in the preset time period, and the Z-axis curve is formed by sensing data collected by the Z-axis acceleration sensor in the preset time period.
In the process, the content of a first preset condition is specifically limited, the acceleration data is divided into data in three directions of an X axis, a Y axis and a Z axis, in a preset time period, when an intersection point exists between any two direction axis curves, the number of wave crests and wave troughs of the three direction axis curves is smaller than a certain value, and the three direction axes have a monotonic characteristic in a certain time period, the sensing data of the terminal device in the time period can be determined to meet the first preset condition, and at the moment, the movement of the terminal device in the preset time period can be determined to be one non-shaking movement. Through the judgment of the first preset condition, subsequent detection is only carried out when the first preset condition is met, and the response is not carried out on the sensing data which do not meet the first preset condition, so that the problem that the screen of the terminal equipment is suddenly turned on and suddenly turned off due to slight shaking of the terminal equipment in a call state is avoided.
In a possible implementation manner, the determining whether the sensing data after meeting the first preset condition meets the second preset condition includes:
performing first processing on the sensing data meeting the first preset condition, wherein the first processing comprises whitening processing and down-sampling processing;
and judging whether the sensing data subjected to the first processing meets the second preset condition.
In the above process, for the sensing data meeting the first preset condition, the second preset condition is also determined, and whether the non-shaking movement is an effective movement is further determined. The judgment of the second preset condition is realized by inputting the sensing data meeting the first preset condition into the trained model, and whether the second preset condition is met is judged according to the output result of the trained model. For the sensing data satisfying the second preset condition, it may be determined that the movement of the corresponding terminal device is one valid movement. And through the judgment of the second preset condition, the data which is not effectively moved in the non-shaking movement is filtered, so that the subsequent volume value detection is carried out only on the effective movement, and the detection efficiency is also improved.
In a possible implementation manner, the determining whether the sensed data after the first processing meets the second preset condition includes:
inputting the sensing data subjected to the first processing into a preset model to obtain an output result of the preset model, and judging whether the sensing data subjected to the first processing meets the second preset condition or not according to the output result of the preset model;
the preset model is obtained by learning multiple groups of first samples and multiple groups of second samples, each group of first samples comprises first sample sensing data and a first sample output result, the first sample sensing data is sensing data acquired by an acceleration sensor in sample terminal equipment when the sample terminal equipment executes the effective movement, and the first sample output result is used for indicating the effective movement; each group of second samples comprises second sample sensing data and a second sample output result, the second sample sensing data is sensing data acquired by an acceleration sensor in the sample terminal equipment when the sample terminal equipment executes invalid movement, and the second sample output result is used for indicating the invalid movement.
In the above process, specific contents of the second preset condition are further defined, the sensing data is firstly subjected to first processing, then the sensing data after the first processing is input into the preset model, and whether the sensing data after the first processing meets the second preset condition is judged according to an output result of the preset model. Before the operation, firstly, the model is trained, the trained sample comprises sensing data when the terminal device executes effective movement and invalid movement, the preset model is obtained through multiple times of training, and judgment is carried out, so that the efficiency of subsequent capacity value detection is improved.
In one possible embodiment, the effective movement comprises:
moving the terminal device from a first position to an ear, a distance between the first position and the ear being greater than the first threshold; and the number of the first and second groups,
moving the terminal device from an ear to a second position, the distance between the second position and the ear being greater than the first threshold.
In the above process, a type of valid movement is further defined, and when the terminal device moves from the first position to the ear and the movement distance is greater than the first threshold, the terminal device is of a close type, and when the terminal device moves from the ear to the second position, the terminal device is of a far type, and for the determination of the valid movement type, it can be determined whether the screen of the terminal device performs an operation of turning on or off.
In a possible implementation manner, lighting up or turning off a screen of the terminal device according to a capacity value of each area in the screen of the terminal device includes:
determining an effective movement type of the terminal equipment according to the volume value of each area in the screen of the terminal equipment, wherein the effective movement type is a pick-up type or a put-down type;
when the effective movement type of the terminal equipment is determined to be the pick-up type, turning off a screen of the terminal equipment;
and when the effective movement type of the terminal equipment is determined to be the put-down type, lightening a screen of the terminal equipment.
In the process, the effective movement type of the terminal equipment is determined through the capacity value of each area in the screen of the terminal equipment, the lighting or extinguishing of the terminal equipment is determined by judging the effective movement type of the terminal equipment, and when the effective movement type is a pick-up type, the operation of extinguishing the screen of the terminal equipment is executed, so that the possible mistaken touch operation in the conversation process is prevented, the electric quantity of the terminal equipment is saved, and when the effective movement type is a drop-down type, the operation of lighting the screen of the terminal equipment is executed, so that various operations of the terminal equipment are facilitated for a user.
In a possible implementation manner, determining an effective movement type of the terminal device according to the volume value of each area in the screen of the terminal device includes:
judging whether the volume value of each area in the screen of the terminal equipment meets at least one of third preset conditions, if so, determining that the effective movement type is a pick-up type, and if not, determining that the effective movement type is a drop-down type;
wherein the third preset condition includes at least one of:
the area of a region with the capacity value larger than the first capacity value in the screen of the terminal equipment is larger than the first area; alternatively, the first and second electrodes may be,
the capacity value of a first area in the terminal equipment is larger than a second capacity value, and the first area is an area with a first preset size where a receiver of the terminal equipment is located; alternatively, the first and second electrodes may be,
the average capacity value of a second area in the terminal device is N times of the average capacity value of a third area, wherein N is larger than 1, the second area is an area with a second preset size where a receiver of the terminal device is located, the third area is an area except the second area in a screen of the terminal device, or the third area is a part of the area except the second area in the screen of the terminal device; alternatively, the first and second electrodes may be,
the fourth area of the terminal device comprises a plurality of sub-areas with the capacity values larger than the third capacity value, and the fourth area is an area with a third preset size where a receiver of the terminal device is located.
In the above process, a condition that the volume value of each region in the screen of the terminal device is satisfied when the valid movement type is the pick-up type is specifically defined, when the volume value of each region in the screen of the terminal device satisfies one of the above multiple conditions, the valid movement type is the pick-up type, and if the volume value of each region in the screen of the terminal device does not satisfy any of the above multiple conditions, the valid movement type is the drop type, so that the operation of turning on or off the screen is determined to be performed by specifically judging the valid movement type.
In a second aspect, an embodiment of the present application provides a terminal control apparatus, including:
the acquisition module is used for acquiring sensing data acquired by the acceleration sensor within a preset time period when the terminal equipment is in a call answering state;
the processing module is used for acquiring the volume value of each area in the screen of the terminal equipment when judging that the sensing data meet preset conditions, wherein the preset conditions are the conditions met by the sensing data acquired by the acceleration sensor when the terminal equipment executes effective movement with the movement distance larger than a first threshold and the movement speed larger than a second threshold;
and the control module is used for lightening the screen of the terminal equipment or extinguishing the screen of the terminal equipment according to the capacity value of each area in the screen of the terminal equipment.
In a possible implementation manner, the preset conditions include a first preset condition and a second preset condition, and the processing module is specifically configured to:
judging whether the sensing data meet the first preset condition or not, wherein the first preset condition is a condition met by the sensing data acquired by the acceleration sensor when the terminal equipment executes non-shaking movement;
if so, judging whether the sensing data meeting the first preset condition meets the second preset condition, wherein the second preset condition is a condition met by the sensing data acquired by the acceleration sensor when the non-shaking movement is the effective movement;
and if so, acquiring the capacity value of each area in the screen of the terminal equipment.
In one possible embodiment, the acceleration sensor includes an X-axis acceleration sensor, a Y-axis acceleration sensor, and a Z-axis acceleration sensor; the first preset condition comprises the following conditions:
any one of the X-axis curve, the Y-axis curve and the Z-axis curve has an intersection point with the other two curves;
the number of wave crests and wave troughs in the X-axis curve, the Y-axis curve and the Z-axis curve is less than a third threshold value;
the X-axis curve, the Y-axis curve and the Z-axis curve all have monotonic characteristics;
the X-axis curve is formed by sensing data collected by the X-axis acceleration sensor in the preset time period, the Y-axis curve is formed by sensing data collected by the Y-axis acceleration sensor in the preset time period, and the Z-axis curve is formed by sensing data collected by the Z-axis acceleration sensor in the preset time period.
In a possible implementation, the processing module is specifically configured to:
performing first processing on the sensing data meeting the first preset condition, wherein the first processing comprises whitening processing and down-sampling processing;
and judging whether the sensing data subjected to the first processing meets the second preset condition.
In a possible implementation, the processing module is specifically configured to:
inputting the sensing data subjected to the first processing into a preset model to obtain an output result of the preset model, and judging whether the sensing data subjected to the first processing meets the second preset condition or not according to the output result of the preset model;
the preset model is obtained by learning multiple groups of first samples and multiple groups of second samples, each group of first samples comprises first sample sensing data and a first sample output result, the first sample sensing data is sensing data acquired by an acceleration sensor in sample terminal equipment when the sample terminal equipment executes the effective movement, and the first sample output result is used for indicating the effective movement; each group of second samples comprises second sample sensing data and a second sample output result, the second sample sensing data is sensing data acquired by an acceleration sensor in the sample terminal equipment when the sample terminal equipment executes invalid movement, and the second sample output result is used for indicating the invalid movement.
In one possible embodiment, the effective movement comprises:
moving the terminal device from a first position to an ear, a distance between the first position and the ear being greater than the first threshold; and the number of the first and second groups,
moving the terminal device from an ear to a second position, the distance between the second position and the ear being greater than the first threshold.
In a possible implementation, the control module is specifically configured to:
determining an effective movement type of the terminal equipment according to the volume value of each area in the screen of the terminal equipment, wherein the effective movement type is a pick-up type or a put-down type;
when the effective movement type of the terminal equipment is determined to be the pick-up type, turning off a screen of the terminal equipment;
and when the effective movement type of the terminal equipment is determined to be the put-down type, lightening a screen of the terminal equipment.
In a possible implementation, the control module is specifically configured to:
judging whether the volume value of each area in the screen of the terminal equipment meets at least one of third preset conditions, if so, determining that the effective movement type is a pick-up type, and if not, determining that the effective movement type is a drop-down type;
wherein the third preset condition includes at least one of:
the area of a region with the capacity value larger than the first capacity value in the screen of the terminal equipment is larger than the first area; alternatively, the first and second electrodes may be,
the capacity value of a first area in the terminal equipment is larger than a second capacity value, and the first area is an area with a first preset size where a receiver of the terminal equipment is located; alternatively, the first and second electrodes may be,
the average capacity value of a second area in the terminal device is N times of the average capacity value of a third area, wherein N is larger than 1, the second area is an area with a second preset size where a receiver of the terminal device is located, the third area is an area except the second area in a screen of the terminal device, or the third area is a part of the area except the second area in the screen of the terminal device; alternatively, the first and second electrodes may be,
the fourth area of the terminal device comprises a plurality of sub-areas with the capacity values larger than the third capacity value, and the fourth area is an area with a third preset size where a receiver of the terminal device is located.
In a third aspect, an embodiment of the present application provides a terminal device, including: a memory, a processor, and a computer program, the computer program being stored in the memory, the processor running the computer program and performing the steps of:
when the terminal equipment is in a call answering state, acquiring sensing data acquired by an acceleration sensor within a preset time period;
when judging that the sensing data meet preset conditions, acquiring the capacity value of each area in a screen of the terminal equipment, wherein the preset conditions are the conditions met by the sensing data acquired by the acceleration sensor when the terminal equipment executes effective movement with the movement distance larger than a first threshold and the movement speed larger than a second threshold;
and lightening or extinguishing the screen of the terminal equipment according to the capacity value of each area in the screen of the terminal equipment.
In a possible implementation manner, the preset conditions include a first preset condition and a second preset condition, and when it is determined that the sensing data meets the preset conditions, acquiring the capacitance values of the regions in the screen of the terminal device includes:
judging whether the sensing data meet the first preset condition or not, wherein the first preset condition is a condition met by the sensing data acquired by the acceleration sensor when the terminal equipment executes non-shaking movement;
if so, judging whether the sensing data meeting the first preset condition meets the second preset condition, wherein the second preset condition is a condition met by the sensing data acquired by the acceleration sensor when the non-shaking movement is the effective movement;
and if so, acquiring the capacity value of each area in the screen of the terminal equipment.
In one possible embodiment, the acceleration sensor includes an X-axis acceleration sensor, a Y-axis acceleration sensor, and a Z-axis acceleration sensor; the first preset condition comprises the following conditions:
any one of the X-axis curve, the Y-axis curve and the Z-axis curve has an intersection point with one or two other curves;
the number of wave crests and wave troughs in the X-axis curve, the Y-axis curve and the Z-axis curve is less than a third threshold value;
the X-axis curve, the Y-axis curve and the Z-axis curve all have monotonic characteristics;
the X-axis curve is formed by sensing data collected by the X-axis acceleration sensor in the preset time period, the Y-axis curve is formed by sensing data collected by the Y-axis acceleration sensor in the preset time period, and the Z-axis curve is formed by sensing data collected by the Z-axis acceleration sensor in the preset time period.
In one possible implementation, the processor is specifically configured to:
performing first processing on the sensing data meeting the first preset condition, wherein the first processing comprises whitening processing and down-sampling processing;
and judging whether the sensing data subjected to the first processing meets the second preset condition.
In one possible implementation, the processor is specifically configured to:
inputting the sensing data subjected to the first processing into a preset model to obtain an output result of the preset model, and judging whether the sensing data subjected to the first processing meets the second preset condition or not according to the output result of the preset model;
the preset model is obtained by learning multiple groups of first samples and multiple groups of second samples, each group of first samples comprises first sample sensing data and a first sample output result, the first sample sensing data is sensing data acquired by an acceleration sensor in sample terminal equipment when the sample terminal equipment executes the effective movement, and the first sample output result is used for indicating the effective movement; each group of second samples comprises second sample sensing data and a second sample output result, the second sample sensing data is sensing data acquired by an acceleration sensor in the sample terminal equipment when the sample terminal equipment executes invalid movement, and the second sample output result is used for indicating the invalid movement.
In one possible embodiment, the effective movement comprises:
moving the terminal device from a first position to an ear, a distance between the first position and the ear being greater than the first threshold; and the number of the first and second groups,
moving the terminal device from an ear to a second position, the distance between the second position and the ear being greater than the first threshold.
In one possible implementation, the processor is specifically configured to:
determining an effective movement type of the terminal equipment according to the volume value of each area in the screen of the terminal equipment, wherein the effective movement type is a pick-up type or a put-down type;
when the effective movement type of the terminal equipment is determined to be the pick-up type, turning off a screen of the terminal equipment;
and when the effective movement type of the terminal equipment is determined to be the put-down type, lightening a screen of the terminal equipment.
In one possible implementation, the processor is specifically configured to:
judging whether the volume value of each area in the screen of the terminal equipment meets at least one of third preset conditions, if so, determining that the effective movement type is a pick-up type, and if not, determining that the effective movement type is a drop-down type;
wherein the third preset condition includes at least one of:
the area of a region with the capacity value larger than the first capacity value in the screen of the terminal equipment is larger than the first area; alternatively, the first and second electrodes may be,
the capacity value of a first area in the terminal equipment is larger than a second capacity value, and the first area is an area with a first preset size where a receiver of the terminal equipment is located; alternatively, the first and second electrodes may be,
the average capacity value of a second area in the terminal device is N times of the average capacity value of a third area, wherein N is larger than 1, the second area is an area with a second preset size where a receiver of the terminal device is located, the third area is an area except the second area in a screen of the terminal device, or the third area is a part of the area except the second area in the screen of the terminal device; alternatively, the first and second electrodes may be,
the fourth area of the terminal device comprises a plurality of sub-areas with the capacity values larger than the third capacity value, and the fourth area is an area with a third preset size where a receiver of the terminal device is located.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where the computer-readable storage medium includes a computer program, and the computer program is used to implement the terminal control method according to any one of the first aspect.
In a fifth aspect, an embodiment of the present application further provides a chip or an integrated circuit, including: an interface circuit and a processor. The processor is configured to call program instructions through the interface circuit to implement the terminal control method according to any one of the first aspect.
In a sixth aspect, the present application further provides a computer program or a computer program product, where the computer program or the computer program product includes computer readable instructions, and the computer readable instructions, when read by one or more processors, implement the terminal control method according to any one of the first aspects.
According to the terminal control method, the terminal control device and the terminal device, when the terminal device is in an incoming call answering state, the acceleration sensor in the terminal device can be started to collect sensing data in a preset time period, acceleration data of the terminal device in the preset time period are obtained, and then whether the terminal device moves effectively or not in the preset time period is judged by judging whether the acceleration data of the terminal device in the preset time period meet a preset condition or not. When the terminal equipment moves in a preset time period and is effectively moved once, the capacity value of each area in the screen of the terminal equipment is acquired, whether the effective movement is close to the movement once or is far away from the movement once is further judged, and then the type of the effective movement is used for controlling the lightening or extinguishing of the screen of the terminal equipment.
Drawings
Fig. 1 is a schematic structural diagram of a terminal device according to an embodiment of the present application;
fig. 2 is a first schematic diagram illustrating that a terminal device performs effective movement according to an embodiment of the present application;
fig. 3 is a second schematic diagram illustrating that the terminal device performs effective movement according to the embodiment of the present application;
fig. 4 is a schematic diagram illustrating a screen comparison between two mobile phones provided in the embodiment of the present application;
fig. 5 is a schematic view of a screen of a proximity light sensor control terminal according to an embodiment of the present application;
fig. 6 is a schematic flowchart of a terminal control method according to an embodiment of the present application;
fig. 7 is a schematic diagram of a preset time period provided in the embodiment of the present application;
fig. 8 is a flowchart illustrating a terminal control method according to another embodiment of the present application;
fig. 9 is a schematic coordinate system diagram of a terminal device according to an embodiment of the present application;
fig. 10A is a first schematic view of a shaking movement scene provided in the embodiment of the present application;
fig. 10B is a schematic view of a shaking movement scene provided in the embodiment of the present application;
fig. 10C is a schematic view of a shaking movement scene three provided in the embodiment of the present application;
fig. 11A is a first schematic diagram illustrating a cross feature of a determination curve provided in an embodiment of the present application;
fig. 11B is a second schematic diagram of the cross feature of the determination curve provided in the embodiment of the present application;
fig. 11C is a third schematic diagram of the cross feature of the determination curve provided in the embodiment of the present application;
FIG. 12 is a schematic diagram of determining a peak of a curve provided by an embodiment of the present application;
fig. 13 is a schematic diagram of determining monotonicity of a curve according to an embodiment of the present disclosure;
FIG. 14 is a schematic diagram of different types of answering calls provided by embodiments of the present application;
fig. 15 is a schematic diagram of a non-shaking movement sensing data curve of a terminal device according to an embodiment of the present disclosure;
fig. 16 is a first schematic diagram of a curve of sensing data of shaking movement of a terminal device according to an embodiment of the present application;
fig. 17 is a second schematic diagram of a curve of sensing data of shaking movement of a terminal device according to an embodiment of the present application;
fig. 18 is a third schematic diagram of a curve of sensing data of shaking movement of a terminal device according to an embodiment of the present application;
fig. 19 is a schematic screen diagram of a mobile phone according to an embodiment of the present application;
FIG. 20 is a first schematic diagram illustrating capacitance approach detection provided in an embodiment of the present application;
FIG. 21 is a second schematic diagram of capacitance approach detection provided in the embodiments of the present application;
FIG. 22 is a third schematic diagram of capacitance approach detection provided by embodiments of the present application;
FIG. 23 is a fourth schematic diagram of capacitance approach detection provided by embodiments of the present application;
FIG. 24 is a fifth schematic diagram of capacitance approach detection provided by embodiments of the present application;
fig. 25 is a schematic structural diagram of a terminal control device according to an embodiment of the present application;
fig. 26 is a schematic hardware structure diagram of a terminal device according to an embodiment of the present application.
Detailed Description
To facilitate an understanding of the present application, the concepts related to the present application will first be explained.
The incoming call answering state: the incoming call answering state referred in the embodiment of the application can be a state that the terminal device carries out transmission conversation through a cellular network, and can also be a state that the terminal device carries out voice transmission conversation based on IP transmitted through the Internet. Illustratively, when the terminal device establishes a connection for a call through a telephone network provided by a telecom operator, the terminal device is in an incoming call answering state; when the terminal device carries out voice call through the network chat tool, the terminal device is in a call answering state, and the like. The network chat tool may be an application installed in a terminal device, such as a mobile phone, and may be an embedded application or a download application, for example.
It should be noted that, when the terminal device performs a video call through the network chat tool, the video call needs to obtain a picture taken by another terminal device through a screen of the terminal device, and the screen of the terminal device needs to be in a normally bright state, so that the state of the terminal device does not belong to an incoming call answering state at this time.
Capacitive screen: the touch screen is a screen which utilizes current induction of a human body as a principle to perform touch control, is widely applied to various terminal devices, and is a capacitive screen. The principle of the capacitive screen is that when a finger clicks the capacitive screen, due to an electric field of a human body, a coupling capacitor is formed in a contact area of the finger and the capacitive screen, a small amount of current can be absorbed from the contact area, voltage drop of corner electrodes is caused, and therefore the position of the contact area is determined, and touch of the screen is achieved.
Area of capacitive screen: the capacitive screen is divided into a plurality of small areas, and each small area has a corresponding capacitance value. When the capacitive screen is not touched, the capacitance values of the areas of the capacitive screen are relatively close, and slight differences may exist. Under different environments of temperature, humidity and the like, the capacitance value of each area of the capacitive screen can fluctuate within a certain range. If a human body touches or clicks the capacitive screen, the capacitance value of the touch area or the area at the click position changes and is higher than the capacitance values of other surrounding areas, and whether the human body touches the capacitive screen can be judged according to the difference between the capacitance values of the areas.
The terminal equipment: the terminal equipment related by the application refers to equipment capable of providing a call function for a user, has a wireless connection function, and can communicate with one or more core networks through a wireless access network. For example, the terminal device may be a mobile phone, a computer, a watch, a tablet, an in-vehicle device, a wearable device, an industrial device, an artificial intelligence device/Augmented Reality (AR) device, a Virtual Reality (VR) device, and the like.
Fig. 1 is a schematic structural diagram of a terminal device according to an embodiment of the present disclosure, and referring to fig. 1, the terminal device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display 194, and a Subscriber Identity Module (SIM) card interface 195. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation to the terminal device 100. In other embodiments of the present application, terminal device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K through an I2C interface, so that the processor 110 and the touch sensor 180K communicate through an I2C bus interface to implement the touch function of the terminal device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture function of terminal device 100. The processor 110 and the display 194 communicate through the DSI interface to implement the display function of the terminal device 100.
It should be understood that the interface connection relationship between the modules illustrated in the embodiment of the present application is only an exemplary illustration, and does not constitute a limitation on the structure of the terminal device 100. In other embodiments of the present application, the terminal device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the terminal device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the terminal device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the terminal device 100 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the terminal device 100 can communicate with the network and other devices through wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou satellite navigation system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The terminal device 100 implements a display function by the GPU, the display 194, and the application processor, etc. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display 194 is used to display images, video, and the like. The display 194 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the terminal device 100 may include 1 or N displays 194, with N being a positive integer greater than 1.
The terminal device 100 may implement a photographing function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the terminal device 100 selects a frequency point, the digital signal processor is used to perform fourier transform or the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. The terminal device 100 may support one or more video codecs. In this way, the terminal device 100 can play or record video in a plurality of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can implement applications such as intelligent recognition of the terminal device 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the terminal device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, a phonebook, etc.) created during use of the terminal device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the terminal device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The terminal device 100 may implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
Fig. 2 is a schematic view showing that the terminal device performs effective movement according to the first embodiment of the present disclosure, and as shown in fig. 2, taking the terminal device as a mobile phone as an example, when a user holds the mobile phone to perform a call answering operation, and after the call is answered, the user takes the mobile phone from a first position to the ear to answer the call.
When the telephone is connected, the screen of the mobile phone is in a bright screen state, so that the user can conveniently dial, mark, click a hands-free function and the like on the mobile phone. When a user needs to take a mobile phone to the ear for communication, the mobile phone screen is very easy to contact with the ear or the face of the user, the mobile phone screen can sense the contact between the user and the screen in a bright screen state, and at the moment, if the screen is still in the bright screen state, false touch operation is very easy to occur, for example, partial areas of the mobile phone screen are easily touched by mistake, unnecessary operation is caused, troubles are brought to the communication of the user, the mobile phone screen needs to be extinguished at the moment, so that the user can not respond to the situation when contacting the mobile phone screen with the mobile phone screen at the moment, and the false touch is prevented. Meanwhile, when a user places the mobile phone to the ear for communication, the user does not need to operate the mobile phone generally, and the power consumption of the mobile phone can be reduced by turning off the screen at the moment.
Fig. 3 is a schematic view of a second example that the terminal device provided in the embodiment of the present application executes effective movement, and as shown in fig. 3, also taking the terminal device as a mobile phone as an example, at this time, a user holds the mobile phone to execute a call hanging-up operation, and first takes the mobile phone away from the ear, moves the mobile phone to a second position, and then executes a hang-up operation. When the user takes the mobile phone away from the ear, the screen of the mobile phone should be in a bright screen state, and at this time, the user can directly perform corresponding operations on the bright screen mobile phone, for example, directly clicking on a hang-up when the user needs to hang up the phone, clicking on a hands-free when the user needs to turn on the hands-free, and the like.
It is understood that the operation of answering the phone illustrated in fig. 2 is only one example of a plurality of scenarios for carrying out the terminal device from another position to the ear, and similarly, the operation of hanging up the phone illustrated in fig. 3 is also only one example of a plurality of scenarios for carrying out the terminal device from the ear to another position. In the scenarios of fig. 2 and 3, it is necessary that the terminal device can determine the type of the valid movement when the terminal device performs the valid movement, so as to implement the operation of turning off the screen or turning on the screen.
In the prior art, the screen on or off of the mobile phone screen in the incoming call answering state is realized by the proximity light sensor, and because the full screen is a new trend, no redundant area is arranged on the full screen, so the scheme in the prior art is not suitable for the full screen mobile phone. Two different types of mobile phone screens, namely a non-full screen and a full screen, will be described in comparison with fig. 4 and 5.
Fig. 4 is a schematic diagram showing a comparison between screens of two mobile phones provided in the embodiment of the present application, as shown in fig. 4, a first mobile phone 41 and a second mobile phone 42 are respectively provided. The screen of the first mobile phone 41 is a non-full screen, the screen of the first mobile phone 41 includes a sensing area 411, and a proximity light sensor is installed inside the first mobile phone 41. Fig. 5 is a schematic view of a screen of a proximity light sensor control terminal according to an embodiment of the present disclosure, as shown in fig. 5, when the first mobile phone 41 is in a call answering state, the first mobile phone 41 emits infrared light to the outside through the sensing area 411. When the emitted infrared light meets an obstacle, the infrared light is reflected, the first mobile phone 41 can receive the corresponding reflected light, the distance s between the obstacle and the first mobile phone 41 is obtained according to the time from the emitting to the receiving of the reflected light and the propagation speed of the infrared light, when the distance s is smaller than a preset value H, the first mobile phone 41 can perform the operation of turning off the screen, and when the distance s is larger than the preset value H, the operation of turning on the screen can be performed.
Since it is necessary to install a device for emitting infrared light on the screen of the mobile phone, a part of the area of the first mobile phone 41 is occupied by the sensing area 411, and since a full-screen mobile phone like the second mobile phone 42 is more and more popular with users, the full-screen is gradually becoming a new trend. There is no place to place the sensing area 411 on the second mobile phone 42, so it is not suitable for the second mobile phone 42 to use the proximity light sensor to control the on/off of the incoming call screen.
In order to solve the problem, embodiments of the present application provide a method that can control lighting or extinguishing of a screen of a terminal device without using a proximity light sensor. Fig. 6 is a schematic flowchart of a terminal control method provided in an embodiment of the present application, where the method is applied to a terminal device, and the terminal device includes an acceleration sensor, as shown in fig. 6, the method may include:
and S61, when the terminal equipment is in a call answering state, acquiring the sensing data acquired by the acceleration sensor within a preset time period.
The execution main body in the embodiment of the application is a terminal device, and the terminal device comprises an acceleration sensor, wherein the acceleration sensor is a sensor for measuring acceleration and is used for measuring the acceleration of the terminal device. The acceleration sensor may be one of the above different types of acceleration sensors, or any other sensor device capable of acquiring the acceleration of the terminal device, and this is not particularly limited in this embodiment of the present invention.
Aiming at realizing the screen control of the terminal equipment in the call process, the embodiment of the application acquires the sensing data acquired by the acceleration sensor within the preset time period when the terminal equipment is in the incoming call answering state. The incoming call answering state of the terminal equipment refers to a state that the terminal equipment is in voice communication with other equipment, taking the terminal equipment as a mobile phone as an example, the incoming call answering state of the mobile phone can comprise that a user dials other numbers through the mobile phone, the mobile phone can be in telephone communication with other equipment after dialing, the incoming call answering state of the mobile phone can also comprise that the user operates social communication application on the mobile phone to initiate a voice call request to other account numbers, and the mobile phone can be in voice communication with other equipment after the other account numbers agree with the voice call request initiated by the user.
It should be noted that the above-mentioned performing telephone communication and initiating a voice call request to another person by dialing another number through a mobile phone to perform voice communication with another device means that the user operates the mobile phone to actively initiate a call request to another party, and when the terminal device is used as a party receiving the call request and makes a call with another device, the terminal device is also in a call answering state.
It should be noted that when communication is performed through a social application between a terminal device and another terminal device, the types of communication may include a voice call and a video call, and the video call needs to share a video screen with the other party, so that an operation of turning off the screen is not performed when the video call is performed. Therefore, when the terminal device is in a call through the application, the voice call and the video call need to be processed respectively, and only the voice call is considered that the terminal device is in a call answering state, and the video call is not considered that the terminal device is in the call answering state.
The acceleration is a vector and comprises an acceleration value and an acceleration direction, and sensing data acquired by the acceleration sensor in a preset time period not only reflects the acceleration value, but also reflects the acceleration direction. In a specific implementation, one acceleration sensor may be used to obtain the magnitude and direction of the acceleration, and then the acceleration is projected to each direction, or a plurality of acceleration sensors may be used, and each acceleration sensor only obtains the acceleration value in the corresponding direction.
The preset time period refers to a time period before a certain time, fig. 7 is a schematic diagram of the preset time period provided in the embodiment of the present application, as shown in fig. 7, for example, when it is required to determine whether to turn on or off the screen at the time t, the sensing data in the time period from t-t0 to t acquired by the acceleration sensor is acquired, and the time period from t-t0 to t is the preset time period, that is, it is determined whether to turn on or off the screen at the certain time needs to be determined according to the time and the sensing data acquired at the time period before the time. the length of the preset time period t-t 0-t is t0, and t0 can be set to 2s, 1s, 3s, etc. according to actual needs. During the preset time period, the acceleration sensor collects multiple frames of sensing data.
When the acceleration sensor collects sensing data, the sensing data are collected at a certain time interval Δ t, for example, the sensing data can be collected at a time interval of 50ms, each time the sensing data is collected, the sensing data is a frame of acceleration data, and multiple frames of acceleration data can be collected within a preset time period. It should be understood that the setting of 50ms is only an example, and the actual setting may set other values according to needs, and the time interval for the acceleration sensor to acquire the values may be equal or not, and the embodiment of the present application is not limited thereto.
And S62, when judging that the sensing data meet preset conditions, acquiring the volume value of each area in the screen of the terminal equipment, wherein the preset conditions are the conditions met by the sensing data acquired by the acceleration sensor when the terminal equipment executes effective movement with the movement distance greater than a first threshold and the movement speed greater than a second threshold.
An active movement refers to a normal pick-up or set-down of the terminal device. When the terminal device is normally picked up once, it means that the user moves the terminal device from another position to a position close to the user's ear, and the terminal device performs a close action. After the approach action is performed, the screen of the terminal device is very close to or in direct contact with the ear or face of the calling user, and at this time, the screen needs to be extinguished, so as to prevent the mistaken touch operation of the terminal device caused by the contact of the user with the screen of the terminal device. When the terminal device is normally put down, the user moves the terminal device from a position close to the ear of the user to other positions, and the terminal device performs a far-away action. After the moving-away action is executed, the screen of the terminal device is relatively far away from the ear or face of the user, and at this time, the screen needs to be lighted up, so that the user can perform corresponding operations by clicking the screen, for example, operations including clicking the screen to start a hands-free function, hanging up the call, and the like.
The effective movement in the embodiment of the present application refers to movement in which a movement distance in a preset time period is greater than a first threshold value, and a movement speed is greater than a second threshold value, where the first threshold value and the second threshold value are both preset. The distance moved within the preset period of time being greater than the first threshold and the speed of movement being greater than the second threshold are a necessary condition for the movement to be a valid movement and not a sufficient condition. When the sensing data are judged and acquired to meet the preset conditions, the fact that the terminal equipment executes an effective movement is indicated, at the moment, whether the effective movement is an executed one-time approaching action or one-time far-away action can not be judged only according to the sensing data, and judgment needs to be carried out by combining the volume values of all areas in the screen of the terminal equipment.
And S63, lighting up the screen of the terminal equipment or extinguishing the screen of the terminal equipment according to the capacity value of each area in the screen of the terminal equipment.
The screen of the terminal device is a capacitive touch screen, when a user touches the screen of the terminal device, the capacitance value of an area where the user contacts the touch screen changes, and when the user does not contact the touch screen, the capacitance values of all areas of the touch screen are almost equal, so that whether the user contacts the touch screen can be judged according to the capacitance values. In the embodiment of the application, the user is considered to contact the touch screen only when the capacitance values of certain areas on the touch screen are changed, and then the screen extinguishing operation is performed, wherein the areas are generally areas which the user may contact when receiving a call, such as areas near a microphone of the terminal equipment. When the capacitance values of all the areas on the touch screen are almost equal, the user is considered not to contact the touch screen at the moment, the effective movement executed by the terminal equipment is a one-time far-away action, and the screen of the terminal equipment is lightened; when the capacitance values of some areas on the touch screen change and are greatly different from the capacitance values of other areas, the user is considered to have contacted the touch screen, the effective movement executed by the terminal equipment is a proximity action, and the screen of the terminal equipment is extinguished at the moment.
According to the terminal control method provided by the embodiment of the application, when the terminal equipment is in the incoming call answering state, the acceleration sensor in the terminal equipment can be started to collect the sensing data in the preset time interval, the acceleration data of the terminal equipment in the preset time interval is obtained, and then whether the movement of the terminal equipment in the preset time interval is effective movement or not is judged by judging whether the acceleration data of the terminal equipment in the preset time interval meets the preset condition or not. When the terminal equipment moves in a preset time period and is effectively moved once, the capacity value of each area in the screen of the terminal equipment is acquired, whether the effective movement is close to the movement once or is far away from the movement once is further judged, and then the type of the effective movement is used for controlling the lightening or extinguishing of the screen of the terminal equipment.
Fig. 8 is a flowchart illustrating a terminal control method according to another embodiment of the present application, as shown in fig. 8, the method may include:
and S81, when the incoming call is in the answering state, the acceleration sensor collects the sensing data of the terminal equipment.
First, the acceleration sensor collects the sensing data of the terminal device only when the terminal device needs to be in the incoming call answering state, and as described above, the acceleration sensor can adopt different types of sensor devices. One possible implementation manner is to use three acceleration sensors to respectively collect corresponding sensing data, where the three acceleration sensors are respectively an X-axis acceleration sensor, a Y-axis acceleration sensor and a Z-axis acceleration sensor, the X-axis acceleration sensor collects sensing data in the X direction, that is, an acceleration value in the X-axis direction, the Y-axis acceleration sensor collects an acceleration value in the Y-axis direction, and the Z-axis acceleration sensor collects an acceleration value in the Z-axis direction, where the directions of the X-axis, the Y-axis and the Z-axis are perpendicular to each other.
Fig. 9 is a schematic coordinate system diagram of a terminal device according to an embodiment of the present application, and as shown in fig. 9, for one terminal device, a direction parallel to a screen of the terminal device and parallel to a short side of the screen of the terminal device is taken as an X-axis direction, a direction parallel to the screen of the terminal device and parallel to a long side of the screen of the terminal device is taken as a Y-axis direction, and a direction perpendicular to the screen of the terminal device is taken as a Z-axis direction.
When the terminal device moves, the X-axis acceleration sensor can acquire acceleration data in a direction parallel to a short side of a screen of the terminal device, similarly, the Y-axis acceleration sensor can acquire acceleration data in a direction parallel to a long side of the screen of the terminal device, and the Z-axis acceleration sensor can acquire acceleration data in a direction perpendicular to the screen of the terminal device.
The X-axis acceleration sensor, the Y-axis acceleration sensor and the Z-axis acceleration sensor may be integrated in one acceleration sensor, or may be independent acceleration sensors. When data are collected, the data collecting time of the three acceleration sensors is the same, namely for each frame of data, the three acceleration sensors simultaneously collect the accelerations of the terminal device in different directions at a certain moment.
And S82, judging whether the movement of the terminal equipment accords with a pick-up answering or putting-down state or not according to the sensing data, if so, executing S83, and if not, executing S84.
Whether the movement of the terminal equipment meets the one-time pick-up answering or putting-down state is judged according to whether the acquired sensing data meets a first preset condition. The movement of the terminal equipment comprises shaking movement and non-shaking movement, wherein the shaking movement refers to the back-and-forth movement of the terminal equipment within a certain range, the non-shaking movement refers to the movement except the shaking movement, and the first preset condition is the condition met by sensing data acquired by the acceleration sensor when the terminal equipment executes the non-shaking movement.
Fig. 10A is a schematic view of a shaking movement scene provided in this embodiment, as shown in fig. 10A, at this time, a user is in a standing state, but when the user places the mobile phone at the ear in the standing state or holds the mobile phone in the standing state, due to reasons of the user, the mobile phone is not in a state completely stationary with respect to the user, or there is a small-amplitude movement, and the movement at this time is considered as a shaking movement of the mobile phone. In fig. 10A, two possible shaking movements are shown, in the lower left of fig. 10A, the mobile phone is shown to shake up and down when the user is in a standing state, and in the lower right of fig. 10A, the mobile phone is shown to shake left and right when the user is in a standing state. It is understood that the above two types of shaking are only examples, and the actual shaking mode may be one or more of them, and may be other shaking modes.
Fig. 10B is a schematic view of a second swaying movement scene provided in this embodiment, as shown in fig. 10B, at this time, the user is in a state of going upstairs, and when the user places the mobile phone at the ear to receive a call, the user can listen to the call while going upstairs and downstairs, and at this time, the mobile phone at the ear of the user is also in a state of swaying movement. Unlike the shaking movement shown in fig. 10A, the user is in a standing state, the shaking of the mobile phone is mainly caused by the user, the user cannot be in an absolute motionless state when holding the mobile phone, and the shaking amplitude of the mobile phone is small. While in fig. 10B the phone moves with the user's movement, in case of going up stairs the phone moves upwards, while in fig. 10B there may be a similar small amplitude wobble as in fig. 10A.
Fig. 10C is a schematic view of a third shaking movement scene provided in the embodiment of the present application, as shown in fig. 10C, at this time, the user is located in a traveling vehicle and is answering a call. When a user answers a call on a traveling vehicle, the mobile phone moves, but the user still holds the mobile phone to answer the call at the ear, and at the moment, the distance that the mobile phone moves along with the traveling vehicle is large, but the movement of the mobile phone is still considered to be shaking movement.
As can be seen from fig. 10A to 10C, the movement distance of one effective movement is to reach the first threshold or the movement speed is to reach the second threshold, but the movement that the movement distance does not reach the first threshold or the movement speed reaches the second threshold is not necessarily one effective movement, and further determination is required.
Firstly, whether the sensing data acquired in a preset time period meets a first preset condition is judged. When the acceleration sensor comprises an X-axis acceleration sensor, a Y-axis acceleration sensor and a Z-axis acceleration sensor, the acquired sensing data are respectively the sensing data corresponding to the X axis, the sensing data corresponding to the Y axis and the sensing data corresponding to the Z axis.
The first preset condition in the embodiment of the present application includes the following conditions:
firstly, acceleration data of an X axis, a Y axis and a Z axis have cross characteristics, namely, any one of an X-axis curve, a Y-axis curve and a Z-axis curve has an intersection point with other two curves;
secondly, aiming at acceleration data of any direction axis, continuous peak values and valley values do not exist, namely the number of the wave crests and the wave troughs in the X-axis curve, the Y-axis curve and the Z-axis curve is smaller than a third threshold value;
finally, the acceleration data of the X-axis, the Y-axis and the Z-axis all meet the monotonous change trend in a period of time, namely an X-axis curve, a Y-axis curve and a Z-axis curve all have monotonous characteristics;
the X-axis curve is formed by sensing data collected by the X-axis acceleration sensor within a preset time period, the Y-axis curve is formed by sensing data collected by the Y-axis acceleration sensor within the preset time period, and the Z-axis curve is formed by sensing data collected by the Z-axis acceleration sensor within the preset time period.
The determination of the first preset condition will be described in detail below with reference to fig. 11 to 14, respectively.
First, it is determined that any one of the X-axis curve, the Y-axis curve, and the Z-axis curve has an intersection with the other two curves, and the determination of the intersection characteristic of the two curves by the X-axis curve and the Y-axis curve will be described as an example.
Fig. 11A is a schematic diagram i of cross characteristics of a judgment curve provided in this embodiment of the present application, and as shown in fig. 11A, it is first judged according to the acquired sensing data corresponding to the X axis and the sensing data corresponding to the Y axis in a preset time period that whether there is a certain moment, an acceleration value in the X axis direction is equal to an acceleration value in the Y axis direction, and if there is at least one moment satisfying a condition, there is a certain intersection point between the X axis curve and the Y axis curve at the moment satisfying the condition, where the intersection point is the intersection pointIs the point corresponding to the moment. For example, it is known from the sensing data corresponding to the X-axis that the acceleration in the X-axis direction at a certain time is 0.3m/s2Based on the sensing data corresponding to the Y-axis, the acceleration in the Y-axis direction at the same time is also 0.3m/s2Then it can be determined that there is an intersection between the X-axis curve and the Y-axis curve.
In some embodiments, if it is not possible to obtain the acceleration values in two directions at any time from the sensing data corresponding to the X axis and the sensing data corresponding to the Y axis that are equal, it cannot be directly determined that there is no intersection point between the X axis curve and the Y axis curve at this time, because the acceleration sensor collects data according to the acceleration values of the terminal device collected at different times at corresponding time intervals, the obtained sensing data is a plurality of discrete points instead of continuous curve data, and because the acceleration sensor collects discrete data, there is a case where the intersection point of some two curves is just between two times collected by the acceleration sensor, and the acceleration sensor does not collect corresponding data. Therefore, even if the acceleration value in the X-axis direction is not completely equal to the acceleration value in the Y-axis direction at any one time, it cannot be directly determined that there is no intersection between the X-axis curve and the Y-axis curve.
For the second case, there are various determination methods, and two determination methods will be exemplified below. Fig. 11B is a schematic diagram of a cross characteristic of a judgment curve provided in this embodiment, where as shown in fig. 11B, an acceleration value of an X axis and an acceleration value of a Y axis are not equal at any time within a preset time period, and one possible judgment manner is to directly fit according to sensing data corresponding to the X axis to obtain a corresponding X axis curve, fit according to sensing data corresponding to the Y axis to obtain a corresponding Y axis curve, then place the X axis curve and the Y axis curve in the same coordinate system to judge whether there is an intersection point, or fit according to the X axis curve to obtain an X axis function, fit according to the Y axis curve to obtain a Y axis function, and judge whether there is an intersection point between the two curves within the preset time period according to the X axis function and the Y axis function.
Fig. 11C is a schematic diagram of a third cross feature of a determination curve provided in the embodiment of the present application, and as shown in fig. 11C, another possible determination manner is to determine, for any two axes of sensing data, such as an X axis and a Y axis, whether an acceleration value in the X direction is greater than an acceleration value in the Y axis at a first time, and at a second time, the acceleration value in the X direction is less than the acceleration value in the Y axis, where the first time and the second time are any two times within a preset time period, and a sequence of the first time and the second time is not limited. When it is satisfied that the acceleration value in the X direction is greater than the acceleration value in the Y axis at the first time and the acceleration value in the X direction is less than the acceleration value in the Y axis at the second time, it may be determined that the X-axis curve and the Y-axis curve have an intersection point, and the time corresponding to the intersection point is between the first time and the second time.
For the judgment of whether the curve of the X axis and the curve of the Z axis have an intersection point and the judgment of whether the curve of the Y axis and the curve of the Z axis have an intersection point, the same process as the above is not repeated here. It should be noted that, after the determination method corresponding to fig. 11A determines that at least two curves have an intersection, subsequent determination may not be needed, and if no curve intersection is found according to the determination method corresponding to fig. 11A, further determination may be performed according to the determination method corresponding to fig. 11B or fig. 11C, and the determination methods of fig. 11B and fig. 11C may be implemented independently.
For the judgment that the number of the wave crests and the wave troughs in the X-axis curve, the Y-axis curve and the Z-axis curve are all smaller than the third threshold, the number of the wave crests and the wave troughs on each curve is judged first, and then the wave crests and the wave troughs are compared with the third threshold respectively. The determination of the number of peaks and valleys on the curve is illustrated below by an X-axis curve.
The peak refers to the maximum amplitude of the wave in a wavelength range, and taking transverse wave as an example, the highest point of the protrusion is the peak, and the opposite is the lowest point of the depression is the trough. The peaks and the valleys are for continuous curves, and since discrete data are collected in the embodiment of the present application, the peaks and the valleys cannot be directly obtained.
Fig. 12 is a schematic diagram of determining a peak of a curve according to an embodiment of the present application, as shown in fig. 12, taking sensing data corresponding to an X axis as an example, and regarding the determination of the peak, a manner that, for a certain point on the sensing data of the X axis, it is determined whether values of M points before the point are all smaller than a value of the point, and values of N points after the point are also all smaller than a value of the point, where M and N need to be larger than predetermined values, may be adopted. If the condition is satisfied, it can be determined that the point is a point where a peak is located. The reason why M and N need to be larger than the predetermined values is that there may be a slight fluctuation of data in the sensed data corresponding to the X axis, for example, a value of two points before a certain point is smaller than that of the point, and a value of a third point before the point is not smaller than that of the point, and the point is considered to be a slight fluctuation and is not considered to be a valid peak.
The judgment of the trough is similar to the judgment of the peak, except that the values of M points before a certain point are all larger than the value of the point, and the values of N points after the point are all larger than the value of the point, which is not described herein again.
For example, assuming that M and N are both 3, taking fig. 12 as an example, in fig. 12, the values of the points adjacent and subsequent to the point B that exists at the point B are both larger than the value of the point B, and the values of the points preceding and subsequent to the point a that exists at the point a satisfy the point a are both smaller than the value of the point a. At this time, since only one point in front of the point B has a value greater than that of the point B and less than 3, the point B is not considered as a valley point, but is a slight fluctuation of the curve. For point a, the values of the points in front of and behind point a are both smaller than the value of point a, and at this time point a is considered to be a peak point, and point a is considered to be an effective peak.
And comparing the number of the wave crests and the wave troughs in a preset time period with a third threshold value after obtaining the number of the wave crests and the wave troughs in the preset time period according to the method, wherein if the number of the wave crests and the wave troughs exceeds the third threshold value, the acceleration data of the direction axis has continuous peak values and continuous valley values, and otherwise, the acceleration data of the direction axis does not have continuous peak values and continuous valley values.
The judgment mode for the wave crest and the wave trough on each direction axis is similar to the above process, and is not described again here. The judgment of monotonicity of the curve will be explained below.
Fig. 13 is a schematic diagram of determining curve monotonicity provided by the embodiment of the present application, and as shown in fig. 13, a method for determining curve monotonicity adopted by the embodiment of the present application is that, if a ratio of a point, where a previous acceleration value is larger than an adjacent subsequent acceleration value, to a total number of points in a period of time exceeds a preset ratio, an acceleration curve in the period of time is considered to be monotonically decreasing; correspondingly, if the proportion of points with smaller acceleration value than the next adjacent acceleration value to the total number of points in a period of time exceeds a preset proportion, the acceleration curve in the period of time is considered to be monotonically increasing. The value of the preset proportion can be determined according to actual needs, and when the acceleration value is obtained, the preset proportion is usually less than 100% due to possible errors, for example, values such as 90% and 93% can be taken.
For example, the preset proportion is set to be 90%, in fig. 13, in the time period t1-t4, there are 21 points in total, where most of the points satisfy that the current point has a higher value than the previous point adjacent to the current point, and only the point C and the point D do not satisfy, then the proportion of the points satisfying the monotonic increase to the total number of points can be obtained:
A=[(21-2)/21]*100%=90.48%。
since 90.48% exceeds 90%, the acceleration curve in FIG. 13 is monotonically increasing over the time period t1-t 4.
The monotonically decreasing decision is similar to this and is not described here again.
The first preset condition is determined to determine whether the movement of the terminal device is a shaking movement or a non-shaking movement according to the sensing data, and the shaking movement and the non-shaking movement will be exemplified below with reference to fig. 14 to 18.
Fig. 14 is a schematic diagram of different types of answering calls provided in an embodiment of the present application, and as shown in fig. 14, a first way of answering a call is shown on the left side of fig. 14, where an ear is in contact with an upper end of a mobile phone screen, a lower end of the mobile phone screen is at a certain distance from a face of a user, and a vertical edge of the mobile phone screen forms an included angle with the face of the user. In the middle of fig. 14, there is a second way to answer a call, in which the ear is in contact with the upper end of the mobile phone screen, while the lower end of the mobile phone screen keeps a certain distance from the face of the user, and the vertical and horizontal edges of the mobile phone screen form a certain angle with the face of the user. On the right side of fig. 14, there is a third way to answer a call, in which the upper end of the mobile phone screen is in contact with the ear of the user, the lower end of the mobile phone screen is in contact with the face of the user, the mobile phone screen is tightly attached to the user, and the included angle between the mobile phone screen and the face of the user is small.
The following exemplifies scenes of shaking movement and non-shaking movement.
Fig. 15 is a schematic diagram of a curve of sensing data of non-shaking movement of a terminal device according to an embodiment of the present application, and as shown in fig. 15, the curve includes sensing data in three directions of an X axis, a Y axis, and a Z axis. Fig. 15 is acceleration data of each direction axis in a normal call receiving or hanging scene, and it can be known from the graph of fig. 15 that the X-axis curve intersects the Y-axis curve at time t1 and the Y-axis curve intersects the Z-axis curve at time t2, so that the sensing data of fig. 15 satisfies a condition that any one of the X-axis curve, the Y-axis curve, and the Z-axis curve has an intersection with the other two curves. In fig. 15, the number of peaks and the number of valleys are small, and the condition that there are no continuous peaks and valleys is satisfied. Within a certain time period, e.g. 50ms-75ms, monotonicity is fulfilled.
Fig. 16 is a first graph illustrating a sensing data curve of shaking movement of a terminal device according to an embodiment of the present application, and as shown in fig. 16, the first graph includes sensing data in three directions, namely, an X axis, a Y axis, and a Z axis. Fig. 16 is acceleration data of each direction axis in a scene that the mobile phone vertically shakes up and down along the ear, and the scene in fig. 16 may be various, for example, a specific scene such as going up and down stairs, running and the like may be used. According to the curve of fig. 16, the X-axis curve, the Y-axis curve and the Z-axis curve in fig. 16 are all non-intersecting in pairs, and the X-axis curve, the Y-axis curve and the Z-axis curve all have regular peaks and troughs.
Fig. 17 is a second schematic diagram of a curve of sensing data of shaking movement of a terminal device according to an embodiment of the present application, and as shown in fig. 17, the curve includes sensing data in three directions of an X axis, a Y axis, and a Z axis. Fig. 17 is acceleration data of each direction axis in a primary mobile phone and face large-angle shaking scene, and it can be known from the curve of fig. 17 that the X-axis curve, the Y-axis curve and the Z-axis curve in fig. 17 all have regular peaks and valleys and present regular shaking.
Fig. 18 is a third schematic diagram of a curve of sensing data of shaking movement of a terminal device according to an embodiment of the present application, and as shown in fig. 18, the curve includes sensing data in three directions, namely an X axis, a Y axis, and a Z axis. Fig. 18 shows acceleration data of each direction axis in a primary cell phone and face small-angle shaking scene, and it can be known from the curve in fig. 18 that the X-axis curve, the Y-axis curve and the Z-axis curve in fig. 18 have irregular peaks and valleys, but exhibit regular shaking, and meanwhile, the X-axis curve, the Y-axis curve and the Z-axis curve in fig. 18 are not intersected in pairs.
By acquiring the sensing data of the terminal equipment in a preset time period and then judging whether the sensing data of the terminal equipment meets a first preset condition or not according to the method, when the sensing data of the terminal equipment meets the first preset condition, the movement of the terminal equipment is determined to be non-shaking movement, otherwise, when the sensing data of the terminal equipment does not meet the first preset condition, the movement of the terminal equipment is determined to be shaking movement.
When the movement of the terminal equipment is shaking movement, subsequent terminal screen control operation is not carried out; when the movement of the terminal device is a non-shaking movement, it is also necessary to determine whether the non-shaking movement is an effective movement, so as to determine whether the movement of the terminal device meets a false trigger state.
And S83, judging whether the movement of the terminal equipment meets the state of false triggering, if so, executing S85, and if not, executing S84.
For the sensing data meeting the first preset condition, it indicates that the corresponding terminal device executes a non-shaking movement, namely a movement with a picking-up or putting-down trend. The problem that a screen is suddenly turned on and suddenly turned off in the call process due to false triggering generated in the call making process cannot be avoided only by judging the first preset condition. If the judgment is only carried out aiming at the first preset condition, when the sensing data meet the first preset condition, the detection capacity value of the touch screen is reported, so that when the mobile phone answering the call slightly has a gesture similar to a defined picking-up and putting-down type, the gesture is judged to be an action with the gesture, and the screen of the mobile phone is on for a moment.
Therefore, the sensing data meeting the first preset condition cannot be directly judged to be valid movement once, and a false trigger judgment needs to be performed to judge whether the sensing data meeting the first preset condition meets a second preset condition, wherein the valid movement includes moving the terminal device from the first position to the ear edge, the distance between the first position and the ear edge is greater than a first threshold, and moving the terminal device from the ear edge to the second position, and the distance between the second position and the ear edge is greater than the first threshold. The step of judging whether the sensing data meeting the first preset condition meets the second preset condition is specifically as follows:
performing first processing on the sensing data meeting a first preset condition, wherein the first processing comprises whitening processing and down-sampling processing;
and judging whether the sensing data subjected to the first processing meets a second preset condition.
Specifically, the sensing data after the first processing is input into a preset model to obtain an output result of the preset model, and whether the sensing data after the first processing meets a second preset condition is judged according to the output result of the preset model, wherein the preset model is obtained by learning multiple groups of first samples and multiple groups of second samples, each group of first samples comprises first sample sensing data and a first sample output result, the first sample sensing data is the sensing data acquired by an acceleration sensor in the sample terminal equipment when the sample terminal equipment executes the effective movement, and the first sample output result is used for indicating the effective movement; each group of second samples comprises second sample sensing data and second sample output results, the second sample sensing data are sensing data acquired by an acceleration sensor in the sample terminal equipment when the sample terminal equipment executes invalid movement, and the second sample output results are used for indicating the invalid movement.
The whitening processing and the down-sampling processing are two processing methods for data, the whitening processing is to remove redundant information of input data, and the down-sampling processing is a processing method for reducing the sampling rate of data to reduce the size of the data.
The preset model is used for processing the sensing data after whitening processing and down sampling processing. Before the preset model is obtained, the model needs to be trained. According to the embodiment of the application, by collecting the sensing data of the gesture obtained when the screen of the mobile phone is answered or hung up in different scenes, when a user answers or hangs up the phone, the mobile phone effectively moves, and the collected sensing data are forward data.
Meanwhile, the embodiment of the application also collects a plurality of scenes which can cause the fluctuation of the acceleration sensor under different scenes, such as answering a call by a user in the walking process, answering a call by a user in the running process, answering a call by a user in the vehicle taking process and the like, wherein the mobile phone executes invalid movement, and the collected sensing data is reverse data.
The method comprises the steps that multiple groups of first samples are formed according to forward data of multiple processed different scenes, multiple groups of second samples are formed according to reverse data of multiple processed different scenes, each group of first samples comprise acceleration values of the X-axis direction, the Y-axis direction and the Z-axis direction, wherein the acceleration values are acquired by an acceleration sensor when the mobile phone performs effective movement, and each group of second samples comprise acceleration values of the X-axis direction, the Y-axis direction and the Z-axis direction, the acceleration values are acquired by the acceleration sensor when the mobile phone performs ineffective movement.
Then, inputting a plurality of groups of first samples and a plurality of groups of second samples into a neural network model for training, carrying out secondary classification on the neural network model according to input data to obtain two possible output results, wherein one output result judges that the movement corresponding to a certain group of sample data is effective movement, and one output result judges that the movement corresponding to a certain group of sample data is ineffective movement.
The formula of the actual result obtained by adopting the model is as follows:
Figure BDA0002280367270000201
wherein, OjThe hidden layer output is obtained through a certain calculation after an initial sample is input, w represents a weight, and θ represents bias, namely the deviation or variance of the model, and represents the accuracy of the trained model on the training set.
The updating method of the weight is as follows:
wi,j=wi,j+(μ)ErrjOi
where μ denotes the training rate of the neural network model, OiRepresenting the actual result of the input sample, Err representing the training result and the actual value OiWhen Err is large, the output result and the actual result have a large difference, and at this time, the error needs to be propagated reversely, and the weight value needs to be updated. And after continuous training, when the predicted error is smaller than a certain threshold value, indicating that the model training is finished, and quitting the model training.
S84, the screen of the terminal device remains in the original on or off state.
And when the movement of the terminal equipment is determined to be one-time ineffective movement, the screen of the terminal equipment maintains the original lighting or extinguishing state.
And S85, starting the capacity value detection.
After the fact that the movement of the terminal equipment is effective movement is judged and known once according to the collected sensing data, the capacity value detection is carried out, and whether the effective movement is a normal picking type or a normal putting type is judged.
The detection of the capacity value is performed for each area on the screen of the terminal device, and the screen of the terminal device is described below with reference to fig. 19 by taking the terminal device as a mobile phone as an example. Fig. 19 is a schematic view of a screen of a mobile phone provided in an embodiment of the present application, and as shown in fig. 19, the screen of the mobile phone is divided into a plurality of small areas, where one possible dividing manner is to divide the screen of the mobile phone into a plurality of rows and a plurality of columns, the row width of each row may be equal or unequal, and the column width of each column may be equal or unequal. When the mobile phone is horizontally placed, the right R0 is the first column, the T0 is the first row, and the Tn is the nth row. When no user touches the mobile phone screen, the capacitance value difference of each area of the mobile phone screen is smaller, when the user touches the mobile phone screen, the capacitance value of the touched part is higher than that of the periphery, and whether the effective movement is a picking-up type or a putting-down type can be determined according to the mode.
The process of capacitance detection will be described below with reference to fig. 20 to 23.
When the volume value of each area in the screen of the terminal equipment meets at least one of third preset conditions, determining that the effective movement type is a pick-up type, otherwise, determining that the effective movement type is a drop-off type, wherein the third preset conditions comprise at least one of the following conditions:
the area of a region with the capacity value larger than the first capacity value in the screen of the terminal equipment is larger than the first area; alternatively, the first and second electrodes may be,
the capacity value of a first area in the terminal equipment is larger than a second capacity value, and the first area is an area with a first preset size where a receiver of the terminal equipment is located; alternatively, the first and second electrodes may be,
the average capacity value of a second area in the terminal device is N times of the average capacity value of a third area, wherein N is larger than 1, the second area is an area with a second preset size where a receiver of the terminal device is located, the third area is an area except the second area in a screen of the terminal device, or the third area is a part of the area except the second area in the screen of the terminal device; alternatively, the first and second electrodes may be,
the fourth area of the terminal device comprises a plurality of sub-areas with the capacity values larger than the third capacity value, and the fourth area is an area with a third preset size where a receiver of the terminal device is located.
Fig. 20 is a first schematic view of capacitance approach detection provided by the embodiment of the present application, and as shown in fig. 20, a scenario in which a user answers a phone call is that the user takes a mobile phone to an ear to answer the phone call, and at this time, a face of the user makes a large-area contact with a screen of the mobile phone, where the contact area is larger than a first area. In fig. 20, a contact area 201 of the face of the user on the cellular phone 200 with the screen of the cellular phone is shown, and in the cellular phone 200 in fig. 20, each cell on the screen is an area, and each area has a capacity value. When no object is in contact with the mobile phone screen, the capacitance value of each area of the mobile phone screen is smaller. When the face of the user contacts with the screen of the mobile phone to obtain the contact area 201, if it is determined that the area of the contact area 201 is larger than the first area and the volume value of the contact area is larger than the first volume value K1, the mobile phone is considered to be a normal pick-up type. The value of the first capacitance K1 may or may not be a fixed value. When the capacity values of other areas on the cell phone screen are not changed, the first capacity value K1 may take a value larger than the capacity values of other areas, and when the capacity values of other areas on the cell phone screen change with the environment, the first capacity value K1 also needs to change accordingly.
Fig. 21 is a second schematic diagram of capacitance approach detection provided in the embodiment of the present application, and as shown in fig. 21, a scenario of a user answering a phone call is that an ear of the user contacts with a screen of a mobile phone, and a contact area is a first area 211, that is, an area 213 of a first preset size where an earpiece 212 of the mobile phone is located. Since the first region 211 is in contact with the user's ear, the capacitance of the first region 211 is higher than that of the region not in contact with the user, and the capacitance of the region not in contact with the user is the second capacitance. If the capacity of the first area 211 is greater than the second capacity K2, the mobile phone is considered to be a normal pick-up type.
Another possible determination manner is that, if the average capacitance value of the second area in the terminal device is N times of the average capacitance value of the third area, where N is greater than 1, the second area is an area of a second preset size where an earphone of the terminal device is located, the third area is an area other than the second area in the screen of the terminal device, or the third area is a partial area in the area other than the second area in the screen of the terminal device.
Fig. 22 is a third schematic view of capacitance approach detection provided by the embodiment of the present application, and as shown in fig. 22, a scenario of receiving a call by a user is that both an ear and a face of the user contact a screen of a mobile phone, so that the ear and the screen of the mobile phone have a first contact area 221, and the face and the screen of the mobile phone have a second contact area 222, which are two contact areas. And when the capacity values of the two areas are higher than the capacity values of the other areas, namely the capacity values of the two areas reach a certain threshold value, the mobile phone is considered to be in a normal picking type.
Fig. 23 is a fourth schematic view of capacitance value proximity detection provided in the embodiment of the present application, as shown in fig. 23, a scenario in which a user receives a call is that an ear of the user contacts a screen of a mobile phone, and different from the above scenario, because of different ears of different users, in practice, when the ear of a part of the user contacts the screen of the mobile phone, a part of a contact region is discontinuous, a region in contact with the screen of the mobile phone is a plurality of sub-regions, a capacitance value of each sub-region is greater than a third capacitance value K3, and the sub-regions are all located in a fourth region, where the fourth region is a region of a third preset size where an earphone of the mobile phone is located.
It should be noted that the values of K1, K2, and K3 are not constant values, because the capacitance value of the mobile phone screen changes with the change of the environment, for example, the temperature, humidity, or interference information of the environment can affect the capacitance value of the mobile phone screen. However, in the same situation, the volume values of the areas of the mobile phone screen that are not touched by any user may be slightly different, and the difference is within a certain range. When the capacity value of a certain area is detected to be obviously higher than other areas of the mobile phone screen, the fact that a user contacts the area is indicated. Therefore, the values of K1, K2 and K3 are determined according to actual conditions. Further, the tolerance difference between the contact area and other areas may also be set, and when the tolerance difference reaches a certain degree, the contact area is considered to have user contact.
Note that, it is determined that the capacitance values of the satisfied regions are close to each other once when the capacitance values of the satisfied regions do not reach the predetermined value, and the following description will be given with reference to fig. 24. Fig. 24 is a schematic diagram of capacitance approach detection provided in the embodiment of the present application, where as shown in fig. 24, after determining that the movement of the mobile phone 240 is an effective movement, the user clicks the mobile phone screen with a finger, at this time, there is a contact region 242 between the finger and the mobile phone screen, the capacitance of the contact region 242 is higher than the capacitance of other regions on the mobile phone screen, but the area of the contact region 242 does not reach the first area, and the contact region 242 is not in the first region, that is, is not in a region 241 of the first preset size where the earpiece 243 of the mobile phone 240 is located, at this time, even if the capacitance of the contact region 242 is higher, the contact at this time is not considered as an approach operation, so that the operation of turning off the screen is not performed.
S86, judging whether the capacity value reaches the threshold value, if yes, executing S87, if no, executing S84.
When the capacity value of the terminal screen meets a third preset condition, namely at least one of the following conditions is met, the capacity value is considered to reach a threshold value:
the area of a region with the capacity value larger than the first capacity value in the screen of the terminal equipment is larger than the first area; alternatively, the first and second electrodes may be,
the capacity value of a first area in the terminal equipment is larger than a second capacity value, and the first area is an area with a first preset size where a receiver of the terminal equipment is located; alternatively, the first and second electrodes may be,
the average capacity value of a second area in the terminal device is N times of the average capacity value of a third area, wherein N is larger than 1, the second area is an area with a second preset size where a receiver of the terminal device is located, the third area is an area except the second area in a screen of the terminal device, or the third area is a part of the area except the second area in the screen of the terminal device; alternatively, the first and second electrodes may be,
the fourth area of the terminal device comprises a plurality of sub-areas with the capacity values larger than the third capacity value, and the fourth area is an area with a third preset size where a receiver of the terminal device is located.
Otherwise, the tolerance value is considered to have not reached the threshold.
And S87, the screen of the terminal equipment is turned on or off.
And when the volume value reaches a threshold value, executing screen turning-off operation, otherwise, executing screen lightening operation.
According to the terminal control method provided by the embodiment of the application, when the terminal equipment is in the incoming call answering state, the acceleration sensor in the terminal equipment can be started to collect the sensing data in the preset time interval, the acceleration data of the terminal equipment in the preset time interval is obtained, and then whether the movement of the terminal equipment in the preset time interval is effective movement or not is judged by judging whether the acceleration data of the terminal equipment in the preset time interval meets the preset condition or not. When the terminal equipment moves in a preset time period and is effectively moved once, the capacity value of each area in the screen of the terminal equipment is acquired, whether the effective movement is close to the movement once or is far away from the movement once is further judged, and then the type of the effective movement is used for controlling the lightening or extinguishing of the screen of the terminal equipment. Meanwhile, whether the movement of the terminal equipment in a preset time period is shaking movement or not is judged by judging whether the collected sensing data meet a first preset condition or not, when the movement of the terminal equipment is determined to be shaking movement, the capacity value of a screen area is not judged, when the movement of the terminal equipment is determined to be non-shaking movement, whether the non-shaking movement is effective movement or not is further judged through a second preset condition, the capacity value of the screen area is judged only when the effective movement is carried out, and the screen of the terminal equipment is lightened or extinguished according to the acquired capacity value, so that the problems that the screen capacity value is detected when a slight gesture shakes, and the screen of the terminal equipment is lightened or extinguished are avoided.
Fig. 25 is a schematic structural diagram of a terminal control device according to an embodiment of the present application, and as shown in fig. 25, the terminal control device includes an obtaining module 251, a processing module 252, and a control module 253, where:
the obtaining module 251 is configured to obtain sensing data acquired by the acceleration sensor within a preset time period when the terminal device is in an incoming call answering state;
the processing module 252 is configured to, when it is determined that the sensing data meets a preset condition, obtain a volume value of each area in a screen of the terminal device, where the preset condition is a condition that the sensing data collected by the acceleration sensor meets when the terminal device executes effective movement in which a movement distance is greater than a first threshold and a movement speed is greater than a second threshold;
the control module 253 is configured to light up the screen of the terminal device or extinguish the screen of the terminal device according to the volume value of each region in the screen of the terminal device.
In a possible implementation manner, the preset conditions include a first preset condition and a second preset condition, and the processing module 252 is specifically configured to:
judging whether the sensing data meet the first preset condition or not, wherein the first preset condition is a condition met by the sensing data acquired by the acceleration sensor when the terminal equipment executes non-shaking movement;
if so, judging whether the sensing data meeting the first preset condition meets the second preset condition, wherein the second preset condition is a condition met by the sensing data acquired by the acceleration sensor when the non-shaking movement is the effective movement;
and if so, acquiring the capacity value of each area in the screen of the terminal equipment.
In one possible embodiment, the acceleration sensor includes an X-axis acceleration sensor, a Y-axis acceleration sensor, and a Z-axis acceleration sensor; the first preset condition comprises the following conditions:
any one of the X-axis curve, the Y-axis curve and the Z-axis curve has an intersection point with the other two curves;
the number of wave crests and wave troughs in the X-axis curve, the Y-axis curve and the Z-axis curve is less than a third threshold value;
the X-axis curve, the Y-axis curve and the Z-axis curve all have monotonic characteristics;
the X-axis curve is formed by sensing data collected by the X-axis acceleration sensor in the preset time period, the Y-axis curve is formed by sensing data collected by the Y-axis acceleration sensor in the preset time period, and the Z-axis curve is formed by sensing data collected by the Z-axis acceleration sensor in the preset time period.
In a possible implementation, the processing module 252 is specifically configured to:
performing first processing on the sensing data meeting the first preset condition, wherein the first processing comprises whitening processing and down-sampling processing;
and judging whether the sensing data subjected to the first processing meets the second preset condition.
In a possible implementation, the processing module 252 is specifically configured to:
inputting the sensing data subjected to the first processing into a preset model to obtain an output result of the preset model, and judging whether the sensing data subjected to the first processing meets the second preset condition or not according to the output result of the preset model;
the preset model is obtained by learning multiple groups of first samples and multiple groups of second samples, each group of first samples comprises first sample sensing data and a first sample output result, the first sample sensing data is sensing data acquired by an acceleration sensor in sample terminal equipment when the sample terminal equipment executes the effective movement, and the first sample output result is used for indicating the effective movement; each group of second samples comprises second sample sensing data and a second sample output result, the second sample sensing data is sensing data acquired by an acceleration sensor in the sample terminal equipment when the sample terminal equipment executes invalid movement, and the second sample output result is used for indicating the invalid movement.
In one possible embodiment, the effective movement comprises:
moving the terminal device from a first position to an ear, a distance between the first position and the ear being greater than the first threshold; and the number of the first and second groups,
moving the terminal device from an ear to a second position, the distance between the second position and the ear being greater than the first threshold.
In a possible implementation, the control module 253 is specifically configured to:
determining an effective movement type of the terminal equipment according to the volume value of each area in the screen of the terminal equipment, wherein the effective movement type is a pick-up type or a put-down type;
when the effective movement type of the terminal equipment is determined to be the pick-up type, turning off a screen of the terminal equipment;
and when the effective movement type of the terminal equipment is determined to be the put-down type, lightening a screen of the terminal equipment.
In a possible implementation, the control module 253 is specifically configured to:
judging whether the volume value of each area in the screen of the terminal equipment meets at least one of third preset conditions, if so, determining that the effective movement type is a pick-up type, and if not, determining that the effective movement type is a drop-down type;
wherein the third preset condition includes at least one of:
the area of a region with the capacity value larger than the first capacity value in the screen of the terminal equipment is larger than the first area; alternatively, the first and second electrodes may be,
the capacity value of a first area in the terminal equipment is larger than a second capacity value, and the first area is an area with a first preset size where a receiver of the terminal equipment is located; alternatively, the first and second electrodes may be,
the average capacity value of a second area in the terminal device is N times of the average capacity value of a third area, wherein N is larger than 1, the second area is an area with a second preset size where a receiver of the terminal device is located, the third area is an area except the second area in a screen of the terminal device, or the third area is a part of the area except the second area in the screen of the terminal device; alternatively, the first and second electrodes may be,
the fourth area of the terminal device comprises a plurality of sub-areas with the capacity values larger than the third capacity value, and the fourth area is an area with a third preset size where a receiver of the terminal device is located.
The terminal control device provided in the embodiment of the present application may execute the technical solutions shown in the above method embodiments, and the implementation principles and beneficial effects thereof are similar, and are not described herein again.
Fig. 26 is a schematic diagram of a hardware structure of a terminal device provided in an embodiment of the present application, and as shown in fig. 26, the terminal device in the embodiment of the present application includes: a processor 261 and a memory 262; wherein
A memory 262 for storing computer-executable instructions;
the processor 261 is configured to execute the computer-executable instructions stored in the memory to implement the steps performed by the terminal control method in the foregoing embodiments. Reference may be made in particular to the description relating to the method embodiments described above.
Alternatively, the memory 262 may be separate or integrated with the processor 261.
When the memory 262 is separately provided, the terminal device further includes a bus 263 for connecting the memory 1302 and the processor 261.
Optionally, the Processor may be a Central Processing Unit (CPU), or may be another general-purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps in the embodiment of the service processing method disclosed in the embodiments of the present application may be directly implemented by a hardware processor, or implemented by a combination of hardware and software modules in the processor.
An embodiment of the present application further provides a computer-readable storage medium, where a computer execution instruction is stored in the computer-readable storage medium, and when a processor executes the computer execution instruction, the terminal control method executed by the above terminal device is implemented.
An embodiment of the present application provides a computer program product, which includes instructions, and when the instructions are executed, the instructions cause a computer to execute the terminal control method.
The embodiment of the present application provides a system on chip or a system chip, where the system on chip or the system chip can be applied to a terminal device, and the system on chip or the system chip includes: the terminal control method comprises at least one communication interface, at least one processor and at least one memory, wherein the communication interface, the memory and the processor are interconnected through a bus, and the processor enables the terminal equipment to execute the terminal control method by executing instructions stored in the memory.
All or a portion of the steps of implementing the above-described method embodiments may be performed by hardware associated with program instructions. The aforementioned program may be stored in a readable memory. When executed, the program performs steps comprising the method embodiments described above; and the aforementioned memory (storage medium) includes: read-only memory (ROM), RAM, flash memory, hard disk, solid state disk, magnetic tape (magnetic tape), floppy disk (floppy disk), optical disk (optical disk), and any combination thereof.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In the embodiments of the present application, the terms "include" and variations thereof may mean non-limiting inclusion; the term "or" and variations thereof may mean "and/or". The terms "first," "second," and the like in the embodiments of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. In the embodiments of the present application, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
It will be apparent to those skilled in the art that various changes and modifications may be made in the embodiments of the present application without departing from the spirit and scope of the embodiments of the present application. Thus, if such modifications and variations of the embodiments of the present application fall within the scope of the claims of the embodiments of the present application and their equivalents, the embodiments of the present application are intended to include such modifications and variations as well.

Claims (18)

1. A terminal control method is applied to a terminal device, the terminal device comprises an acceleration sensor, and the method comprises the following steps:
when the terminal equipment is in a call answering state, acquiring sensing data acquired by the acceleration sensor within a preset time period;
when judging that the sensing data meet preset conditions, acquiring the capacity value of each area in a screen of the terminal equipment, wherein the preset conditions are the conditions met by the sensing data acquired by the acceleration sensor when the terminal equipment executes effective movement with the movement distance larger than a first threshold and the movement speed larger than a second threshold;
and lightening or extinguishing the screen of the terminal equipment according to the capacity value of each area in the screen of the terminal equipment.
2. The method according to claim 1, wherein the preset conditions include a first preset condition and a second preset condition, and the obtaining the volume value of each area in the screen of the terminal device when it is determined that the sensing data meets the preset conditions includes:
judging whether the sensing data meet the first preset condition or not, wherein the first preset condition is a condition met by the sensing data acquired by the acceleration sensor when the terminal equipment executes non-shaking movement;
if so, judging whether the sensing data meeting the first preset condition meets the second preset condition, wherein the second preset condition is a condition met by the sensing data acquired by the acceleration sensor when the non-shaking movement is the effective movement;
and if so, acquiring the capacity value of each area in the screen of the terminal equipment.
3. The method of claim 2, wherein the acceleration sensors comprise an X-axis acceleration sensor, a Y-axis acceleration sensor, and a Z-axis acceleration sensor; the first preset condition comprises the following conditions:
any one of the X-axis curve, the Y-axis curve and the Z-axis curve has an intersection point with the other two curves;
the number of wave crests and wave troughs in the X-axis curve, the Y-axis curve and the Z-axis curve is less than a third threshold value;
the X-axis curve, the Y-axis curve and the Z-axis curve all have monotonic characteristics;
the X-axis curve is formed by sensing data collected by the X-axis acceleration sensor in the preset time period, the Y-axis curve is formed by sensing data collected by the Y-axis acceleration sensor in the preset time period, and the Z-axis curve is formed by sensing data collected by the Z-axis acceleration sensor in the preset time period.
4. The method according to claim 2 or 3, wherein the determining whether the sensing data meeting the first preset condition meets the second preset condition comprises:
performing first processing on the sensing data meeting the first preset condition, wherein the first processing comprises whitening processing and down-sampling processing;
and judging whether the sensing data subjected to the first processing meets the second preset condition.
5. The method according to claim 4, wherein the determining whether the first processed sensing data meets the second preset condition includes:
inputting the sensing data subjected to the first processing into a preset model to obtain an output result of the preset model, and judging whether the sensing data subjected to the first processing meets the second preset condition or not according to the output result of the preset model;
the preset model is obtained by learning multiple groups of first samples and multiple groups of second samples, each group of first samples comprises first sample sensing data and a first sample output result, the first sample sensing data is sensing data acquired by an acceleration sensor in sample terminal equipment when the sample terminal equipment executes the effective movement, and the first sample output result is used for indicating the effective movement; each group of second samples comprises second sample sensing data and a second sample output result, the second sample sensing data is sensing data acquired by an acceleration sensor in the sample terminal equipment when the sample terminal equipment executes invalid movement, and the second sample output result is used for indicating the invalid movement.
6. The method according to any of claims 1-5, wherein the actively moving comprises:
moving the terminal device from a first position to an ear, a distance between the first position and the ear being greater than the first threshold; and the number of the first and second groups,
moving the terminal device from an ear to a second position, the distance between the second position and the ear being greater than the first threshold.
7. The method according to any one of claims 1 to 5, wherein lighting up the screen of the terminal device or turning off the screen of the terminal device according to the volume value of each region in the screen of the terminal device comprises:
determining an effective movement type of the terminal equipment according to the volume value of each area in the screen of the terminal equipment, wherein the effective movement type is a pick-up type or a put-down type;
when the effective movement type of the terminal equipment is determined to be the pick-up type, turning off a screen of the terminal equipment;
and when the effective movement type of the terminal equipment is determined to be the put-down type, lightening a screen of the terminal equipment.
8. The method according to claim 7, wherein determining the effective movement type of the terminal device according to the volume value of each area in the screen of the terminal device comprises:
judging whether the volume value of each area in the screen of the terminal equipment meets at least one of third preset conditions, if so, determining that the effective movement type is a pick-up type, and if not, determining that the effective movement type is a drop-down type;
wherein the third preset condition includes at least one of:
the area of a region with the capacity value larger than the first capacity value in the screen of the terminal equipment is larger than the first area; alternatively, the first and second electrodes may be,
the capacity value of a first area in the terminal equipment is larger than a second capacity value, and the first area is an area with a first preset size where a receiver of the terminal equipment is located; alternatively, the first and second electrodes may be,
the average capacity value of a second area in the terminal device is N times of the average capacity value of a third area, wherein N is larger than 1, the second area is an area with a second preset size where a receiver of the terminal device is located, the third area is an area except the second area in a screen of the terminal device, or the third area is a part of the area except the second area in the screen of the terminal device; alternatively, the first and second electrodes may be,
the fourth area of the terminal device comprises a plurality of sub-areas with the capacity values larger than the third capacity value, and the fourth area is an area with a third preset size where a receiver of the terminal device is located.
9. A terminal control apparatus, comprising:
the acquisition module is used for acquiring sensing data acquired by the acceleration sensor within a preset time period when the terminal equipment is in a call answering state;
the processing module is used for acquiring the volume value of each area in the screen of the terminal equipment when judging that the sensing data meet preset conditions, wherein the preset conditions are the conditions met by the sensing data acquired by the acceleration sensor when the terminal equipment executes effective movement with the movement distance larger than a first threshold and the movement speed larger than a second threshold;
and the control module is used for lightening the screen of the terminal equipment or extinguishing the screen of the terminal equipment according to the capacity value of each area in the screen of the terminal equipment.
10. The apparatus according to claim 9, wherein the preset condition includes a first preset condition and a second preset condition, and the processing module is specifically configured to:
judging whether the sensing data meet the first preset condition or not, wherein the first preset condition is a condition met by the sensing data acquired by the acceleration sensor when the terminal equipment executes non-shaking movement;
if so, judging whether the sensing data meeting the first preset condition meets the second preset condition, wherein the second preset condition is a condition met by the sensing data acquired by the acceleration sensor when the non-shaking movement is the effective movement;
and if so, acquiring the capacity value of each area in the screen of the terminal equipment.
11. The apparatus of claim 10, wherein the acceleration sensor comprises an X-axis acceleration sensor, a Y-axis acceleration sensor, and a Z-axis acceleration sensor; the first preset condition comprises the following conditions:
any one of the X-axis curve, the Y-axis curve and the Z-axis curve has an intersection point with the other two curves;
the number of wave crests and wave troughs in the X-axis curve, the Y-axis curve and the Z-axis curve is less than a third threshold value;
the X-axis curve, the Y-axis curve and the Z-axis curve all have monotonic characteristics;
the X-axis curve is formed by sensing data collected by the X-axis acceleration sensor in the preset time period, the Y-axis curve is formed by sensing data collected by the Y-axis acceleration sensor in the preset time period, and the Z-axis curve is formed by sensing data collected by the Z-axis acceleration sensor in the preset time period.
12. The apparatus according to claim 10 or 11, wherein the processing module is specifically configured to:
performing first processing on the sensing data meeting the first preset condition, wherein the first processing comprises whitening processing and down-sampling processing;
and judging whether the sensing data subjected to the first processing meets the second preset condition.
13. The apparatus of claim 12, wherein the processing module is specifically configured to:
inputting the sensing data subjected to the first processing into a preset model to obtain an output result of the preset model, and judging whether the sensing data subjected to the first processing meets the second preset condition or not according to the output result of the preset model;
the preset model is obtained by learning multiple groups of first samples and multiple groups of second samples, each group of first samples comprises first sample sensing data and a first sample output result, the first sample sensing data is sensing data acquired by an acceleration sensor in sample terminal equipment when the sample terminal equipment executes the effective movement, and the first sample output result is used for indicating the effective movement; each group of second samples comprises second sample sensing data and a second sample output result, the second sample sensing data is sensing data acquired by an acceleration sensor in the sample terminal equipment when the sample terminal equipment executes invalid movement, and the second sample output result is used for indicating the invalid movement.
14. The apparatus of any of claims 9-13, wherein the effective movement comprises:
moving the terminal device from a first position to an ear, a distance between the first position and the ear being greater than the first threshold; and the number of the first and second groups,
moving the terminal device from an ear to a second position, the distance between the second position and the ear being greater than the first threshold.
15. The apparatus according to any one of claims 9-13, wherein the control module is specifically configured to:
determining an effective movement type of the terminal equipment according to the volume value of each area in the screen of the terminal equipment, wherein the effective movement type is a pick-up type or a put-down type;
when the effective movement type of the terminal equipment is determined to be the pick-up type, turning off a screen of the terminal equipment;
and when the effective movement type of the terminal equipment is determined to be the put-down type, lightening a screen of the terminal equipment.
16. The apparatus of claim 15, wherein the control module is specifically configured to:
judging whether the volume value of each area in the screen of the terminal equipment meets at least one of third preset conditions, if so, determining that the effective movement type is a pick-up type, and if not, determining that the effective movement type is a drop-down type;
wherein the third preset condition includes at least one of:
the area of a region with the capacity value larger than the first capacity value in the screen of the terminal equipment is larger than the first area; alternatively, the first and second electrodes may be,
the capacity value of a first area in the terminal equipment is larger than a second capacity value, and the first area is an area with a first preset size where a receiver of the terminal equipment is located; alternatively, the first and second electrodes may be,
the average capacity value of a second area in the terminal device is N times of the average capacity value of a third area, wherein N is larger than 1, the second area is an area with a second preset size where a receiver of the terminal device is located, the third area is an area except the second area in a screen of the terminal device, or the third area is a part of the area except the second area in the screen of the terminal device; alternatively, the first and second electrodes may be,
the fourth area of the terminal device comprises a plurality of sub-areas with the capacity values larger than the third capacity value, and the fourth area is an area with a third preset size where a receiver of the terminal device is located.
17. A terminal device, comprising: a memory storing a computer program and a processor running the computer program to perform the terminal control method according to any one of claims 1 to 8.
18. A computer-readable storage medium, characterized in that the computer-readable storage medium comprises a computer program which, when executed by one or more processors, implements the terminal control method of any one of claims 1-8.
CN201911139005.1A 2019-11-20 2019-11-20 Terminal control method and device and terminal equipment Pending CN111163213A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911139005.1A CN111163213A (en) 2019-11-20 2019-11-20 Terminal control method and device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911139005.1A CN111163213A (en) 2019-11-20 2019-11-20 Terminal control method and device and terminal equipment

Publications (1)

Publication Number Publication Date
CN111163213A true CN111163213A (en) 2020-05-15

Family

ID=70556020

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911139005.1A Pending CN111163213A (en) 2019-11-20 2019-11-20 Terminal control method and device and terminal equipment

Country Status (1)

Country Link
CN (1) CN111163213A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112416060A (en) * 2020-11-19 2021-02-26 捷开通讯(深圳)有限公司 Screen state control method and device, storage medium and mobile terminal
CN115170785A (en) * 2021-11-22 2022-10-11 荣耀终端有限公司 Character recognition method for image, electronic device and storage medium
CN115550504A (en) * 2022-08-08 2022-12-30 赫名迪科技(深圳)有限公司 Screen control method and device

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2079011A1 (en) * 2007-12-28 2009-07-15 HTC Corporation Handheld electronic device and screen lock method thereof
CN103841246A (en) * 2012-11-20 2014-06-04 联想(北京)有限公司 Information processing method and system, and mobile terminal
CN104284012A (en) * 2014-06-14 2015-01-14 敦泰科技有限公司 Method for judging conversation state and personal mobile communication terminal
CN105677039A (en) * 2016-02-16 2016-06-15 北京博研智通科技有限公司 Method, device and wearable device for gesture-based driving status detection
CN105929940A (en) * 2016-04-13 2016-09-07 哈尔滨工业大学深圳研究生院 Rapid three-dimensional dynamic gesture recognition method and system based on character value subdivision method
CN107896272A (en) * 2017-08-11 2018-04-10 广东欧珀移动通信有限公司 A kind of call control method and device
CN108563387A (en) * 2018-04-13 2018-09-21 Oppo广东移动通信有限公司 Display control method and device, terminal, computer readable storage medium
CN108600557A (en) * 2018-07-13 2018-09-28 维沃移动通信有限公司 A kind of screen light on and off control method and mobile terminal
CN108762813A (en) * 2018-05-22 2018-11-06 厦门美图移动科技有限公司 screen awakening method and device
CN109582197A (en) * 2018-11-30 2019-04-05 北京小米移动软件有限公司 Screen control method, device and storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2079011A1 (en) * 2007-12-28 2009-07-15 HTC Corporation Handheld electronic device and screen lock method thereof
CN103841246A (en) * 2012-11-20 2014-06-04 联想(北京)有限公司 Information processing method and system, and mobile terminal
CN104284012A (en) * 2014-06-14 2015-01-14 敦泰科技有限公司 Method for judging conversation state and personal mobile communication terminal
CN105677039A (en) * 2016-02-16 2016-06-15 北京博研智通科技有限公司 Method, device and wearable device for gesture-based driving status detection
CN105929940A (en) * 2016-04-13 2016-09-07 哈尔滨工业大学深圳研究生院 Rapid three-dimensional dynamic gesture recognition method and system based on character value subdivision method
CN107896272A (en) * 2017-08-11 2018-04-10 广东欧珀移动通信有限公司 A kind of call control method and device
CN108563387A (en) * 2018-04-13 2018-09-21 Oppo广东移动通信有限公司 Display control method and device, terminal, computer readable storage medium
CN108762813A (en) * 2018-05-22 2018-11-06 厦门美图移动科技有限公司 screen awakening method and device
CN108600557A (en) * 2018-07-13 2018-09-28 维沃移动通信有限公司 A kind of screen light on and off control method and mobile terminal
CN109582197A (en) * 2018-11-30 2019-04-05 北京小米移动软件有限公司 Screen control method, device and storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112416060A (en) * 2020-11-19 2021-02-26 捷开通讯(深圳)有限公司 Screen state control method and device, storage medium and mobile terminal
WO2022104952A1 (en) * 2020-11-19 2022-05-27 捷开通讯(深圳)有限公司 Screen state control method and apparatus, and storage medium
CN115170785A (en) * 2021-11-22 2022-10-11 荣耀终端有限公司 Character recognition method for image, electronic device and storage medium
CN115170785B (en) * 2021-11-22 2023-05-09 荣耀终端有限公司 Character recognition method for image, electronic device and storage medium
CN115550504A (en) * 2022-08-08 2022-12-30 赫名迪科技(深圳)有限公司 Screen control method and device

Similar Documents

Publication Publication Date Title
EP4221164A1 (en) Display method for electronic device with flexible display and electronic device
CN110989852B (en) Touch screen, electronic equipment and display control method
EP3822831A1 (en) Voice recognition method, wearable device and electronic device
CN109766043A (en) The operating method and electronic equipment of electronic equipment
CN110032307A (en) A kind of moving method and electronic equipment of application icon
CN111163213A (en) Terminal control method and device and terminal equipment
CN112947755A (en) Gesture control method and device, electronic equipment and storage medium
CN109151428B (en) Automatic white balance processing method, device and computer storage medium
CN112578982A (en) Electronic equipment and operation method thereof
CN111552451B (en) Display control method and device, computer readable medium and terminal equipment
CN113805487B (en) Control instruction generation method and device, terminal equipment and readable storage medium
CN110742580A (en) Sleep state identification method and device
CN113934330A (en) Screen capturing method and electronic equipment
WO2020019355A1 (en) Touch control method for wearable device, and wearable device and system
CN112651510A (en) Model updating method, working node and model updating system
CN117130469A (en) Space gesture recognition method, electronic equipment and chip system
CN113971271A (en) Fingerprint unlocking method and device, terminal and storage medium
CN113448482A (en) Sliding response control method and device of touch screen and electronic equipment
CN110691165A (en) Navigation operation method and electronic equipment
CN112684969A (en) Always displaying method and mobile device
CN115393676A (en) Gesture control optimization method and device, terminal and storage medium
CN115079810A (en) Information processing method and device, main control equipment and controlled equipment
CN114116610A (en) Method, device, electronic equipment and medium for acquiring storage information
CN114554012A (en) Incoming call answering method, electronic equipment and storage medium
CN114362878B (en) Data processing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200515