CN110505341B - Terminal control method and device, mobile terminal and storage medium - Google Patents

Terminal control method and device, mobile terminal and storage medium Download PDF

Info

Publication number
CN110505341B
CN110505341B CN201910702590.5A CN201910702590A CN110505341B CN 110505341 B CN110505341 B CN 110505341B CN 201910702590 A CN201910702590 A CN 201910702590A CN 110505341 B CN110505341 B CN 110505341B
Authority
CN
China
Prior art keywords
mobile terminal
characteristic value
ultrasonic
state
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910702590.5A
Other languages
Chinese (zh)
Other versions
CN110505341A (en
Inventor
林进全
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201910702590.5A priority Critical patent/CN110505341B/en
Publication of CN110505341A publication Critical patent/CN110505341A/en
Priority to PCT/CN2020/103305 priority patent/WO2021017947A1/en
Application granted granted Critical
Publication of CN110505341B publication Critical patent/CN110505341B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72463User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions to restrict the functionality of the device
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W52/00Power management, e.g. TPC [Transmission Power Control], power saving or power classes
    • H04W52/02Power saving arrangements
    • H04W52/0209Power saving arrangements in terminal devices
    • H04W52/0261Power saving arrangements in terminal devices managing power supply demand, e.g. depending on battery level
    • H04W52/0267Power saving arrangements in terminal devices managing power supply demand, e.g. depending on battery level by controlling user interface components
    • H04W52/027Power saving arrangements in terminal devices managing power supply demand, e.g. depending on battery level by controlling user interface components by controlling a display operation or backlight unit
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Environmental & Geological Engineering (AREA)
  • Telephone Function (AREA)

Abstract

The application discloses a terminal control method, a device, a mobile terminal and a storage medium, wherein the terminal control method is applied to the mobile terminal and comprises the following steps: when the mobile terminal is in a call state, sending an ultrasonic signal through an ultrasonic transmitting module, and receiving an ultrasonic signal returned by the ultrasonic signal after encountering an object through an ultrasonic receiving module; acquiring first detection data detected by an acceleration sensor and second detection data detected by a gyroscope sensor; acquiring a first characteristic value of the ultrasonic signal in the transmission process, a second characteristic value corresponding to the first detection data and a third characteristic value corresponding to the second detection data; and inputting the first characteristic value, the second characteristic value and the third characteristic value into a trained preset model to obtain an output result, wherein the preset model is used for acquiring the moving state of the mobile terminal relative to the object and controlling the state of a display screen of the mobile terminal according to the output result. The method can improve the accuracy of state control of the display screen.

Description

Terminal control method and device, mobile terminal and storage medium
Technical Field
The present application relates to the field of mobile terminal technologies, and in particular, to a terminal control method and apparatus, a mobile terminal, and a storage medium.
Background
Mobile terminals, such as mobile phones, tablet computers, etc., have become one of the most common consumer electronic products in people's daily life. With the increasing development of mobile terminal technology, full-screen and curved-screen mobile phones have become mainstream products, and as the top space of the mobile terminal needs to be saved, many manufacturers adopt an ultrasonic proximity detection scheme on the mobile terminal to replace the traditional infrared proximity detection scheme. However, due to the complexity of the internal structure of the mobile phone, the characteristics of the ultrasonic waves and other factors, the accuracy of the detection result of the ultrasonic waves in some special scenes cannot be guaranteed, and the state of the display screen is inaccurately controlled.
Disclosure of Invention
In view of the foregoing problems, the present application provides a terminal control method, an apparatus, a mobile terminal and a storage medium to improve accuracy of state control of a display screen.
In a first aspect, an embodiment of the present application provides a terminal control method, which is applied to a mobile terminal, where the mobile terminal includes an ultrasonic wave emitting module, an ultrasonic wave receiving module, an acceleration sensor, a gyroscope sensor, and a display screen, and the method includes: when the mobile terminal is in a call state, sending an ultrasonic signal through the ultrasonic transmitting module, and receiving an ultrasonic signal returned by the ultrasonic signal after encountering an object through the ultrasonic receiving module; acquiring first detection data detected by the acceleration sensor and second detection data detected by the gyroscope sensor; acquiring a first characteristic value of the ultrasonic signal in a transmission process, a second characteristic value corresponding to the first detection data and a third characteristic value corresponding to the second detection data; and inputting the first characteristic value, the second characteristic value and the third characteristic value into a trained preset model to obtain an output result, wherein the preset model is used for acquiring the moving state of the mobile terminal relative to an object and controlling the state of a display screen of the mobile terminal according to the output result.
In a second aspect, an embodiment of the present application provides a terminal control device, which is applied to a mobile terminal, where the mobile terminal includes an ultrasonic wave emitting module, an ultrasonic wave receiving module, an acceleration sensor, a gyroscope sensor, and a display screen, and the device includes: the mobile terminal comprises a transceiving control module, a data acquisition module, a feature acquisition module and a screen control module, wherein the transceiving control module is used for sending an ultrasonic signal through the ultrasonic transmitting module when the mobile terminal is in a call state, and receiving an ultrasonic signal returned by the ultrasonic signal after encountering an object through the ultrasonic receiving module; the data acquisition module is used for acquiring first detection data detected by the acceleration sensor and second detection data detected by the gyroscope sensor; the characteristic acquisition module is used for acquiring a first characteristic value of the ultrasonic signal in a transmission process, a second characteristic value corresponding to the first detection data and a third characteristic value corresponding to the second detection data; the screen control module is used for inputting the first characteristic value, the second characteristic value and the third characteristic value into a trained preset model to obtain an output result, and the preset model is used for acquiring the moving state of the mobile terminal relative to an object and controlling the state of a display screen of the mobile terminal according to the output result.
In a third aspect, an embodiment of the present application provides a mobile terminal, including: one or more processors; a memory; one or more application programs, wherein the one or more application programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the terminal control method provided by the first aspect above.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where a program code is stored in the computer-readable storage medium, and the program code may be called by a processor to execute the terminal control method provided in the first aspect.
According to the scheme, when the mobile terminal is in a call state, the ultrasonic signal is sent through the ultrasonic transmitting module, the ultrasonic signal returned when encountering an object is received by the ultrasonic receiving module, the first detection data detected by the acceleration sensor and the second detection data detected by the gyroscope sensor are obtained, the first characteristic value of the ultrasonic signal in the transmission process, the second characteristic value corresponding to the first detection data and the third characteristic value corresponding to the second detection data are obtained, then the first characteristic value, the second characteristic value and the third characteristic value are input into the trained preset model, an output result is obtained, the preset model is used for obtaining the moving state of the mobile terminal relative to the object, and finally the state of the display screen of the mobile terminal is controlled according to the output result. Therefore, the moving state of the mobile terminal relative to the object can be obtained according to the ultrasonic characteristic value, the characteristic value of the data detected by the acceleration sensor and the characteristic value of the data detected by the gyroscope sensor by utilizing the preset model for obtaining the moving state of the mobile terminal relative to the object, the moving state of the mobile terminal relative to the object can be accurately detected, and the accuracy of state control of the display screen in the conversation process is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 shows a schematic diagram of a propagation path of an ultrasonic wave provided by an embodiment of the present application.
Fig. 2 shows a flowchart of a terminal control method according to an embodiment of the present application.
Fig. 3 shows a flowchart of a terminal control method according to another embodiment of the present application.
Fig. 4 shows a schematic diagram of a model training process provided by an embodiment of the present application.
Fig. 5 shows a frequency spectrum diagram of audio data provided by an embodiment of the present application.
Fig. 6 shows a flowchart of a terminal control method according to still another embodiment of the present application.
Fig. 7 shows a block diagram of a terminal control device according to an embodiment of the present application.
Fig. 8 is a block diagram of a mobile terminal for executing a terminal control method according to an embodiment of the present application.
Fig. 9 is a storage unit for storing or carrying a program code implementing a terminal control method according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
At present, with the increasing development of mobile terminal technology, more and more mobile terminals with curved screens and full screens are available, and in order to save the top space of the mobile terminals, a plurality of manufacturers adopt an ultrasonic proximity monitoring scheme to replace the traditional infrared proximity detection scheme on the mobile terminals. At present, the ultrasonic proximity monitoring scheme is that a mobile terminal transmits ultrasonic waves through an ultrasonic transmitting device (such as an earphone, a loudspeaker, a dedicated ultrasonic transmitter, and the like), a part of the ultrasonic waves directly reach an ultrasonic receiving module (a sound pick-up) through air propagation (as shown in path 1 of fig. 1), and a part of the ultrasonic waves are reflected by a shielding object through air propagation and then reach the ultrasonic receiving module (as shown in path 2 of fig. 1). The ultrasonic receiving module picks up the superposed signal of the direct sound and the reflected sound, and the superposed signal is converted into an audio signal through an A/D converter. And processing the audio data through an algorithm to obtain the motion state of the shielding object relative to the mobile terminal, and further guiding the display screen of the mobile terminal to be in a bright screen state or a dark screen state.
The inventor finds that due to factors such as complexity of the internal structure of the mobile phone and characteristics of ultrasonic waves, the display state of the display screen cannot be accurately controlled. For example, the mobile terminal views the content displayed on the display screen during the call, and at this time, the user manually operates the content displayed on the display screen and triggers the ultrasonic proximity detection scheme to detect that an object is close to the mobile terminal, so that the display screen is controlled to be turned off, and the user experience is influenced.
In view of the above problems, the inventor proposes a terminal control method, a terminal control device, a mobile terminal, and a storage medium according to embodiments of the present application, in which a feature value of an ultrasonic signal, a feature value corresponding to detection data of an acceleration sensor, and a feature value corresponding to detection data of a gyro sensor are input to a preset model, and a display state of a display screen is controlled according to an output result, so as to improve accuracy of state control of the display screen. The specific terminal control method is described in detail in the following embodiments.
Referring to fig. 2, fig. 2 is a schematic flowchart illustrating a terminal control method according to an embodiment of the present application. The terminal control method is used for inputting the characteristic value of the ultrasonic signal, the characteristic value corresponding to the detection data of the acceleration sensor and the characteristic value corresponding to the detection data of the gyroscope sensor into a preset model, and controlling the display state of the display screen according to the output result so as to improve the accuracy of state control of the display screen. In a specific embodiment, the terminal control method is applied to the terminal control apparatus 400 shown in fig. 7 and the mobile terminal 100 (fig. 8) configured with the terminal control apparatus 400. The following will describe a specific process of this embodiment by taking a mobile terminal as an example, and it is understood that the mobile terminal applied in this embodiment may be a smart phone, a tablet computer, a wearable electronic device, and the like, which is not limited herein. In this embodiment, the mobile terminal may include an ultrasonic wave emitting module, an ultrasonic wave receiving module, an acceleration sensor, a gyroscope sensor, and a display screen, and the following will be described in detail with respect to the flow shown in fig. 2, where the terminal control method may specifically include the following steps:
step S110: when the mobile terminal is in a call state, the ultrasonic wave transmitting module is used for transmitting an ultrasonic wave signal, and the ultrasonic wave receiving module is used for receiving an ultrasonic wave signal returned by the ultrasonic wave signal after encountering an object.
In the embodiment of the application, the mobile terminal can detect the call state, so that when the mobile terminal is in the call state, the ultrasonic wave transmitting module transmits the ultrasonic wave signal and the ultrasonic wave receiving module receives the ultrasonic wave signal, and then the characteristic value of the ultrasonic wave signal in the transmission process can be obtained, so as to determine the moving state of the mobile terminal relative to the object.
In some embodiments, the mobile terminal may monitor an incoming CALL or an outgoing CALL of the mobile terminal in real time through a built-in monitoring module, and when it is monitored that the mobile terminal is in a ring start (CALL _ STATE _ RINGING) incoming CALL or when a dialing operation is going, monitor whether the mobile terminal enters a CALL STATE. The method includes the steps that when a mobile terminal makes a CALL, a system Broadcast is sent out, the mobile terminal can use a Broadcast Receiver to monitor, in addition, whether the mobile terminal is in a CALL STATE can be monitored by monitoring whether the mobile terminal is in an interface in the CALL after the mobile terminal makes a CALL or leaves, and when the mobile terminal is monitored to be in the CALL (CALL _ STATE _ OFFHOOK), the mobile terminal can be determined to be in the CALL STATE.
In the embodiment of the present application, the mobile terminal may include both the ultrasonic wave transmitting module and the ultrasonic wave receiving module. In the process of the movement of the ultrasonic wave transmitting module relative to the object, the essence is that the mobile terminal moves relative to the object, so that the ultrasonic wave receiving module also moves relative to the object. The wavelength of the object radiation varies due to the relative motion of the source (mobile terminal) and the observer (object) according to the doppler effect, which is formulated as follows:
Figure BDA0002151249540000031
where f' is the observed frequency, f is the original emission frequency of the emission originating in the medium, v is the propagation velocity of the wave in the medium, v0The moving speed of the observer is the forward operation sign is plus sign if the observer approaches the emission source, otherwise, the forward operation sign is minus sign; v. ofsFor the emitting source moving speed, the forward operation symbol is a minus sign if the object is close to the observer, and a plus sign if the object is not close to the observer. As can be seen from the doppler effect formula, when the emission source is relatively close to the observer, the frequency of the signal received by the observer will become higher; when the emission source is relatively far away from the observer, the frequency of the signal received by the observer becomes smaller; when the source and the observer are relatively stationary, the signal received by the observer coincides with the source.
In some embodiments, when it is monitored that the mobile terminal is in a call state, an ultrasonic wave signal with a fixed frequency may be sent by an ultrasonic wave sending module built in the mobile terminal, it can be understood that a part of the ultrasonic wave signal sent by the ultrasonic wave sending module directly reaches the ultrasonic wave receiving module through air propagation, another part of the ultrasonic wave signal forms reflection with a shielding object through air propagation and then reaches the ultrasonic wave receiving module, the ultrasonic wave receiving module picks up a superimposed signal of direct sound and reflected sound, and converts the superimposed signal into an audio signal through a/D, where the shielding object may include a human face, a human body, and the like. For example, an earphone, a speaker or a dedicated ultrasonic transmitter built in the mobile terminal may be used to transmit an ultrasonic signal with a fixed frequency, a part of the ultrasonic signal is directly transmitted to the sound pick-up through air propagation, and another part of the ultrasonic signal is reflected by the shielding object through air propagation and then reaches the sound pick-up, where the sound pick-up is a superposed signal of direct sound and reflected sound obtained by the sound pick-up and is converted into an audio signal through a/D.
In this embodiment, when the mobile terminal is in a call state, the ultrasonic wave transmitting module may transmit an ultrasonic wave signal, and the ultrasonic wave receiving module may receive an ultrasonic wave signal returned after the ultrasonic wave signal encounters an object, or extract an ultrasonic wave signal (reflected sound) returned after the ultrasonic wave signal encounters an object from the ultrasonic wave signals (direct sound and reflected sound) received by the ultrasonic wave receiving module, which is not limited herein.
Step S120: and acquiring first detection data detected by the acceleration sensor and second detection data detected by the gyroscope sensor.
In this embodiment, the mobile terminal may further include an acceleration sensor and a gyroscope sensor at the same time. Among them, the acceleration sensor is a sensor capable of measuring acceleration. In the acceleration process, the sensor obtains an acceleration value by measuring the inertial force borne by the mass block and utilizing Newton's second law. The mobile terminal can adopt a three-axis acceleration sensor, and the acceleration of the mobile phone terminal in three different directions (namely an x axis, a y axis and a z axis) can be measured through the three-axis acceleration sensor. And then, the numerical values of the three axes are logically judged, so that the position changes of the mobile phone terminal, such as movement and action, can be approximately measured. A gyroscope sensor, namely an angular velocity sensor, is different from an accelerometer (G-sensor), the measured physical quantity of the gyroscope sensor is the rotating angular velocity during deflection and inclination, in a mobile terminal, a complete 3D action cannot be measured or reconstructed only by the accelerometer, the G-sensor can only detect the axial linear action, but the gyroscope can well measure the actions of rotation and deflection, so that the actual actions of a user can be accurately analyzed and judged.
In this embodiment of the application, the mobile terminal may control the acceleration sensor and the gyroscope sensor to keep an on state in a process of sending the ultrasonic signal through the ultrasonic sending module and receiving the ultrasonic signal through the ultrasonic receiving module, and acquire first detection data detected by the acceleration sensor and second detection data detected by the gyroscope sensor. The first detection data may include accelerations of the x-axis, the y-axis, and the z-axis detected by the acceleration sensor, and the second detection data may include rotational angular velocities of the x-axis, the y-axis, and the z-axis detected by the gyroscope sensor, which is not limited herein.
Step S130: and acquiring a first characteristic value of the ultrasonic signal in a transmission process, a second characteristic value corresponding to the first detection data and a third characteristic value corresponding to the second detection data.
In this embodiment of the application, the mobile terminal may obtain a first characteristic value, a second characteristic value corresponding to the first detection data, and a third characteristic value corresponding to the third detection data of the ultrasonic signal sent by the ultrasonic sending module in the transmission process.
In some embodiments, the first characteristic value of the ultrasonic signal during transmission may include one or more of a doppler effect area difference, a doppler effect area sum, and an absolute value of an amplitude change rate of the ultrasonic wave, which is not limited herein.
In some embodiments, the acceleration sensor detects a second feature value corresponding to the first detection data, and may form a feature vector from the acceleration values of the x-axis, the y-axis, and the z-axis in the first detection data; the gyroscope sensor detects a third eigenvalue corresponding to the second detection data, and the third eigenvalue can be obtained by forming a eigenvector from the rotational angular velocities of the x axis, the y axis and the z axis in the second detection data, which is not limited herein.
Step S140: and inputting the first characteristic value, the second characteristic value and the third characteristic value into a trained preset model to obtain an output result, wherein the preset model is used for acquiring the moving state of the mobile terminal relative to an object and controlling the state of a display screen of the mobile terminal according to the output result.
In this embodiment of the application, after obtaining the first eigenvalue of the ultrasonic signal in the transmission process, the second eigenvalue corresponding to the first detection data, and the third eigenvalue corresponding to the second detection data, the first eigenvalue, the second eigenvalue, and the third eigenvalue may be input to the trained preset model as input parameters, so as to obtain an output result output by the preset model. Wherein, the output result may include a moving state of the mobile terminal relative to the object.
In some embodiments, the preset model may be obtained by training in advance according to a large number of training samples. The training sample may include an input sample and an output sample, the input sample may include a first feature value of the ultrasonic signal in the transmission process, a second feature value corresponding to data detected by the acceleration sensor, and a third feature value corresponding to data detected by the gyroscope sensor, and the output sample may be a moving state of the mobile terminal relative to the object corresponding to the first feature value, the second feature value, and the third feature value, so that the trained preset model may be used to output the moving state of the mobile terminal relative to the object according to the acquired first feature value, the second feature value, and the third feature value. The preset model may include a Support Vector Machine (SVM), a neural network, and the like, which is not limited herein.
In the embodiment of the application, after the mobile terminal acquires the moving state of the mobile terminal relative to the object, the display screen can be controlled to be in the bright screen state or the off screen state according to the moving state of the mobile terminal relative to the object, so that the accuracy and the stability of state control of the display screen in the conversation state of the mobile terminal are improved, the power consumption of the mobile terminal is effectively reduced, and the radiation of the display screen in the bright screen state to the face when the display screen is close to the face is reduced.
The terminal control method provided by the embodiment of the application realizes the purpose that the moving state of the mobile terminal relative to the object is obtained according to the ultrasonic characteristic value, the characteristic value of the data detected by the acceleration sensor and the characteristic value of the data detected by the gyroscope sensor by using the preset model for obtaining the moving state of the mobile terminal relative to the object, and the moving state of the mobile terminal relative to the object can be determined jointly according to the ultrasonic characteristic value, the acceleration characteristic value and the rotation angular velocity characteristic value, so that the moving state of the mobile terminal relative to the object is accurately detected, and the accuracy of state control of the display screen in the conversation process is improved.
Referring to fig. 3, fig. 3 is a schematic flowchart illustrating a terminal control method according to another embodiment of the present application. The method is applied to the mobile terminal, which includes an ultrasonic wave emitting module, an ultrasonic wave receiving module, an acceleration sensor, a gyroscope sensor and a display screen, and will be described in detail with respect to the flow shown in fig. 3, and the terminal control method may specifically include the following steps:
step S210: and acquiring a training data set, wherein the training data set comprises training data labeled with the moving state of the mobile terminal relative to the object, and the training data comprises a first characteristic value of the ultrasonic signal in the transmission process, a second characteristic value corresponding to the data detected by the acceleration sensor and a third characteristic value corresponding to the data detected by the gyroscope sensor.
In this embodiment of the present application, for the preset model in the foregoing embodiment, the embodiment of the present application further includes a training method for the preset model, and it is worth to be noted that training for the preset model may be performed in advance according to the acquired training data set, and subsequently, when the moving state of the mobile terminal relative to the object is detected each time, the preset model may be acquired according to the preset model, and it is not necessary to train the preset model when the moving state of the mobile terminal relative to the object is detected each time.
In some embodiments, the training data set may include training data labeled with a moving state of the mobile terminal relative to the object, and the training data may include a first characteristic value of the ultrasonic signal during transmission, a second characteristic value corresponding to data of the acceleration sensor, and a third characteristic value corresponding to data detected by the gyro sensor. The contents of the first characteristic value, the second characteristic value and the third characteristic value may be the same as the first characteristic value, the second characteristic value and the third characteristic value used for acquiring the moving state of the mobile terminal relative to the object in the foregoing embodiment.
When a training data set is constructed, a first characteristic value, a second characteristic value and a third characteristic value which are obtained by calculation when a mobile terminal is far away from, close to or still relative to an object under the condition of using a large number of users can be obtained, and the calculated first characteristic value, second characteristic value and third characteristic value are marked as the mobile state of the mobile terminal which is far away from, close to or still relative to the object.
In addition, in some special scenes, the moving state of the mobile terminal approaching to the object can be easily detected through an ultrasonic approach detection method, the display screen can be controlled to be turned off at the moment, but the user does not want the display screen to be turned off, for example, the user can check the screen when using the mobile terminal to talk and operate the content displayed on the screen by hands, and the situation that the mobile terminal approaches to the object to cause the display screen to be turned off at the moment can be detected. Therefore, the first characteristic value, the second characteristic value and the third characteristic value which are calculated when the mobile terminal is detected to be close to an object through a traditional detection method in some special scenes but the display screen is required to be kept bright by a user can be obtained, and the first characteristic value, the second characteristic value and the third characteristic value are marked as the state that the mobile terminal is far away from or is still relative to the object, so that the trained preset model can recognize the moving state of the mobile terminal relative to the object in the situations as the state that the mobile terminal is far away from or is still relative to the object. Similarly, the first characteristic value, the second characteristic value and the third characteristic value which are calculated when the situation that the mobile terminal is far away from the object but the display screen is required to be turned off by the user through the traditional detection method in some special scenes can be obtained, and the first characteristic value, the second characteristic value and the third characteristic value are marked as the state that the mobile terminal is close to or is still relative to the object. Therefore, the model obtained through training can avoid the mobile state of the mobile terminal relative to the object, which is detected in a special scene, and inaccurate control of the display screen is caused.
In this embodiment of the application, in the training data set, the first feature value, the second feature value, and the third feature value are input samples for performing training, a state in which the labeled mobile terminal is close to or stationary with respect to an object is an output sample for performing training, and each set of training data may include one input sample and one output sample.
Step S220: and training an initial model according to the training data set to obtain a trained preset model.
In the embodiment of the application, the training data set can be input to the initial model for training according to the training data set, so that the preset model for acquiring the moving state of the mobile terminal relative to the object is obtained. The initial model may be an SVM, a neural network, etc., and is not limited herein.
The training of the initial model based on the training data set is described below using a neural network as an example.
FIG. 4 illustrates a process of training according to a fully connected neural network. As shown in fig. 4, a first eigenvalue, a second eigenvalue, and a third eigenvalue in a group of data in the training data set are used as input samples of the neural network, the moving state of the mobile terminal relative to an object labeled in the group of data can be used as output samples of the neural network, and the fully-connected neural network is divided into three branches, i.e., a close branch, a far branch, and a stationary branch, by using an output layer through an input layer and a hidden layer, so as to train the moving state of the mobile terminal close to the object, the moving state of the mobile terminal far from the object, and the moving state of the mobile terminal stationary relative to the object.
In the neural network shown in fig. 4, the neurons in the input layer are fully connected with the neurons in the hidden layer, and the neurons in the hidden layer are fully connected with the neurons in the output layer, so that potential features with different granularities can be effectively extracted. And the number of the hidden layers can be multiple, so that the nonlinear relation can be better fitted, and the trained preset model is more accurate.
It is understood that the training process for the preset model may be performed by the mobile terminal, or may not be performed by the mobile terminal. When the training process is not completed by the mobile terminal, the mobile terminal can be used as a direct user or an indirect user, that is, the mobile terminal can send the first characteristic value, the second characteristic value and the third characteristic value to the server storing the preset model, and obtain the moving state of the mobile terminal relative to the object from the server.
In some embodiments, the trained preset model may be stored locally in the mobile terminal, and the trained preset model may also be stored in a server in communication connection with the mobile terminal, so that the storage space occupied by the mobile terminal may be reduced, and the operation efficiency of the mobile terminal may be improved.
In some embodiments, the preset model may be periodically or aperiodically trained and updated by acquiring new training data.
Step S230: when the mobile terminal is in a call state, the ultrasonic wave transmitting module is used for transmitting an ultrasonic wave signal, and the ultrasonic wave receiving module is used for receiving an ultrasonic wave signal returned by the ultrasonic wave signal after encountering an object.
Step S240: and acquiring first detection data detected by the acceleration sensor and second detection data detected by the gyroscope sensor.
In the embodiment of the present application, step S230 and step S240 may refer to the content of the above embodiments, and are not described herein again.
Step S250: and acquiring a first characteristic value of the ultrasonic signal in a transmission process, a second characteristic value corresponding to the first detection data and a third characteristic value corresponding to the second detection data.
In this embodiment, the first characteristic value of the ultrasonic signal during transmission may include one or more of a doppler effect area difference, a doppler effect area sum, and an absolute value of an amplitude change rate of the ultrasonic wave, which is not limited herein.
In some embodiments, when the first characteristic value includes a doppler effect area difference, acquiring the first characteristic value of the ultrasonic signal during transmission may include:
acquiring the sending frequency of the ultrasonic signal sent by the ultrasonic sending module and the frequency range of the ultrasonic signal received by the ultrasonic receiving module; determining a first frequency variation interval and a second frequency variation interval based on the transmission frequency and the frequency range; calculating to obtain a first area according to the first frequency change interval and a first intensity change curve corresponding to the first frequency change interval; calculating to obtain a second area according to the second frequency change interval and a second intensity change curve corresponding to the second frequency change interval; and calculating the difference between the first area and the second area to obtain the Doppler effect area difference of the ultrasonic signals in the transmission process.
When the mobile terminal is in a call state, the relative motion state of the mobile terminal relative to the object is substantially the process that the user takes the mobile terminal close to or away from the human body in the process that the user uses the mobile terminal, and the change of the speed of taking the mobile terminal by the user in a certain range is considered, so that the frequency change of the ultrasonic signal received by the ultrasonic receiving module is correspondingly in a certain range, namely the frequency range of the ultrasonic signal.
In some embodiments, the mobile terminal may acquire a transmission frequency of an ultrasonic signal transmitted by its built-in ultrasonic transmission module and acquire a frequency range of an ultrasonic signal received by its built-in ultrasonic reception module. The transmission frequency of the ultrasonic signal transmitted by the ultrasonic transmission module may be a fixed frequency, and therefore, the mobile terminal may acquire the transmission frequency based on the set transmission parameters of the ultrasonic signal of the ultrasonic transmission module. In addition, the frequency range of the ultrasonic signal received by the ultrasonic receiving module is related to the relative motion relationship between the mobile terminal and the object, so that the change range of the motion speed of most users in the process of using the mobile terminal can be obtained, and the frequency range of the ultrasonic signal received by the ultrasonic receiving module is determined according to the change range of the motion speed.
Specifically, based on the doppler effect formula, f' is the frequency of the ultrasonic signal reflected by the object received by the ultrasonic receiving module. f is the transmission frequency of the ultrasonic signal transmitted by the ultrasonic transmission module. v is the propagation speed of sound in air, and is taken as 340 m/s. Assuming that the mobile terminal is stationary, vs0. If the speed of movement of the object relative to the terminal is v01Then the moving speed of the object in the Doppler effect formula is v0=2v01. Assuming that the transmission frequency of the ultrasonic signal transmitted by the ultrasonic transmission module is 22500Hz, and the frequency range of the ultrasonic signal received by the ultrasonic reception module is 22420Hz, 22580Hz]According to a plurality ofThe maximum relative speed of the object and the mobile terminal, which can be identified by the plerian effect, is as follows:
Figure BDA0002151249540000071
if the data length of the Fourier Transform (DFT) Transform is fftlen-8192 and the audio data sampling rate fs-48 kHz, the frequency resolution of the DFT result is:
Figure BDA0002151249540000072
then is represented by
Figure BDA0002151249540000073
And formula
Figure BDA0002151249540000074
Then the minimum relative speed of the object and the mobile terminal that can be identified is:
Figure BDA0002151249540000075
therefore, in the present embodiment, the maximum relative velocity and the minimum relative velocity of the mobile terminal and the object may be obtained based on the history data and the like, and the frequency range of the ultrasonic signal received by the ultrasonic receiving module may be obtained by reversely deriving the maximum relative velocity, the minimum relative velocity and the above formula.
In some embodiments, after acquiring the transmission frequency of the ultrasonic signal transmitted by the ultrasonic transmission module and the frequency range of the ultrasonic signal received by the ultrasonic reception module, the frequency variation interval may be determined based on the transmission frequency and the frequency range. For example, as shown in fig. 5, fig. 5 shows a spectrogram of audio data provided in the embodiment of the present application, where a frequency spectrum is an abbreviation of a frequency spectrum, and is a distribution curve of frequencies, and for discrete audio data sampling points, the frequency spectrum can be obtained by discrete fourier transform, and in fig. 5, the frequency spectrum is obtained by discrete fourier transform of a segment of audio data, each point on an abscissa corresponds to a real frequency value, and an ordinate represents a signal intensity of the frequency.
In some embodiments, the feature extraction module performs DFT with a data module with length fftlen 8192 each time, to obtain a corresponding amplitude-frequency vector X as shown in fig. 5,
actual frequency fnThe relation with the nth data of the amplitude-frequency vector X is as follows:
Figure BDA0002151249540000081
wherein f issFor the sampling rate, fftlen is the data length. Then X [ n ]]Representing the actual frequency fnThe strength of (2).
Assuming that the key frequencies considered in the algorithm are ultrasonic 22500Hz, f _ min _ low 22494Hz, f _ min _ up 22506Hz, f _ low 22420Hz, and f _ up 22580Hz, the considered key frequencies are: n1, n2, n3, n4 and n5, n1 being point _ low, n2 being point _ mid _ low, n3 being point _ mid, n4 being point _ mid _ up, n5 being point _ up, wherein,
Figure BDA0002151249540000082
Figure BDA0002151249540000083
Figure BDA0002151249540000084
Figure BDA0002151249540000085
Figure BDA0002151249540000086
as shown in fig. 5, the sending frequency of the ultrasonic signal sent by the ultrasonic sending module is point _ mid, the signal strength corresponding to the sending frequency is ultrasonic _ amp, and the frequency range of the ultrasonic signal received by the ultrasonic receiving module is point _ low to point _ up, so that the frequency variation interval can be determined to be point _ low to point _ mid _ low and point _ min-up to point _ up.
In some embodiments, the first frequency variation interval and the second frequency variation interval may be determined based on the transmission frequency and the frequency range. For example, as shown in fig. 5, the first frequency variation interval is from point _ low to point _ mid _ low, and the second frequency variation interval is from point _ min-up to point _ up.
In some embodiments, after the frequency variation interval is acquired, an intensity variation curve corresponding to the frequency variation interval may be acquired based on the spectrogram, and the doppler effect area difference of the ultrasonic signal during transmission may be calculated based on the frequency variation interval and the intensity variation curve corresponding to the frequency variation interval.
Specifically, after the first frequency variation interval is acquired, a first intensity variation curve corresponding to the first frequency variation interval may be acquired based on a spectrogram, and a first area of the ultrasonic signal during transmission may be calculated based on the first intensity variation curve corresponding to the first frequency variation interval and the first frequency variation interval, and meanwhile, after the second frequency variation interval is acquired, a second intensity variation curve corresponding to the second frequency variation interval may be acquired based on the spectrogram, and a second area of the ultrasonic signal during transmission may be calculated based on the second frequency variation interval and a second intensity variation curve corresponding to the second frequency variation interval. Further, the difference between the first area and the second area is calculated, for example, by subtracting the second area from the first area or by subtracting the first area from the second area, the doppler effect area difference of the ultrasonic signal during transmission can be obtained.
In some embodiments, the acquiring the first characteristic value of the ultrasonic signal during transmission may include:
acquiring the sending frequency of the ultrasonic signal sent by the ultrasonic sending module and the frequency range of the ultrasonic signal received by the ultrasonic receiving module; determining a first frequency variation interval and a second frequency variation interval based on the transmission frequency and the frequency range; calculating to obtain a first area according to the first frequency change interval and a first intensity change curve corresponding to the first frequency change interval; calculating to obtain a second area according to the second frequency change interval and a second intensity change curve corresponding to the second frequency change interval; and calculating the sum of the first area and the second area to obtain the Doppler effect area sum of the ultrasonic signals in the transmission process.
The process of obtaining the sum of the doppler effect areas may be substantially the same as the process of obtaining the difference of the doppler effect areas, and after the first area and the second area are obtained, the sum of the first area and the second area may be calculated to obtain the sum of the doppler effect areas. When the first feature value includes both the doppler effect area difference and the doppler effect area sum, the doppler effect area difference and the doppler effect area sum may be calculated after the sum of the first area and the second area is obtained through calculation.
In some embodiments, when the first characteristic value includes an absolute value of an amplitude change rate of the ultrasonic wave, acquiring the first characteristic value of the ultrasonic wave signal during transmission may include:
acquiring a first ultrasonic amplitude corresponding to the ultrasonic signal received by the ultrasonic receiving module and a second ultrasonic amplitude corresponding to the ultrasonic signal received by the ultrasonic receiving module at the previous moment; and acquiring an absolute value of a difference value between the first ultrasonic amplitude and the second ultrasonic amplitude to obtain an absolute value of the ultrasonic amplitude change rate of the ultrasonic signal in the transmission process.
When the first characteristic value comprises an absolute value of ultrasonic amplitude change rate, the mobile terminal can acquire a first ultrasonic amplitude value corresponding to an ultrasonic signal received by the ultrasonic receiving module at the current moment and acquire a second ultrasonic amplitude value of the ultrasonic signal received by the ultrasonic receiving module at the previous moment. The specific interval between the current time and the previous time is not limited, and may be, for example, 0.5S, 0.75S, or the like. In some embodiments, when the mobile terminal receives the ultrasonic signal through the ultrasonic receiving module, the amplitude of the ultrasonic signal received at each time may be recorded.
After the mobile terminal obtains the first ultrasonic amplitude and the second ultrasonic amplitude, a difference value between the first ultrasonic amplitude and the second ultrasonic amplitude can be calculated, and an absolute value of the difference value is taken, so that an ultrasonic amplitude change rate absolute value of the ultrasonic signal in the transmission process is obtained.
In some embodiments, obtaining a second feature value corresponding to the first detection data and a third feature value corresponding to the second detection data includes: generating a feature vector according to the first detection data to obtain a second feature value corresponding to the first detection data; and generating a feature vector according to the second detection data to obtain a third feature value corresponding to the second detection data.
The mobile terminal can generate a first feature vector according to acceleration values of an x axis, a y axis and a z axis in the first detection data, and the first feature vector is used as a second feature value corresponding to the first detection data; the mobile terminal may generate a second feature vector according to the rotation angular velocity of the x-axis, the y-axis, and the z-axis in the second detection data, and use the second feature vector as a third feature value corresponding to the second detection data. The specific manner of obtaining the second feature vector corresponding to the first detection data and obtaining the third feature vector corresponding to the second detection data may not be limited.
Step S260: and inputting the first characteristic value, the second characteristic value and the third characteristic value into a trained preset model to obtain an output result, wherein the preset model is used for acquiring the moving state of the mobile terminal relative to an object and controlling the state of a display screen of the mobile terminal according to the output result.
In some embodiments, the trained preset model is stored locally in the mobile terminal, the mobile terminal may directly input the first feature value, the second feature value, and the third feature value into the trained preset model stored locally, and the mobile terminal may also send an instruction to the preset model, where the instruction is used to instruct the preset model to read the acquired first feature value, the second feature value, and the third feature value, and output a result according to the first feature value, the second feature value, and the third feature value.
In other embodiments, when the trained preset model is stored in the server, the mobile terminal may generate a request or an instruction according to the first feature vector, the second feature vector, and the third feature vector, and send the generated instruction or the request to the server, so as to instruct the server to input the first feature vector, the second feature vector, and the third feature vector into the preset model, and obtain an output result. Through the mode of storing the preset model in the server, the storage space of the mobile terminal occupied by the preset model can be effectively reduced, and the occupation of the mobile terminal is also avoided.
Therefore, the mobile terminal can obtain the output result of the preset model according to the first feature vector, the second feature vector and the third feature vector, the output result comprises the moving state of the mobile terminal relative to the object, and the mobile terminal can control the display screen to be in the screen-on state or the screen-off state according to the moving state of the mobile terminal relative to the object, so that the recognition success rate of the mobile terminal in different scenes is improved, and the screen-on control accuracy and stability of the display screen are improved.
The terminal control method provided by the embodiment of the application provides a method for training a preset model, and the initial model is trained through training data marked with the moving state of the mobile terminal relative to an object, so that the preset model is obtained. The preset model can be used for outputting the moving state of the mobile terminal relative to the object according to the ultrasonic characteristic value, the characteristic value of the data detected by the acceleration sensor and the characteristic value of the data detected by the gyroscope sensor, so that the mobile terminal can determine the moving state of the mobile terminal relative to the object according to the ultrasonic characteristic value, the acceleration characteristic value and the rotation angular speed characteristic value, the moving state of the mobile terminal relative to the object can be accurately detected, and the accuracy of state control of the display screen in the conversation process is improved.
Referring to fig. 6, fig. 6 is a flowchart illustrating a terminal control method according to another embodiment of the present application. The method is applied to the mobile terminal, which includes an ultrasonic wave emitting module, an ultrasonic wave receiving module, an acceleration sensor, a gyroscope sensor and a display screen, and will be described in detail with respect to the flow shown in fig. 6, and the terminal control method may specifically include the following steps:
step S310: when the mobile terminal is in a call state, the ultrasonic wave transmitting module is used for transmitting an ultrasonic wave signal, and the ultrasonic wave receiving module is used for receiving an ultrasonic wave signal returned by the ultrasonic wave signal after encountering an object.
Step S320: and acquiring first detection data detected by the acceleration sensor and second detection data detected by the gyroscope sensor.
Step S330: and acquiring a first characteristic value of the ultrasonic signal in a transmission process, a second characteristic value corresponding to the first detection data and a third characteristic value corresponding to the second detection data.
In the embodiment of the present application, steps S310 to S330 may refer to the contents of the above embodiments, and are not described herein again.
Step S340: and inputting the first characteristic value, the second characteristic value and the third characteristic value into a trained preset model to obtain an output result, wherein the preset model is used for acquiring the moving state of the mobile terminal relative to an object and controlling the state of a display screen of the mobile terminal according to the output result.
In this embodiment of the present application, controlling the moving state of the mobile terminal relative to the object according to the output result of the preset model may include:
when the output result represents that the mobile terminal is relatively close to the object, controlling the display screen to be in a screen-off state; when the output result represents that the mobile terminal is relatively far away from the object, controlling the display screen to be in a bright screen state; and when the output result represents that the mobile terminal is static relative to the object, controlling the display screen to keep the current display state.
It can be understood that when the output result represents that the mobile terminal and the object are relatively close to each other, the representation indicates that the relative motion relationship between the mobile terminal and the object is close to each other, and it can be realized that when the mobile terminal is in a call state, the mobile terminal is close to the ear of the user, that is, the display screen of the mobile terminal can be controlled to be in a screen-off state. When the output result represents that the mobile terminal is relatively far away from the object, the relative motion relation between the mobile terminal and the object is represented as the motion of mutual far away, that is, when the mobile terminal is in a call state, the mobile terminal is far away from the ear of the user, that is, the display screen of the mobile terminal can be controlled to be in a bright screen state. The relative rest of the mobile terminal and the object may be that the mobile terminal and the object both remain stationary, or the mobile terminal and the object have the same motion state, for example, the mobile terminal and the object have the same motion speed, the same motion direction, the same motion amplitude, and the same motion frequency, which is not limited herein. When the judgment result indicates that the mobile terminal and the object are relatively static, the relative motion relationship between the mobile terminal and the object is unchanged, and the display screen can be controlled to keep the current display state unchanged, that is, in the process that the mobile terminal is in the call state, when the current display state of the display screen is the bright screen state, the display screen is kept in the bright screen state unchanged, and when the current display state of the display screen is the off screen state, the display screen is kept in the off screen state unchanged.
Step S350: and if the control operation on the state of the display screen is detected within the preset time, controlling the display screen to be in a screen-on state or a screen-off state according to the control operation.
In the embodiment of the application, after controlling the display state of the display screen according to the output result, the mobile terminal may further detect a control operation of the user on the state of the display screen, and if the control operation on the state of the display screen is detected within a preset time period, the display screen may be controlled to be in a display state corresponding to the control operation. The preset time duration may be a time duration that a user can react after the mobile terminal controls the state of the display screen according to the output result, for example, the preset time duration may be 1 second to 5 seconds, and the specific preset time duration is not limited; the control operation may be an operation of triggering a bright screen, such as clicking a screen, clicking a home key, clicking a power key, or the like, or may be an operation of turning off the screen, such as clicking a power key, or the like, and the specific control operation is not limited.
It can be understood that if the mobile terminal controls the state of the display screen incorrectly according to the output result, the user can quickly control the state of the display screen to meet the display state required by the user. Therefore, if the control operation is detected in the preset duration, the state of the display screen is controlled to be in a bright screen state or a dark screen state according to the control operation. For example, the mobile terminal controls the display screen from a bright screen state to a dark screen state according to the output result, and the control is wrong at this time, so that the user can manually light the display screen.
Step S360: and marking the first characteristic value, the second characteristic value and the third characteristic value corresponding to the output result as a target moving state of the mobile terminal relative to the object, wherein the target moving state corresponds to the state of the display screen corresponding to the control operation.
In the embodiment of the application, if the control operation of the user on the display screen is detected after the preset duration, it may be indicated that the state of the display screen controlled by the mobile terminal according to the output result is incorrect, that is, the output result of the preset model is incorrect. Therefore, the first characteristic value, the second characteristic value, and the third characteristic value corresponding to the output result may be labeled as a target movement state of the mobile terminal relative to the object, where the first characteristic value, the second characteristic value, and the third characteristic value are data input by the output result obtained by using a preset model, and the target movement state may correspond to a state of the display screen corresponding to the control operation, for example, if the state of the display screen corresponding to the control operation is a bright screen state, the target movement state may be a state in which the mobile terminal is far away from the object. Therefore, the preset model can be trained subsequently according to the first characteristic value, the second characteristic value and the third characteristic value marked with the movement state of the mobile terminal relative to the object, so that the preset model can learn that the movement state of the mobile terminal relative to the object corresponding to the first characteristic value, the second characteristic value and the third characteristic value is the target movement state.
In some embodiments, before the first characteristic value, the second characteristic value, and the third characteristic value corresponding to the output result are labeled as a target movement state of the mobile terminal relative to an object, prompt content may be further generated to prompt a user whether to perform correction of a preset model; and if an instruction for confirming to correct is received, marking the first characteristic value, the second characteristic value and the third characteristic value corresponding to the output result as a target moving state of the mobile terminal relative to the object.
Step S370: and inputting the first characteristic value, the second characteristic value and the third characteristic value marked with the target moving state into the preset model, and performing correction training on the preset model.
In this embodiment of the application, after labeling the first eigenvalue, the second eigenvalue, and the third eigenvalue corresponding to the output result as the target moving state of the mobile terminal relative to the object, the first eigenvalue, the second eigenvalue, and the third eigenvalue labeled with the target moving state may be input to the preset model, that is, the first eigenvalue, the second eigenvalue, and the third eigenvalue are used as input samples, the target moving state is used as an output sample, the preset model is trained, the purpose of correcting the preset model is achieved, and the accuracy of the output result of the preset model is higher.
According to the terminal control method provided by the embodiment of the application, the mobile terminal obtains the ultrasonic characteristic value, the characteristic value of the data detected by the acceleration sensor and the characteristic value of the data detected by the gyroscope sensor, and the mobile state of the mobile terminal relative to the object is output by using the preset model, so that the mobile terminal can jointly determine the mobile state of the mobile terminal relative to the object according to the ultrasonic characteristic value, the acceleration characteristic value and the rotation angular velocity characteristic value, the mobile state of the mobile terminal relative to the object is accurately detected, and the accuracy of state control of the display screen in the conversation process is improved. In addition, the preset model is corrected according to the control operation of the user on the display screen, and the accuracy of the output result of the preset model is improved.
Referring to fig. 7, a block diagram of a terminal control apparatus 400 according to an embodiment of the present disclosure is shown. The terminal control device 400 is applied to the mobile terminal, and the mobile terminal includes a transceiving control module, a data acquisition module, a feature acquisition module, and a screen control module. The terminal control device 400 includes: a transceiving control module 410, a data acquisition module 420, a feature acquisition module 430, and a screen control module 440. The transceiving control module 410 is configured to send an ultrasonic signal through the ultrasonic transmitting module when the mobile terminal is in a call state, and receive an ultrasonic signal returned by the ultrasonic signal after encountering an object through the ultrasonic receiving module; the data acquiring module 420 is configured to acquire first detection data detected by the acceleration sensor and second detection data detected by the gyroscope sensor; the characteristic acquiring module 430 is configured to acquire a first characteristic value of the ultrasonic signal in a transmission process, a second characteristic value corresponding to the first detection data, and a third characteristic value corresponding to the second detection data; the screen control module 440 is configured to input the first feature value, the second feature value, and the third feature value into a trained preset model to obtain an output result, where the preset model is configured to obtain a moving state of the mobile terminal relative to an object, and control a state of a display screen of the mobile terminal according to the output result.
In some embodiments, the terminal control apparatus 400 may further include: the device comprises a data set acquisition module and a model training module. The data set acquisition module is used for acquiring a training data set, wherein the training data set comprises training data which are marked with the moving state of the mobile terminal relative to an object, and the training data comprise a first characteristic value of an ultrasonic signal in a transmission process, a second characteristic value corresponding to data detected by the acceleration sensor and a third characteristic value corresponding to data detected by the gyroscope sensor; and the model training module is used for training an initial model according to the training data set to obtain a trained preset model.
In some embodiments, the first characteristic value comprises a doppler effect area difference. The feature obtaining module 430 may be specifically configured to: acquiring the sending frequency of the ultrasonic signal sent by the ultrasonic sending module and the frequency range of the ultrasonic signal received by the ultrasonic receiving module; determining a first frequency variation interval and a second frequency variation interval based on the transmission frequency and the frequency range; calculating to obtain a first area according to the first frequency change interval and a first intensity change curve corresponding to the first frequency change interval; calculating to obtain a second area according to the second frequency change interval and a second intensity change curve corresponding to the second frequency change interval; and calculating the difference between the first area and the second area to obtain the Doppler effect area difference of the ultrasonic signals in the transmission process.
In some embodiments, the first feature value comprises a doppler effect area sum. The feature obtaining module 430 may be specifically configured to: acquiring the sending frequency of the ultrasonic signal sent by the ultrasonic sending module and the frequency range of the ultrasonic signal received by the ultrasonic receiving module; determining a first frequency variation interval and a second frequency variation interval based on the transmission frequency and the frequency range; calculating to obtain a first area according to the first frequency change interval and a first intensity change curve corresponding to the first frequency change interval; calculating to obtain a second area according to the second frequency change interval and a second intensity change curve corresponding to the second frequency change interval; and calculating the sum of the first area and the second area to obtain the Doppler effect area sum of the ultrasonic signals in the transmission process.
In some embodiments, the first characteristic value comprises an absolute value of a rate of change of amplitude of the ultrasound wave. The feature obtaining module 430 may be specifically configured to: acquiring a first ultrasonic amplitude corresponding to the ultrasonic signal received by the ultrasonic receiving module and a second ultrasonic amplitude corresponding to the ultrasonic signal received by the ultrasonic receiving module at the previous moment; and acquiring an absolute value of a difference value between the first ultrasonic amplitude and the second ultrasonic amplitude to obtain an absolute value of the ultrasonic amplitude change rate of the ultrasonic signal in the transmission process.
In some embodiments, the feature obtaining module 430 may be further specifically configured to generate a first feature vector according to the first detection data, and use the first feature vector as a second feature value corresponding to the first detection data; and generating a second feature vector according to the second detection data, and taking the second feature vector as a third feature value corresponding to the second detection data.
In some embodiments, the screen control module 440 may be specifically configured to: when the output result represents that the mobile terminal is relatively close to the object, controlling the display screen to be in a screen-off state; when the output result represents that the mobile terminal is relatively far away from the object, controlling the display screen to be in a bright screen state; and when the output result represents that the mobile terminal is static relative to the object, controlling the display screen to keep the current display state.
In some embodiments, the screen control module 440 may be further configured to, after controlling the state of the display screen of the mobile terminal according to the output result, control the display screen to be in a bright screen state or a dark screen state according to the control operation if the control operation on the state of the display screen is detected within a preset time period.
Further, the terminal control device 400 may further include a state labeling module and a model correcting module. The state labeling module is used for labeling the first characteristic value, the second characteristic value and the third characteristic value corresponding to the output result as a target moving state of the mobile terminal relative to the object, wherein the target moving state corresponds to the state of the display screen corresponding to the control operation; and the model correction module is used for inputting the first characteristic value, the second characteristic value and the third characteristic value marked with the target moving state into the preset model and carrying out correction training on the preset model.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and modules may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, the coupling between the modules may be electrical, mechanical or other type of coupling.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
To sum up, according to the scheme provided by the application, when the mobile terminal is in a call state, the ultrasonic signal is sent through the ultrasonic transmitting module, the ultrasonic signal returned by the object is received through the ultrasonic receiving module, the first detection data detected by the acceleration sensor and the second detection data detected by the gyroscope sensor are obtained, the first characteristic value, the second characteristic value corresponding to the first detection data and the third characteristic value corresponding to the second detection data of the ultrasonic signal in the transmission process are obtained, then the first characteristic value, the second characteristic value and the third characteristic value are input into the trained preset model to obtain an output result, the preset model is used for obtaining the mobile state of the mobile terminal relative to the object, and finally the state of the display screen of the mobile terminal is controlled according to the output result. Therefore, the moving state of the mobile terminal relative to the object can be obtained according to the ultrasonic characteristic value, the characteristic value of the data detected by the acceleration sensor and the characteristic value of the data detected by the gyroscope sensor by utilizing the preset model for obtaining the moving state of the mobile terminal relative to the object, the moving state of the mobile terminal relative to the object can be accurately detected, and the accuracy of state control of the display screen in the conversation process is improved.
Referring to fig. 8, a block diagram of a mobile terminal according to an embodiment of the present application is shown. The mobile terminal 100 may be a smart phone, a tablet computer, an electronic book, or other mobile terminal capable of running an application. The mobile terminal 100 in the present application may include one or more of the following components: processor 110, memory 120, display 130, ultrasound transmission module 140, ultrasound reception module 150, acceleration sensor 160, gyroscope sensor 170, and one or more applications, wherein the one or more applications may be stored in memory 120 and configured to be executed by the one or more processors 110, the one or more programs configured to perform the methods as described in the foregoing method embodiments.
Processor 110 may include one or more processing cores. The processor 110 interfaces with various components throughout the mobile terminal 100 using various interfaces and lines, and performs various functions of the mobile terminal 100 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 120 and invoking data stored in the memory 120. Alternatively, the processor 110 may be implemented in hardware using at least one of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 110 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing display content; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 110, but may be implemented by a communication chip.
The Memory 120 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). The memory 120 may be used to store instructions, programs, code sets, or instruction sets. The memory 120 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing various method embodiments described below, and the like. The storage data area may also store data created by the terminal 100 in use, such as a phonebook, audio-video data, chat log data, and the like.
The Display 130 is used for displaying information input by a user, information provided to the user, and various graphic user interfaces of the mobile terminal 100, which may be composed of graphics, text, icons, numbers, video, and any combination thereof, and in one example, the Display 130 may be a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), which is not limited herein.
The ultrasonic wave emitting module 140 is used for emitting ultrasonic waves, and the ultrasonic wave emitting device module 140 may be a receiver, a speaker, a dedicated ultrasonic wave emitter, and the like, which is not limited herein. The ultrasonic receiving module 150 is used for receiving ultrasonic waves, and the ultrasonic receiving module 150 may be a microphone, and the like, but is not limited thereto. The acceleration sensor 160 is a sensor capable of measuring acceleration, and the gyro sensor 170, also called an angular velocity sensor, is different from an accelerometer (G-sensor) and measures a physical quantity, i.e., a rotational angular velocity in yaw and tilt.
Referring to fig. 9, a block diagram of a computer-readable storage medium according to an embodiment of the present application is shown. The computer-readable medium 800 has stored therein a program code that can be called by a processor to execute the method described in the above-described method embodiments.
The computer-readable storage medium 800 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. Alternatively, the computer-readable storage medium 800 includes a non-volatile computer-readable storage medium. The computer readable storage medium 800 has storage space for program code 810 to perform any of the method steps of the method described above. The program code can be read from or written to one or more computer program products. The program code 810 may be compressed, for example, in a suitable form.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not necessarily depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.

Claims (9)

1. A terminal control method is applied to a mobile terminal, wherein the mobile terminal comprises an ultrasonic wave transmitting module, an ultrasonic wave receiving module, an acceleration sensor, a gyroscope sensor and a display screen, and the method comprises the following steps:
when the mobile terminal is in a call state, sending an ultrasonic signal through the ultrasonic transmitting module, and receiving an ultrasonic signal returned by the ultrasonic signal after encountering an object through the ultrasonic receiving module;
acquiring first detection data detected by the acceleration sensor and second detection data detected by the gyroscope sensor;
acquiring a first characteristic value of the ultrasonic signal in a transmission process, a second characteristic value corresponding to the first detection data and a third characteristic value corresponding to the second detection data;
inputting the first characteristic value, the second characteristic value and the third characteristic value into a trained preset model to obtain an output result, wherein the preset model is used for acquiring the moving state of the mobile terminal relative to an object;
when the output result represents that the mobile terminal is relatively close to the object, controlling the display screen to be in a screen-off state;
when the output result represents that the mobile terminal is relatively far away from the object, controlling the display screen to be in a bright screen state;
when the output result represents that the mobile terminal is static relative to the object, controlling the display screen to keep the current display state;
if the control operation on the state of the display screen is detected within the preset time, controlling the display screen to be in a bright screen state or a dead screen state according to the control operation, wherein the preset time is the time which can be responded by a user after the mobile terminal controls the state of the display screen according to the output result;
marking the first characteristic value, the second characteristic value and the third characteristic value corresponding to the output result as a target moving state of the mobile terminal relative to the object, wherein the target moving state corresponds to the state of the display screen corresponding to the control operation;
and inputting the first characteristic value, the second characteristic value and the third characteristic value marked with the target moving state into the preset model, and performing correction training on the preset model.
2. The method of claim 1, wherein before said sending an ultrasonic signal by said ultrasonic transmission module and receiving an ultrasonic signal returned by said ultrasonic signal after encountering an object by said ultrasonic reception module, said method further comprises:
acquiring a training data set, wherein the training data set comprises training data which are marked with the moving state of the mobile terminal relative to an object, and the training data comprise a first characteristic value of an ultrasonic signal in a transmission process, a second characteristic value corresponding to data detected by an acceleration sensor and a third characteristic value corresponding to data detected by a gyroscope sensor;
and training an initial model according to the training data set to obtain a trained preset model.
3. The method of claim 1, wherein the first characteristic value comprises a doppler effect area difference, and wherein the obtaining the first characteristic value of the ultrasonic signal during transmission comprises:
acquiring the sending frequency of the ultrasonic signal sent by the ultrasonic sending module and the frequency range of the ultrasonic signal received by the ultrasonic receiving module;
determining a first frequency variation interval and a second frequency variation interval based on the transmission frequency and the frequency range;
calculating to obtain a first area according to the first frequency change interval and a first intensity change curve corresponding to the first frequency change interval;
calculating to obtain a second area according to the second frequency change interval and a second intensity change curve corresponding to the second frequency change interval;
and calculating the difference between the first area and the second area to obtain the Doppler effect area difference of the ultrasonic signals in the transmission process.
4. The method of claim 1, wherein the first eigenvalue comprises a doppler effect area and the obtaining the first eigenvalue of the ultrasonic signal during transmission comprises:
acquiring the sending frequency of the ultrasonic signal sent by the ultrasonic sending module and the frequency range of the ultrasonic signal received by the ultrasonic receiving module;
determining a first frequency variation interval and a second frequency variation interval based on the transmission frequency and the frequency range;
calculating to obtain a first area according to the first frequency change interval and a first intensity change curve corresponding to the first frequency change interval;
calculating to obtain a second area according to the second frequency change interval and a second intensity change curve corresponding to the second frequency change interval;
and calculating the sum of the first area and the second area to obtain the Doppler effect area sum of the ultrasonic signals in the transmission process.
5. The method according to claim 1, wherein the first characteristic value comprises an absolute value of an amplitude change rate of the ultrasonic wave, and the acquiring the first characteristic value of the ultrasonic wave signal during transmission comprises:
acquiring a first ultrasonic amplitude corresponding to the ultrasonic signal received by the ultrasonic receiving module and a second ultrasonic amplitude corresponding to the ultrasonic signal received by the ultrasonic receiving module at the previous moment;
and acquiring an absolute value of a difference value between the first ultrasonic amplitude and the second ultrasonic amplitude to obtain an absolute value of the ultrasonic amplitude change rate of the ultrasonic signal in the transmission process.
6. The method of claim 1, wherein obtaining a second eigenvalue corresponding to the first detection data and a third eigenvalue corresponding to the second detection data comprises:
generating a first feature vector according to the first detection data, and taking the first feature vector as a second feature value corresponding to the first detection data;
and generating a second feature vector according to the second detection data, and taking the second feature vector as a third feature value corresponding to the second detection data.
7. The utility model provides a terminal control device which characterized in that is applied to mobile terminal, mobile terminal includes ultrasonic wave emission module, ultrasonic wave receiving module, acceleration sensor, gyroscope sensor and display screen, the device includes: a receiving and sending control module, a data acquisition module, a characteristic acquisition module, a screen control module, a state marking module and a model correction module, wherein,
the receiving and sending control module is used for sending an ultrasonic signal through the ultrasonic transmitting module when the mobile terminal is in a call state, and receiving an ultrasonic signal returned by the ultrasonic signal after encountering an object through the ultrasonic receiving module;
the data acquisition module is used for acquiring first detection data detected by the acceleration sensor and second detection data detected by the gyroscope sensor;
the characteristic acquisition module is used for acquiring a first characteristic value of the ultrasonic signal in a transmission process, a second characteristic value corresponding to the first detection data and a third characteristic value corresponding to the second detection data;
the screen control module is used for inputting the first characteristic value, the second characteristic value and the third characteristic value into a trained preset model to obtain an output result, and the preset model is used for acquiring the moving state of the mobile terminal relative to an object and controlling the state of a display screen of the mobile terminal according to the output result;
the screen control module controls the state of the display screen of the mobile terminal according to the output result, and the method comprises the following steps:
when the output result represents that the mobile terminal is relatively close to the object, controlling the display screen to be in a screen-off state;
when the output result represents that the mobile terminal is relatively far away from the object, controlling the display screen to be in a bright screen state;
when the output result represents that the mobile terminal is static relative to the object, controlling the display screen to keep the current display state;
the screen control module is further used for controlling the display screen to be in a bright screen state or a dead screen state according to the control operation if the control operation on the state of the display screen is detected within a preset time length, wherein the preset time length is a time length which can be responded after the mobile terminal controls the state of the display screen according to the output result;
the state labeling module is used for labeling the first characteristic value, the second characteristic value and the third characteristic value corresponding to the output result as a target moving state of the mobile terminal relative to the object, wherein the target moving state corresponds to the state of the display screen corresponding to the control operation;
the model correction module is used for inputting the first characteristic value, the second characteristic value and the third characteristic value marked with the target moving state into the preset model and carrying out correction training on the preset model.
8. A mobile terminal, comprising:
one or more processors;
a memory;
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the method of any of claims 1-6.
9. A computer-readable storage medium, having stored thereon program code that can be invoked by a processor to perform the method according to any one of claims 1 to 6.
CN201910702590.5A 2019-07-31 2019-07-31 Terminal control method and device, mobile terminal and storage medium Active CN110505341B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910702590.5A CN110505341B (en) 2019-07-31 2019-07-31 Terminal control method and device, mobile terminal and storage medium
PCT/CN2020/103305 WO2021017947A1 (en) 2019-07-31 2020-07-21 Terminal control method and device, mobile terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910702590.5A CN110505341B (en) 2019-07-31 2019-07-31 Terminal control method and device, mobile terminal and storage medium

Publications (2)

Publication Number Publication Date
CN110505341A CN110505341A (en) 2019-11-26
CN110505341B true CN110505341B (en) 2021-05-07

Family

ID=68586879

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910702590.5A Active CN110505341B (en) 2019-07-31 2019-07-31 Terminal control method and device, mobile terminal and storage medium

Country Status (2)

Country Link
CN (1) CN110505341B (en)
WO (1) WO2021017947A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110505341B (en) * 2019-07-31 2021-05-07 Oppo广东移动通信有限公司 Terminal control method and device, mobile terminal and storage medium
CN110916725A (en) * 2019-12-19 2020-03-27 上海尽星生物科技有限责任公司 Ultrasonic volume measurement method based on gyroscope
CN112890860A (en) * 2021-01-20 2021-06-04 李逸丰 5G-based ultrasonic detection method and device
CN114298105B (en) * 2021-12-29 2023-08-22 东莞市猎声电子科技有限公司 Signal processing method for quickly responding to wrist lifting action and brightening screen in running process
CN116466058B (en) * 2023-06-15 2023-09-05 上海博取仪器有限公司 Water quality detection data processing method, water quality evaluation system, equipment and medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108196778A (en) * 2017-12-29 2018-06-22 努比亚技术有限公司 Control method, mobile terminal and the computer readable storage medium of screen state
CN109710142A (en) * 2017-10-25 2019-05-03 华为技术有限公司 A kind of method and terminal for controlling terminal screen light on and off

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170138279A (en) * 2016-06-07 2017-12-15 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN106413060B (en) * 2016-10-24 2019-11-08 北京小米移动软件有限公司 Screen state control method and device
CN108234767B (en) * 2017-12-29 2021-07-23 努比亚技术有限公司 Control method of screen state, mobile terminal and computer readable storage medium
CN108562890B (en) * 2017-12-29 2023-10-03 努比亚技术有限公司 Method and device for calibrating ultrasonic characteristic value and computer readable storage medium
CN108989546B (en) * 2018-06-15 2021-08-17 Oppo广东移动通信有限公司 Approach detection method of electronic device and related product
CN109218538A (en) * 2018-11-29 2019-01-15 努比亚技术有限公司 Mobile terminal screen control method, mobile terminal and computer readable storage medium
CN110505341B (en) * 2019-07-31 2021-05-07 Oppo广东移动通信有限公司 Terminal control method and device, mobile terminal and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109710142A (en) * 2017-10-25 2019-05-03 华为技术有限公司 A kind of method and terminal for controlling terminal screen light on and off
CN108196778A (en) * 2017-12-29 2018-06-22 努比亚技术有限公司 Control method, mobile terminal and the computer readable storage medium of screen state

Also Published As

Publication number Publication date
CN110505341A (en) 2019-11-26
WO2021017947A1 (en) 2021-02-04

Similar Documents

Publication Publication Date Title
CN110505341B (en) Terminal control method and device, mobile terminal and storage medium
CN108615526B (en) Method, device, terminal and storage medium for detecting keywords in voice signal
CN111049979B (en) Application sharing method, electronic equipment and computer readable storage medium
CN110865710B (en) Terminal control method and device, mobile terminal and storage medium
EP3805982B1 (en) Gesture recognition method, apparatus and device
CN111402866B (en) Semantic recognition method and device and electronic equipment
CN108683850B (en) Shooting prompting method and mobile terminal
CN108391008B (en) Message reminding method and mobile terminal
CN107734170B (en) Notification message processing method, mobile terminal and wearable device
CN110795007B (en) Method and device for acquiring screenshot information
WO2021017927A1 (en) Method and device for controlling apparatus, electronic apparatus, and storage medium
CN112788583B (en) Equipment searching method and device, storage medium and electronic equipment
CN109189303B (en) Text editing method and mobile terminal
CN111026305A (en) Audio processing method and electronic equipment
US20230019967A1 (en) Electronic device, and interaction method and device
CN110830368A (en) Instant messaging message sending method and electronic equipment
CN110418023B (en) Ring processing method, device, mobile terminal and storage medium
CN110798327B (en) Message processing method, device and storage medium
CN110493459B (en) Screen state control method and device, mobile terminal and storage medium
CN109117037B (en) Image processing method and terminal equipment
CN110505660B (en) Network rate adjusting method and terminal equipment
CN112870697A (en) Interaction method, device, equipment and medium based on virtual relationship formation program
CN112653789A (en) Voice mode switching method, terminal and storage medium
CN110248401B (en) WiFi scanning control method and device, storage medium and mobile terminal
CN109286726B (en) Content display method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant