CN115580677A - Method for controlling equipment, electronic equipment and storage medium - Google Patents

Method for controlling equipment, electronic equipment and storage medium Download PDF

Info

Publication number
CN115580677A
CN115580677A CN202211174569.0A CN202211174569A CN115580677A CN 115580677 A CN115580677 A CN 115580677A CN 202211174569 A CN202211174569 A CN 202211174569A CN 115580677 A CN115580677 A CN 115580677A
Authority
CN
China
Prior art keywords
mobile phone
electronic device
screen
electronic equipment
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211174569.0A
Other languages
Chinese (zh)
Other versions
CN115580677B (en
Inventor
曹明君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202211174569.0A priority Critical patent/CN115580677B/en
Publication of CN115580677A publication Critical patent/CN115580677A/en
Application granted granted Critical
Publication of CN115580677B publication Critical patent/CN115580677B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72454User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions according to context-related or environment-related conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72448User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions
    • H04M1/72463User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions to restrict the functionality of the device
    • H04M1/724631User interfaces specially adapted for cordless or mobile telephones with means for adapting the functionality of the device according to specific conditions to restrict the functionality of the device by limiting the access to the user interface, e.g. locking a touch-screen or a keypad
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/06Authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Computer Security & Cryptography (AREA)
  • Environmental & Geological Engineering (AREA)
  • Telephone Function (AREA)

Abstract

The application discloses a method for controlling equipment, electronic equipment and a storage medium, and relates to the technical field of electronics. Therefore, the method and the device can help the user to operate the first electronic device under the condition that the screen cannot be used, and user experience is improved. In addition, the first electronic device needs to be in an unlocking state when establishing connection with the second electronic device, and if the first electronic device is in a locking state, the first electronic device needs to be unlocked first, so that the safety of data in the first electronic device is ensured.

Description

Method for controlling equipment, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of electronic technologies, and in particular, to a method for controlling a device, an electronic device, and a storage medium.
Background
At present, electronic devices with screens, such as personal mobile phones and tablet computers, have become common tools in daily life, but the screens of the electronic devices are easily damaged, and after the screens are damaged, the electronic devices cannot normally display or do not respond to the touch screen operation of a user, so that the user cannot operate the electronic devices and cannot acquire data stored in the electronic devices, and inconvenience is brought to the user.
Disclosure of Invention
The application provides a method for controlling equipment, electronic equipment and a storage medium, which can control the electronic equipment under the condition that a screen of the electronic equipment is damaged. The technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a method for controlling a device, which is applied to a first electronic device, and the method includes: under the condition that the screen of first electronic equipment is detected to be damaged, connection with second electronic equipment is established, and the first electronic equipment is in an unlocking state when the connection with the second electronic equipment is established; the method comprises the steps of projecting an operation interface of first electronic equipment to a display screen of second electronic equipment; the first electronic device is operated by the second electronic device.
According to the method provided by the embodiment of the application, under the condition that the screen cannot be used, a user can operate the first electronic device through the second electronic device, the data of the first electronic device are obtained, emergency matters such as incoming calls and information of the first electronic device are processed, if the user replaces the first electronic device, the data in the first electronic device can be automatically transferred, the display screen does not need to be replaced or a maintainer needs not to be found to repair the display screen, the maintenance cost of the user is reduced, and the user experience is improved. In addition, the first electronic device needs to be in an unlocked state when establishing connection with the second electronic device, and the first electronic device and the second electronic device cannot be connected in the locked state, so that the safety of data in the first electronic device is guaranteed.
The first electronic device can detect whether a screen of the first electronic device is damaged or not when the first electronic device is started up every time. Or, the first electronic device may periodically acquire information related to a screen state of the first electronic device, and detect whether a screen of the first electronic device is damaged.
With reference to the first aspect, in certain implementations of the first aspect, in a case that a screen of the first electronic device is detected to be damaged, establishing a connection with a second electronic device includes: under the condition that the screen of the first electronic device is detected to be damaged, outputting fourth prompt information, and under the condition that the fourth prompt information prompts whether a user needs to judge whether the screen of the first electronic device is damaged, controlling the first electronic device to be connected with the second electronic device; receiving a third instruction, wherein the third instruction is obtained based on the fourth prompt message; and establishing connection with the second electronic equipment according to the third instruction.
For example, the first electronic device outputs a fourth prompt message "whether the current screen is damaged and needs to be connected with other electronic devices" through voice, the user inputs a third instruction according to the fourth prompt message, for example, the user inputs a "need", "yes", and other instructions, and the first electronic device executes the processes of unlocking, connecting with the second electronic device, and the like after receiving the instruction. If the user inputs the instructions of 'not needed' and 'not used', and the like, the first electronic device receives the instructions and then ends the process.
With reference to the first aspect and the foregoing implementation manners, in some implementation manners of the first aspect, when it is detected that a screen of the first electronic device is damaged, the first electronic device is in a locked state, and the method further includes: receiving a first instruction in a locked state; and controlling the first electronic equipment to be in an unlocking state according to the first instruction.
That is, if the first electronic device is in the locked state, the user needs to input the first instruction to unlock the first electronic device.
With reference to the first aspect and the foregoing implementations, in some implementations of the first aspect, before receiving the first instruction, the method further includes: and outputting first prompt information according to the set unlocking mode of the first electronic equipment, wherein the first prompt information prompts a user to unlock the first electronic equipment by using the set unlocking mode of the first electronic equipment, and the first instruction is obtained based on the first prompt information.
In the embodiment of the present application, under the condition that the first electronic device is in the locked state, the first electronic device obtains an unlocking manner set by itself, and the unlocking manner includes: fingerprint unlocking, human face unlocking, intelligent unlocking and the like. The first electronic equipment outputs first prompt information according to the set unlocking mode, the first electronic equipment can use voice output, after receiving the first prompt information, a user can input a first instruction corresponding to the set unlocking mode according to the first prompt information, and then the first electronic equipment controls the self screen locking state to be the unlocking state according to the first instruction. For example, when the set unlocking manner is fingerprint unlocking, the first prompt message may be "please unlock using a fingerprint", and the user inputs his fingerprint at the position where the fingerprint of the first electronic device is unlocked, so as to unlock the first electronic device.
With reference to the first aspect and the foregoing implementation manners, in some implementation manners of the first aspect, the foregoing method further includes: determining whether the first electronic device currently supports the set unlocking mode; outputting first prompt information, including: and outputting first prompt information under the condition that the first electronic equipment currently supports the set unlocking mode, or prompting a user that the first electronic equipment currently does not support the unlocking of the first electronic equipment by using the set unlocking mode under the condition that the first electronic equipment currently does not support the set unlocking mode.
In the embodiment of the application, after acquiring the self-set unlocking mode, the first electronic device determines whether the first electronic device supports the set unlocking mode at present; for example, a camera is required to be used for unlocking a human face, when the set unlocking mode is human face unlocking, the first electronic device determines whether the camera can be normally used, if the camera can be normally used, the first electronic device currently supports unlocking by using the human face unlocking mode, and under the condition, the first electronic device outputs first prompt information. If the camera cannot be normally used, the first electronic device does not currently support unlocking in a face unlocking mode, and under the condition, the first electronic device prompts a user that the first electronic device does not currently support unlocking in a set unlocking mode.
With reference to the first aspect and the foregoing implementation manners, in some implementation manners of the first aspect, before establishing a connection with the second electronic device, the method further includes: and outputting second prompt information, wherein the second prompt information prompts a user to establish connection between the first electronic equipment and the second electronic equipment in a target connection mode.
The target connection mode comprises Bluetooth connection or NFC connection.
And the first electronic equipment opens connection functions such as Bluetooth, wi-Fi and NFC in an unlocked state, outputs second prompt information through voice, and prompts a user to establish connection between the first electronic equipment and the second electronic equipment through Bluetooth connection or NFC connection. The first electronic equipment searches other electronic equipment in real time, and after receiving the second prompt message, the user selects one second electronic equipment, places the second electronic equipment and the first electronic equipment in a connectable range, so that the first electronic equipment can search the second electronic equipment to connect. Or after receiving the second prompt message, the user may search the first electronic device using the second electronic device to connect.
With reference to the first aspect and the foregoing implementation manners, in some implementation manners of the first aspect, establishing a connection with a second electronic device includes: authenticating the second electronic device; and establishing connection with the second electronic equipment under the condition that the authentication is passed.
In the embodiment of the application, after the first electronic device searches the second electronic device, the second electronic device needs to be authenticated, and the first electronic device can be successfully connected with the second electronic device only when the authentication is passed, so that the safety of data in the first electronic device is ensured. In an implementation manner, after the first electronic device searches for the second electronic device, the trusted device information is queried, and if the second electronic device is a trusted device, it is indicated that the first electronic device and the second electronic device are not connected for the first time. If the second electronic equipment is untrustworthy, it is indicated that the first electronic equipment is connected with the second electronic equipment for the first time, and under the condition, the first electronic equipment outputs third prompt information through voice, and the third prompt information prompts whether a user agrees to establish connection between the first electronic equipment and the second electronic equipment; and after receiving the third prompt message, the user inputs a second instruction according to the third prompt message, and the first electronic equipment authenticates the second electronic equipment according to the second instruction. And under the condition that the second instruction indicates that the user agrees that the first electronic equipment establishes connection with the second electronic equipment, the second electronic equipment passes the authentication. And under the condition that the second instruction indicates that the user does not agree that the first electronic equipment and the second electronic equipment are connected, the second electronic equipment does not pass the authentication, and the connection is interrupted.
In another implementation manner, after the first electronic device searches for the second electronic device, no matter whether the second electronic device has established a connection with the first electronic device, the connection outputs the third prompt message, receives the second instruction, and authenticates the second electronic device according to the second instruction.
In a second aspect, an embodiment of the present application provides an electronic device, including: one or more processors; one or more memories; the memory stores one or more programs that, when executed by the processor, cause the electronic device to perform any of the possible methods of the first aspect described above.
In a third aspect, the present application provides an apparatus, which is included in an electronic device, and has a function of implementing the behavior of the electronic device in the foregoing aspects and possible implementations of the foregoing aspects. The functions may be implemented by hardware, or by hardware executing corresponding software. The hardware or software includes one or more modules or units corresponding to the above-described functions. Such as a display module or unit, a detection module or unit, a processing module or unit, etc.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having stored therein instructions, which, when executed on a computer, cause the computer to perform the method of the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method of the first aspect.
The technical effects obtained by the second, third, fourth and fifth aspects are similar to the technical effects obtained by the corresponding technical means in the first aspect, and are not described herein again.
Drawings
Fig. 1 shows one of the flow diagrams of a method for controlling a device provided by the embodiment of the present application;
FIG. 2 illustrates a function explanation interface of an emergency collaboration service provided by an embodiment of the application;
fig. 3 is a schematic diagram illustrating a mobile phone interacting with a user to enter an emergency collaborative service according to an embodiment of the present application;
fig. 4 is a schematic diagram illustrating intelligent unlocking of a mobile phone through interaction with a user according to an embodiment of the present application;
fig. 5 is a second flowchart of a method for controlling a device according to an embodiment of the present application;
fig. 6 is a schematic flowchart illustrating a process of establishing multi-screen collaboration with bluetooth through voice according to an embodiment of the present application;
fig. 7 illustrates a flow chart of establishing multi-screen collaboration with NFC through voice according to an embodiment of the present application;
fig. 8 is a schematic flowchart illustrating a process for operating a mobile phone through a tablet computer according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an example of the mobile phone 100 according to the embodiment of the present application;
FIG. 10 is a block diagram illustrating an example of a software architecture provided by an embodiment of the present application;
FIG. 11 is an interactive illustration of the cooperation between the various software structures of FIG. 10 to implement the method of the present application;
FIG. 12 is an interactive schematic diagram showing cooperation between the various software structures of FIG. 10 to implement the method of the present application;
fig. 13 is an interaction diagram illustrating cooperation between the software structures in fig. 10 to implement the method of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings. In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present embodiment, "a plurality" means two or more unless otherwise specified.
The term "screen projection" in the embodiment of the present application means that data of a display interface on one electronic device is transmitted to another electronic device, so that the other electronic device displays the same display interface.
In the embodiment of the application, the screen locking state of the electronic device comprises a locking state and an unlocking state. In the locked state, the user can only use part of the functions of the electronic equipment, such as answering a telephone, emergency dialing and the like. Under the condition that the screen locking password is set, the user can enter the unlocking state only by inputting the password or unlocking in other modes. In the unlocked state, the user can use all functions of the electronic device, such as opening an application, making function settings, and the like.
At present, electronic devices with screens, such as personal mobile phones and tablet computers, have become common tools in daily life of people, but the screens of the electronic devices are easily damaged, and after the screens are damaged, the electronic devices cannot normally display or do not respond to touch screen operations of users, so that the users cannot operate the electronic devices and cannot acquire data stored in the electronic devices, and inconvenience is brought to the users. If the user needs to acquire data on the electronic device, the display screen can be replaced or a maintenance worker can be found to repair the display screen, but extra cost is required.
In view of this, the present application provides a method for controlling a device, which is applied to a first electronic device, where the first electronic device establishes a connection with a second electronic device when detecting that a screen of the first electronic device is damaged, projects an operation interface of the first electronic device onto a display screen of the second electronic device, and operates the first electronic device through the second electronic device. After the operation interface of the first electronic device is projected to the display screen of the second electronic device, a window is displayed on the display screen of the second electronic device, the window displays the operation interface of the first electronic device, and a user can interact with the first electronic device through the window. In addition, the first electronic device needs to be in an unlocking state when establishing connection with the second electronic device, and if the first electronic device is in a locking state, the first electronic device needs to be unlocked first, so that the safety of data in the first electronic device is ensured.
The first electronic device may be a mobile phone, a tablet computer, a PC, an ultra-mobile personal computer (UMPC), an in-vehicle device, a netbook, a Personal Digital Assistant (PDA), and the like, and the second electronic device may be a mobile phone, a tablet computer, a PC, and the like having a display screen. The method for controlling the device is described below with reference to fig. 1 to 13 by taking the first electronic device as a mobile phone as an example.
First, a process of confirming a screen locking state when the mobile phone detects that a screen is damaged is described. Fig. 1 shows a schematic flowchart of a method for controlling a device according to an embodiment of the present application, where the method includes, as shown in fig. 1, the following steps S101 to S111.
S101, the mobile phone detects whether a screen of the mobile phone is damaged or not, and if the screen is not damaged, the mobile phone operates according to a conventional process; if the screen is damaged, the following S103-S111 are performed.
The screen damage may be damage that causes the user to be unable to touch the cell phone through the screen, such as: the method comprises the following steps of breaking all or partial areas of a screen, failing to touch the screen, and blanking all or partial areas of the screen. In the embodiment of the application, the mobile phone can acquire the information related to the screen state in real time to detect whether the screen is damaged, and can periodically acquire the information related to the screen state according to a preset period to detect whether the screen is damaged. The mobile phone can also acquire information related to the screen state when the mobile phone is started up each time so as to detect whether the screen is damaged. The mobile phone can also set a monitoring event, and a TP/LCD drive is used for monitoring whether the screen is damaged or not. The method for determining whether the screen is damaged or not is not limited, and for example, the information related to the screen state may include one or more of information such as voltage, current, capacitance, screen sensitivity, screen gray scale, light intensity around the screen, and the like of the screen.
In the embodiment of the application, the whole process can be called as an emergency cooperative service, the connection with the second electronic device is established as multi-screen cooperation under the emergency cooperative service, and the second electronic device has a touch screen (also called as a display screen or a touch screen).
In one implementation, after the mobile phone detects that the mobile phone has dangerous behaviors such as falling, collision and the like through the acceleration sensor and the collision sensor, a description of the emergency cooperative service is displayed to a user, where the description content may be that the mobile phone can be restarted when a screen is damaged (a black screen, an inoperable screen and the like), the emergency cooperative service can be triggered after the mobile phone is started, a multi-screen cooperation is established with other devices, and the mobile phone is operated through other devices.
Alternatively, the mobile phone displays the description of the "emergency collaborative service" on the target interface, for example, as shown in fig. 2 (a), the mobile phone is installed with the "machine play skill" APP, and the user can click on the "machine play skill" icon to enter the APP. In the "machine playing skill" APP, as shown in fig. 2 (b), the user can know various functions of the mobile phone, such as emergency cooperative service, video entertainment, smart life, smart travel, and the like. The user clicks the emergency cooperative service to enter the explanation interface of the emergency cooperative service, as shown in (c) of fig. 2, the content displayed on the interface introduces the function and the use method of the emergency cooperative service, so that the user knows that the user can restart the mobile phone to trigger the emergency cooperative service, and the user experience is improved.
S102, under the condition that the damage of the screen is not detected, the mobile phone operates according to a conventional flow.
S103, under the condition that the damage of the screen is detected, the mobile phone outputs fourth prompt information to prompt a user whether to control the mobile phone to enter an emergency cooperative service or not.
In an implementation manner, when the mobile phone detects that the screen is damaged, the fourth prompt message is output first, and the fourth prompt message prompts whether the user needs to connect the mobile phone with the second electronic device when the screen of the mobile phone is damaged, that is, whether the mobile phone is controlled to enter the emergency cooperative service.
In one implementation, the mobile phone may automatically enter the emergency cooperative service when detecting the screen damage, and steps S103 to S107 may be replaced with: and acquiring the screen locking state of the mobile phone under the condition of detecting the damage of the screen. In this implementation, the mobile phone may output a voice prompt to prompt the user that the mobile phone is entering the emergency cooperative service, so that the user can know the current state of the mobile phone in time. For example, the mobile phone may output "prepare to connect with the second electronic device, please complete the relevant operation according to the voice prompt".
And S104, the mobile phone opens the microphone and monitors the voice command.
S105, the mobile phone detects whether a voice command is received.
As the current screen is damaged, the user may not be able to use the touch function, as shown in fig. 3, the mobile phone may output the fourth prompt message in a voice manner, and turn on the microphone, and detect whether the voice command of the user is received through the microphone in real time. The content of the fourth prompting message may be "please confirm whether to enter the emergency cooperative service, please reply to enter if confirming to enter the emergency cooperative service, and please reply to not enter if confirming not to enter the emergency cooperative service". After receiving the prompt message, the user replies to the prompt message according to the actual situation, and the reply mode may be voice, for example, the user replies "enter", and the mobile phone enters the emergency cooperative service after receiving the voice command input by the user, or the user replies "do not enter", and the mobile phone exits the process after receiving the voice command input by the user. The content of the fourth prompting message is not limited in the present application. Optionally, the content of the fourth prompt message may also be "whether the current mobile phone screen is damaged, and is connected with other electronic devices, or please reply to the connection or not.
In some implementations, the user may not reply according to the specified password, and after receiving the voice command input by the user, the mobile phone may identify the content input by the user through semantic recognition to confirm whether to enter the emergency collaborative service.
And S106, if the voice command is not received after overtime, or the voice command indicates that the emergency cooperative service is not entered, the mobile phone ends the emergency cooperative service.
The mobile phone starts timing after outputting the prompt message, and when the time length is greater than a preset threshold, the mobile phone automatically ends the emergency cooperative service, for example, after outputting the prompt message, the timing time length reaches 1 minute, and the mobile phone does not receive a voice instruction within 1 minute, which indicates that the user is not ready to perform multi-screen cooperation currently, and then the mobile phone automatically ends the emergency cooperative service. If the user needs to perform multi-screen cooperation at some later time, the mobile phone can be restarted to trigger the mobile phone to enter the emergency cooperation service. The voice instructions mentioned in steps S104 to S107 correspond to the third instruction.
And S107, the mobile phone receives the voice command and acquires the current screen locking state.
The screen locking state is divided into an unlocking state and a locking state. After the emergency cooperative service is entered, if the mobile phone is in an unlocked state, the mobile phone can start to establish connection with the second electronic device; if the mobile phone is in the locked state, the mobile phone needs to be unlocked first, so that the mobile phone is in the unlocked state, and the connection with the second electronic device can be established, thereby ensuring the safety of data in the mobile phone.
And S108, in the unlocking state, the mobile phone is connected with the second electronic equipment.
And S109, outputting the first prompt message by the mobile phone in the locked state.
In the locked state, the mobile phone can output first prompt information to prompt the user to unlock the mobile phone, which requires the user to input an unlocking instruction. For example, the content of the first prompt message may be "please unlock". And then the user inputs an unlocking instruction according to the use habit to unlock the mobile phone. The unlock instruction mentioned in steps S109 to S111 corresponds to the first instruction.
In one implementation, S109 may include S1091 and S1092 described below.
S1091, in the locked state, the mobile phone obtains the unlocking mode set by the mobile phone.
S1092, the mobile phone outputs a first prompt message according to the set unlocking mode.
The unlocking mode can comprise fingerprint unlocking, human face unlocking, intelligent unlocking and the like, the mobile phone outputs first prompt information through voice according to the set unlocking mode to prompt a user to unlock the mobile phone by using the set unlocking mode of the mobile phone, and the situation that the user needs to try for many times under the condition that the user does not know which unlocking mode to use is avoided, so that the user experience is improved. In addition, after the mobile phone acquires the unlocking mode set by the mobile phone, devices such as a camera and a bluetooth required for unlocking need to be opened so as to ensure that the mobile phone can receive the unlocking instruction.
For example, the user sets fingerprint unlocking before the screen is damaged, the mobile phone may output a first prompt message "please use fingerprint to unlock", and the user may press the position of the fingerprint sensor with a finger to unlock the mobile phone according to the first prompt message. For example, the user sets face unlocking before the screen is damaged, the mobile phone may output a first prompt message "please use face unlocking", and the user may aim the face at the camera to unlock the mobile phone according to the first prompt message. For example, before the screen is damaged, the user sets fingerprint unlocking and face unlocking, the mobile phone may output a first prompt message "please use fingerprint unlocking or face unlocking", and the user correspondingly inputs an unlocking instruction through fingerprint unlocking or face unlocking according to the first prompt message.
The intelligent unlocking refers to that a user sets the Bluetooth devices such as the intelligent bracelet and the intelligent watch into devices for intelligent unlocking, and when the mobile phone detects that the Bluetooth devices such as the intelligent bracelet and the intelligent watch are connected with the mobile phone, the mobile phone can be unlocked without password, fingerprint or face authentication. As shown in fig. 4, when the mobile phone obtains that the set unlocking manner is the smart unlocking, the mobile phone opens the bluetooth and searches for nearby devices, and after finding the bluetooth devices such as the smart band and the smart watch, the screen locking state is controlled to be the unlocking state. In some scenes, the distance between the mobile phone and the bluetooth device is far, the mobile phone cannot search the bluetooth devices such as the smart band and the smart watch, and at this time, the mobile phone can output first prompt information, for example, "please get the mobile phone close to the smart band or the smart watch", according to the first prompt information, the user can place the smart band or the smart watch in a range where the mobile phone can search (i.e., input an unlocking instruction), and then the mobile phone searches the bluetooth device, which indicates that the mobile phone is currently in a reliable and safe scene, and the mobile phone can unlock by itself.
According to the different models of mobile phones, fingerprint unlocking can be divided into screen fingerprint unlocking, side fingerprint unlocking, back fingerprint unlocking and the like, when the mobile phone falls off and collides, and the screen is damaged, functions such as a camera and screen fingerprints can not be used, and under the condition, the mobile phone can not support face unlocking or screen fingerprint unlocking.
In one implementation of the embodiment of the present application, S109 may include S1091, S1093, and S1094 described below.
S1091, in the locked state, the mobile phone obtains the unlocking mode set by the mobile phone.
S1093, the mobile phone determines whether the mobile phone currently supports unlocking using the set unlocking manner.
And S1094, under the condition that the unlocking is currently supported by using the set unlocking mode, the mobile phone outputs first prompt information according to the supported unlocking mode.
For example, when the set unlocking modes including the unlocking of the fingerprint under the screen and the unlocking of the human face are acquired by the mobile phone, whether the camera function and the unlocking function of the fingerprint under the screen are normal is detected, for example, whether the camera is in a preset position or not and whether the number of the cameras is a preset number or not can be detected through the driving of the camera by the mobile phone, so that whether the camera can be normally used or not is determined. For example, the mobile phone can acquire information such as temperature and working current of the fingerprint sensor under the screen through fingerprint driving so as to determine whether the unlocking function of the fingerprint under the screen can be normally used.
When the camera is abnormal and the fingerprint unlocking function under the screen is normal, the mobile phone does not support face unlocking, but supports fingerprint unlocking, and the mobile phone can output first prompt information to prompt a user to use fingerprint unlocking, namely, unlocking by using a set unlocking mode supported by the mobile phone. When the camera is normal and the function of fingerprint unlocking under the screen is abnormal, the mobile phone supports face unlocking but does not support fingerprint unlocking, and the mobile phone can output first prompt information to prompt a user to use face unlocking. Under the condition that the camera and the under-screen fingerprint unlocking function are normal, the mobile phone can output first prompt information to prompt a user to unlock the mobile phone by using a face or a fingerprint. Under the condition that the camera and the under-screen fingerprint unlocking function are abnormal, the mobile phone does not support fingerprint unlocking or face unlocking, namely a plurality of set unlocking modes are not supported, the mobile phone can output prompt information to prompt a user that unlocking cannot be performed currently, for example, the content of the prompt information can be 'face unlocking function and fingerprint unlocking function are abnormal, unlocking cannot be performed currently', and under the condition, the mobile phone finishes emergency cooperative service.
In some cases, the mobile phone starts timing after outputting the first prompt message, the mobile phone may not receive the unlocking instruction after the user inputs the unlocking instruction, and when the mobile phone does not detect the unlocking instruction for a long time (for example, the timing duration is greater than 2 minutes), the mobile phone outputs the prompt message, for example, the content of the prompt message may be "no human face unlocking detected or fingerprint unlocking detected, please input again", so as to avoid the user waiting for a long time. If the mobile phone does not detect the unlocking instruction after prompting the user to input the unlocking instruction for multiple times, the mobile phone ends the emergency cooperative service and outputs the prompt information to prompt the user, for example, the content of the prompt information may be "unlocking failure, quitting the emergency cooperative service", or "unlocking failure, please restart the mobile phone later", etc.
S110, the mobile phone receives the first instruction.
And S111, controlling the mobile phone to be in an unlocking state according to the first instruction.
The following describes a process of establishing a connection between the mobile phone and the second electronic device. Fig. 5 is a schematic flowchart illustrating a method for controlling a device according to an embodiment of the present application, where as shown in fig. 5, the method includes the following steps S501 to S507.
S501, in the unlocking state, the mobile phone sets a flag bit, and the flag bit is used for judging whether the mobile phone needs to perform multi-screen cooperation in a voice mode.
The flag bit is a variable identifier, and the flag bit set after unlocking under the condition of screen damage is different from the flag bit set after normal unlocking, for example, the flag bit set after unlocking under the condition of screen damage is marked as True, which indicates that the mobile phone needs to perform multi-screen cooperation in a voice manner. And marking the flag bit as False after unlocking under the normal condition of the screen, and indicating that the mobile phone performs multi-screen cooperation by using a touch mode.
And S502, starting connection functions such as Bluetooth, wi-Fi and NFC by the mobile phone, and outputting second prompt information.
The mobile phone starts connection functions such as a wireless fidelity (Wi-Fi) network, bluetooth (BT), near Field Communication (NFC), and determines that the connection functions such as the Bluetooth, the Wi-Fi and the NFC can be normally used. After the connection function is started, the mobile phone outputs second prompt information to prompt a user to establish connection between the mobile phone and the second electronic device through the connection modes such as the Bluetooth, the Wi-Fi and the NFC, and the user is reminded to operate on other devices, so that the user can select a more convenient connection mode in time, and user experience is improved. For example, the content of the second prompt message may be "bluetooth is turned on, please bring the mobile phone close to the tablet computer and turn on the multi-screen coordination function of the tablet computer", or "NFC is turned on, please bring the mobile phone close to the tablet computer for multi-screen coordination", or "bluetooth and NFC are turned on, please bring the mobile phone close to other devices and turn on the multi-screen coordination function of other devices", or "bluetooth is turned on, please initiate connection on other devices", and the like. The embodiment of the present application is not limited thereto.
S503, when the mobile phone searches the second electronic device or receives a connection request of the second electronic device, inquiring the zone bit.
The mobile phone searches surrounding electronic equipment in real time after Bluetooth and Wi-Fi are started. The user can place the second electronic device and the mobile phone in a close range according to the prompt information and the device to be used is taken as the second electronic device, then the mobile phone can search the second electronic device, and the user needs to confirm whether to connect with the second electronic device. Or, the user places the second electronic device and the mobile phone within a close range, searches the mobile phone by using the second electronic device, and sends a connection request to the mobile phone after searching the mobile phone, and the user needs to confirm whether to accept the connection request of the second electronic device.
The mobile phone needs to prompt the user so that the user can know whether the searched device is the second electronic device, and in order to confirm the prompting mode, when the second electronic device is searched or a connection request of the second electronic device is received, the mobile phone inquires the flag bit and determines whether the voice prompting mode is used for prompting the user.
S504, the mobile phone outputs third prompt information through voice to prompt whether the user agrees to connect the mobile phone with the second electronic device.
The second electronic device is taken as a tablet computer for illustration. Fig. 6 illustrates a process of performing multi-screen collaboration between a mobile phone and a tablet pc through bluetooth, and as shown in (a) in fig. 6, a user slides out a notification panel from a status bar of the tablet pc, clicks "multi-screen collaboration" in the notification panel, and displays a multi-screen collaboration function frame on the tablet pc. The user places the mobile phone and the tablet computer together (within about 20 cm), the mobile phone queries the flag bit after searching for the tablet computer, and outputs third prompt information through voice under the condition that the flag bit is True, as shown in (b) in fig. 6, where the third prompt information includes an identifier of the tablet computer, such as information of an internet protocol address (IP address), an MAC address, a universal unique identifier (unique identifier, uuid), a device identifier, and a device name. In view of the fact that the time for voice broadcast of the mobile phone is not suitable to be too long, the mobile phone may not be capable of outputting the information of the tablet computer through voice. Therefore, the third prompt message may only include the device identifier (e.g., device name) of the tablet computer. For example, the content of the third prompt message may be "please confirm whether to establish a connection with the tablet computer".
The mobile phone monitors a voice instruction (second instruction) of the user through the microphone, and in a case that the voice instruction indicates that the connection between the mobile phone and the tablet computer is confirmed, for example, the content of the voice instruction input by the user may be "connect", "confirm", "yes", and the like, the mobile phone sends a connection request to the tablet computer. As shown in fig. 6 (c), after receiving the connection request of the mobile phone, the tablet pc pops up a prompt box 61, and the content of the prompt box 61 may be "whether to allow the mobile phone to connect me". As shown in (d) in fig. 6, after the user clicks "allow" on the tablet computer, the mobile phone and the tablet computer successfully establish a connection, and the interface of the mobile phone is displayed on the tablet computer. In the case that the voice instruction indicates that the mobile phone is not authorized to establish connection with the tablet computer, for example, the content of the voice instruction input by the user may be "not connected", "not authorized", and the like, and the mobile phone ends the emergency cooperative service.
Fig. 7 shows a process of multi-screen coordination between the mobile phone and the tablet pc through NFC, as shown in (a) of fig. 7, it is first ensured that the tablet pc 71 is successfully paired with the external keyboard 72, as shown in (b) of fig. 7, an NFC region is provided on the external keyboard 72, for example, a "shfit" key 721 is a keyboard NFC region. Then, as shown in (c) of fig. 7, the back NFC region of the mobile phone 73 is used to touch the keyboard NFC region 721, and after the mobile phone detects the tablet computer, the mobile phone queries the flag bit, and in a case that the flag bit is True, as shown in (d) of fig. 7, the mobile phone outputs a third prompt message by voice, for example, "please confirm whether to establish a connection with the tablet computer". The mobile phone monitors a voice instruction (second instruction) of the user through the microphone, after the user confirms connection, for example, the user replies "yes", the tablet receives a connection request of the mobile phone, and then, a prompt box 74 pops up, as shown in (e) in fig. 7, the content of the prompt box may be "whether to allow the mobile phone to connect me", after the user clicks "allow" on the tablet, the mobile phone and the tablet successfully establish connection, as shown in (f) in fig. 7, an interface of the mobile phone is displayed on the tablet.
And S505, receiving a second instruction, and determining whether the user agrees to connect according to the second instruction.
S506, under the condition that the second instruction indicates that the connection between the mobile phone and the tablet personal computer is allowed, the mobile phone and the second electronic device are successfully connected.
As shown in fig. 8, after the mobile phone is successfully connected with the tablet pc, a mobile phone window 81 appears on the desktop of the tablet pc, and a user can interact with the mobile phone on the tablet pc through the window 81, and operate the mobile phone through the tablet pc, thereby realizing fast file transmission, short message check, incoming call answering, various functions of the mobile phone, APP, and the like. For example, as shown in fig. 8 (a), when a user needs to transmit a photo in a mobile phone to a tablet pc, the user can enter a mobile phone album through the window 81, and can directly browse all photos on the mobile phone through the window 81 on a desktop of the tablet pc, and the user's selection operation can be completed through a keyboard and a mouse of the tablet pc. As shown in fig. 8 (b), after the user selects a photo of the mobile phone on the tablet pc, the user may press the left mouse button for a long time to start the function of dragging and dropping the file, and then drag and drop the selected photo 82 of the mobile phone onto the tablet pc directly, thereby completing the transmission.
And S507, under the condition that the second instruction indicates that the mobile phone is not approved to be connected with the tablet personal computer, ending the emergency cooperative service.
The currently connected tablet computer may not be the target device of the user, the user does not agree with the connection between the mobile phone and the tablet computer, the connection between the mobile phone and the tablet computer fails, and the emergency cooperative service is ended. After the user replaces the second electronic device, the user can restart the electronic device to trigger the emergency cooperative service.
In the above S503-S506, when the mobile phone is connected to the tablet computer, the user authenticates the tablet computer through the voice command, and when the authentication is passed, the mobile phone and the tablet computer are successfully connected. In one implementation, the mobile phone and the tablet computer need to perform the voice authentication process of S503-S506 each time they are connected. In another implementation manner, after the mobile phone searches for the second electronic device, the mobile phone obtains the trusted device information, if the second electronic device is a trusted device, the second electronic device passes authentication, the mobile phone and the second electronic device are successfully connected, and if the second electronic device is not a trusted device, the mobile phone needs to perform the voice authentication process of S503-S506. That is, in the case that the mobile phone and the tablet computer are connected for the first time (first connection), the voice authentication process of S503-S506 needs to be performed. After the first connection between the mobile phone and the tablet computer is successfully established, the mobile phone stores the device information of the tablet computer, and marks the tablet computer as a trusted device. The tablet computer may also save device information of the mobile phone and mark the mobile phone as a trusted device. When the mobile phone is subsequently connected with the tablet computer, the mobile phone acquires the trusted device information after searching the tablet computer, and under the condition that the tablet computer is the trusted device of the mobile phone, the tablet computer passes the authentication, so that the mobile phone can be directly connected with the tablet computer.
In some scenarios, there are multiple electronic devices around the mobile phone, for example, a tablet computer and a notebook computer are around the mobile phone, and the mobile phone is connected to the electronic device that is searched first when searching for other devices. If the user wishes to establish a connection with the mobile phone by using the tablet computer, the mobile phone can be far away from the notebook computer and placed at a position closer to the tablet computer. For example, the mobile phone first searches for a notebook computer and prompts the user to "connect with the notebook computer", the user may refuse to connect, and then the mobile phone is brought close to the tablet computer and is far away from the notebook computer, and the emergency cooperative service is restarted.
To sum up, in the method provided by the embodiment of the application, the mobile phone detects whether the screen of the mobile phone is damaged, and when the screen is damaged, the control mode can be switched to the voice input mode, the user is prompted through voice, and a voice instruction from the user is received, so that the mobile phone is connected with the tablet computer in the unlocking state, and the mobile phone is operated through the tablet computer. Therefore, a user can acquire the data of the mobile phone and process incoming calls, short messages and the like of the mobile phone under the condition that the screen cannot be used, if the user replaces the mobile phone, the data in the mobile phone can be automatically migrated to new equipment, the display screen does not need to be replaced or a maintenance worker needs to be found to repair the display screen, the maintenance cost of the user is reduced, and the user experience is improved. In addition, the mobile phone needs to be in an unlocking state when the connection between the mobile phone and the tablet personal computer is established, and if the mobile phone is in a locking state, the mobile phone needs to be unlocked first, so that the safety of data in the mobile phone is ensured.
A schematic hardware structure of the mobile phone 100 for implementing the method is described below with reference to fig. 9.
The mobile phone 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only an exemplary illustration, and does not limit the structure of the mobile phone 100. In other embodiments of the present application, the mobile phone 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the cell phone 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the mobile phone 100 can be realized by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the handset 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then passed to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to the mobile phone 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (BT), global Navigation Satellite System (GNSS), frequency Modulation (FM), near Field Communication (NFC), infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
The mobile phone 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the handset 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the handset 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data created during use of the mobile phone 100 (e.g., audio data, a phone book, etc.), and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications and data processing of the mobile phone 100 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The mobile phone 100 can implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into a sound signal. The cellular phone 100 can listen to music through the speaker 170A or listen to a hands-free call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the cellular phone 100 receives a call or voice information, it is possible to receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or sending voice information, the user can input a voice signal to the microphone 170C by uttering a voice signal close to the microphone 170C through the mouth of the user. The handset 100 may be provided with at least one microphone 170C. In other embodiments, the handset 100 may be provided with two microphones 170C to achieve noise reduction functions in addition to collecting sound signals. In other embodiments, the mobile phone 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
The earphone interface 170D is used to connect a wired earphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The acceleration sensor 180E can detect the magnitude of acceleration of the cellular phone 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the handset 100 is stationary. The method can also be used for identifying the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and the like.
The ambient light sensor 180L is used to sense the ambient light level. The handset 100 may adaptively adjust the brightness of the display 194 according to the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the mobile phone 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The mobile phone 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, take a photograph of the fingerprint, answer an incoming call with the fingerprint, and the like.
The temperature sensor 180J is used to detect temperature. In some embodiments, the handset 100 implements a temperature processing strategy using the temperature detected by the temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the mobile phone 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the battery 142 is heated by the cell phone 100 when the temperature is below another threshold to avoid abnormal shutdown of the cell phone 100 due to low temperature. In other embodiments, when the temperature is lower than a further threshold, the mobile phone 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The cellular phone 100 may receive a key input, and generate a key signal input related to user setting and function control of the cellular phone 100.
The motor 191 may generate a vibration cue. Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the mobile phone 100. In other embodiments of the present application, the handset 100 may include more or fewer components than shown, or some components may be combined, some components may be separated, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The software system of the mobile phone 100 may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present invention uses an Android system with a layered architecture as an example to exemplarily explain a software structure of the mobile phone 100.
Fig. 10 is a block diagram of a software structure of the mobile phone 100 according to an embodiment of the present application. The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, a kernel layer, and a hardware layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 10, the application package may include applications such as camera, gallery, calendar, call, map, navigation, WLAN, bluetooth, music, video, short message, emergency collaboration service, collaboration assistant, lock screen, etc.
The emergency cooperative service is used for detecting whether the screen of the mobile phone 100 is damaged or not, and controlling the mobile phone to establish connection with a tablet computer (second electronic device) in an unlocked state according to an input instruction when the screen of the mobile phone 100 is damaged.
The cooperative assistant realizes connection between the mobile phone 100 and the tablet computer through connection modes such as bluetooth and NFC, for example, the mobile phone 100 is used to search for other electronic devices in real time, interaction between the mobile phone 100 and the tablet computer, and multi-screen cooperation between the mobile phone 100 and the tablet computer is realized, for example, an operation interface of the mobile phone 100 is projected to a display screen of the tablet computer, and data of the mobile phone 100 is transmitted to the tablet computer through Wi-Fi.
And the screen locking state of the mobile phone is stored, and the set unlocking mode of the mobile phone is recorded. And the screen locking is also used for unlocking the mobile phone according to the unlocking instruction of the user and updating the screen locking state.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 10, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and answered, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide the communication functions of the handset 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a brief dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scrollbar text in a status bar at the top of the system, such as a notification of a running application in the background, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The connection management module is used for providing connection functions of Bluetooth, NFC and Wi-Fi of the mobile phone 100, the capacity of the Bluetooth chip can be called through the Bluetooth drive of the mobile phone 100, the capacity of the NFC chip can be called through the NFC drive of the mobile phone 100, and the capacity of the Wi-Fi chip can be called through the Wi-Fi drive of the mobile phone 100, so that multi-screen cooperation of the mobile phone and the tablet computer is achieved.
The multimedia module provides the functions of the mobile phone 100 for interacting with the user in case of a damaged screen, such as playing voice prompt information, receiving voice commands, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The kernel layer at least comprises a TP/LCD driver (display driver), a camera driver, an audio driver, a sensor driver, a fingerprint driver, a Bluetooth driver, an NFC driver and a Wi-Fi driver.
The hardware layer contains the display screen at least, the camera, the microphone, the speaker, the bluetooth chip, the NFC chip, the Wi-Fi chip, fingerprint sensor.
Fig. 11 is a schematic interaction diagram between exemplary software modules in an embodiment of the present application, and unlocking a mobile phone is performed in a case that a screen of the mobile phone is damaged through cooperation between the software modules.
The emergency cooperative service detects a screen status through the HIDL service of the LCD S1101.
The TP/LCD drives parameters representing the screen state, such as voltage, current, capacity value and the like of a screen, and determines the screen state according to the parameters, and the screen state can be marked as 'damaged' or 'normal'. Taking the detection of the screen state after the mobile phone is turned on as an example, the user turns the mobile phone back on when the user finds that the screen cannot be operated. After the startup operation is detected, the emergency cooperative service of the mobile phone acquires the screen state from the TP/LCD driver through the HIDL service of the LCD, wherein the HIDL service of the LCD is a channel for connecting the emergency cooperative service and the TP/LCD driver.
In the mode of setting the monitoring event, the TP/LCD driver reports the information of the screen damage to the emergency cooperative service after determining the screen damage according to the parameters of voltage, current, capacity value and the like, thereby triggering the emergency cooperative service.
And S1102, the TP/LCD drive returns to the screen state to be normal or damaged.
S1103, in case that the screen is normal, the emergency cooperative service is ended.
The content is realized in the mobile phone, the user is a non-inductive user, and the user can normally use the mobile phone under the condition that the screen is normal.
And S1104, under the condition that the screen is damaged, the emergency cooperative service instructs the multimedia module to play the fourth prompt message and monitors the voice command.
Optionally, the content of the fourth prompt message may be "whether the current mobile phone screen is damaged, and is connected with other electronic devices, and please reply to the connection or not.
S1105, the multimedia module creates a player, creates an audio playing stream and sets playing parameters; and creates a recording stream, setting recording parameters.
S1106, the multimedia module outputs a fourth prompt message.
The multimedia module outputs the fourth prompt message through the set playing parameters. And the multimedia module opens the microphone through audio drive, receives sound around the mobile phone in real time through the microphone, and determines whether the received sound is a voice instruction of the user.
S1107, the user inputs a third instruction, and the multimedia module receives the third instruction.
And S1108, the multimedia module sends a third instruction from the user to the emergency cooperative service.
And the multimedia module generates a recording file from the third instruction and sends the recording file to the emergency cooperative service.
And S1109, the emergency cooperative service analyzes the third instruction.
The third instruction indicates whether the user needs the mobile phone to be connected with other electronic equipment under the condition that the screen is damaged.
S1110, in case that the user does not agree, closing the emergency cooperative service.
S1111, under the condition that the user agrees, the emergency cooperative service acquires the screen locking state from the screen locking state.
Fig. 12 is an interaction diagram of exemplary software modules in an embodiment of the present application, and when a screen of a mobile phone is damaged and the mobile phone is in an unlocked state, connection is established between the mobile phone and a tablet computer through cooperation between the software modules.
S1201, the emergency cooperative service acquires a screen locking state from the screen locking.
The screen locking state comprises an unlocking state and a locking state, and the screen locking state is marked by different marks.
S1202, the screen locking returns the screen locking state to the emergency cooperative service, and the screen locking state is the unlocking state.
And S1203, setting a flag bit by the emergency cooperative service.
The flag bit set here is used for determining whether to use a voice connection mode when the connection between the mobile phone and the tablet personal computer is established subsequently.
S1204, the emergency cooperation service operates the cooperation assistant.
Namely, the cooperation assistant is enabled to be in an operation state, so that the multi-screen cooperation function of the mobile phone can be used.
And S1205, the emergency cooperative service instructs the connection management module to start the Bluetooth, NFC and Wi-Fi functions of the mobile phone.
S1206, the connection management module starts the functions of Bluetooth, NFC and Wi-Fi and sends information that the functions of Bluetooth, NFC and Wi-Fi are started to the emergency cooperative service.
S1207, the emergency cooperative service instructs the multimedia module to play the second prompt message. The second prompt message prompts the user to perform multi-screen cooperation in an NFC connection or Bluetooth connection mode.
S1208, the multimedia module creates an audio playing stream and sets playing parameters.
S1209, the multimedia module outputs the second prompt message.
For example, the second prompt message may be "bluetooth, NFC is turned on, please bring the mobile phone close to the other device, and turn on the multi-screen cooperative function of the other device".
S1210, the user opens the multi-screen cooperative function of the tablet computer according to the prompt, and places the mobile phone and the tablet computer at positions where the mobile phone and the tablet computer can be searched mutually.
S1211, after the collaboration assistant searches the tablet computer, the collaboration assistant queries the flag bit from the emergency collaboration service.
And the flag bit is marked as True, and the multi-screen cooperation of the mobile phone in a voice mode is indicated. And marking the flag bit as False, and indicating that the mobile phone performs multi-screen cooperation by using a touch mode.
And S1212, the emergency cooperative service returns the information with the flag bit of True to the cooperative assistant.
S1213, in case that the flag bit is True, the cooperation assistant requests the emergency cooperation service to process the connection event.
The connection event means that the cooperation assistant searches for the tablet computer and needs to establish multi-screen cooperation with the tablet computer in a voice mode.
And under the condition that the flag bit is False, the cooperation assistant establishes multi-screen cooperation between the mobile phone and the tablet computer by using a conventional touch mode.
S1214, the emergency cooperative service instructs the multimedia module to play the third prompt message. And the third prompt message prompts whether the user agrees to the multi-screen cooperation between the mobile phone and the tablet personal computer.
S1215, the multimedia module creates an audio playing stream and sets playing parameters; and creates a recording stream, setting recording parameters.
S1216, the multimedia module outputs the third prompt message.
S1217, the user inputs a second instruction.
S1218, after receiving the second instruction, the multimedia module sends the second instruction to the emergency cooperative service.
And the multimedia module generates a recording file by the second instruction and sends the recording file to the emergency cooperative service.
S1219, the emergency cooperation service parses the second instruction.
And the second instruction indicates whether the user agrees to connect the mobile phone with the tablet personal computer, and the emergency cooperative service authenticates the second electronic equipment according to the second instruction.
S1220, if the user does not agree, the emergency cooperative service is turned off.
S1221, in case of user' S agreement, the emergency collaboration service sends information of successful connection establishment to the collaboration assistant.
After receiving the information that the connection is successfully established, the cooperation assistant can start multi-screen cooperation, that is, the mobile phone and the tablet computer are successfully established, and the user can operate the mobile phone through the tablet computer.
Fig. 13 is a schematic interaction diagram between exemplary software modules in an embodiment of the present application, and unlocking a mobile phone is performed in a case that a screen of the mobile phone is damaged through cooperation between the software modules.
And S1301, the emergency cooperative service acquires a screen locking state from the screen locking.
S1302, when the screen locking state is the locking state, the screen locking determines whether unlocking is supported by using the set unlocking mode at present.
And S1303, the lock screen sends the currently supported set unlocking mode to the emergency cooperative service.
And S1304, the emergency cooperative service instructs the multimedia module to output first prompt information, and the first prompt information prompts a user to unlock the mobile phone by using a set unlocking mode currently supported by the mobile phone.
S1305, the multimedia module creates an audio play stream and sets play parameters.
S1306, the multimedia module outputs a first prompt message.
S1307, the user inputs a first instruction.
The unlocking mode can comprise face unlocking, fingerprint unlocking, intelligent unlocking and the like, and a user inputs an unlocking instruction according to the prompt information.
S1308, after the screen locking receives the first instruction, updating the screen locking state to the unlocking state.
And S1309, the screen locking sends the information that the current screen locking state is the unlocking state to the emergency cooperative service.
S1310, the emergency cooperative service sets a flag.
The contents after setting the flag refer to S1204-S1221, which are not described herein.
To sum up, in the method provided by the embodiment of the application, the mobile phone detects whether the screen of the mobile phone is damaged, and when the screen is damaged, the control mode can be switched to the voice input mode, the user is prompted through voice, and a voice instruction from the user is received, so that the mobile phone is connected with the tablet computer in the unlocking state, and the mobile phone is operated through the tablet computer. Therefore, a user can acquire the data of the mobile phone and process incoming calls, short messages and the like of the mobile phone under the condition that the screen cannot be used, if the user replaces the mobile phone, the data in the mobile phone can be automatically migrated to new equipment, the display screen does not need to be replaced or a maintenance worker needs to be found to repair the display screen, the maintenance cost of the user is reduced, and the user experience is improved. In addition, the mobile phone needs to be in an unlocking state when the connection between the mobile phone and the tablet personal computer is established, and if the mobile phone is in a locking state, the mobile phone needs to be unlocked first, so that the safety of data in the mobile phone is ensured.
The electronic device provided by the embodiment is used for executing the method, so that the same effect as the implementation method can be achieved. In case an integrated unit is employed, the electronic device may comprise a processing module, a storage module and a communication module. The processing module may be configured to control and manage actions of the electronic device, and for example, may be configured to support the electronic device to execute steps executed by the processing unit. The memory module may be used to support the electronic device in executing stored program codes and data, etc. The communication module can be used for supporting the communication between the electronic equipment and other equipment.
An embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the foregoing method embodiments.
Embodiments of the present application provide a computer program product, which when running on a mobile terminal, enables an electronic device to implement the steps in the above method embodiments when executed.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component or a module, and may include a processor and a memory connected to each other; the memory is used for storing computer execution instructions, and when the device runs, the processor can execute the computer execution instructions stored in the memory, so that the chip can execute the method in the above method embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/electronic device and method may be implemented in other ways. For example, the above-described apparatus/electronic device embodiments are merely illustrative, and for example, the division of the modules or units is only one type of logical function division, and other division manners may exist in actual implementation, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be implemented by a computer program, which can be stored in a computer readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a terminal apparatus, including recording media, computer memory, read-only memory (ROM), random Access Memory (RAM), electrical carrier signals, telecommunications signals, and software distribution media. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In the description above, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that reference to "a plurality" in the specification and the appended claims means two or more. In the description of the present application, "/" means "or" unless otherwise stated, for example, a/B may mean a or B; "and/or" herein is merely an associative relationship that describes an associated object, and refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In addition, for the convenience of clearly describing the technical solutions of the present application, the terms "first", "second", and the like are used to distinguish the same items or similar items having substantially the same functions and actions. Those skilled in the art will appreciate that the terms first, second, etc. do not denote any order or importance, but rather the terms first, second, etc. are used to distinguish one element from another.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (13)

1. A method of controlling a device, the method comprising:
under the condition that a screen of first electronic equipment is detected to be damaged, connection with second electronic equipment is established, and the first electronic equipment is in an unlocked state when the connection with the second electronic equipment is established;
the operation interface of the first electronic device is projected to a display screen of the second electronic device;
operating the first electronic device through the second electronic device.
2. The method of claim 1, wherein the first electronic device is in a locked state when a screen damage of the first electronic device is detected, the method further comprising:
receiving a first instruction in a locked state;
and controlling the first electronic equipment to be in an unlocking state according to the first instruction.
3. The method of claim 2, wherein prior to receiving the first instruction, the method further comprises:
and outputting first prompt information according to the set unlocking mode of the first electronic equipment, wherein the first prompt information prompts a user to unlock the first electronic equipment by using the set unlocking mode of the first electronic equipment, and the first instruction is obtained based on the first prompt information.
4. The method of claim 3, further comprising:
determining whether the first electronic device currently supports a set unlocking mode;
the outputting the first prompt information includes:
outputting the first prompt message when the first electronic device currently supports the set unlocking mode, or,
and prompting a user that the first electronic equipment does not support the unlocking of the first electronic equipment by using the set unlocking mode under the condition that the first electronic equipment does not support the set unlocking mode currently.
5. The method of any of claims 1-4, wherein prior to said establishing a connection with a second electronic device, the method further comprises:
and outputting second prompt information, wherein the second prompt information prompts a user to establish connection between the first electronic equipment and the second electronic equipment in a target connection mode.
6. The method of claim 5, wherein the target connection comprises a Bluetooth connection or an NFC connection.
7. The method of any of claims 1-6, wherein establishing the connection with the second electronic device comprises:
authenticating the second electronic device;
and if the authentication is passed, establishing connection with the second electronic equipment.
8. The method of claim 7, wherein authenticating the second electronic device comprises:
outputting third prompt information under the condition that the first electronic equipment is connected with the second electronic equipment for the first time, wherein the third prompt information prompts whether a user agrees to establish connection between the first electronic equipment and the second electronic equipment;
receiving a second instruction, wherein the second instruction is obtained based on the third prompt message;
and authenticating the second electronic equipment according to the second instruction.
9. The method according to any one of claims 1 to 8, wherein the establishing a connection with a second electronic device in case of detecting a screen damage of a first electronic device comprises:
under the condition that the screen of the first electronic device is detected to be damaged, outputting fourth prompt information, wherein the fourth prompt information prompts a user whether the screen of the first electronic device is damaged or not, and the first electronic device is controlled to be connected with the second electronic device;
receiving a third instruction, wherein the third instruction is obtained based on the fourth prompt message;
and establishing connection with second electronic equipment according to the third instruction.
10. The method according to any one of claims 1 to 9, further comprising:
when the first electronic equipment is started, detecting whether a screen of the first electronic equipment is damaged.
11. The method according to any one of claims 1 to 9, further comprising:
periodically detecting whether a screen of the first electronic equipment is damaged.
12. An electronic device, comprising: one or more processors; one or more memories; the memory stores one or more programs that, when executed by the processor, cause the electronic device to perform the method of any of claims 1-11.
13. A computer-readable storage medium having stored therein instructions which, when run on a computer, cause the computer to perform the method of any one of claims 1 to 11.
CN202211174569.0A 2022-09-26 2022-09-26 Method for controlling equipment, electronic equipment and storage medium Active CN115580677B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211174569.0A CN115580677B (en) 2022-09-26 2022-09-26 Method for controlling equipment, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211174569.0A CN115580677B (en) 2022-09-26 2022-09-26 Method for controlling equipment, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN115580677A true CN115580677A (en) 2023-01-06
CN115580677B CN115580677B (en) 2024-07-26

Family

ID=84582540

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211174569.0A Active CN115580677B (en) 2022-09-26 2022-09-26 Method for controlling equipment, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115580677B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117971551A (en) * 2024-04-02 2024-05-03 杭州海康威视数字技术股份有限公司 Method and device for processing abnormality of touch screen of data recorder
WO2024172925A1 (en) * 2023-02-16 2024-08-22 Qualcomm Incorporated Controlling a device with an inoperable user interface

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170074371A (en) * 2015-12-22 2017-06-30 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN107659721A (en) * 2017-09-25 2018-02-02 联想(北京)有限公司 The control method and device of a kind of electronic equipment
CN110290526A (en) * 2019-06-26 2019-09-27 北京小米移动软件有限公司 A kind of data access method, device and medium
CN111163225A (en) * 2019-12-27 2020-05-15 维沃移动通信有限公司 Control method of electronic equipment and electronic equipment
CN114115770A (en) * 2020-08-31 2022-03-01 华为终端有限公司 Display control method and related device
CN114205364A (en) * 2020-08-27 2022-03-18 华为技术有限公司 Data backup method and equipment

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20170074371A (en) * 2015-12-22 2017-06-30 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN107659721A (en) * 2017-09-25 2018-02-02 联想(北京)有限公司 The control method and device of a kind of electronic equipment
CN110290526A (en) * 2019-06-26 2019-09-27 北京小米移动软件有限公司 A kind of data access method, device and medium
CN111163225A (en) * 2019-12-27 2020-05-15 维沃移动通信有限公司 Control method of electronic equipment and electronic equipment
CN114205364A (en) * 2020-08-27 2022-03-18 华为技术有限公司 Data backup method and equipment
CN114115770A (en) * 2020-08-31 2022-03-01 华为终端有限公司 Display control method and related device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024172925A1 (en) * 2023-02-16 2024-08-22 Qualcomm Incorporated Controlling a device with an inoperable user interface
CN117971551A (en) * 2024-04-02 2024-05-03 杭州海康威视数字技术股份有限公司 Method and device for processing abnormality of touch screen of data recorder

Also Published As

Publication number Publication date
CN115580677B (en) 2024-07-26

Similar Documents

Publication Publication Date Title
CN112399390B (en) Bluetooth connection method and related device
CN113496426A (en) Service recommendation method, electronic device and system
CN113778641B (en) Method for controlling camera, electronic device and computer readable storage medium
CN115580677B (en) Method for controlling equipment, electronic equipment and storage medium
CN113722058B (en) Resource calling method and electronic equipment
CN110276177B (en) Login method of intelligent terminal and electronic equipment
CN114173204A (en) Message prompting method, electronic equipment and system
CN113168461A (en) Method for deleting security service and electronic equipment
CN111464689A (en) Audio output method and terminal equipment
CN114125130B (en) Method for controlling communication service state, terminal device and readable storage medium
CN112887926B (en) Calling method and device
CN114095599B (en) Message display method and electronic equipment
CN113821767A (en) Application program authority management method and device and electronic equipment
CN114844984B (en) Notification message reminding method and electronic equipment
CN112543447A (en) Device discovery method based on address list, audio and video communication method and electronic device
CN114173000A (en) Method, electronic equipment and system for replying message
CN114115770A (en) Display control method and related device
CN114756785A (en) Page display method and device, electronic equipment and readable storage medium
CN111132047A (en) Network connection method and device
CN112966297B (en) Data protection method, system, medium and electronic device
WO2023071940A1 (en) Cross-device method and apparatus for synchronizing navigation task, and device and storage medium
CN114254334A (en) Data processing method, device, equipment and storage medium
CN115017498A (en) Method for operating applet and electronic device
CN113688368A (en) Cross-device authentication method
CN114117367A (en) Data protection method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant