WO2022227767A1 - 一种查找可穿戴设备的方法和装置 - Google Patents

一种查找可穿戴设备的方法和装置 Download PDF

Info

Publication number
WO2022227767A1
WO2022227767A1 PCT/CN2022/075080 CN2022075080W WO2022227767A1 WO 2022227767 A1 WO2022227767 A1 WO 2022227767A1 CN 2022075080 W CN2022075080 W CN 2022075080W WO 2022227767 A1 WO2022227767 A1 WO 2022227767A1
Authority
WO
WIPO (PCT)
Prior art keywords
scene
event
location
electronic device
information
Prior art date
Application number
PCT/CN2022/075080
Other languages
English (en)
French (fr)
Inventor
董吉阳
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Priority to EP22794248.9A priority Critical patent/EP4236402A4/en
Priority to US18/038,789 priority patent/US20240007826A1/en
Publication of WO2022227767A1 publication Critical patent/WO2022227767A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/10Earpieces; Attachments therefor ; Earphones; Monophonic headphones
    • H04R1/1016Earpieces of the intra-aural type
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/80Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/10Earpieces; Attachments therefor ; Earphones; Monophonic headphones
    • H04R1/1041Mechanical or electronic switches, or control elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R29/00Monitoring arrangements; Testing arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/025Services making use of location information using location based information parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2201/00Details of transducers, loudspeakers or microphones covered by H04R1/00 but not provided for in any of its subgroups
    • H04R2201/10Details of earpieces, attachments therefor, earphones or monophonic headphones covered by H04R1/10 but not provided for in any of its subgroups
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2420/00Details of connection covered by H04R, not provided for in its groups
    • H04R2420/07Applications of wireless loudspeakers or wireless microphones
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2460/00Details of hearing devices, i.e. of ear- or headphones covered by H04R1/10 or H04R5/033 but not provided for in any of their subgroups, or of hearing aids covered by H04R25/00 but not provided for in any of its subgroups
    • H04R2460/07Use of position data from wide-area or local-area positioning systems in hearing devices, e.g. program or information selection

Definitions

  • the present application relates to the field of terminals, and in particular, to a method and apparatus for finding wearable devices.
  • a wearable device refers to a portable device that can be worn directly on the user or integrated into the user's clothing or accessories.
  • Wearable devices may include Bluetooth headsets, smart bracelets, smart glasses and other devices.
  • the Bluetooth headset (for example, a true wireless stereo (TWS) Bluetooth headset) refers to a hands-free headset to which the Bluetooth technology is applied. Users can freely use the TWS Bluetooth headset in various ways (eg, listening to music, making calls) without the hassle of wires. Especially for mobile business people, Bluetooth headsets are a good tool to improve efficiency.
  • wearable devices for example, TWS Bluetooth headsets
  • TWS Bluetooth headsets While improving user experience, also bring users the trouble of being easily lost and difficult to find.
  • the embodiments of the present application provide a method and apparatus for finding a wearable device, which can help a user to find a wearable device conveniently and quickly, and improve user experience.
  • an embodiment of the present application provides a method for finding a wearable device, which is applied to an electronic device, and includes: receiving a first event from a wearable device, where the first event includes a wearing event or a taking off event; in response to receiving the first event For an event, acquire a first location or a first scene, the first location is used to indicate the geographic location of the electronic device or the wearable device, and the first scene is used to indicate the user's home state; record the first event and the first location or the first scene The association relationship of the scene; in response to the user's first operation on the first application, a first interface is displayed; wherein, the first interface includes a first option, and the first option is used to find the wearable device; In response to the user selecting the first option operation, the second interface is displayed; wherein, the second interface includes first information and second information corresponding to the wearable device, the first information and the second information are associated, the first information corresponds to the first position or the first scene, and the first information corresponds to the first
  • the wearing/taking off state of the wearable device is associated and recorded with the geographic location or the living scene, which can better help the user recall the lost location of the wearable device and facilitate the user to retrieve the wearable device.
  • Wearable devices to improve user experience.
  • the method when the first information corresponds to the first position, the method further includes: in response to the user's operation on the first information, displaying a visual map, and indicating the first position in the visual map. In this way, the user can find the first location in the visual map. If the wearable device is taken off at the first location and the wearable device is left behind, it means that the wearable device is likely to be left near the first location, and can be zoomed out. The range of users looking for wearable devices helps users find wearable devices faster and more conveniently.
  • acquiring the first location or the first scene by the electronic device includes: if it is determined that the electronic device is located in a preset residential area, acquiring the first scene; if it is determined that the electronic device is located in an area outside the preset residential area, Get the first position.
  • the preset living area may refer to a residence/home. Since accurate positioning is usually impossible indoors, at this time, the usage scene of the headset can be recorded based on the indoor scene mode. In this way, when the wearable device is lost subsequently, the user can quickly and accurately recall the use location of the wearable device based on the usage scene of the wearable device, which is convenient for the user to retrieve the wearable device.
  • the wearable device can be positioned by using a positioning technology (eg, global positioning system (GPS) positioning technology, Beidou positioning technology, etc.).
  • a positioning technology eg, global positioning system (GPS) positioning technology, Beidou positioning technology, etc.
  • GPS global positioning system
  • Beidou positioning technology etc.
  • acquiring the first scene includes: querying the first scene from a second application; the second application includes multiple scenes, and the multiple scenes include sleep scenes, wake-up scenes, theater scenes, and dining scenes , at least one of a leisure scene, a reading scene, an entertainment scene, a returning home scene, or a leaving home scene.
  • the first scenario may be one of the above-mentioned scenarios.
  • the second application may be the same as the first application, for example, both may be a smart life APP.
  • the second application program may be different from the first application program, which is not limited in this application.
  • the method further includes: determining whether to switch from the first scene to the second scene, the first scene is different from the second scene; if it is determined to switch from the first scene to the second scene, recording the first event The relationship with the second scene.
  • the method further includes: the electronic device receives a second event; when the first event is a wearing event, the second event is a taking off event; when the first event is a taking off event, the second event is a wearing event; the electronic device acquires the second location or the third scene, the second location is used to indicate the geographic location of the electronic device or the wearable device, and the third scene is used to indicate the user's home state; record the second event and the second location Or the relationship of the third scene.
  • the wearing state and the taking off state of the wearable device can be switched, the use state (wearing state or taking off state) and the corresponding position or scene of the wearable device can continue to be recorded after the switch, so that the wearable device can be recorded more accurately. Device usage to help users find wearable devices more conveniently and quickly.
  • acquiring the first location includes: acquiring the first location from the wearable device, where the first location is used to indicate the geographic location of the wearable device; or acquiring the first location based on a network positioning technology, the first location is It is used to indicate the geographic location of electronic devices; network positioning technologies include base station positioning technology, wireless fidelity (WiFi) positioning technology and global satellite positioning system GPS positioning technology. That is, the geographic location collected by the wearable device can be used as the geographic location of the user when using the wearable device, or the geographic location collected by the electronic device can be used as the geographic location where the user is using the wearable device (this is due to the Wearable devices are usually used in conjunction with electronic devices, and the geographical differences collected by them are relatively small).
  • WiFi wireless fidelity
  • the first location is indicated by geographic coordinates of the wearable device or electronic device, or the first location is indicated by a first name, and the first name is based on the wearable device or electronic device Geographical coordinates are determined.
  • the first name may be XX Cafe, XX cafe, XX Library, XX Store in a shopping mall, and so on.
  • the wearable device when the wearable device is a Bluetooth headset, and the Bluetooth headset includes a left-ear headset and a right-ear headset, the second interface includes first information and second information corresponding to the left-ear headset and the right-ear headset respectively .
  • the interface 730 may include the first information corresponding to the left earphone (for example, the icon 737 and the icon 738 corresponding to the wearing state, and the icon 739a and the icon corresponding to the taking off state).
  • the interface 730 may also include first information corresponding to the right earphone (for example, corresponding to the wearing state) icon 733, icon 734, and icon 735, and the icon 736 corresponding to the off state) and second information (for example, scene card 740, scene card 741, scene card 742 corresponding to different scenes).
  • the second information corresponding to the left earphone and the second information corresponding to the right earphone may be represented by the same scene card.
  • an embodiment of the present application provides a method for finding a wearable device, which is applied to an electronic device.
  • the wearable device is a Bluetooth headset.
  • the Bluetooth headset includes a left-ear headset and a right-ear headset, including: receiving a first Events, the first event includes a wearing event or a taking off event; in response to receiving the first event, obtain a first position or a first scene, the first position is used to indicate the geographic location of the electronic device or the left earphone, and the first scene is used to indicate the user's home state; record the association between the first event and the first location or the first scene; receive a second event from the right earphone, the second event includes a wearing event or a taking off event; in response to receiving the second event , obtain the second position or the second scene, the second position is used to indicate the geographic location of the electronic device or the right earphone, and the second scene is used to indicate the user's home state; record the relationship between the second event and the second position or the second scene an association relationship; in
  • the wearing/taking off state of the earphone (left ear earphone or right ear earphone) is associated and recorded with the geographic location or life scene, which can better help the user recall the location where the earphone was left. It is convenient for users to retrieve the wearable device and improve the user experience.
  • an embodiment of the present application provides a method for finding a wearable device, which is applied to a system composed of an electronic device and a wearable device, including: when the wearable device detects a first event, sending the first event to the electronic device; The electronic device receives a first event from the wearable device, and the first event includes a wearing event or a taking off event; in response to receiving the first event, the electronic device acquires a first position or a first scene, and the first position is used to indicate the electronic device or The geographic location of the wearable device, and the first scene is used to indicate the user's home state; the electronic device records the association between the first event and the first location or the first scene; in response to the user's first operation on the first application, the electronic The device displays a first interface; wherein the first interface includes a first option, and the first option is used to search for a wearable device; in response to an operation of the user selecting the first option, the electronic device displays a second interface; wherein the second interface includes a
  • the wearing/taking off state of the wearable device is associated and recorded with the geographic location or the living scene, which can better help the user recall the lost location of the wearable device and facilitate the user to retrieve the wearable device.
  • Wearable devices to improve user experience.
  • the present application provides a chip system including one or more interface circuits and one or more processors.
  • the interface circuit and the processor are interconnected by wires.
  • the above-described chip system may be applied to an electronic device including a communication module and a memory.
  • the interface circuit is configured to receive signals from the memory of the electronic device and send the received signals to the processor, the signals including computer instructions stored in the memory.
  • the processor executes the computer instructions, the electronic device can perform the method described in the first aspect, the second aspect or the third aspect and any possible design manners thereof.
  • the present application provides a computer-readable storage medium comprising computer instructions.
  • the computer instructions When the computer instructions are executed on an electronic device (such as a mobile phone), the electronic device is caused to perform the method described in the first aspect, the second aspect or the third aspect and any possible design manners thereof.
  • the present application provides a computer program product that, when the computer program product is run on a computer, causes the computer to perform the first aspect, the second aspect or the third aspect and any possible designs thereof method described.
  • an embodiment of the present application provides an apparatus for finding a wearable device, including a processor, the processor is coupled to a memory, the memory stores program instructions, and when the program instructions stored in the memory are executed by the processor, the apparatus is The methods described in the first aspect or the second aspect or the third aspect and any possible design manners thereof are implemented.
  • the apparatus may be an electronic device; or may be a component of the electronic device, such as a chip.
  • an embodiment of the present application provides an apparatus for finding a wearable device, the apparatus can be divided into different logical units or modules according to functions, and each unit or module performs different functions, so that the apparatus performs the above-mentioned The method described in the first aspect or the second aspect or the third aspect and any possible design manners thereof.
  • an embodiment of the present application provides a system for finding a wearable device, including an electronic device and a wearable device, wherein the electronic device and the wearable device respectively perform part of the steps and cooperate with each other to realize the above-mentioned first aspect or the first aspect.
  • FIG. 1 is a schematic diagram of a system architecture provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of a hardware structure of an electronic device provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of a hardware structure of a wearable device according to an embodiment of the present application.
  • Fig. 4 is a kind of flow chart provided by the embodiment of this application.
  • Fig. 5 is another flow chart provided by the embodiment of the present application.
  • FIG. 6 is another flowchart provided by the embodiment of the present application.
  • FIG. 7 is a schematic display diagram provided by an embodiment of the present application.
  • FIG. 8 is another schematic display diagram provided by an embodiment of the present application.
  • FIG. 9 is another schematic display diagram provided by an embodiment of the present application.
  • FIG. 10 is another schematic display diagram provided by an embodiment of the present application.
  • FIG. 11 is a schematic diagram of a scenario and state of headset maintenance provided by an embodiment of the present application.
  • FIG. 12 is another schematic display diagram provided by an embodiment of the present application.
  • FIG. 13 is another schematic display diagram provided by an embodiment of the present application.
  • FIG. 14 is another schematic diagram of a display provided by an embodiment of the present application.
  • FIG. 15 is another schematic display diagram provided by an embodiment of the present application.
  • FIG. 16 is another flowchart provided by the embodiment of the present application.
  • FIG. 17 is another flowchart provided by the embodiment of the present application.
  • FIG. 18 is another schematic display diagram provided by an embodiment of the present application.
  • FIG. 19 is another schematic display diagram provided by an embodiment of the present application.
  • FIG. 20 is a schematic structural diagram of a chip system according to an embodiment of the present application.
  • the communication system architecture may include the wearable device 100 and the electronic device 200 .
  • the wearable device 100 and the electronic device 200 may perform Bluetooth communication.
  • the embodiment of the present application uses an electronic device 200 (such as a mobile phone) as an example to illustrate the structure of the electronic device provided by the embodiment of the present application.
  • the electronic device 200 (such as a mobile phone) may include: a processor 210, an external memory interface 220, an internal memory 221, a universal serial bus (USB) interface 230, a charging management module 240, a power management module 241, a battery 242, Antenna 1, Antenna 2, Mobile Communication Module 250, Wireless Communication Module 260, Audio Module 270, Speaker 270A, Receiver 270B, Microphone 270C, Headphone Interface 270D, Sensor Module 280, Key 290, Motor 291, Indicator 292, Camera 293, Display screen 294, and subscriber identification module (subscriber identification module, SIM) card interface 295 and so on.
  • SIM subscriber identification module
  • the aforementioned sensor module 280 may include sensors such as pressure sensors, gyroscope sensors, air pressure sensors, magnetic sensors, acceleration sensors, distance sensors, proximity light sensors, fingerprint sensors, temperature sensors, touch sensors, ambient light sensors, and bone conduction sensors.
  • sensors such as pressure sensors, gyroscope sensors, air pressure sensors, magnetic sensors, acceleration sensors, distance sensors, proximity light sensors, fingerprint sensors, temperature sensors, touch sensors, ambient light sensors, and bone conduction sensors.
  • the structure illustrated in this embodiment does not constitute a specific limitation on the electronic device 200 .
  • the electronic device 200 may include more or fewer components than shown, or combine some components, or separate some components, or different component arrangements.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 210 may include one or more processing units, for example, the processor 210 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (NPU) Wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • NPU neural-network processing unit
  • the controller may be the nerve center and command center of the electronic device 200 .
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 210 for storing instructions and data.
  • the memory in processor 210 is cache memory.
  • the memory may hold instructions or data that have just been used or recycled by the processor 210 . If the processor 210 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided, and the waiting time of the processor 210 is reduced, thereby improving the efficiency of the system.
  • the processor 210 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the interface connection relationship between the modules illustrated in this embodiment is only a schematic illustration, and does not constitute a structural limitation of the electronic device 200 .
  • the electronic device 200 may also adopt different interface connection manners in the above-mentioned embodiments, or a combination of multiple interface connection manners.
  • the charging management module 240 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger. While the charging management module 240 charges the battery 242 , the power management module 241 can also supply power to the electronic device.
  • the power management module 241 is used to connect the battery 242 , the charging management module 240 and the processor 210 .
  • the power management module 241 receives input from the battery 242 and/or the charging management module 240, and supplies power to the processor 210, the internal memory 221, the external memory, the display screen 294, the camera 293, and the wireless communication module 260.
  • the power management module 241 and the charging management module 240 may also be provided in the same device.
  • the wireless communication function of the electronic device 200 may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, the modulation and demodulation processor, the baseband processor, and the like.
  • the antenna 1 of the electronic device 200 is coupled with the mobile communication module 250
  • the antenna 2 is coupled with the wireless communication module 260, so that the electronic device 200 can communicate with the network and other devices through wireless communication technology.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 200 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 250 may provide a wireless communication solution including 2G/3G/4G/5G, etc. applied on the electronic device 200 .
  • the mobile communication module 250 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), and the like.
  • the mobile communication module 250 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 250 can also amplify the signal modulated by the modulation and demodulation processor, and then convert it into electromagnetic waves for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 250 may be provided in the processor 210 .
  • at least part of the functional modules of the mobile communication module 250 may be provided in the same device as at least part of the modules of the processor 210 .
  • the wireless communication module 260 can provide applications on the electronic device 200 including WLAN (such as (wireless fidelity, Wi-Fi) network), Bluetooth (bluetooth, BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation ( frequency modulation, FM), near field communication technology (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN such as (wireless fidelity, Wi-Fi) network
  • Bluetooth bluetooth, BT
  • global navigation satellite system global navigation satellite system, GNSS
  • frequency modulation frequency modulation, FM
  • near field communication technology near field communication, NFC
  • infrared technology infrared, IR
  • GNSS may include Beidou navigation satellite system (BDS), GPS, global navigation satellite system (GLONASS), quasi-zenith satellite system (QZSS) and/or Satellite based augmentation systems (SBAS).
  • BDS Beidou navigation satellite system
  • GLONASS global navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS Satellite based augmentation systems
  • the wireless communication module 260 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 260 receives electromagnetic waves via the antenna 2 , modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 210 .
  • the wireless communication module 260 can also receive the signal to be sent from the processor 210 , perform frequency modulation on the signal, amplify the signal, and then convert it into an electromagnetic wave for radiation through the antenna 2 .
  • the electronic device 200 implements a display function through a GPU, a display screen 294, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 294 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 210 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 294 is used to display images, videos, and the like.
  • the display screen 294 includes a display panel.
  • the electronic device 200 can realize the shooting function through the ISP, the camera 293, the video codec, the GPU, the display screen 294 and the application processor.
  • the ISP is used to process the data fed back by the camera 293 .
  • Camera 293 is used to capture still images or video.
  • the electronic device 200 may include 1 or N cameras 293 , where N is a positive integer greater than 1.
  • the external memory interface 220 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 200.
  • the external memory card communicates with the processor 210 through the external memory interface 220 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 221 may be used to store computer executable program code, which includes instructions.
  • the processor 210 executes various functional applications and data processing of the electronic device 200 by executing the instructions stored in the internal memory 221 .
  • the processor 210 may execute instructions stored in the internal memory 221, and the internal memory 221 may include a program storage area and a data storage area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the electronic device 200 and the like.
  • the internal memory 221 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the electronic device 200 may implement audio functions through an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an earphone interface 270D, and an application processor. Such as music playback, recording, etc.
  • the keys 290 include a power-on key, a volume key, and the like. Keys 290 may be mechanical keys. It can also be a touch key. Motor 291 can generate vibrating cues. The motor 291 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback. The indicator 292 can be an indicator light, which can be used to indicate the charging status, the change of power, and can also be used to indicate messages, missed calls, notifications, and the like.
  • the SIM card interface 295 is used to connect a SIM card. The SIM card can be contacted and separated from the electronic device 200 by inserting into the SIM card interface 295 or pulling out from the SIM card interface 295 .
  • the electronic device 200 may support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
  • the SIM card interface 295 can support Nano SIM card, Micro SIM card, SIM card and so on.
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the electronic device 200 .
  • the electronic device 200 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the electronic device 100 may be a mobile phone, a smart remote control, a handheld computer, an augmented reality (AR)/virtual reality (VR) device, a portable multimedia player (PMP), a media player device, etc.
  • AR augmented reality
  • VR virtual reality
  • PMP portable multimedia player
  • the embodiments of the present application do not limit any specific types of electronic devices.
  • the wearable device 100 may include a wireless communication module 310 , a positioning module 320 , a processor 330 , an internal memory 340 , a power management module 350 , a battery 360 , a charging management module 370 , an antenna 3 and the like.
  • the processor 330 may include one or more processing units.
  • the processor 330 may include an application processor (AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (ISP), a controller, a video Codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc.
  • AP application processor
  • modem processor graphics processor
  • ISP image signal processor
  • controller a video Codec
  • digital signal processor digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • neural-network processing unit neural-network processing unit
  • processor 330 may include one or more interfaces.
  • the interface may include an I2C interface, an I2S interface, a PCM interface, a UART interface, MIPI, a GPIO interface, a SIM card interface, and/or a USB interface, and the like.
  • the interface connection relationship between the modules illustrated in the embodiments of the present application is only a schematic illustration, and does not constitute a structural limitation of the wearable device 100 .
  • the wearable device 100 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the charging management module 370 is used to receive charging input from the charger.
  • the charger may be a wireless charger or a wired charger.
  • the power management module 350 is used to connect the battery 360 , the charging management module 370 and the processor 330 .
  • the power management module 350 receives input from the battery 360 and/or the charge management module 370, and supplies power to the processor 330, the internal memory 340, the external memory interface 220, the wireless communication module 310, and the like.
  • the power management module 350 can also be used to monitor battery capacity, battery cycle times, battery health status (leakage, impedance) and other parameters.
  • the wireless communication function of the wearable device 100 may be implemented by the antenna 3 , the wireless communication module 310 , the positioning module 320 and the like.
  • the positioning module 320 may provide positioning technology applied on the wearable device 100 .
  • the positioning technology may include Beidou navigation satellite system (BDS), global positioning system (global positioning system, GPS), global navigation satellite system (GLONASS), quasi-zenith satellite system (quasi-zenith satellite system) - Positioning technology for systems such as zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • BDS Beidou navigation satellite system
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • quasi-zenith satellite system quadsi-zenith satellite system
  • QZSS zenith satellite system
  • SBAS satellite based augmentation systems
  • the positioning module 320 may be integrated with the wireless communication module 310 or provided separately, which is not limited in this application.
  • Internal memory 340 may be used to store one or more computer programs including instructions.
  • the wearable device 100 may also be provided with optical sensors (eg, infrared temperature sensors), motion sensors (eg, acceleration sensors, gyroscopes, etc.), capacitive sensors, and the like.
  • optical sensors eg, infrared temperature sensors
  • motion sensors eg, acceleration sensors, gyroscopes, etc.
  • capacitive sensors e.g., capacitive sensors, and the like.
  • the wearable device is a Bluetooth headset
  • the Bluetooth headset can perform wearing and removal detection to determine whether the Bluetooth headset is in a wearing state or a removal (removing) state.
  • the temperature change within a preset time can be sensed through the infrared temperature measurement sensor, and whether the wearing action occurs within the preset time can be obtained according to the acceleration sensor. Determine whether the Bluetooth headset is on or off.
  • the Bluetooth headset is provided with a capacitive sensor, it can be determined whether the Bluetooth headset is in the wearing state or the taking off state through the change in the capacitance value of the capacitive sensor during the process of putting on and taking off the headset.
  • the Bluetooth headset when the Bluetooth headset is in the wearing state, the Bluetooth headset is worn on the ear of the user.
  • the Bluetooth headset may be stored in the headset box, or placed on a table, or left on the floor, sofa, etc., which is not limited in this application.
  • the Bluetooth headset may further include a speaker and a microphone. Speakers play audio to the user's ears.
  • the microphone may collect audio data such as the voice of the user on the phone call and/or ambient noise information (eg, for noise cancellation).
  • the state of the Bluetooth headset can be controlled according to a tap operation (eg, single tap, double tap, triple tap, etc.) detected by the accelerometer.
  • the Bluetooth headset can adjust audio playback (eg, can start or stop playing music when a click operation is detected).
  • the power management function of the Bluetooth headset can be controlled. For example, stop playing music when a click is detected, and put a Bluetooth headset into low power sleep mode when not being used to play music for the user.
  • the structures illustrated in the embodiments of the present application do not constitute a specific limitation on the wearable device 100 .
  • the wearable device 100 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the wearable device 100 may include, for example, a Bluetooth headset, a smart bracelet, a smart watch, or smart glasses, which is not limited in this application.
  • an embodiment of the present application provides a method for finding a wearable device.
  • the wearable device is an earphone (for example, a Bluetooth earphone) and the electronic device is a mobile phone as an example for description, including:
  • the mobile phone receives the wearing event sent by the headset.
  • a mobile phone and a headset can be connected via Bluetooth to communicate based on a Bluetooth link.
  • the Bluetooth connection may include classic Bluetooth (basic rate/enhanced data rate, BR/EDR) and low power Bluetooth (bluetooh low energy, BLE) and the like.
  • the headset After the headset detects the wearing event, it can send the wearing event to the mobile phone.
  • the identity (ID) and wearing status of the headset can be carried in the wearing event.
  • the earphones may include left ear earphones (wireless earphone device located in the left ear) or right earphones (wireless earphone device located in the right ear). You can use one of the left or right earphones as the master and the other as the slave.
  • the master and slave headsets may be selected based on the respective qualities of wireless communication. For example, if the communication link between the left earphone and the phone is better (eg, more reliable, faster, or lower power consumption) than the communication link between the right earphone and the phone, then the left earphone It can be used as the master earphone, and the right earphone can be used as the slave earphone. Of course, other methods may also be used to determine the master earphone and the slave earphone, which is not limited in this application.
  • the primary headset may communicate with the cell phone (eg, the primary headset may send its own wearing events to the cell phone).
  • the slave headset can communicate with the master headset (eg, the slave headset can send its own donning events to the master headset).
  • the master headset can act as a medium for the slave headset to communicate with the mobile phone (eg, the master headset can send a wearing event of the slave headset to the mobile phone).
  • both the master and slave headsets can communicate with the mobile phone, which is not limited in this application.
  • the mobile phone After the mobile phone receives the wearing event sent by the headset, it can query the current geographic location of the mobile phone. If it is determined that the mobile phone is located in the residence/home (the location of the residence/home can be set by the user or determined by the mobile phone according to a preset algorithm), determine the current For a home scenario, step 402 may be performed.
  • the mobile phone has a positioning function (for example, GPS function or Beidou function) turned on, it can be determined whether the mobile phone is at home according to the positioning information. Alternatively, if a corresponding mode (eg, "home mode") in an application program (eg, a smart life APP) of the mobile phone is turned on, it may be determined that the mobile phone is at home. Alternatively, if the mobile phone is connected to the WIFI at home, it can be determined that the mobile phone is located at the home.
  • a positioning function for example, GPS function or Beidou function
  • a corresponding mode eg, "home mode”
  • an application program eg, a smart life APP
  • the mobile phone determines that the current scene is at home.
  • the usage scene of the headset can be recorded based on the scene mode provided by the application (for example, the smart life APP). In this way, when the earphone is lost subsequently, the user can quickly and accurately recall the usage location of the earphone based on the usage scene of the earphone, which is convenient for the user to retrieve the earphone.
  • the application for example, the smart life APP
  • Mobile phone inquires about the first life scene.
  • the mobile phone After receiving the wearing event sent by the headset, the mobile phone can query the current life scene (ie, the first life scene) from the application.
  • the current life scene ie, the first life scene
  • a plurality of different life scenarios can be preset in an application (eg, a smart life APP).
  • Different life scenarios are designed to meet the different needs of users in their lives.
  • various adjustable, flexible and multi-scenario home modes are created, which are a combination of a series of home functions. Providing users with different life scenarios can help users reduce complicated equipment operations and save living costs.
  • the living scene may include at least one of a sleeping scene, a waking up scene, a theater scene, a dining scene (eg, a breakfast scene, a lunch scene, a dinner scene), a leisure scene, a reading scene, an entertainment scene, a coming home scene, or a leaving home scene.
  • the scene may also be replaced by a scene, a mode, or the like, which is not limited in the present application.
  • the bedroom curtains can be slowly opened, the speakers can play soft background music, the bread machine can automatically start to bake bread, the coffee machine can start grinding coffee, and the bathroom heating equipment and water heaters can start to work.
  • the lights in the living room are automatically dimmed, the TV and audio equipment are turned on, and the user selects their favorite movie to start watching.
  • the security system is activated: the door sensor, infrared human body sensor, and smoke alarm work, and the user can go out with confidence.
  • Each of the above scenarios can be manually triggered by the user.
  • the user can trigger it immediately (for example, by directly selecting a scene on the mobile phone application), or it can be triggered at a preset time (for example, the user can set the wake-up scene to be automatically triggered at 7 o'clock).
  • Individual scenes can also be triggered by actions of smart home devices. For example, it detects that the curtains in the bedroom are opened and the coffee machine is turned on, and the mobile phone can actively trigger the wake-up scene.
  • the mobile phone determines whether the first life scene is queried.
  • the mobile phone can call a preset application programming interface (application programming interface, API) interface to query the current life scene (ie, the first life scene) from the smart life APP.
  • API application programming interface
  • first life scene and the wearing event are corresponding/associated, indicating that in the first life scene, the headset is in a wearing state.
  • the wake-up scene and the wearing event are recorded, indicating that the headset is in the wearing state in the wake-up scene.
  • the headset is in the wearing state at the current time/state.
  • the current time may be the time when the mobile phone receives the wearing event sent by the headset or the time when the headset sends the wearing event, and the current motion state is used to indicate the user's current motion state (eg, stationary state, walking state).
  • the mobile phone determines whether the user takes off the headset.
  • the mobile phone receives the removal event sent by the headset, it is determined that the user has removed the headset. If the mobile phone does not receive the removal event sent by the headset, it is determined that the user has not removed the headset.
  • step 408 may be executed. If it is determined that the user takes off the earphone, step 410 may be executed.
  • the headset can experience changes in multiple life scenarios when it is worn, in order to more accurately record the usage of the headset, the usage state of the headset and the scene after the switch can continue to be recorded when scene switching occurs.
  • the smart life APP of the mobile phone can complete the conversion of life scenes according to the actions of the home equipment. For example, in the entertainment scene/cinema scene, it is detected that the TV is turned off and the lights are turned on, and it can be switched to the leisure scene/reading scene. For another example, the user can manually switch the life scene on the smart life APP, or the smart life APP can automatically switch the scene at the time defined by the user. For example, at 11 pm, automatically switch from the entertainment scene to the sleep scene.
  • step 409 may be executed.
  • the mobile phone can query the switched life scene (ie, the second life scene) from the application, and record the association between the second life scene and the wearing event.
  • the association relationship between the second life scene and the wearing event indicates that in the second life scene, the headset is in a wearing state.
  • the mobile phone After receiving the removal event sent by the headset, the mobile phone can query the current life scene (ie, the third life scene) from an application (eg, a smart life APP).
  • an application eg, a smart life APP
  • the mobile phone can query the current life scene (ie, the third life scene) from the application program, and record the correlation between the third life scene and the taking-off event.
  • the association relationship between the third life scene and the taking-off event indicates that in the third life scene, the headset is in a take-off state.
  • step 406 For related description, reference may be made to step 406, which is not repeated here.
  • step 413 step 414 and step 415 may also be performed.
  • step 408 For the description of the scene switching, reference may be made to step 408, which is not repeated here.
  • the mobile phone can query the switched life scene (ie, the fourth life scene) from the application, record the relationship between the fourth life scene and the removal event, and end the process. .
  • the switched life scene ie, the fourth life scene
  • step 414 may be executed cyclically.
  • the user when the user uses the earphone at home, if the earphone falls off accidentally, the user can quickly recall the location of the earphone by querying the historical state of the earphone and the corresponding scene, which is helpful for the user to quickly and easily to retrieve the earphones.
  • an embodiment of the present application provides a method for searching for a wearable device, where the wearable device is an earphone (the earphone may refer to a left ear earphone or a right ear earphone) and the electronic device is a mobile phone as an example, including:
  • the mobile phone receives the wearing event sent by the headset.
  • step 401 For a description of the wearing event sent by the mobile phone receiving the headset, reference may be made to step 401, which is not repeated here.
  • the mobile phone can query the current geographic location of the mobile phone. If it is determined that the mobile phone is located in an area other than the residence/home, and it is determined that the current situation is going out, step 502 can be performed.
  • the mobile phone has a positioning function (eg, GPS function or Beidou function) turned on, it can be determined that the mobile phone is located in an area outside the residence/home according to the positioning information, so as to determine that it is currently an outing scene.
  • a positioning function eg, GPS function or Beidou function
  • a corresponding mode eg, "away-from-home mode” in an application (eg, a smart life APP) of the mobile phone is turned on, it may be determined that it is currently an out-of-home scenario.
  • the mobile phone is connected to the WIFI in an area other than the residence/home, it can be determined that the current situation is going out.
  • the mobile phone determines that it is currently an outing scene.
  • the headset can be positioned through a positioning technology (eg, GPS positioning technology, Beidou positioning technology, etc.).
  • a positioning technology eg, GPS positioning technology, Beidou positioning technology, etc.
  • the mobile phone determines whether the wearing event carries the first coordinate information.
  • the first coordinate information may include geographic coordinates collected by the headset when the wearing event is detected, where the geographic coordinates are used to represent the geographic location of the headset when the wearing event is detected.
  • the wearing event reported by the headset may include GPS coordinates (ie, first coordinate information).
  • the wearing event is associated with the first coordinate information. It can be considered that a wearing event occurs at the geographic location indicated by the first coordinate information, that is, the user wears the headset at the geographic location indicated by the first coordinate information.
  • the second coordinate information may include geographic coordinates collected by the mobile phone when the mobile phone receives the wearing event.
  • the network positioning technology may include base station positioning technology (positioning technology based on cellular mobile communication technology), WiFi positioning technology (the principle is to detect the ID (router address) of the WiFi through positioning software, and then use the WiFi location database and map in its WiFi location technology. With the cooperation of the database to complete the positioning), GPS positioning technology.
  • base station positioning technology positioning technology based on cellular mobile communication technology
  • WiFi positioning technology the principle is to detect the ID (router address) of the WiFi through positioning software, and then use the WiFi location database and map in its WiFi location technology. With the cooperation of the database to complete the positioning
  • GPS positioning technology GPS positioning technology.
  • the mobile phone can query the map database according to the first coordinate information or the second coordinate information to determine whether the first humanistic location can be queried.
  • the first human location may refer to a geographic location with a first name, and the first name may be designated by humans.
  • the first name may be XX Cafe, XX cafe, XX Library, XX Store in a shopping mall, and so on.
  • step 503 may not be performed, and step 505 may be directly performed.
  • step 506 the first human location can be directly queried according to the second coordinate information.
  • first coordinate information or the second coordinate information has a corresponding first humanistic location, for example, XX cafe, XX cafe, XX library, XX store in a shopping mall, etc.
  • the first human position and the wearing event have a corresponding/associative relationship, which means that at the first human position, the headset is in a wearing state.
  • the first queue is a first-in, first-out linear list.
  • the size of the first queue may be preset.
  • each record in the first queue is configured with a timestamp. Timestamps can indicate when a record was created.
  • the third coordinate information may be the first coordinate information or the second coordinate information.
  • the wearing event and the coordinate information (first coordinate information or second coordinate information) corresponding to the wearing event may be added to the first queue as a record.
  • the mobile phone receives the removal event sent by the headset, it is determined that the user has removed the headset. If the mobile phone does not receive the removal event sent by the headset, it is determined that the user has not removed the headset.
  • step 510 may be performed cyclically. If it is determined that the user takes off the earphone, step 511 may be performed.
  • the mobile phone receives the removal event sent by the headset, the fourth coordinate information is recorded.
  • the fourth coordinate information may be carried in the removal event sent by the headset, including geographic coordinates collected by the headset when the removal event is detected.
  • the fourth coordinate information may also be determined by the mobile phone, including geographic coordinates collected when the mobile phone receives the removal event sent by the headset.
  • the second human location may refer to a geographic location with a second name, and the second name may be artificially designated.
  • the second name may be XX cafe, XX cafe, XX Library, XX Store in a shopping mall, and so on.
  • the second humanities position may be the same or different from the first humanities position.
  • the second name may be the same or different from the first name.
  • the third coordinate information or the fourth coordinate information has a corresponding second humanistic location, for example, XX cafe, XX cafe, XX library, XX store in a shopping mall, etc.
  • the second human position corresponds to/correlates with the take-off event, which means that at the second human position, the earphone is in the off state.
  • the corresponding human location may not be queried.
  • a wearable device for example, a headset
  • steps 501-505, steps 509-step 511, and step 515 for each step, please refer to the above. The related descriptions are not repeated here.
  • the headset when the user uses the headset outdoors, if the headset falls off during the user's outdoor exercise (running, biking, mountain climbing), the historical state of the headset and the corresponding geographic coordinates can be inquired. , to quickly recall the lost location of the earphone, which helps the user to retrieve the earphone conveniently and quickly.
  • the mobile phone in response to the user's operation of clicking Smart Life 702 in the main interface 701, as shown in (b) of FIG. 7 , the mobile phone can display the main interface 703 of the Smart Life APP, which can be displayed in the Manage the Bluetooth headset in the Smart Life APP (for example, the name of the Bluetooth headset can be XX3).
  • the mobile phone in response to the operation of the user clicking the icon 704 corresponding to the Bluetooth headset, as shown in FIG. 8 , the mobile phone can display the management interface 710 of the Bluetooth headset XX3.
  • the management interface 710 may also include other management options, which are not limited in this application.
  • the mobile phone in response to the operation of the user clicking on the Honor headset 721 in the main interface 720, as shown in (b) of FIG. 9, the mobile phone may display the device management interface 723 of the Honor headset APP , the device management interface 723 may include a Bluetooth headset XX3.
  • the device management interface 723 may also include other Bluetooth headsets, which are not limited in this application.
  • the mobile phone In response to the user clicking on the display area 724 corresponding to the Bluetooth headset XX3, as shown in FIG. 8, the mobile phone can display the management interface 710 of the Bluetooth headset XX3.
  • the mobile phone may display a search headset interface 730 , which may include headsets (left ear headset and right ear headset). ) trajectory and state.
  • the state of the earphones (the left earphone and the right earphone) may be the wearing state or the taking off state.
  • the wearing status may be represented by solid circular icons (eg, icon 733, icon 734, icon 735, icon 737, icon 738).
  • the off state may be represented by a hollow circular icon (eg, icon 736, icon 739a, icon 739b).
  • the wearing state or the taking off state can also be represented by icons filled with other shapes or colors, or can also be represented by prompt text, which is not limited in this application.
  • the track of the headset can be composed of multiple scenes. Each scene may correspond to a scene card, for example, scene card 740 may represent a sleep scene (or a rest scene).
  • the scene card 741 can represent a leisure scene or a reading scene, the scene card 742 can represent a "blank scene" ("blank scene” can refer to the situation where the current scene cannot be queried from an application (for example, a smart life APP)), a scene card 743 may represent an entertainment scene.
  • the state of the headset is associated with the scene.
  • the icon 733 , the icon 734 , and the icon 735 may indicate that the right earphone 731 is in the wearing state in the scenes corresponding to the scene card 740 , the scene card 741 , and the scene card 742 respectively.
  • the icon 736 may indicate that the right ear earphone 731 is in the off state in the scene corresponding to the scene card 743 .
  • the icon 737 and the icon 738 may indicate that the left ear earphone 732 is in the wearing state in the scenes corresponding to the scene card 740 and the scene card 741 respectively.
  • the icon 739a and the icon 739b may indicate that the left ear earphone 732 is in the removed state in the scenes corresponding to the scene card 742 and the scene card 743 respectively.
  • the life scene record when the headset is used as shown in Figure 10 will greatly help the user recall the scene when the headset was used last time, thereby helping the user to narrow the search range and quickly Get your headphones back.
  • the scene and state records can be maintained separately for the left earphone and the right earphone.
  • the scene and state records corresponding to each headset may form a data queue, each piece of data includes a scene and a state under the scene, and the state in the data at the end of the queue is usually taken off.
  • the scene and state records of the left earphone and the right earphone during the last N times of use can be maintained separately.
  • N is an integer greater than or equal to 1.
  • the scene and state records when the left earphone and the right earphone are used in the last M time units can be maintained separately.
  • M is an integer greater than or equal to 1.
  • Time units may include minutes, hours, days, months, and the like. It should be understood that the one-time use process of the headset includes the entire process from the event of wearing the headset to the event of taking off the headset.
  • the mobile phone can query the current scene from the smart life APP, and can record the current scene (eg, sleep scene 751 ) and state (eg, sleep scene 751 ) , wearing state 755).
  • the scenarios in the Smart Life APP may be automatically determined by the mobile phone based on preset parameters (for example, time, electronic device data (for example, whether the electronic device is turned on), etc.), or may be manually selected by the user, which is not limited in this application .
  • the mobile phone detects that the smart life APP has undergone scene switching, the switched scene (eg, leisure scene 752 ) and state (eg, wearing state 756 ) may be recorded. Further, if the mobile phone receives the removal event sent by the right earphone, it can query the current scene from Smart Life. "Blank Scene” 753 and state (eg, off state 757). Optionally, the current time and/or the user's status may be recorded on the basis of the "blank scene". Wherein, the state of the user may be obtained based on sensor data (eg, a gyroscope).
  • sensor data eg, a gyroscope
  • the mobile phone Since the mobile phone cannot query the current scene when the user takes off the right earphone, it is difficult for subsequent users to recall the lost location of the earphone when searching for the earphone. Therefore, in order to help the user recall the lost time and location of the right earphone to the greatest extent, after the mobile phone records the "blank scene" when the right earphone is taken off, if a scene switch is detected, it can further record the scene after the switch (for example, , entertainment scene 754) and states (eg, off state 758).
  • the switch for example, , entertainment scene 754
  • states eg, off state 758
  • the mobile phone After the mobile phone receives the wearing event of the left ear earphone, it determines that the left ear earphone is in the wearing state. At this time, the current scene can be queried from the Smart Life APP, and the current scene can be recorded. (eg, sleep scene 759) and state (eg, wearing state 762). It should be noted that the user may perform multiple wearing and taking off actions in the same scene (for example, the user may frequently adjust the position of the headset when wearing the headset for the first time), that is, the mobile phone can receive multiple wearing events and taking off events. .
  • the recorded scenes and states within a preset time interval may be processed in a normalized manner, and only the last recorded state and scene within the preset time interval are retained.
  • a preset time interval eg, 5s
  • the wearing state 762 corresponding to the sleep scene 759 , the taking off state 763 corresponding to the sleep scene 760 , and the wearing state 764 corresponding to the sleep scene 761 can be normalized, and only the last time is retained.
  • the mobile phone detects that the smart life APP has switched scenes, the switched scene (eg, leisure scene 765 ) and status (eg, wearing status 768 ) can be recorded. If the mobile phone detects that the smart life APP has exited the leisure scene 765, but no new scene is queried, the "blank scene” 766 and the status (eg, wearing status 769) can be recorded. Further, if the mobile phone receives the removal event sent by the left earphone, it can query the current scene from Smart Life, and record the current scene (eg, entertainment scene 767 ) and state (eg, removal state 770 ).
  • the current scene eg, entertainment scene 767
  • state eg, removal state 770
  • the mobile phone can display the latest scene and status of the headset by default, and the user can query more historical records (earlier scenes and status of the headset) by sliding down. Of course, users can return to the latest scene and status by sliding up.
  • the time corresponding to the scene may be displayed near each scene card (for example, the time when the scene is queried by the mobile phone).
  • the scenes and states of the headset at different times can be displayed to the user, and the user can recall the location where the headset may have been left according to the scene and state of the headset, which is helpful for the user to retrieve the headset conveniently and quickly.
  • the mobile phone can record the location of the headset and the corresponding state and display it to the user.
  • the location of the headset can include human location and geographic coordinates.
  • the geographic coordinates can be converted into a visual map by a geodecoder (GeoDecoder) of the corresponding map provider. If the user clicks on the corresponding location card (eg, location card 780, location card 781, location card 782, location card 783), the mobile phone can open the map and display the location in the map.
  • the scene card and the location card can be distinguished by different elements (shape, text) or color.
  • the scene card may be square, and the location card may be circular, so that the user can more easily distinguish the going out scene (eg, outdoors) and the home scene, making the UI interface more intuitive.
  • an interface 790 as shown in FIG. 14 may be displayed, and the interface 790 includes the location card 791 corresponding to the outdoor and the corresponding location card 791 corresponding to the indoor (in the home scene).
  • the mobile phone calculates a position 795 where an earphone may be lost according to historical records (scenario records or location records during use of the mobile phone), and prompts the user.
  • the wearable device as an earphone as an example, but the embodiment of the present application does not limit the type of the wearable device.
  • the wearable device may also be a bracelet, a watch, glasses, and the like.
  • the electronic device may give more prompts to the user according to the usage scenario of the wearable device. For example, the user may be prompted not to wear a watch in a sports state or not to wear glasses in a reading state, etc.
  • associating and recording the wearing/taking off state of the wearable device with the geographic location or life scene can better help the user recall the lost location of the wearable device, facilitate the user to retrieve the wearable device, and improve the user experience.
  • an embodiment of the present application provides a method for finding a wearable device, which is applied to an electronic device, including:
  • the first location is used to indicate the geographic location of the electronic device or the wearable device, and the first scene is used to indicate the home state of the user.
  • the first scene may be the aforementioned first life scene, and the first location may be the position indicated by the aforementioned first coordinate information.
  • the first scene is acquired; if it is determined that the electronic device is located in an area outside the preset residential area, the first location is acquired.
  • the preset living area may be a residence/home.
  • the electronic device acquires the first location from the wearable device, and the first location is used to indicate the geographic location of the wearable device.
  • the electronic device acquires the first location based on a network positioning technology, where the first location is used to indicate the geographic location of the electronic device.
  • the network positioning technology includes base station positioning technology, wireless fidelity WiFi positioning technology and global satellite positioning system GPS positioning technology.
  • the first location is indicated by geographic coordinates of the wearable device or electronic device.
  • the first location is indicated by a first name, which is determined according to geographic coordinates of the wearable device or electronic device.
  • the first scene may be queried from a second application; the second application includes multiple scenes.
  • the plurality of scenes includes at least one of a sleeping scene, a waking up scene, a theater scene, a dining scene, a leisure scene, a reading scene, an entertainment scene, a returning home scene, or a leaving home scene.
  • the electronic device determines whether to switch from the first scene to the second scene, and the first scene is different from the second scene; if it is determined to switch from the first scene to the second scene, the first event and the second scene are recorded. The relationship of the scene.
  • the electronic device can also receive a second event; when the first event is a wearing event, the second event is a taking off event; when the first event is a taking off event, the second event is a wearing event ;
  • the electronic device obtains the second position or the third scene, the second position is used to indicate the geographic location of the electronic device or the wearable device, and the third scene is used to indicate the user's home state; record the second event and the second position or the third scene The relationship of the scene.
  • the first interface may be, for example, the interface 710 shown in FIG. 8
  • the first option may be, for example, a search for earphones option 711 .
  • the second interface includes first information and second information corresponding to the wearable device, the first information and the second information are associated, and the first information corresponds to The first location or the first scene, and the second information corresponds to the first event.
  • the wearable device when the wearable device is a Bluetooth headset, and the Bluetooth headset includes a left-ear headset and a right-ear headset, the second interface includes first information and second information respectively corresponding to the left-ear headset and the right-ear headset.
  • the interface 730 may include the first information corresponding to the left earphone (for example, the icon 737 and the icon 738 corresponding to the wearing state, and the icon 739a and the icon corresponding to the taking off state).
  • the interface 730 may also include first information corresponding to the right earphone (for example, corresponding to the wearing state) icon 733, icon 734, and icon 735, and the icon 736 corresponding to the removed state) and second information (eg, scene card 740, scene card 741, scene card 742 corresponding to different scenes).
  • first information corresponding to the right earphone for example, corresponding to the wearing state
  • second information eg, scene card 740, scene card 741, scene card 742 corresponding to different scenes.
  • the second information corresponding to the left earphone and the second information corresponding to the right earphone may be represented by the same scene card.
  • the first information corresponds to the first position
  • a visual map is displayed, and the first position is indicated in the visual map.
  • the wearing/taking off state of the wearable device is associated and recorded with the geographic location or the living scene, which can better help the user recall the lost location of the wearable device and facilitate the user to retrieve the wearable device.
  • Wearable devices to improve user experience.
  • the electronic device in the embodiment of FIG. 16 may be the mobile phone in the previous embodiment, and the wearable device may be a Bluetooth headset.
  • the wearable device may be a Bluetooth headset.
  • an embodiment of the present application provides a method for finding a wearable device (taking the wearable device as a Bluetooth headset, and the Bluetooth headset includes a left-ear headset and a right-ear headset as an example), which is applied to an electronic device, including:
  • the first location is used to indicate the geographic location of the electronic device or the left earphone, and the first scene is used to indicate the home state of the user.
  • steps 1701-1703 and 1704-1706 are not limited in this application. Steps 1701-1703 may be executed first, and then steps 1704-1706 may be executed; or steps 1704-1706 may be executed first, and then executed Steps 1701-1703; or, steps 1701-1703 and 1704-1706 may be performed simultaneously.
  • an electronic device may display an interface 710 (first interface), and the interface 710 includes an option 801 for finding the left earphone and an option 802 for finding the right earphone. .
  • a second interface is displayed, and the second interface includes the first information and the second information corresponding to the left ear earphone; the first information and the second information are associated, and the first information corresponds to the first information.
  • the second information corresponds to the first event.
  • the mobile phone in response to the user selecting the option 801 corresponding to the left earphone, as shown in (b) of FIG. 18 , the mobile phone may display an interface 803 (second interface), the interface The scene and state corresponding to the left earphone 804 can be displayed in 803 .
  • a third interface is displayed, and the second interface includes the third information and the fourth information corresponding to the right ear earphone; the third information is associated with the fourth information, and the third information corresponds to the third information.
  • the second location or the second scene, the fourth information corresponds to the second event.
  • the mobile phone in response to the user selecting the option 802 corresponding to the right earphone, as shown in (b) of FIG. 19 , the mobile phone may display an interface 805 (third interface), the interface The scene and state corresponding to the right earphone 806 can be displayed in 805 .
  • step 1708 and step 1709 may be executed alternatively, or may be executed separately at different times, which is not limited in this application.
  • the wearing/taking off state of the earphone (left ear earphone or right ear earphone) is associated and recorded with the geographic location or life scene, which can better help the user recall the location where the earphone was left. It is convenient for users to retrieve the wearable device and improve the user experience.
  • the electronic device in the embodiment of FIG. 17 may be the mobile phone in the foregoing embodiment, and the parts not described in detail in the embodiment of FIG. 17 may refer to the foregoing embodiment, and will not be repeated here.
  • the chip system includes at least one processor 2001 and at least one interface circuit 2002 .
  • the processor 2001 and the interface circuit 2002 may be interconnected by wires.
  • interface circuit 2002 may be used to receive signals from other devices (eg, memory of an electronic device).
  • the interface circuit 2002 may be used to send signals to other devices (eg, the processor 2001).
  • the interface circuit 2002 can read instructions stored in a memory in the electronic device and send the instructions to the processor 2001 .
  • the electronic device the electronic device 200 shown in FIG. 2
  • the wearable device the wearable device 300 shown in FIG. 3
  • the steps in the above embodiments can be caused to perform the steps in the above embodiments .
  • the chip system may also include other discrete devices, which are not specifically limited in this embodiment of the present application.
  • Embodiments of the present application further provide a computer-readable storage medium, where the computer-readable storage medium includes computer instructions, when the computer instructions are stored in an electronic device (the electronic device 200 shown in FIG. 2 ) or a wearable device (as shown in FIG. 2 ) 3
  • the electronic device 200 is caused to perform each function or step performed by the electronic device in the above method embodiments, so that the wearable device 300 performs each function performed by the wearable device in the above method embodiments. or steps.
  • Embodiments of the present application further provide a computer program product, which, when the computer program product runs on a computer, enables the computer to perform each function or step performed by the electronic device in the foregoing method embodiments.
  • An embodiment of the present application further provides a processing device, the processing device can be divided into different logical units or modules according to functions, and each unit or module performs different functions, so that the processing device executes the electronic components in the above method embodiments. Individual functions or steps performed by a device or wearable device.
  • the disclosed apparatus and method may be implemented in other manners.
  • the device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be Incorporation may either be integrated into another device, or some features may be omitted, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may be one physical unit or multiple physical units, that is, they may be located in one place, or may be distributed to multiple different places . Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically alone, or two or more units may be integrated into one unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in a readable storage medium.
  • the technical solutions of the embodiments of the present application can be embodied in the form of software products in essence, or the parts that contribute to the prior art, or all or part of the technical solutions, which are stored in a storage medium , including several instructions to make a device (may be a single chip microcomputer, a chip, etc.) or a processor (processor) to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read only memory (ROM), random access memory (random access memory, RAM), magnetic disk or optical disk and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Human Computer Interaction (AREA)
  • Databases & Information Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Software Systems (AREA)
  • Remote Sensing (AREA)
  • Data Mining & Analysis (AREA)
  • Otolaryngology (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本申请实施例提供一种查找可穿戴设备的方法和装置,涉及终端领域。其方法包括:从可穿戴设备接收第一事件,第一事件包括佩戴事件或摘下事件;响应于接收到第一事件,获取第一位置或第一场景,第一位置用于指示电子设备或可穿戴设备的地理位置,第一场景用于指示用户的居家状态;记录第一事件与第一位置或第一场景的关联关系;响应于用户对第一应用程序的第一操作,显示第一界面;其中,第一界面包括第一选项,第一选项用于查找可穿戴设备;响应于用户选中第一选项的操作,显示第二界面;其中,第二界面包括可穿戴设备对应的第一信息和第二信息,第一信息和第二信息相关联,第一信息对应第一位置或第一场景,第二信息对应第一事件。

Description

一种查找可穿戴设备的方法和装置
本申请要求于2021年04月25日提交国家知识产权局、申请号为202110448810.3、发明名称为“一种查找可穿戴设备的方法和装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端领域,尤其涉及一种查找可穿戴设备的方法和装置。
背景技术
可穿戴设备是指可以直接穿戴在用户身上,或是整合到用户的衣服或配件的一种便携式设备。可穿戴设备可以包括蓝牙耳机、智能手环、智能眼镜等设备。其中,蓝牙耳机(例如,真无线立体声(true wireless stereo,TWS)蓝牙耳机)是指应用了蓝牙技术的免持耳机。用户可以免除电线的牵绊,自在地以各种方式使用TWS蓝牙耳机(例如,听音乐,打电话)。尤其对于行动商务族,蓝牙耳机是提升效率的好工具。
但是,可穿戴设备(例如,TWS蓝牙耳机)在提升用户体验的同时,也给用户带来了容易丢失,寻找困难的烦恼。
发明内容
本申请实施例提供一种查找可穿戴设备的方法和装置,能够帮助用户方便快捷地查找可穿戴设备,提高用户体验。
第一方面,本申请实施例提供一种查找可穿戴设备的方法,应用于电子设备,包括:从可穿戴设备接收第一事件,第一事件包括佩戴事件或摘下事件;响应于接收到第一事件,获取第一位置或第一场景,第一位置用于指示电子设备或可穿戴设备的地理位置,第一场景用于指示用户的居家状态;记录第一事件与第一位置或第一场景的关联关系;响应于用户对第一应用程序的第一操作,显示第一界面;其中,第一界面包括第一选项,第一选项用于查找可穿戴设备;响应于用户选中第一选项的操作,显示第二界面;其中,第二界面包括可穿戴设备对应的第一信息和第二信息,第一信息和第二信息相关联,第一信息对应第一位置或第一场景,第二信息对应第一事件。
基于本申请实施例提供的方法,将可穿戴设备的佩戴/摘下状态与地理位置或生活场景进行关联并记录,可以更好地帮助用户回想可穿戴设备的遗落位置,方便用户找回可穿戴设备,提高用户体验。
在一种可能的实现方式中,当第一信息对应第一位置时,方法还包括:响应于用户对第一信息的操作,显示可视化地图,在可视化地图中指示第一位置。这样,用户可以在可视化地图中查找到第一位置,若可穿戴设备在第一位置为摘下状态且可穿戴设备遗落,说明可穿戴设备很有可能遗落在第一位置附近,可以缩小用户找寻可穿戴设备的范围,有助于用户更快更方便地找回可穿戴设备。
在一种可能的实现方式中,电子设备获取第一位置或第一场景包括:若确定电子设备位于预设居住区域,获取第一场景;若确定电子设备位于预设居住区域之外的区域,获取第一位置。其中,预设居住区域可以是指住宅/家,由于在室内通常无法进行精准定位,此 时可以基于室内的场景模式,记录耳机的使用场景。这样后续可穿戴设备丢失时,用户可以基于可穿戴设备的使用场景快速准确地回忆起可穿戴设备的使用地点,方便用户找回可穿戴设备。在外出场景下,可以通过定位技术(例如,全球卫星定位系统(global positioning system,GPS)定位技术、北斗定位技术等)对可穿戴设备进行定位。这样后续可穿戴设备丢失时,用户可以基于可穿戴设备的定位信息快速准确地回忆起可穿戴设备的使用地点,方便用户找回可穿戴设备。
在一种可能的实现方式中,获取第一场景包括:从第二应用程序中查询第一场景;第二应用程序包括多个场景,多个场景包括睡眠场景、起床场景、影院场景、用餐场景、休闲场景、阅读场景、娱乐场景、回家场景或离家场景中的至少一个。第一场景可以是上述多个场景中的一个。其中,第二应用程序可以与第一应用程序相同,例如都可以是智慧生活APP。或者,第二应用程序可以与第一应用程序不同,本申请不做限定。
在一种可能的实现方式中,方法还包括:确定是否从第一场景切换到第二场景,第一场景与第二场景不同;若确定从第一场景切换到第二场景,记录第一事件与第二场景的关联关系。考虑到可穿戴设备在佩戴状态可以经历多个生活场景的变化,发生场景切换后可以继续记录可穿戴设备的使用状态和切换后的场景,从而可以更准确地记录可穿戴设备的使用情况。
在一种可能的实现方式中,方法还包括:电子设备接收第二事件;当第一事件为佩戴事件时,第二事件为摘下事件;当第一事件为摘下事件时,第二事件为佩戴事件;电子设备获取第二位置或第三场景,第二位置用于指示电子设备或可穿戴设备的地理位置,第三场景用于指示用户的居家状态;记录第二事件与第二位置或第三场景的关联关系。考虑到可穿戴设备的佩戴状态和摘下状态可以发生切换,切换后可以继续记录可穿戴设备的使用状态(佩戴状态或摘下状态)和响应的位置或场景,从而可以更准确地记录可穿戴设备的使用情况,以帮助用户后续更加方便快捷地查找可穿戴设备。
在一种可能的实现方式中,获取第一位置包括:从可穿戴设备获取第一位置,第一位置用于指示可穿戴设备的地理位置;或者基于网络定位技术获取第一位置,第一位置用于指示电子设备的地理位置;网络定位技术包括基站定位技术、无线保真(wireless fidelity,WiFi)定位技术和全球卫星定位系统GPS定位技术。即可以将可穿戴设备采集的地理位置作为用户使用可穿戴设备时所处的地理位置,也可以将电子设备采集到的地理位置作为用户使用可穿戴设备时所处的地理位置(这是由于可穿戴设备通常与电子设备配合使用,其分别采集到的地理位置差别较小)。
在一种可能的实现方式中,第一位置是通过可穿戴设备或电子设备的地理坐标指示的,或者第一位置是通过第一名称指示的,第一名称是根据可穿戴设备或电子设备的地理坐标确定的。例如,第一名称可以是XX咖啡馆、XX咖啡厅,XX图书馆,商场的XX店铺等。
在一种可能的实现方式中,当可穿戴设备为蓝牙耳机,蓝牙耳机包括左耳耳机和右耳耳机时,第二界面包括左耳耳机和右耳耳机分别对应的第一信息和第二信息。示例性的,如图10所示,界面730(第二界面)可以包括左耳耳机对应的第一信息(例如,对应佩戴状态的图标737和图标738,以及对应摘下状态的图标739a和图标739b)和第二信息(例如,对应不同场景的场景卡片740、场景卡片741、场景卡片742);界面730(第二界面)还可以包括右耳耳机对应的第一信息(例如,对应佩戴状态的图标733、图标734和图标 735,以及对应摘下状态的图标736)和第二信息(例如,对应不同场景的场景卡片740、场景卡片741、场景卡片742)。其中,左耳耳机对应的第二信息和右耳耳机对应的第二信息可以用同一个场景卡片表示。
第二方面,本申请实施例提供一种查找可穿戴设备的方法,应用于电子设备,可穿戴设备为蓝牙耳机,蓝牙耳机包括左耳耳机和右耳耳机,包括:从左耳耳机接收第一事件,第一事件包括佩戴事件或摘下事件;响应于接收到第一事件,获取第一位置或第一场景,第一位置用于指示电子设备或左耳耳机的地理位置,第一场景用于指示用户的居家状态;记录第一事件与第一位置或第一场景的关联关系;从右耳耳机接收第二事件,第二事件包括佩戴事件或摘下事件;响应于接收到第二事件,获取第二位置或第二场景,第二位置用于指示电子设备或右耳耳机的地理位置,第二场景用于指示用户的居家状态;记录第二事件与第二位置或第二场景的关联关系;响应于用户对第一应用程序的第一操作,显示第一界面;其中,第一界面包括用于查找左耳耳机的选项和用于查找右耳耳机对应的选项;响应于用户选中左耳耳机对应的选项,显示第二界面,第二界面包括左耳耳机对应的第一信息和第二信息;第一信息和第二信息相关联,第一信息对应第一位置或第一场景,第二信息对应第一事件;或者响应于用户选中右耳耳机对应的选项,显示第三界面,第二界面包括右耳耳机对应的第三信息和第四信息;第三信息和第四信息相关联,第三信息对应第二位置或第二场景,第四信息对应第二事件。
基于本申请实施例提供的方法,将耳机(左耳耳机或右耳耳机)的佩戴/摘下状态与地理位置或生活场景进行关联并记录,可以更好地帮助用户回想耳机的遗落位置,方便用户找回可穿戴设备,提高用户体验。
第三方面,本申请实施例提供一种查找可穿戴设备的方法,应用于电子设备和可穿戴设备组成的系统,包括:可穿戴设备检测到第一事件时,向电子设备发送第一事件;电子设备从可穿戴设备接收第一事件,第一事件包括佩戴事件或摘下事件;响应于接收到第一事件,电子设备获取第一位置或第一场景,第一位置用于指示电子设备或可穿戴设备的地理位置,第一场景用于指示用户的居家状态;电子设备记录第一事件与第一位置或第一场景的关联关系;响应于用户对第一应用程序的第一操作,电子设备显示第一界面;其中,第一界面包括第一选项,第一选项用于查找可穿戴设备;响应于用户选中第一选项的操作,电子设备显示第二界面;其中,第二界面包括可穿戴设备对应的第一信息和第二信息,第一信息和第二信息相关联,第一信息对应第一位置或第一场景,第二信息对应第一事件。
基于本申请实施例提供的方法,将可穿戴设备的佩戴/摘下状态与地理位置或生活场景进行关联并记录,可以更好地帮助用户回想可穿戴设备的遗落位置,方便用户找回可穿戴设备,提高用户体验。
第四方面,本申请提供一种芯片系统,该芯片系统包括一个或多个接口电路和一个或多个处理器。该接口电路和处理器通过线路互联。
上述芯片系统可以应用于包括通信模块和存储器的电子设备。该接口电路用于从电子设备的存储器接收信号,并向处理器发送接收到的信号,该信号包括存储器中存储的计算机指令。当处理器执行该计算机指令时,电子设备可以执行如第一方面、第二方面或第三方面及其任一种可能的设计方式所述的方法。
第五方面,本申请提供一种计算机可读存储介质,该计算机可读存储介质包括计算机 指令。当计算机指令在电子设备(如手机)上运行时,使得该电子设备执行如第一方面、第二方面或第三方面及其任一种可能的设计方式所述的方法。
第六方面,本申请提供一种计算机程序产品,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如第一方面、第二方面或第三方面及其任一种可能的设计方式所述的方法。
第七方面,本申请实施例提供了一种查找可穿戴设备的装置,包括处理器,处理器和存储器耦合,存储器存储有程序指令,当存储器存储的程序指令被处理器执行时使得所述装置实现上述第一方面或第二方面或第三方面及其任一种可能的设计方式所述的方法。所述装置可以为电子设备;或可以为电子设备中的一个组成部分,如芯片。
第八方面,本申请实施例提供了一种查找可穿戴设备的装置,所述装置可以按照功能划分为不同的逻辑单元或模块,各单元或模块执行不同的功能,以使得所述装置执行上述第一方面或第二方面或第三方面及其任一种可能的设计方式所述的方法。
第九方面,本申请实施例提供了一种查找可穿戴设备的系统,包括电子设备和可穿戴设备,所述电子设备和可穿戴设备分别执行部分步骤,相互配合以实现上述第一方面或第二方面或第三方面及其任一种可能的设计方式所述的方法。
可以理解地,上述提供的第四方面所述的芯片系统,第五方面所述的计算机可读存储介质,第六方面所述的计算机程序产品及第七方面、第八方面所述的装置及第九方面所述的系统所能达到的有益效果,可参考如第一方面、第二方面或第三方面及其任一种可能的设计方式中的有益效果,此处不再赘述。
附图说明
图1为本申请实施例提供的一种系统架构示意图;
图2为本申请实施例提供的一种电子设备的硬件结构示意图;
图3为本申请实施例提供的一种可穿戴设备的硬件结构示意图;
图4为本申请实施例提供的一种流程图;
图5为本申请实施例提供的又一种流程图;
图6为本申请实施例提供的又一种流程图;
图7为本申请实施例提供的一种显示示意图;
图8为本申请实施例提供的又一种显示示意图;
图9为本申请实施例提供的又一种显示示意图;
图10为本申请实施例提供的又一种显示示意图;
图11为本申请实施例提供的一种耳机维护的场景和状态示意图;
图12为本申请实施例提供的又一种显示示意图;
图13为本申请实施例提供的又一种显示示意图;
图14为本申请实施例提供的又一种显示示意图;
图15为本申请实施例提供的又一种显示示意图;
图16为本申请实施例提供的又一种流程图;
图17为本申请实施例提供的又一种流程图;
图18为本申请实施例提供的又一种显示示意图;
图19为本申请实施例提供的又一种显示示意图;
图20为本申请实施例提供的一种芯片系统结构示意图。
具体实施方式
如图1所示,为本申请实施例提供的一种通信系统架构示意图。该通信系统架构可以包括可穿戴设备100和电子设备200。可穿戴设备100和电子设备200可以进行蓝牙通信。
如图2所示,本申请实施例以电子设备200(如手机)为例,对本申请实施例提供的电子设备的结构进行举例说明。电子设备200(如手机)可以包括:处理器210,外部存储器接口220,内部存储器221,通用串行总线(universal serial bus,USB)接口230,充电管理模块240,电源管理模块241,电池242,天线1,天线2,移动通信模块250,无线通信模块260,音频模块270,扬声器270A,受话器270B,麦克风270C,耳机接口270D,传感器模块280,按键290,马达291,指示器292,摄像头293,显示屏294,以及用户标识模块(subscriber identification module,SIM)卡接口295等。
其中,上述传感器模块280可以包括压力传感器,陀螺仪传感器,气压传感器,磁传感器,加速度传感器,距离传感器,接近光传感器,指纹传感器,温度传感器,触摸传感器,环境光传感器和骨传导传感器等传感器。
可以理解的是,本实施例示意的结构并不构成对电子设备200的具体限定。在另一些实施例中,电子设备200可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器210可以包括一个或多个处理单元,例如:处理器210可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以是电子设备200的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器210中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器210中的存储器为高速缓冲存储器。该存储器可以保存处理器210刚用过或循环使用的指令或数据。如果处理器210需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器210的等待时间,因而提高了系统的效率。
在一些实施例中,处理器210可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
可以理解的是,本实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备200的结构限定。在另一些实施例中,电子设备200也可以采用上述实施例 中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块240用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。充电管理模块240为电池242充电的同时,还可以通过电源管理模块241为电子设备供电。
电源管理模块241用于连接电池242,充电管理模块240与处理器210。电源管理模块241接收电池242和/或充电管理模块240的输入,为处理器210,内部存储器221,外部存储器,显示屏294,摄像头293,和无线通信模块260等供电。在一些实施例中,电源管理模块241和充电管理模块240也可以设置于同一个器件中。
电子设备200的无线通信功能可以通过天线1,天线2,移动通信模块250,无线通信模块260,调制解调处理器以及基带处理器等实现。在一些实施例中,电子设备200的天线1和移动通信模块250耦合,天线2和无线通信模块260耦合,使得电子设备200可以通过无线通信技术与网络以及其他设备通信。
天线1和天线2用于发射和接收电磁波信号。电子设备200中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块250可以提供应用在电子设备200上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块250可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块250可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。
移动通信模块250还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块250的至少部分功能模块可以被设置于处理器210中。在一些实施例中,移动通信模块250的至少部分功能模块可以与处理器210的至少部分模块被设置在同一个器件中。
无线通信模块260可以提供应用在电子设备200上的包括WLAN(如(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。
其中,GNSS可以包括北斗卫星导航系统(beidou navigation satellite system,BDS),GPS,全球导航卫星系统(global navigation satellite system,GLONASS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
无线通信模块260可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块260经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器210。无线通信模块260还可以从处理器210接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
电子设备200通过GPU,显示屏294,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏294和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器210可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏294用于显示图像,视频等。该显示屏294包括显示面板。
电子设备200可以通过ISP,摄像头293,视频编解码器,GPU,显示屏294以及应用处理器等实现拍摄功能。ISP用于处理摄像头293反馈的数据。摄像头293用于捕获静态图像或视频。在一些实施例中,电子设备200可以包括1个或N个摄像头293,N为大于1的正整数。
外部存储器接口220可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备200的存储能力。外部存储卡通过外部存储器接口220与处理器210通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器221可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。处理器210通过运行存储在内部存储器221的指令,从而执行电子设备200的各种功能应用以及数据处理。例如,在本申请实施例中,处理器210可以通过执行存储在内部存储器221中的指令,内部存储器221可以包括存储程序区和存储数据区。
其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储电子设备200使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器221可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
电子设备200可以通过音频模块270,扬声器270A,受话器270B,麦克风270C,耳机接口270D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
按键290包括开机键,音量键等。按键290可以是机械按键。也可以是触摸式按键。马达291可以产生振动提示。马达291可以用于来电振动提示,也可以用于触摸振动反馈。指示器292可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。SIM卡接口295用于连接SIM卡。SIM卡可以通过插入SIM卡接口295,或从SIM卡接口295拔出,实现和电子设备200的接触和分离。电子设备200可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口295可以支持Nano SIM卡,Micro SIM卡,SIM卡等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备200的具体限定。在本申请另一些实施例中,电子设备200可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
示例性的,电子设备100可以是手机、智能遥控器、掌上电脑、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备便携式多媒体播放器(portable multimedia player,PMP)、媒体播放器等类型的设备。本申请实施例对电子设备的具体类型不作任何限制。
如图3所示,可穿戴设备100可包括无线通信模块310,定位模块320,处理器330,内部存储器340,电源管理模块350,电池360,充电管理模块370,天线3等。
其中,处理器330可以包括一个或多个处理单元。例如:处理器330可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network  processing unit,NPU)等。
在一些实施例中,处理器330可以包括一个或多个接口。接口可以包括I2C接口,I2S接口,PCM接口,UART接口,MIPI,GPIO接口,SIM卡接口,和/或USB接口等。
可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对可穿戴设备100的结构限定。在本申请另一些实施例中,可穿戴设备100也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
充电管理模块370用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。
电源管理模块350用于连接电池360,充电管理模块370与处理器330。电源管理模块350接收电池360和/或充电管理模块370的输入,为处理器330,内部存储器340,外部存储器接口220和无线通信模块310等供电。电源管理模块350还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。
可穿戴设备100的无线通信功能可以通过天线3以及无线通信模块310、定位模块320等实现。
定位模块320可以提供应用在可穿戴设备100上的定位技术。定位技术可以包括基于北斗卫星导航系统(beidou navigation satellite system,BDS),全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)等系统的定位技术。
定位模块320可以与无线通信模块310集成在一起或者分开设置,本申请不做限定。
内部存储器340可以用于存储一个或多个计算机程序,该一个或多个计算机程序包括指令。
可穿戴设备100上还可以设置光学传感器(例如,红外测温传感器)、运动传感器(例如,加速度传感器、陀螺仪等)和电容式传感器等。
当可穿戴设备为蓝牙耳机时,基于上述任一种或多种传感器,蓝牙耳机可以进行佩戴摘下检测,以确定蓝牙耳机处于佩戴状态还是摘下(取下)状态。
例如,当蓝牙耳机上设置有红外测温传感器和加速度传感器时,可以通过红外测温传感器感受预设时间内温度的变化,根据加速度传感器得到预设时间内是否发生佩戴动作,结合这两方面可以确定出蓝牙耳机是处于佩戴状态还是摘下状态。又例如,当蓝牙耳机上设置有电容式传感器时,通过耳机戴上和摘下过程中电容式传感器的电容值的变化可以判定蓝牙耳机是处于佩戴状态还是摘下状态。
应该理解的是,蓝牙耳机处于佩戴状态时,蓝牙耳机佩戴在用户的耳朵上。蓝牙耳机处于摘下状态时,蓝牙耳机可以是存放在耳机盒中,也可以是放置在桌子上,或者遗落在地板上、沙发上等,本申请不做限定。
可选的,蓝牙耳机还可以包括扬声器和麦克风。扬声器可将音频播放到用户的耳部。麦克风可采集音频数据诸如正在进行电话呼叫的用户的语音和/或环境噪声信息(例如,用于噪声消除)。
在蓝牙耳机的操作期间,可根据加速度计检测到的轻击操作(例如,单击、双击、三击等)控制蓝牙耳机的状态。响应于控制蓝牙耳机的操作,蓝牙耳机可以调节音频播放(例如, 当检测到单击操作时,可以开始或停止播放音乐)。进一步的,响应于控制蓝牙耳机的操作,可以控制蓝牙耳机的电源管理功能。例如,检测到单击操作时,停止播放音乐,并且在不用于为用户播放音乐时将蓝牙耳机置于低功率睡眠模式。
可以理解的是,本申请实施例示意的结构并不构成对可穿戴设备100的具体限定。在本申请另一些实施例中,可穿戴设备100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
可穿戴设备100例如可以包括蓝牙耳机、智能手环、智能手表或者智能眼镜等,本申请不做限定。
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。其中,在本申请的描述中,除非另有说明,“至少一个”是指一个或多个,“多个”是指两个或多于两个。另外,为了便于清楚描述本申请实施例的技术方案,在本申请的实施例中,采用了“第一”、“第二”等字样对功能和作用基本相同的相同项或相似项进行区分。本领域技术人员可以理解“第一”、“第二”等字样并不对数量和执行次序进行限定,并且“第一”、“第二”等字样也并不限定一定不同。
为了便于理解,以下结合附图对本申请实施例提供的查找可穿戴设备的方法进行具体介绍。
如图4所示,本申请实施例提供一种查找可穿戴设备的方法,以可穿戴设备为耳机(例如,蓝牙耳机),电子设备为手机为例进行说明,包括:
401、手机接收耳机发送的佩戴事件。
手机与耳机(例如,蓝牙耳机)可以进行蓝牙连接,从而基于蓝牙链路进行通信。其中,蓝牙连接可以包括经典蓝牙(basic rate/enhanced data rate,BR/EDR)和低功耗蓝牙(bluetooh low energy,BLE)等。
耳机检测到佩戴事件后,可以向手机发送该佩戴事件。佩戴事件中可以携带耳机的标识(identity,ID)和佩戴状态。
耳机可以包括左耳耳机(位于左耳的无线耳机设备)或右耳耳机(位于右耳的无线耳机设备)。可以将左耳耳机或右耳耳机中的一个作为主耳机,另一个作为从耳机。
主耳机和从耳机可基于无线通信的相应质量来选择。例如,如果位于左耳耳机与手机之间的通信链路比右耳耳机与手机之间的通信链路更好(例如,更可靠、更快、或更低的功率消耗),则左耳耳机可以作为主耳机,右耳耳机可以作为从耳机。当然,也可以使用其他方法确定主耳机和从耳机,本申请不做限定。
主耳机可以与手机通信(例如,主耳机可以向手机发送自身的佩戴事件)。从耳机可以与主耳机进行通信(例如,从耳机可以向主耳机发送自身的佩戴事件)。主耳机可以作为从耳机与手机通信的媒介(例如,主耳机可以向手机发送从耳机的佩戴事件)。或者,主从耳机都可以与手机进行通信,本申请不做限定。
手机接收到耳机发送的佩戴事件后,可以查询手机当前的地理位置,若确定手机位于住所内/家中(住所/家的位置可以是用户设置的或者是手机根据预设算法确定的),确定当前为居家场景,可以执行步骤402。
示例性的,若手机打开了定位功能(例如,GPS功能或北斗功能),可以根据定位信 息确定手机是否位于家中。或者,若手机的应用程序(例如,智慧生活APP)中的相应模式(例如,“回家模式”)被打开,可以确定手机位于家中。或者,若手机连接家中的WIFI,可以确定手机位于家中。
402、手机确定当前为居家场景。
由于在室内通常无法进行精准定位,此时可以基于应用程序(例如,智慧生活APP)提供的场景模式,记录耳机的使用场景。这样后续耳机丢失时,用户可以基于耳机的使用场景快速准确地回忆起耳机的使用地点,方便用户找回耳机。
403、手机查询第一生活场景。
手机接收到耳机发送的佩戴事件后,可以从应用程序中查询当前的生活场景(即第一生活场景)。
其中,应用程序(例如,智慧生活APP)中可以预设多个不同的生活场景。不同生活场景是为了满足用户生活中的不同需求,借助一系列智能家居设备所打造出来的各种可调整、灵活性强、支持多场景应用的家居模式,是一系列家居功能的组合。为用户提供不同的生活场景,可以帮助用户减少繁复的设备操作,节省生活成本。
生活场景可以包括睡眠场景、起床场景、影院场景、用餐场景(例如,早餐场景、午餐场景、晚餐场景)、休闲场景、阅读场景、娱乐场景、回家场景或离家场景中的至少一个。本申请实施例中,场景也可以替换为情景、模式等,本申请不做限定。
示例性的,起床场景下:早晨7点,卧室窗帘可以缓缓打开,音箱可以播放轻柔的背景音乐,面包机可以自启动烘焙面包,咖啡机开始研磨咖啡,卫生间取暖设备、热水器开始工作。
娱乐场景/影院场景下:客厅灯光自动变暗,电视、音响设备打开,用户选择自己喜欢的电影开始欣赏。
休闲场景/阅读场景下:灯光变柔变弱,音箱可以开始播放轻音乐,电视机关闭。用户可以在舒适优雅的环境之下开始阅读。
离家场景下:灯光全部关闭、不需待机的设备断电(例如,空调、音箱等)。安防系统启动:门磁、红外人体感应器,烟雾报警器工作,用户可以放心出门。
当然,上述各个场景可以有其他称呼,本申请不做限定。
上述各个场景可以通过用户手动触发。例如,用户可以即时触发(例如,在手机的应用程序上直接选中某场景),也可以自定义在预设时间触发(例如,用户可以设定7点自动触发起床场景)。各个场景也可以由智能家居设备的操作触发。例如,检测到卧室窗帘打开、咖啡机开启,手机可以主动触发起床场景。
404、手机确定是否查询到第一生活场景。
示例性的,手机可以调用预设的应用程序接口(application programming interface,API)接口,从智慧生活APP查询当前的生活场景(即第一生活场景)。
405、若查询到第一生活场景,记录第一生活场景和佩戴事件。
应该理解的是,第一生活场景和佩戴事件是相对应/关联的,表示在第一生活场景下,耳机为佩戴状态。
例如,若查询到第一生活场景为起床场景,记录起床场景和佩戴事件,表示耳机在起床场景下为佩戴状态。
406、若未查询到第一生活场景,记录当前时间/运动状态和佩戴事件。
例如,若智慧生活APP是关闭状态,或者智慧生活APP中未运行生活场景,则无法查询到结果。此时可以记录耳机在当前时间/状态下为佩戴状态。其中,当前时间可以是手机接收到耳机发送的佩戴事件的时间或者耳机发送佩戴事件的时间,当前运动状态用于指示用户当前的运动状态(例如,静止状态,步行状态)。
407、手机确定用户是否摘下耳机。
若手机接收到耳机发送的摘下事件,确定用户摘下耳机。若手机未接收到耳机发送的摘下事件,确定用户未摘下耳机。
若确定用户未摘下耳机,可以执行步骤408。若确定用户摘下耳机,可以执行步骤410。
408、若确定用户未摘下耳机,确定是否发生场景切换。
考虑到耳机在佩戴状态可以经历多个生活场景的变化,为了更准确地记录耳机的使用情况,可以在发生场景切换时继续记录耳机的使用状态和切换后的场景。
手机的智慧生活APP可根据家居设备的动作完成生活场景的转换。例如,在娱乐场景/影院场景下,检测到电视关闭,灯光打开,可以切换到休闲场景/阅读场景。又例如,用户可以手动在智慧生活APP上切换生活场景,或者,智慧生活APP可以在用户自定义的时间自动进行场景切换。例如,晚上11点,从娱乐场景自动切换为睡眠场景。
若确定未发生场景切换,可以返回执行步骤407。若确定发生场景切换,可以执行步骤409。
409、若确定发生场景切换,记录第二生活场景和佩戴事件。
在用户未摘下耳机的情况下,手机确定发生场景切换后,可以从应用程序中查询切换后的生活场景(即第二生活场景),并记录第二生活场景与佩戴事件的关联关系。第二生活场景与佩戴事件的关联关系表示在第二生活场景下,耳机为佩戴状态。
410、若确定用户摘下耳机,查询第三生活场景。
手机接收到耳机发送的摘下事件后,可以从应用程序(例如,智慧生活APP)中查询当前的生活场景(即第三生活场景)。
411、确定是否查询到第三生活场景。
412、若查询到第三生活场景,记录第三生活场景和摘下事件。
在用户摘下耳机的情况下,手机可以从应用程序中查询当前的生活场景(即第三生活场景),并记录第三生活场景与摘下事件的关联关系。第三生活场景与摘下事件的关联关系表示在第三生活场景下,耳机为摘下状态。
413、若未查询到第三生活场景,记录当前时间/运动状态和佩戴事件。
相关描述可以参考步骤406,在此不做赘述。
可选的,步骤413之后,还可以执行步骤414和步骤415。
414、确定是否发生场景切换。
即确定电子设备是否从第三生活场景切换到其他生活场景(例如,第四生活场景)。
场景切换的描述可以参考步骤408,在此不做赘述。
415、若确定发生场景切换,记录第四生活场景和摘下事件。
在用户摘下耳机的情况下,手机确定发生场景切换后,可以从应用程序中查询切换后的生活场景(即第四生活场景),记录第四生活场景和摘下事件的关联关系,结束流程。
若确定未发生场景切换,可以循环执行步骤414。
基于本申请实施例提供的方法,当用户居家使用耳机时,若耳机不慎脱落,可以通过查询耳机的历史状态及相对应的场景,迅速回忆起耳机的遗落位置,有助于用户方便快速的找回耳机。
如图5所示,本申请实施例提供一种查找可穿戴设备的方法,以可穿戴设备为耳机(耳机可以是指左耳耳机或右耳耳机),电子设备为手机为例,包括:
501、手机接收耳机发送的佩戴事件。
手机接收耳机发送的佩戴事件的相关描述可以参考步骤401,在此不做赘述。
需要说明的是,手机接收到耳机发送的佩戴事件后,可以查询手机当前的地理位置,若确定手机位于住所/家以外的区域,确定当前为外出场景,可以执行步骤502。
示例性的,若手机打开了定位功能(例如,GPS功能或北斗功能),可以根据定位信息确定手机位于住所/家之外的区域,从而确定当前为外出场景。或者,若手机的应用程序(例如,智慧生活APP)中的相应模式(例如,“离家模式”)被打开,可以确定当前为外出场景。或者,若手机连接住所/家之外的区域的WIFI,可以确定当前为外出场景。
502、手机确定当前为外出场景。
在外出场景下,可以通过定位技术(例如,GPS定位技术、北斗定位技术等)对耳机进行定位。这样后续耳机丢失时,用户可以基于耳机的定位信息快速准确地回忆起耳机的使用地点,方便用户找回耳机。
503、手机确定佩戴事件是否携带第一坐标信息。
第一坐标信息可以包括耳机在检测到佩戴事件时采集到的地理坐标,该地理坐标用于表征耳机在检测到佩戴事件时的地理位置。例如,若耳机开启GPS功能,耳机上报的佩戴事件中可以包括GPS坐标(即第一坐标信息)。
504、若佩戴事件携带第一坐标信息,记录第一坐标信息。
佩戴事件与第一坐标信息是相关联的。可以认为在第一坐标信息指示的地理位置发生了佩戴事件,即用户在第一坐标信息指示的地理位置佩戴了耳机。
505、若佩戴事件未携带第一坐标信息,记录第二坐标信息。
第二坐标信息可以包括手机接收到佩戴事件时手机采集的地理坐标。
手机可以基于网络定位技术采集地理坐标。其中,网络定位技术可以包括基站定位技术(以蜂窝移动通信技术为基础的定位技术)、WiFi定位技术(原理是通过定位软件侦测WiFi的ID(路由器地址),然后在其WiFi位置数据库和地图数据库的配合下完成定位)、GPS定位技术。
506、根据第一坐标信息或第二坐标信息查询第一人文位置。
手机可以根据第一坐标信息或第二坐标信息查询地图数据库,确定是否可以查询到第一人文位置。第一人文位置可以是指具有第一名称的地理位置,第一名称可以是人为指定的。例如,第一名称可以是XX咖啡馆、XX咖啡厅,XX图书馆,商场的XX店铺等。
在一些实施例中,若耳机不具有定位功能,可以不执行步骤503,直接执行步骤505。并且,在步骤506中,可以直接根据第二坐标信息查询第一人文位置。
507、确定是否查询到第一人文位置。
即确定第一坐标信息或第二坐标信息是否有对应的第一人文位置,例如,XX咖啡馆、 XX咖啡厅,XX图书馆,商场的XX店铺等。
508、若查询到第一人文位置,记录第一人文位置和佩戴事件。
应该理解的是,第一人文位置与佩戴事件具有对应/关联关系,表示在第一人文位置,耳机处于佩戴状态。
若可以查询到第一人文位置,则可以将佩戴事件和第一人文位置作为一条记录(也可以称为一个元素)加入(存入/压入)第一队列。第一队列是一种先进先出的线性表。第一队列的大小可以是预设的。
可选的,第一队列中的每一条记录都配置有时间戳。时间戳可以指示一条记录产生的时间。
509、若未查询到第一人文位置,记录第三坐标信息和佩戴事件。
第三坐标信息可以是第一坐标信息或第二坐标信息。
即若未查询到第一人文位置,可以将佩戴事件和该佩戴事件对应的坐标信息(第一坐标信息或第二坐标信息)作为一条记录加入第一队列。
510、确定用户是否摘下耳机。
若手机接收到耳机发送的摘下事件,确定用户摘下耳机。若手机未接收到耳机发送的摘下事件,确定用户未摘下耳机。
若确定用户未摘下耳机,可以循环执行步骤510。若确定用户摘下耳机,可以执行步骤511。
511、若确定用户摘下耳机,记录第四坐标信息。
即若手机接收到耳机发送的摘下事件,记录第四坐标信息。
其中,第四坐标信息可以携带在耳机发送的摘下事件中,包括耳机在检测到摘下事件时采集到的地理坐标。第四坐标信息也可以是手机确定的,包括手机接收到耳机发送的摘下事件时采集到地理坐标。
512、根据第四坐标信息查询第二人文位置。
第二人文位置可以是指具有第二名称的地理位置,第二名称可以是人为指定的。例如,第二名称可以是XX咖啡馆、XX咖啡厅,XX图书馆,商场的XX店铺等。第二人文位置与第一人文位置可以相同或不同。第二名称与第一名称可以相同或不同。
513、确定是否查询到第二人文位置。
即确定第三坐标信息或第四坐标信息是否有对应的第二人文位置,例如,XX咖啡馆、XX咖啡厅,XX图书馆,商场的XX店铺等。
514、若查询到第二人文位置,记录摘下事件和第二人文位置。
应该理解的是,第二人文位置与摘下事件对应/关联,表示在第二人文位置,耳机处于摘下状态。
515、若未查询到第二人文位置,记录摘下事件和第四坐标信息。
在一些实施例中,在获取第一坐标信息或第二坐标信息后,可以不查询相应的人文位置(即第一人文位置)。具体的,如图6所示,本申请实施例提供一种查找可穿戴设备(例如,耳机)的方法,包括步骤501-步骤505、步骤509-步骤511以及步骤515,各步骤可以参考上文的相关描述,在此不做赘述。
基于本申请实施例提供的方法,当用户在户外使用耳机时,若耳机在用户进行户外运 动(奔跑,自行车,爬山)过程中不慎脱落,可以通过查询耳机的历史状态及相对应的地理坐标,迅速回忆起耳机的遗落位置,有助于用户方便快速的找回耳机。
下面对本申请实施例提供的UI界面进行介绍:
如图7中的(a)所示,响应于用户在主界面701中点击智慧生活702的操作,如图7中的(b)所示,手机可以显示智慧生活APP的主界面703,可以在智慧生活APP中管理蓝牙耳机(例如,蓝牙耳机的名称可以为XX3)。响应于用户点击蓝牙耳机对应的图标704的操作,如图8所示,手机可以显示蓝牙耳机XX3的管理界面710,管理界面710可以包括案件功能自定义选项和查找耳机选项711等。当然,管理界面710还可以包括其他管理选项,本申请不做限定。
或者,如图9中的(a)所示,响应于用户在主界面720中点击荣耀耳机721的操作,如图9中的(b)所示,手机可以显示荣耀耳机APP的设备管理界面723,设备管理界面723中可以包括蓝牙耳机XX3。当然,设备管理界面723中还可以包括其他蓝牙耳机,本申请不做限定。响应于用户点击蓝牙耳机XX3对应的显示区域724,如图8所示,手机可以显示蓝牙耳机XX3的管理界面710。
如图8所示,响应于用户在管理界面710上点击查找耳机选项711的操作,如图10所示,手机可以显示查找耳机界面730,界面730中可以包括耳机(左耳耳机和右耳耳机)的轨迹和状态。其中,耳机(左耳耳机和右耳耳机)的状态可以为佩戴状态或摘下状态。佩戴状态可以用实心圆形图标表示(例如,图标733、图标734、图标735、图标737、图标738)。摘下状态可以用空心圆形图标表示(例如,图标736、图标739a、图标739b)。当然,佩戴状态或摘下状态也可以用其他形状或颜色填充的图标表示,或者也可以采用提示文字来表示,本申请不做限定。耳机的轨迹可以是多个场景组成的。每个场景可以对应一个场景卡片,例如,场景卡片740可以表示睡眠场景(或休息场景)。场景卡片741可以表示休闲场景或阅读场景,场景卡片742可以表示“空白场景”(“空白场景”可以是指无法从应用程序(例如,智慧生活APP)中查询到当前场景的情况),场景卡片743可以表示娱乐场景。耳机的状态和场景是关联的。
具体的,图标733、图标734、图标735可以表示右耳耳机731在场景卡片740、场景卡片741、场景卡片742分别对应的场景下为佩戴状态。图标736可以表示右耳耳机731在场景卡片743对应的场景下为摘下状态。图标737、图标738可以表示左耳耳机732在场景卡片740、场景卡片741分别对应的场景下为佩戴状态。图标739a和图标739b可以表示左耳耳机732在场景卡片742和场景卡片743分别对应的场景下为摘下状态。
当用户一时无法回忆起耳机最后的放置位置时,如图10所示的耳机使用时的生活场景记录,会大大的帮助用户回忆起上一次耳机使用时的场景,进而帮助用户缩小查找范围,快速找回耳机。
为了双耳耳机(左耳耳机和右耳耳机)的找回效果,可以针对左耳耳机和右耳耳机分别维护场景和状态记录。应该理解的是,每个耳机对应的场景和状态记录可以组成一个数据队列,每条数据包括一个场景和该场景下的状态,该队列尾部的数据中的状态通常为摘下。
在一种可能的设计中,可以分别维护左耳耳机和右耳耳机最近N次使用时的场景和状态记录。其中,N为大于或等于1的整数。在另一种可能的设计中,可以分别维护左耳耳 机和右耳耳机在最近M个时间单元内使用时的场景和状态记录。其中,M为大于或等于1的整数。时间单元可以包括分钟、小时、天、月等。应该理解的是,耳机的一次使用过程包括耳机发生佩戴事件至摘下事件的全部过程。
下面以手机分别维护左耳耳机和右耳耳机最近1次使用时的场景和状态记录为例进行说明。如图11中的(a)所示,手机接收到右耳耳机的佩戴事件后,可以从智慧生活APP中查询当前的场景,并可以记录当前的场景(例如,睡眠场景751)和状态(例如,佩戴状态755)。其中,智慧生活APP中的场景可以是手机基于预设参数(例如,时间、电子设备数据(例如,电子设备是否开启)等)自动确定的,也可以是用户手动选择的,本申请不做限定。进一步的,若手机检测到智慧生活APP发生了场景切换,可以记录切换后的场景(例如,休闲场景752)和状态(例如,佩戴状态756)。进一步的,若手机接收到右耳耳机发送的摘下事件,可以从智慧生活查询当前场景,若无法查询到当前场景(例如,智慧生活APP已关闭,或者用户未手动选择场景),可以记录“空白场景”753和状态(例如,摘下状态757)。可选的,可以在“空白场景”的基础上记录当前时间和/或用户的状态。其中,用户的状态可以是基于传感器数据(例如,陀螺仪)得到的。由于用户摘下右耳耳机时,手机无法查询到当前场景,后续用户在查找耳机时难以回想起耳机的丢失位置。因此,为了最大程度的帮助用户回想右耳耳机的丢失时间和位置,手机在右耳耳机为摘下状态时记录“空白场景”后,若检测到场景切换,可以进一步记录切换后的场景(例如,娱乐场景754)和状态(例如,摘下状态758)。
类似的,如图11中的(b)所示,手机接收到左耳耳机的佩戴事件后,确定左耳耳机为佩戴状态,此时可以从智慧生活APP中查询当前的场景,并可以记录当前的场景(例如,睡眠场景759)和状态(例如,佩戴状态762)。需要说明的是,用户可能在同一场景下进行了多次佩戴和摘下动作(例如,用户初次佩戴耳机时可能会频繁调整耳机的位置),即手机可以接收到多个佩戴事件和摘下事件。可以认为同一场景下不同状态(佩戴状态和摘下状态)的频繁切换是一种“抖动”,因此,需要进行“防抖”处理。例如,可以将预设时间间隔(例如,5s)内的记录的场景和状态归一处理,仅保留预设时间间隔内最后一次记录的状态和场景。如图11中的(b)所示,可以将睡眠场景759对应的佩戴状态762、睡眠场景760对应的摘下状态763,以及睡眠场景761对应的佩戴状态764进行归一处理,仅保留最后一次记录的状态和场景,即佩戴状态764和睡眠场景761。
如图11中的(b)所示,进一步的,若手机检测到智慧生活APP发生了场景切换,可以记录切换后的场景(例如,休闲场景765)和状态(例如,佩戴状态768)。若手机检测到智慧生活APP退出了休闲场景765,但未查询到新的场景,可以记录“空白场景”766和状态(例如,佩戴状态769)。进一步的,若手机接收到左耳耳机发送的摘下事件,可以从智慧生活查询当前场景,并记录当前的场景(例如,娱乐场景767)和状态(例如,摘下状态770)。
如图12所示,若用户最近在室内使用过耳机,手机可以默认显示耳机最新的场景和状态,用户可以通过下滑操作查询更多的历史记录(耳机更早的场景和状态)。当然,用户可以通过上滑操作返回最新的场景和状态。
可选的,每个场景卡片附近可以显示该场景对应的时间(例如,可以是手机查询到该场景时的时间)。
这样,可以将耳机在不同时间的场景和状态展示给用户,用户可以根据耳机的场景和状态回忆耳机可能遗落的位置,有助于用户方便快捷地找回耳机。
如图13所示,用户在外出场景下使用耳机时,手机可以记录耳机的位置和对应的状态并显示给用户,耳机的位置可以包括人文位置和地理坐标。其中,地理坐标可以由对应地图供应商的地理解码器(GeoDecoder)转化为可视化地图。若用户点击了相应的位置卡片(例如,位置卡片780、位置卡片781、位置卡片782、位置卡片783),手机可以打开地图,显示地图中的定位位置。
可选的,可以通过不同的元素(形状、文字)或颜色区分场景卡片和位置卡片。例如,场景卡片可以为方形,位置卡片可以为圆形,从而用户可以更加方便地区分外出场景(例如,在户外)和居家场景,使UI界面更加直观。
在一些实施例中,用户在使用耳机的过程中,若从室外进入室内,可以显示如图14所示的界面790,界面790中包括室外对应的位置卡片791和室内(家居场景下)对应的场景卡片792、场景卡片793和场景卡片794。
在一些实施例中,如图15所示,手机根据历史记录(手机在使用过程中的场景记录或位置记录)推算一个耳机可能遗落的位置795,提示给用户。
上述实施例以可穿戴设备为耳机为例进行说明,但本申请实施例不限定可穿戴设备的类型。例如,可穿戴设备还可以是手环,手表,眼镜等。
进一步的,在一些实施例中,电子设备可以根据可穿戴设备的使用场景对用户进行更多提示。例如,可以提示用户在运动状态未佩戴手表或在阅读状态未佩戴眼镜等。
这样,将可穿戴设备的佩戴/摘下状态与地理位置或生活场景进行关联并记录,可以更好地帮助用户回想可穿戴设备的遗落位置,方便用户找回可穿戴设备,提高用户体验。
如图16所示,本申请实施例提供一种查找可穿戴设备的方法,应用于电子设备,包括:
1601、从可穿戴设备接收第一事件,第一事件包括佩戴事件或摘下事件。
1602、响应于接收到第一事件,获取第一位置或第一场景,第一位置用于指示电子设备或可穿戴设备的地理位置,第一场景用于指示用户的居家状态。
其中,第一场景可以是前文所述的第一生活场景,第一位置可以是前文所述的第一坐标信息指示的位置。
若确定电子设备位于预设居住区域,获取第一场景;若确定电子设备位于预设居住区域之外的区域,获取第一位置。其中,预设居住区域可以是住宅/家。
在一种可能的设计中,电子设备从可穿戴设备获取第一位置,第一位置用于指示可穿戴设备的地理位置。或者,电子设备基于网络定位技术获取第一位置,第一位置用于指示电子设备的地理位置。其中,网络定位技术包括基站定位技术、无线保真WiFi定位技术和全球卫星定位系统GPS定位技术。
示例性的,第一位置是通过可穿戴设备或电子设备的地理坐标指示的。或者,第一位置是通过第一名称指示的,第一名称是根据可穿戴设备或电子设备的地理坐标确定的。
在一种可能的设计中,可以从第二应用程序中查询第一场景;第二应用程序包括多个场景。多个场景包括睡眠场景、起床场景、影院场景、用餐场景、休闲场景、阅读场景、娱乐场景、回家场景或离家场景中的至少一个。
在一种可能的设计中,电子设备确定是否从第一场景切换到第二场景,第一场景与第二场景不同;若确定从第一场景切换到第二场景,记录第一事件与第二场景的关联关系。
在一种可能的设计中,电子设备还可以接收第二事件;当第一事件为佩戴事件时,第二事件为摘下事件;当第一事件为摘下事件时,第二事件为佩戴事件;电子设备获取第二位置或第三场景,第二位置用于指示电子设备或可穿戴设备的地理位置,第三场景用于指示用户的居家状态;记录第二事件与第二位置或第三场景的关联关系。
1603、记录第一事件与第一位置或第一场景的关联关系。
1604、响应于用户对第一应用程序的第一操作,显示第一界面;其中,第一界面包括第一选项,第一选项用于查找可穿戴设备。
示例性的,第一界面例如可以是如图8所示的界面710,第一选项例如可以是查找耳机选项711。
1605、响应于用户选中第一选项的操作,显示第二界面;其中,第二界面包括可穿戴设备对应的第一信息和第二信息,第一信息和第二信息相关联,第一信息对应第一位置或第一场景,第二信息对应第一事件。
示例性的,当可穿戴设备为蓝牙耳机,蓝牙耳机包括左耳耳机和右耳耳机时,第二界面包括左耳耳机和右耳耳机分别对应的第一信息和第二信息。示例性的,如图10所示,界面730(第二界面)可以包括左耳耳机对应的第一信息(例如,对应佩戴状态的图标737和图标738,以及对应摘下状态的图标739a和图标739b)和第二信息(例如,对应不同场景的场景卡片740、场景卡片741、场景卡片742);界面730(第二界面)还可以包括右耳耳机对应的第一信息(例如,对应佩戴状态的图标733、图标734和图标735,以及对应摘下状态的图标736)和第二信息(例如,对应不同场景的场景卡片740、场景卡片741、场景卡片742)。其中,左耳耳机对应的第二信息和右耳耳机对应的第二信息可以用同一个场景卡片表示。
当第一信息对应第一位置时,响应于用户对第一信息的操作,显示可视化地图,在可视化地图中指示第一位置。
基于本申请实施例提供的方法,将可穿戴设备的佩戴/摘下状态与地理位置或生活场景进行关联并记录,可以更好地帮助用户回想可穿戴设备的遗落位置,方便用户找回可穿戴设备,提高用户体验。
需要说明的是,图16的实施例中的电子设备可以为前述实施例中的手机,可穿戴设备可以为蓝牙耳机,图16的实施例中未详述的部分,可以参考前述实施例,在此不做赘述。
如图17所示,本申请实施例提供一种查找可穿戴设备(以可穿戴设备为蓝牙耳机,蓝牙耳机包括左耳耳机和右耳耳机为例)的方法,应用于电子设备,包括:
1701、从左耳耳机接收第一事件,第一事件包括佩戴事件或摘下事件。
1702、响应于接收到第一事件,获取第一位置或第一场景。
其中,第一位置用于指示电子设备或左耳耳机的地理位置,第一场景用于指示用户的居家状态。
1703、记录第一事件与第一位置或第一场景的关联关系。
1704、从右耳耳机接收第二事件,第二事件包括佩戴事件或摘下事件。
1705、响应于接收到第二事件,获取第二位置或第二场景,第二位置用于指示电子设备或右耳耳机的地理位置,第二场景用于指示用户的居家状态。
1706、记录第二事件与第二位置或第二场景的关联关系。
需要说明的是,步骤1701-1703与步骤1704-1706的执行顺序本申请不做限定,可以先执行步骤1701-1703,再执行步骤1704-1706;也可以是先执行步骤1704-1706,再执行步骤1701-1703;或者,可以是同时执行步骤1701-1703与步骤1704-1706。
1707、响应于用户对第一应用程序的第一操作,显示第一界面;其中,第一界面包括用于查找左耳耳机的选项和用于查找右耳耳机对应的选项。
示例性的,如图18所示,电子设备(例如,手机)可以显示界面710(第一界面),界面710包括用于查找左耳耳机的选项801和用于查找右耳耳机对应的选项802。
1708、响应于用户选中左耳耳机对应的选项,显示第二界面,第二界面包括左耳耳机对应的第一信息和第二信息;第一信息和第二信息相关联,第一信息对应第一位置或第一场景,第二信息对应第一事件。
示例性的,如图18中的(a)所示,响应于用户选中左耳耳机对应的选项801,如图18中的(b)所示,手机可以显示界面803(第二界面),界面803中可以显示左耳耳机804对应的场景和状态。
1709、响应于用户选中右耳耳机对应的选项,显示第三界面,第二界面包括右耳耳机对应的第三信息和第四信息;第三信息和第四信息相关联,第三信息对应第二位置或第二场景,第四信息对应第二事件。
示例性的,如图19中的(a)所示,响应于用户选中右耳耳机对应的选项802,如图19中的(b)所示,手机可以显示界面805(第三界面),界面805中可以显示右耳耳机806对应的场景和状态。
需要说明的是,步骤1708与步骤1709可以是择一执行的,或者可以是在不同时刻分别执行的,本申请不做限定。
基于本申请实施例提供的方法,将耳机(左耳耳机或右耳耳机)的佩戴/摘下状态与地理位置或生活场景进行关联并记录,可以更好地帮助用户回想耳机的遗落位置,方便用户找回可穿戴设备,提高用户体验。
需要说明的是,图17的实施例中的电子设备可以为前述实施例中的手机,图17的实施例中未详述的部分,可以参考前述实施例,在此不做赘述。
本申请实施例还提供一种芯片系统,如图20所示,该芯片系统包括至少一个处理器2001和至少一个接口电路2002。处理器2001和接口电路2002可通过线路互联。例如,接口电路2002可用于从其它装置(例如,电子设备的存储器)接收信号。又例如,接口电路2002可用于向其它装置(例如处理器2001)发送信号。
例如,接口电路2002可读取电子设备中存储器中存储的指令,并将该指令发送给处理器2001。当所述指令被处理器2001执行时,可使得电子设备(如图2所示的电子设备200)或可穿戴设备(如图3所示的可穿戴设备300)执行上述实施例中的各个步骤。
当然,该芯片系统还可以包含其他分立器件,本申请实施例对此不作具体限定。
本申请实施例还提供一种计算机可读存储介质,该计算机可读存储介质包括计算机指令,当所述计算机指令在电子设备(如图2所示的电子设备200)或可穿戴设备(如图3 所示的可穿戴设备300)上运行时,使得电子设备200执行上述方法实施例中电子设备执行的各个功能或者步骤,使得可穿戴设备300执行上述方法实施例中可穿戴设备执行的各个功能或者步骤。
本申请实施例还提供一种计算机程序产品,当所述计算机程序产品在计算机上运行时,使得所述计算机执行上述方法实施例中电子设备执行的各个功能或者步骤。
本申请实施例还提供了一种处理装置,所述处理装置可以按照功能划分为不同的逻辑单元或模块,各单元或模块执行不同的功能,以使得所述处理装置执行上述方法实施例中电子设备或可穿戴设备执行的各个功能或者步骤。
通过以上实施方式的描述,所属领域的技术人员可以清楚地了解到,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是一个物理单元或多个物理单元,即可以位于一个地方,或者也可以分布到多个不同地方。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(read only memory,ROM)、随机存取存储器(random access memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上内容,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (13)

  1. 一种查找可穿戴设备的方法,应用于电子设备,其特征在于,包括:
    从可穿戴设备接收第一事件,所述第一事件包括佩戴事件或摘下事件;
    响应于接收到所述第一事件,获取第一位置或第一场景,所述第一位置用于指示所述电子设备或所述可穿戴设备的地理位置,所述第一场景用于指示用户的居家状态;
    记录所述第一事件与所述第一位置或所述第一场景的关联关系;
    响应于用户对第一应用程序的第一操作,显示第一界面;其中,所述第一界面包括第一选项,所述第一选项用于查找所述可穿戴设备;
    响应于用户选中所述第一选项的操作,显示第二界面;其中,所述第二界面包括所述可穿戴设备对应的第一信息和第二信息,所述第一信息和所述第二信息相关联,所述第一信息对应所述第一位置或所述第一场景,所述第二信息对应所述第一事件。
  2. 根据权利要求1所述的方法,其特征在于,当所述第一信息对应所述第一位置时,所述方法还包括:
    响应于用户对所述第一信息的操作,显示可视化地图,所述可视化地图中包括用于指示所述第一位置的标识。
  3. 根据权利要求1或2所述的方法,其特征在于,所述电子设备获取第一位置或第一场景包括:
    若确定所述电子设备位于预设居住区域,获取所述第一场景;
    若确定所述电子设备位于所述预设居住区域之外的区域,获取所述第一位置。
  4. 根据权利要求1-3任一项所述的方法,其特征在于,所述获取第一场景包括:
    从第二应用程序中查询所述第一场景;所述第二应用程序包括多个场景,所述第一场景为所述多个场景中的任一个,所述多个场景包括睡眠场景、起床场景、影院场景、用餐场景、休闲场景、阅读场景、娱乐场景、回家场景或离家场景中的至少一个。
  5. 根据权利要求1-4任一项所述的方法,其特征在于,所述方法还包括:
    确定是否从所述第一场景切换到第二场景,所述第一场景与所述第二场景不同;
    若确定从所述第一场景切换到第二场景,记录所述第一事件与所述第二场景的关联关系。
  6. 根据权利要求1-5任一项所述的方法,其特征在于,所述方法还包括:
    接收第二事件;当所述第一事件为佩戴事件时,所述第二事件为摘下事件;当所述第一事件为摘下事件时,所述第二事件为佩戴事件;
    响应于接收到所述第二事件,获取第二位置或第三场景,所述第二位置用于指示所述电子设备或所述可穿戴设备的地理位置,所述第三场景用于指示用户的居家状态;
    记录所述第二事件与所述第二位置或所述第三场景的关联关系。
  7. 根据权利要求1-6任一项所述的方法,其特征在于,所述获取第一位置包括:
    从所述可穿戴设备获取所述第一位置,所述第一位置用于指示所述可穿戴设备的地理位置;或者
    基于网络定位技术获取所述第一位置,所述第一位置用于指示所述电子设备的地理位置;所述网络定位技术包括基站定位技术、无线保真WiFi定位技术和全球卫星定位系统GPS定位技术。
  8. 根据权利要求1-7任一项所述的方法,其特征在于,
    所述第一位置是通过所述可穿戴设备或所述电子设备的地理坐标指示的,或者所述第一位置是通过第一名称指示的,所述第一名称是根据所述可穿戴设备或所述电子设备的地理坐标确定的。
  9. 根据权利要求1-8任一项所述的方法,其特征在于,
    当所述可穿戴设备为蓝牙耳机,所述蓝牙耳机包括左耳耳机和右耳耳机时,所述第二界面包括所述左耳耳机和所述右耳耳机分别对应的所述第一信息和所述第二信息。
  10. 一种查找可穿戴设备的方法,应用于电子设备,所述可穿戴设备为蓝牙耳机,所述蓝牙耳机包括左耳耳机和右耳耳机,其特征在于,包括:
    从所述左耳耳机接收第一事件,所述第一事件包括佩戴事件或摘下事件;
    响应于接收到所述第一事件,获取第一位置或第一场景,所述第一位置用于指示所述电子设备或所述左耳耳机的地理位置,所述第一场景用于指示用户的居家状态;
    记录所述第一事件与所述第一位置或所述第一场景的关联关系;
    从所述右耳耳机接收第二事件,所述第二事件包括佩戴事件或摘下事件;
    响应于接收到所述第二事件,获取第二位置或第二场景,所述第二位置用于指示所述电子设备或所述右耳耳机的地理位置,所述第二场景用于指示用户的居家状态;
    记录所述第二事件与所述第二位置或所述第二场景的关联关系;
    响应于用户对第一应用程序的第一操作,显示第一界面;其中,所述第一界面包括用于查找所述左耳耳机的选项和用于查找所述右耳耳机对应的选项;
    响应于用户选中所述左耳耳机对应的选项,显示第二界面,所述第二界面包括所述左耳耳机对应的第一信息和第二信息;所述第一信息和所述第二信息相关联,所述第一信息对应所述第一位置或所述第一场景,所述第二信息对应所述第一事件;或者
    响应于用户选中所述右耳耳机对应的选项,显示第三界面,所述第三界面包括所述右耳耳机对应的第三信息和第四信息;所述第三信息和所述第四信息相关联,所述第三信息对应所述第二位置或所述第二场景,所述第四信息对应所述第二事件。
  11. 一种查找可穿戴设备的方法,应用于电子设备和可穿戴设备组成的系统,其特征在于,包括:
    所述可穿戴设备检测到第一事件时,向所述电子设备发送所述第一事件;所述第一事件包括佩戴事件或摘下事件;
    所述电子设备从可穿戴设备接收所述第一事件;
    响应于接收到所述第一事件,所述电子设备获取第一位置或第一场景,所述第一位置用于指示所述电子设备或所述可穿戴设备的地理位置,所述第一场景用于指示用户的居家状态;
    所述电子设备记录所述第一事件与所述第一位置或所述第一场景的关联关系;
    响应于用户对第一应用程序的第一操作,所述电子设备显示第一界面;其中,所述第一界面包括第一选项,所述第一选项用于查找可穿戴设备;
    响应于用户选中所述第一选项的操作,所述电子设备显示第二界面;其中,所述第二界面包括所述可穿戴设备对应的第一信息和第二信息,所述第一信息和所述第二信息相关联,所述第一信息对应所述第一位置或所述第一场景,所述第二信息对应所述第一事件。
  12. 一种电子设备,其特征在于,所述电子设备包括:无线通信模块、存储器和一个或多个处理器;所述无线通信模块、所述存储器与所述处理器耦合;
    其中,所述存储器用于存储计算机程序代码,所述计算机程序代码包括计算机指令;当所述计算机指令被所述处理器执行时,使得所述电子设备执行如权利要求1-10中任一项所述的方法。
  13. 一种计算机可读存储介质,其特征在于,包括计算机指令;
    当所述计算机指令在电子设备上运行时,使得所述电子设备执行如权利要求1-10中任一项所述的方法。
PCT/CN2022/075080 2021-04-25 2022-01-29 一种查找可穿戴设备的方法和装置 WO2022227767A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP22794248.9A EP4236402A4 (en) 2021-04-25 2022-01-29 METHOD AND APPARATUS FOR FIND A WEARABLE DEVICE
US18/038,789 US20240007826A1 (en) 2021-04-25 2022-01-29 Method and apparatus for finding wearable device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110448810.3A CN113301546B (zh) 2021-04-25 2021-04-25 一种查找可穿戴设备的方法和装置
CN202110448810.3 2021-04-25

Publications (1)

Publication Number Publication Date
WO2022227767A1 true WO2022227767A1 (zh) 2022-11-03

Family

ID=77320126

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/075080 WO2022227767A1 (zh) 2021-04-25 2022-01-29 一种查找可穿戴设备的方法和装置

Country Status (4)

Country Link
US (1) US20240007826A1 (zh)
EP (1) EP4236402A4 (zh)
CN (1) CN113301546B (zh)
WO (1) WO2022227767A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117041853A (zh) * 2023-08-31 2023-11-10 上海柯锐芯微电子有限公司 一种低功耗的无线耳机丢失检测和快速找回的装置及方法

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023173323A1 (zh) * 2022-03-16 2023-09-21 深圳市大疆创新科技有限公司 无人机的功耗控制方法、装置、系统及存储介质
CN115033396B (zh) * 2022-05-27 2023-05-02 荣耀终端有限公司 一种通信方法及相关设备

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105046896A (zh) * 2015-09-14 2015-11-11 联想(北京)有限公司 一种可穿戴电子设备及其控制方法
WO2016102416A1 (en) * 2014-12-22 2016-06-30 Koninklijke Philips N.V. Context informed wearable device
CN110503800A (zh) * 2019-08-27 2019-11-26 安徽华米信息科技有限公司 一种智能可穿戴设备的防丢失方法、装置

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140156084A1 (en) * 2011-06-10 2014-06-05 Aliphcom Data-capable band management in an integrated application and network communication data environment
US9779596B2 (en) * 2012-10-24 2017-10-03 Apple Inc. Devices and methods for locating accessories of an electronic device
US20150350051A1 (en) * 2014-06-03 2015-12-03 International Business Machines Corporation Quickly locating devices
US10210741B2 (en) * 2014-10-15 2019-02-19 Huawei Technologies Co., Ltd. Fall-off detection method for wearable device and wearable device
KR102354586B1 (ko) * 2015-08-11 2022-01-24 삼성전자주식회사 전자 장치 상태에 따른 제어 방법 및 그 장치
US10771898B2 (en) * 2017-01-09 2020-09-08 Apple Inc. Locating wireless devices
US10219107B2 (en) * 2017-02-17 2019-02-26 Tile, Inc. Tracking device location identification
CN108683799A (zh) * 2018-04-23 2018-10-19 Oppo广东移动通信有限公司 可穿戴设备查找方法及相关设备
CN108600539B (zh) * 2018-04-23 2021-07-16 Oppo广东移动通信有限公司 移动终端、位置检测方法及相关产品
CN111221602A (zh) * 2019-10-29 2020-06-02 维沃移动通信有限公司 一种界面显示方法及电子设备
US20210204094A1 (en) * 2019-12-31 2021-07-01 Micron Technology, Inc. Tracking wireless peripheral devices
CN111818165A (zh) * 2020-07-09 2020-10-23 深圳市科奈信科技有限公司 一种耳机找回提示方法、系统、电子设备及存储介质
WO2022067316A1 (en) * 2020-09-25 2022-03-31 Apple Inc. User interfaces for tracking and finding items
CN112672278A (zh) * 2020-12-17 2021-04-16 南昌逸勤科技有限公司 一种可穿戴设备的定位方法及装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016102416A1 (en) * 2014-12-22 2016-06-30 Koninklijke Philips N.V. Context informed wearable device
CN105046896A (zh) * 2015-09-14 2015-11-11 联想(北京)有限公司 一种可穿戴电子设备及其控制方法
CN110503800A (zh) * 2019-08-27 2019-11-26 安徽华米信息科技有限公司 一种智能可穿戴设备的防丢失方法、装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4236402A4

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117041853A (zh) * 2023-08-31 2023-11-10 上海柯锐芯微电子有限公司 一种低功耗的无线耳机丢失检测和快速找回的装置及方法
CN117041853B (zh) * 2023-08-31 2024-02-09 上海柯锐芯微电子有限公司 一种低功耗的无线耳机丢失检测和快速找回的装置及方法

Also Published As

Publication number Publication date
EP4236402A4 (en) 2024-05-08
EP4236402A1 (en) 2023-08-30
CN113301546A (zh) 2021-08-24
CN113301546B (zh) 2022-07-01
US20240007826A1 (en) 2024-01-04

Similar Documents

Publication Publication Date Title
WO2022227767A1 (zh) 一种查找可穿戴设备的方法和装置
WO2021213120A1 (zh) 投屏方法、装置和电子设备
CN111345010B (zh) 一种多媒体内容同步方法、电子设备及存储介质
CN113169760B (zh) 无线短距离音频共享方法及电子设备
CN110764730B (zh) 播放音频数据的方法和装置
WO2019128592A1 (zh) 进行直播的方法和装置
CN111182145A (zh) 显示方法及相关产品
WO2021013230A1 (zh) 机器人的控制方法、机器人、终端、服务器及控制系统
JP2023532078A (ja) ヘッドセットノイズ処理方法、装置及びヘッドセット
CN110543289A (zh) 控制音量的方法和电子设备
CN110401767B (zh) 信息处理方法和设备
CN114217532A (zh) 智能家居场景编排方法及终端
CN104065818A (zh) 提醒用户的方法及装置
CN113965715B (zh) 一种设备协同控制方法和装置
CN111505946B (zh) 设备控制方法和设备
WO2021000817A1 (zh) 环境音处理方法及相关装置
CN113573122B (zh) 音视频播放方法及装置
CN111182140A (zh) 马达控制方法及装置、计算机可读介质及终端设备
CN110808021B (zh) 音频播放的方法、装置、终端及存储介质
WO2023071454A1 (zh) 场景同步方法、装置、电子设备及可读存储介质
WO2016123743A1 (zh) 滤镜的智能匹配方法和终端
CN108419007A (zh) 克隆照片拍摄方法、终端及计算机可读存储介质
WO2022022722A1 (zh) 配件主题自适应方法、装置和系统
JP7404389B2 (ja) 運動軌跡記録方法及び関連デバイス
CN115334194A (zh) 提醒方法、装置、终端及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22794248

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 18038789

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2022794248

Country of ref document: EP

Effective date: 20230523

NENP Non-entry into the national phase

Ref country code: DE