CN112134995A - Method, terminal and computer readable storage medium for searching application object - Google Patents

Method, terminal and computer readable storage medium for searching application object Download PDF

Info

Publication number
CN112134995A
CN112134995A CN202010786041.3A CN202010786041A CN112134995A CN 112134995 A CN112134995 A CN 112134995A CN 202010786041 A CN202010786041 A CN 202010786041A CN 112134995 A CN112134995 A CN 112134995A
Authority
CN
China
Prior art keywords
target object
electronic device
application
target
equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010786041.3A
Other languages
Chinese (zh)
Inventor
罗·沃特加尔
周轩
龙嘉裕
肖恩·布瑞利
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202010786041.3A priority Critical patent/CN112134995A/en
Publication of CN112134995A publication Critical patent/CN112134995A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0414Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using force sensing means to determine a position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/021Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/023Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/025Services making use of location information using location based information parameters
    • H04W4/026Services making use of location information using location based information parameters using orientation information, e.g. compass
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Abstract

The application relates to the technical field of electronics, and provides a method, a terminal and a computer readable storage medium for searching an application object, wherein the method comprises the following steps: firstly, the first equipment acquires the pointing direction of the first equipment; then, if the first device is identified to point at the target object, the first device determines a target application object according to the target object; finally, the target application object and the target object have a corresponding relationship; and the first equipment displays a corresponding interface according to the target application object. By implementing the method and the device, the searching efficiency of the application object can be improved.

Description

Method, terminal and computer readable storage medium for searching application object
Technical Field
The present application relates to the field of electronic technologies, and in particular, to a method, a terminal, and a computer-readable storage medium for searching an application object.
Background
With the rapid development of electronic technology and the rapid popularization of terminals, the functions of terminals are increasingly improved. Generally, in order to enrich the life of a user, the user can install various applications in the terminal, so that more and more applications are installed in the terminal. For example, an application program for accessing network data resources, a photographing application program, a timely communication application program, and the like can be installed in the terminal. Because more and more applications are installed in the terminal, and the number of icons of the applications displayed on each page in the display interface of the terminal is limited, when a user searches for an application to be operated in the terminal, the user often needs to turn several pages to find the application which the user wants to operate, and the efficiency of searching for the application is too low due to the implementation mode.
Disclosure of Invention
The application provides a method, a terminal and a computer readable storage medium for searching an application object, which can improve the searching efficiency of the application object.
In a first aspect, a method for searching an application is provided, and the method may include the following steps: first, a pointing direction of a first device is obtained, for example, the pointing direction may be that a device points at a B object; then, if the first device is identified to point at the target object, the first device determines a target application object according to the target object, and the target application object and the target object have a corresponding relationship, wherein the application object may include the application program itself, or may include a functional component in the application program, and the like; and then, the first device displays a corresponding interface according to the target application object. For example, taking an application object as an application program as an example, in the case that the number of the found application programs is one, the display interface includes the name and/or the icon of the application program, or the display interface is a main interface of the application program; and under the condition that the number of the searched application programs is multiple, the display interface comprises names and/or icons corresponding to the multiple application programs.
Compared with the prior art, the method and the device have the advantages that the target application object can be found without the need of multiple page turning of the user, and the target application object is determined according to the target object under the condition that the first device determines the target object, so that the searching efficiency of the application object can be improved.
In one possible implementation, the method may further include the steps of: the first device receives a first trigger instruction of a user, wherein the first trigger instruction is used for triggering the search of the application object. Specifically, the first trigger instruction may be an operation initiated by the user acting on the first device, so that an action of acquiring the pointing direction of the first device may be performed, instead of the first device acquiring the pointing direction of the first device at any time, which may reduce resource consumption of the first device and improve pertinence of searching for the application object.
In a possible implementation manner, when it is recognized that the first device is pointed at the target object, the target object may be determined according to the pointing direction of the first device and the spatial positions of objects around the first device. Specifically, the object around the first device may be an object included in a signal strength coverage area of the first device, an object included in a preset range, or the like. Through the implementation mode, the first device can accurately identify the object pointed by the first device, and the accuracy rate of subsequently searching the application object is improved.
In a possible implementation manner, when the first device determines the target object according to the pointing direction of the first device and the spatial positions of the objects around the first device, the target object may be determined according to the pointing direction of the first device and the historical spatial positions of the objects around the first device. In consideration of the fact that in practical application, the spatial positions of the objects around the first device do not change within a period of time, after the first device determines the spatial positions of the objects around the first device through the UWB technology, the spatial positions can be stored, so that the first device can acquire the spatial positions of the objects around the first device in the following process. By the implementation mode, the execution efficiency of the first device can be improved, and the resource consumption of the first device can be reduced.
In one possible implementation, the obtaining of the pointing direction of the first device may include: according to a preset reference direction (for example, a preset central axis direction, the central axis passes through a reference point of the device, for example, when the device is a mobile phone, the central axis direction may be determined according to a long edge of the mobile phone, the reference point is set as a central point of a display screen, when the device is AR glasses, the reference point is a terminal point of a connection line between two lens central points, the central axis passes through the reference point and is perpendicular to a plane where the lenses are located, and here is only an example, the setting mode of the central axis may be set according to specific application requirements), and the obtained inclination angle of the first device determines the pointing direction of the first device. In practical application, the inclination angle of the first equipment can be acquired through the gyroscope sensor, and the action state of left and right inclination when the vertical screen of the first equipment is placed can be identified by utilizing the characteristics of the gyroscope sensor.
In a possible implementation manner, before the first device is identified to point to the target object, the spatial position of the target object may also be obtained in real time, specifically, the first device broadcasts an ultra-bandwidth measurement request to the target object, and the target object is used for determining its own orientation parameter according to the ultra-bandwidth measurement request; wherein the orientation parameters may include a distance between the target object and the first device, one or more of an angle of arrival of a signal received at the first device from the target object, and a received signal strength; the first device receives orientation parameters from a target object; and the first equipment determines the spatial position of the target object according to the orientation parameters corresponding to the target object. After determining the spatial location, the first device may identify the target object based on the pointing direction of the first device and the determined spatial location. By implementing the embodiment of the application, under the condition that the target object is the electronic device, the two-way connection is established between the first device and the target object to acquire the orientation parameters of the target object, so that the positioning between the devices can be accurately realized, and the user experience is improved.
In a possible implementation manner, before identifying that the first device is pointed at the target object, the following steps may be further included: the method comprises the steps that first equipment broadcasts a measurement request to at least three wireless positioning modules preset in the same space; the at least three wireless positioning modules are not on the same straight line, and the distance between every two wireless positioning modules is not less than the preset distance; each wireless positioning module in the at least three wireless positioning modules is used for acquiring distance information between the wireless positioning module and the first equipment according to the measurement request; the first equipment receives at least three pieces of distance information; and the first equipment determines the spatial position of the target object according to the three-point positioning principle according to the at least three pieces of distance information and the acquired inclination angle of the first equipment. By implementing the embodiment of the application, under the condition that the target object is a non-electronic device, the first device establishes two-way connection with at least three wireless positioning modules which are preset in the same space, the spatial position of the target object is determined according to at least three distance information and the inclination angle of the first device and by combining a three-point positioning principle, the positioning between the device and the device can be accurately realized, and the user experience is improved.
In one possible implementation manner, the implementation process of the first device for determining the target application object according to the target object may include: the method includes the steps that a first device obtains a corresponding relation between a pre-constructed object and an application object, for example, the corresponding relation between the object and the application object is at least one of the corresponding relation between an electronic device and the application object contained in an environment scene where the first device is located and the corresponding relation between a non-electronic device and the application object contained in the environment scene where the first device is located; and then, the first device determines a target application object corresponding to the target object based on the corresponding relation. By implementing the embodiment of the application, because the corresponding relation takes the environment scene where the first device is located and the type of the target object into consideration, the data size of the corresponding relation is small, and the efficiency of searching for the application object can be improved.
In one possible implementation, the method may further include the steps of: the method further comprises the following steps: and under the condition that the target application object on the first device receives the message notification, sending indication information to the target object, wherein the indication information is used for indicating the target object to output prompt information. By implementing the embodiment of the application, when the target application object on the first device receives the message notification, the first device outputs the prompt information through the target object instead of outputting the prompt information through the first device, for example, when the first device is in a mute state or a vibration state, the user can be prevented from missing important message notifications, the presentation experience of the message notifications is better, and the user experience is improved.
In a possible implementation manner, when the target object outputs the prompt information, the target object outputs the prompt information according to the capability information of the target object, and the capability information of the target object is used for representing the output mode of the target object. The output mode refers to different presentation modes of the message, and may include display, voice broadcast, and the like.
In a second aspect, an embodiment of the present application provides a device for finding an application, where the device may include: the acquisition pointing unit is used for acquiring the pointing direction of the first equipment; an application object determination unit, configured to determine a target application object according to the target object when the recognition unit recognizes that the first device is directed to the target object; the target application object and the target object have a corresponding relationship; and the display unit is used for indicating the first equipment to display a corresponding interface according to the target application object.
In one possible implementation, the apparatus further includes: the instruction receiving unit is used for receiving a first trigger instruction of a user, and the first trigger instruction is used for triggering and searching an application object.
In a possible implementation manner, the identification unit is specifically configured to: and determining the target object according to the pointing direction of the first equipment and the spatial position of the object around the first equipment.
In a possible implementation manner, the identification unit is specifically configured to: and determining the target object according to the pointing direction of the first equipment and the historical space positions of the objects around the first equipment.
In a possible implementation manner, the obtaining pointing unit is specifically configured to: and determining the pointing direction of the first equipment according to a preset reference direction and the acquired inclination angle of the first equipment.
In one possible implementation, the apparatus further includes: a first sending unit, configured to broadcast an ultra-bandwidth measurement request to the target object; the target object is used for determining the self orientation parameter according to the ultra-bandwidth measurement request; a first receiving unit for receiving orientation parameters from the target object; and the first position determining unit is used for determining the spatial position of the target object according to the orientation parameters corresponding to the target object.
In one possible implementation, the apparatus further includes:
the second sending unit is used for broadcasting the measurement request to at least three wireless positioning modules preset in the same space; the at least three wireless positioning modules are not on the same straight line, and the distance between every two wireless positioning modules is not less than the preset distance; each wireless positioning module in the at least three wireless positioning modules is used for acquiring distance information between the wireless positioning module and the first equipment according to the measurement request; a second receiving unit for receiving at least three distance information; and the second position determining unit is used for determining the spatial position of the target object according to the at least three pieces of distance information and by combining a three-point positioning principle.
In a possible implementation manner, the application object determining unit is specifically configured to: acquiring a corresponding relation between a pre-constructed object and an application object; and determining a target application object corresponding to the target object based on the corresponding relation.
In one possible implementation, the apparatus further includes: a third sending unit, configured to send, when a target application object on the first device receives a message notification, instruction information to the target object, where the instruction information is used to instruct the target object to output prompt information.
In a possible implementation manner, when the target object outputs the prompt information, the target object outputs the prompt information according to the capability information of the target object, and the capability information of the target object is used for representing an output manner of the target object.
In a third aspect, an embodiment of the present application further provides an electronic device, which is a first device and may include a memory and a processor, where the memory is used to store a computer program that supports the electronic device to execute the method described above, and the computer program includes program instructions, and the processor is configured to call the program instructions to execute the method described above in the first aspect.
In a fourth aspect, embodiments of the present application further provide a computer-readable storage medium, in which a computer program is stored, the computer program comprising program instructions, which, when executed by a processor, cause the processor to perform the method of the first aspect.
In a fifth aspect, the present application further provides a computer program, where the computer program includes computer software instructions, and the computer software instructions, when executed by a computer, cause the computer to execute any one of the methods for finding an application object according to the first aspect.
Drawings
Fig. 1A and fig. 1B are schematic diagrams of a scenario provided by an embodiment of the present application;
FIG. 2 is a system architecture diagram according to an embodiment of the present application;
fig. 3A is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 3B is a schematic coordinate system diagram of an electronic device according to an embodiment of the present disclosure;
fig. 3C to fig. 3E are schematic UWB antenna distribution diagrams provided by embodiments of the present application;
fig. 4 is a schematic structural diagram of another electronic device provided in the embodiment of the present application;
fig. 5A is a schematic flowchart of a method for searching an application object according to an embodiment of the present application;
FIG. 5B is a diagram illustrating an embodiment of a method for generating a trigger command;
FIG. 5C is a schematic diagram illustrating another example of generating a trigger instruction according to the present disclosure;
fig. 5D is a schematic flowchart of determining a target device according to an embodiment of the present application;
fig. 5E is a schematic flowchart of determining a target device according to an embodiment of the present application;
fig. 5F is a schematic structural diagram of a UWB chip system according to an embodiment of the present application;
fig. 5G is a schematic flowchart of a process for determining a target object according to an embodiment of the present application;
fig. 5H is a schematic diagram illustrating a wireless positioning module according to an embodiment of the present disclosure;
fig. 6 is a schematic flowchart of another method for searching an application object according to an embodiment of the present application;
fig. 7 is a schematic diagram of a target object output prompt message according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of an apparatus for searching an application object according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of another electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application are described below clearly and completely with reference to the accompanying drawings, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
The terms "first" and "second" and the like in the description and drawings of the present application are used for distinguishing different objects or for distinguishing different processes for the same object, and are not used for describing a specific order of the objects. Furthermore, the terms "including" and "having," and any variations thereof, as referred to in the description of the present application, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements but may alternatively include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. It should be noted that in the embodiments of the present application, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design method described herein as "exemplary" or "e.g.," should not be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion. In the examples of the present application, "A and/or B" means both A and B, and A or B. "A, and/or B, and/or C" means either A, B, C, or means either two of A, B, C, or means A and B and C.
In order to facilitate better understanding of the schemes described in the present application, the following describes application scenarios to which the present application may be applied.
A first application scenario:
at present, intelligent terminals are becoming more and more popular, and a plurality of intelligent terminals (i.e. electronic devices) such as sound boxes, televisions, air conditioners and the like may exist in office areas or houses where users are located.
In practical applications, a plurality of intelligent terminals in a certain area may access a network (e.g., a local area network, the internet, etc.) to realize interaction with a cloud or other devices.
In the method, when a user triggers a first trigger instruction (e.g., a voice instruction), a first device receives a trigger operation initiated by the user acting on the first device, and obtains a pointing direction of the first device, for example, the first device 100 points to a specific device in a plurality of nearby intelligent terminals. Then, orientation parameters of the plurality of intelligent terminals are obtained through a wireless positioning technology (for example, an ultra-wideband positioning technology), and a target device pointed by the first device can be determined according to the orientation parameters of the plurality of intelligent terminals. Specifically, the orientation parameter of the smart terminal may include a distance between the device and the first device 100, an Angle of Arrival (AOA) of a signal of the device on the first device, and a Received Signal Strength (RSSI), where the RSSI may be used to determine whether there is an occlusion between the device and the first device 100. Aiming at the scenes of a plurality of intelligent terminals, the method can accurately realize the positioning between the devices, and improves the user experience.
Illustratively, as shown in fig. 1A, there are a plurality of electronic devices near the user, such as a sound box 201, a television 202, a refrigerator 203, an air conditioner 204, and the like. When a user triggers a first trigger instruction (e.g., a voice instruction), a first device 100 (e.g., a smart band) points at a sound box, the electronic devices receive an ultra-wideband measurement request initiated by the first device, respectively initiate measurement of orientation parameters (i.e., the orientation parameters relative to the smart band), and after the orientation parameters are obtained through measurement, send the orientation parameters to the first device, so that the first device can determine a spatial position corresponding to a target device to which the first device points according to the orientation parameters of the electronic devices, and then the first device can recognize that the first device points at a target object according to a pointing direction of the first device and the spatial positions of the electronic devices, for example, the target object is the sound box 201.
It should be noted that the scenario shown in fig. 1A is only for illustrating a scenario of multiple electronic devices, and is not a limitation of the present application, and in various implementation scenarios, a different number and/or different types of devices from those shown in the figure may be included, for example, more or fewer electronic devices may be included, or other electronic devices different from those shown in fig. 1A may be included.
A second application scenario:
currently, there may be multiple objects (i.e., non-electronic devices) in the office area or housing where the user is located, such as a television cabinet, a pot, a tea table, etc.
In the method, when a user triggers a first trigger instruction (e.g., a voice instruction), a first device receives a trigger operation initiated by the user acting on the first device, and acquires a pointing direction of the first device, for example, the first device 100 points at a specific object in a plurality of nearby objects. The first device determines an inclination angle of the first device through a gyroscope sensor by acquiring distance information corresponding to at least three wireless positioning modules preset in the same space (here, the inclination angle of the first device is used for representing the indication direction of the first device), then, a spatial position corresponding to a target object pointed by the first device can be determined by combining a three-point positioning principle, and then, the first device is recognized to be the target object according to the pointing direction of the first device and the spatial position of the target object. Aiming at the scenes of a plurality of non-electronic devices, the method can accurately realize the positioning between the devices and the objects, and improves the user experience.
Illustratively, as shown in FIG. 1B, the user has a plurality of non-electronic devices in the vicinity, such as a potted plant 205, a television cabinet 206, an end table 207, and the like. When a user triggers a first trigger instruction (e.g., a voice instruction), the first device 100 (e.g., a smart bracelet) points at the tea table 207, the at least three wireless positioning modules receive broadcast information initiated by the first device, measure distance information between the first device and the first device, and the first device identifies a target object pointed by the first device as the tea table 207 according to the at least three distance information and by combining a three-point positioning principle.
It should be noted that the scenario shown in fig. 1B is only for illustrating a scenario of one or more non-electronic devices, and is not limited to the present application, and in various implementation scenarios, a different number and/or different types of objects may be included from those shown in the figure, for example, more or fewer objects may be included, or other non-electronic devices different from those shown in fig. 1B may be included.
It should be noted that, in the embodiments of the present application, the electronic device refers to an electronic product having a positioning function. The non-electronic device refers to an electronic product or an object having no positioning function.
Based on the application scenario shown in fig. 1A, a communication system provided in the embodiment of the present application is described below. Referring to fig. 2, fig. 2 schematically illustrates a schematic diagram of a communication system 300. As shown in fig. 2, the communication system 300 includes the electronic device 100, the electronic device 201, the electronic device 202, the electronic device 203, the electronic device 204, and the like. The electronic device 100 may search for a target application object corresponding to a target device pointed by the electronic device 100 through the stored correspondence between the object and the application object. Wherein the content of the first and second substances,
an electronic device (e.g., the electronic device 100, the electronic device 201, the electronic device 202, the electronic device 203, or the electronic device 204) has an Ultra Wide Band (UWB) communication module and may further have one or more of a bluetooth communication module, a WLAN communication module, and an infrared communication module. Taking the electronic device 100 as an example, the electronic device 100 may detect and scan electronic devices (e.g., the electronic device 201, the electronic device 202, the electronic device 203, or the electronic device 204) near the electronic device 100 by transmitting signals through one or more of a UWB communication module, a bluetooth communication module, a WLAN communication module, and an infrared communication module, so that the electronic device 100 may discover the nearby electronic devices through one or more near-field wireless communication protocols of UWB, bluetooth, WLAN, and infrared, establish wireless communication connections with the nearby electronic devices, and may transmit data to the nearby electronic devices.
The type of the electronic device (e.g., the electronic device 100, the electronic device 201, the electronic device 202, the electronic device 203, or the electronic device 204) is not particularly limited in this application. In some embodiments, the electronic device in the embodiments of the present application may be a portable device such as a mobile phone, a wearable device (e.g., a smart band), a tablet computer, a laptop computer (laptop), a handheld computer, a notebook computer, an ultra-mobile personal computer (UMPC), a cellular phone, a Personal Digital Assistant (PDA), an Augmented Reality (AR) \ Virtual Reality (VR) device, and may also be a sound box, a television, a refrigerator, an air conditioner, a vehicle-mounted device, a printer, a projector, and the like. Exemplary embodiments of the electronic device include, but are not limited toLimited to mounting
Figure BDA0002619977320000061
Or other operating system.
In one possible implementation, electronic device 100, electronic device 201, electronic device 202, electronic device 203, and electronic device 204 may communicate directly with each other.
In one possible implementation, the electronic device 100, the electronic device 201, the electronic device 202, the electronic device 203, and the electronic device 204 may be connected to a Local Area Network (LAN) by way of a wired or wireless fidelity (WiFi) connection. For example, the electronic device 100, the electronic device 201, the electronic device 202, the electronic device 203, and the electronic device 204 are all connected to the same electronic device 301, and the electronic device 100, the electronic device 201, the electronic device 202, the electronic device 203, and the electronic device 204 may communicate indirectly through the electronic device 301. The electronic device 301 may be one of the electronic device 100, the electronic device 201, the electronic device 202, the electronic device 203, and the electronic device 204, and may also be an additional third-party device, such as a router, a cloud server, a gateway, a smart device controller, and the like. The cloud server may be a hardware server or may be embedded in a virtualization environment, for example, the cloud server may be a virtual machine executing on a hardware server that may include one or more other virtual machines. The electronic device 301 may transmit data to the electronic device 100, the electronic device 201, the electronic device 202, the electronic device 203, and the electronic device 204 via a network, and may receive data transmitted from the electronic device 100, the electronic device 201, the electronic device 202, the electronic device 203, and the electronic device 204.
The electronic device 301 may include a memory, a processor, and a transceiver. The memory can be used for storing voice wake-up words and related programs for UWB positioning; the memory may also be used to store orientation parameters of an electronic device (e.g., electronic device 201) acquired via UWB positioning techniques; the memory may also be used to store messages exchanged via electronic device 301, data and/or configurations related to electronic device 100 and nearby devices. The processor may be configured to determine, when obtaining the location parameters of the plurality of nearby devices in the local area network, a target device to respond according to the location parameters of the plurality of nearby devices. The transceiver may be used to communicate with electronic devices connected to a local area network. It should be noted that, in the embodiment of the present application, multiple vicinities may or may not be connected to the same local area network, and are not specifically limited herein.
It is to be understood that the configuration shown in the present embodiment does not constitute a specific limitation to the communication system 300. In other embodiments of the present application, communication system 300 may include more or fewer devices than those shown.
The communication system 300 may also include the electronic device 100, the wireless positioning module a, the wireless positioning module B, and the wireless positioning module C. It is understood that the electronic device 100, the wireless positioning module a, the wireless positioning module B, and the wireless positioning module C may be connected to a Local Area Network (LAN) by a wired or wireless fidelity (WiFi) connection to realize communication therebetween.
The electronic device 100 related to the embodiment of the present application is described in detail below.
Referring to fig. 3A, fig. 3A illustrates a schematic structural diagram of an electronic device 100 provided in an embodiment of the present application.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
The NPU may perform artificial intelligence operations using Convolutional Neural Networks (CNN) processing. For example, a CNN model is used for carrying out a large amount of information identification and information screening, and training and identification of scene intelligence can be achieved.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to the electronic device 100, including UWB, Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (WiFi) network), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
Among them, UWB wireless communication is a wireless personal area network communication technology having low power consumption and high-speed transmission. UWB employs pulse signals to transmit data, unlike the continuous carrier approach used by common communication techniques. UWB utilizes nanosecond (ns) to picosecond (ps) non-sine wave narrow pulse signal to transmit data, and time modulation technology can greatly improve the transmission rate. Because of the use of very short pulses, UWB devices transmit very little power at high speed, only a few percent of the current continuous carrier systems, and thus consume relatively little power.
Compared with the traditional narrow-band system, the UWB system has the advantages of strong penetrating power, low power consumption, good multipath resistance effect, high safety, low system complexity, capability of providing accurate positioning precision and the like. UWB may be applied to wireless communication applications requiring high quality services, and may be used in the fields of Wireless Personal Area Networks (WPANs), home network connections, short-range radars, and the like. UWB will become a technological means to solve the contradiction between the demand for high-speed internet access in enterprises, homes, public places, etc. and the increasingly crowded allocation of frequency resources.
In the embodiment of the present application, the electronic device 100 may implement the distance and RRSI measurement through a UWB antenna. The electronic device 100 may implement AOA measurements through at least two UWB antennas. The following describes an arrangement distribution of UWB antennas exemplarily provided by an embodiment of the present application.
First, a reference coordinate system of the electronic device is defined. For example, as shown in fig. 3B, the coordinate system of the electronic device may be defined as follows: the X axis is parallel to the short side direction of the screen of the electronic equipment and points to the right side of the screen from the left side of the screen; the Y axis is parallel to the long edge direction of the screen and points to the top of the screen from the bottom of the screen; the Z axis is perpendicular to the plane formed by the X axis and the Y axis, namely the Z axis is perpendicular to the plane of the screen. When the electronic device is placed horizontally and the screen faces upwards, the direction of the Z axis is opposite to the direction of gravity.
It should be noted that, in the embodiments of the present application, the top, the bottom, the left, and the right are relative, and are exemplary descriptions in specific implementation manners, and should not be construed as limiting the embodiments of the present application. It can be understood that when the posture of the electronic device is changed, the top, the bottom, the left side and the right side of the electronic device mentioned in the embodiment of the present application are not changed.
Fig. 3C to 3E exemplarily show the electronic apparatus 100 having the UWB antenna. In fig. 3C, the electronic terminal 100 has 2 UWB antennas, i.e., an antenna a and an antenna B. Wherein, the connecting line of antenna A and antenna B is on a parallel with the X axis of the electronic equipment, and 2 UWB antennas present one-dimensional arrangement. In fig. 3D, the electronic terminal 100 has 3 antennas, namely antenna a, antenna B and antenna C. Wherein, the connecting line of the antenna B and the antenna C is parallel to the Y axis of the electronic device, and the 3 UWB antennas are arranged in two dimensions. In fig. 3E, the electronic device has 4 UWB antennas, i.e., antenna a, antenna B, antenna C, and antenna d. In some embodiments, the connecting line of antenna C and antenna B is parallel to the Z-axis of the electronic device, and the 4 UWB antennas exhibit a three-dimensional arrangement. In fig. 3C to 3E, the distance between the antenna a and the antenna V is d1, the distance between the antenna a and the antenna C is d2, and the distance between the antenna a and the antenna B is d 3. Wherein d1, d2 and d3 are all smaller than lambda/2, and lambda is the wavelength of the electromagnetic wave.
It should be noted that the arrangement of the UWB antennas and the distribution positions on the electronic device shown in fig. 3C to fig. 3E are only for illustration and are not limiting to the present application. For example, in the same arrangement, the electronic device 100 may have a greater number of UWB antennas in addition to the number of UWB antennas shown in FIGS. 3C-3E; in the same arrangement, there may be other distribution positions than the distribution positions of the UWB antennas shown in fig. 3C to 3E.
In the embodiment of the present application, the UWB antenna may be multiplexed with the antennas 1 and 2, or may be independent of each other. And is not particularly limited herein.
In some embodiments, the UWB communication module of the electronic device 100 may be in a powered-on state while the electronic device is in a standby state.
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
In some embodiments of the present application, the interface content currently output by the system is displayed in the display screen 194. For example, the interface content is an interface provided by an instant messaging application.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to collect sound (e.g., ambient sounds, including sounds made by a person, sounds made by a device, etc.) and convert the sound signal into an electrical signal. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. When the voice wake-up function of the electronic device is turned on, the microphone 163 may collect ambient sound in real time to obtain audio data. The condition of the microphone 163 collecting sound is related to the environment. For example, when the surrounding environment is noisy and the user speaks a wake up word, the sound collected by the microphone 163 includes the surrounding environment noise and the sound of the wake up word emitted by the user. For another example, when the surrounding environment is quiet, the user speaks the wake-up word, and the sound collected by the microphone 163 is the sound of the wake-up word emitted by the user. For another example, when the surrounding environment is noisy, the voice wake-up function of the electronic device is turned on, but the user does not speak the wake-up word to wake up the electronic device, and the sound collected by the microphone 163 is only the surrounding environment noise. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. In some alternative embodiments of the present application, the pressure sensor 180A may be configured to capture a pressure value generated when a user's finger portion contacts the display screen and transmit the pressure value to the processor, so that the processor identifies which finger portion the user entered the user action.
The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message. In some alternative embodiments of the present application, the pressure sensor 180A may transmit the detected capacitance value to the processor, so that the processor recognizes through which finger portion (knuckle or pad, etc.) the user inputs the user operation. In some alternative embodiments of the present application, the pressure sensor 180A may also calculate the number of touch points from the detected signals and transmit the calculated value to the processor, so that the processor recognizes the user operation by the single-finger or multi-finger input.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of the electronic device 100 about three axes (the X, Y, and Z axes of the electronic device) may be determined by the gyro sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications. In some alternative embodiments of the present application, the acceleration sensor 180E may be configured to capture an acceleration value generated when a finger portion of the user touches the display screen (or when the finger of the user strikes a rear bezel of the electronic device 100), and transmit the acceleration value to the processor, so that the processor identifies which finger portion the user input the user operation through.
In this embodiment, the electronic device 100 may determine the posture change of the electronic device 100 through a gyroscope sensor and/or an acceleration sensor, so as to recognize the user operation. For example, the current user operation is recognized as a pointing operation according to a change in the posture of the electronic device 100, and the pointing operation may be that the user points the electronic device 100 in a specific direction and keeps pointing in the specific direction for a preset time.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the display screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or thereabout, which is an operation of a user's hand, elbow, stylus, or the like contacting the display screen 194. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195.
The following describes a structure of an apparatus provided in an embodiment of the present application, taking the electronic apparatus 201 as an example.
Fig. 4 exemplarily shows a schematic structural diagram of an electronic device 201 provided in an embodiment of the present application.
As shown in fig. 4, the electronic device 201 may include: the wireless communication device comprises a processor 401, a memory 402, a wireless communication processing module 403, an antenna 404, a power switch 405, a wired LAN communication processing module 406, a USB communication processing module 407 and an audio module 408. Wherein:
processor 401 may be used to read and execute computer readable instructions. In particular implementations, processor 401 may include primarily controllers, operators, and registers. The controller is mainly responsible for instruction decoding and sending out control signals for operations corresponding to the instructions. The arithmetic unit is mainly responsible for storing register operands, intermediate operation results and the like temporarily stored in the instruction execution process. In a specific implementation, the hardware architecture of the processor 401 may be an Application Specific Integrated Circuit (ASIC) architecture, a MIPS architecture, an ARM architecture, or an NP architecture, etc.
In some embodiments, the processor 401 may be configured to parse signals received by the wireless communication module 403 and/or the wired LAN communication processing module 406, such as probe requests broadcast by the terminal 100, and/or the like. The process 401 may be used to perform corresponding processing operations according to the parsing result, such as generating a probe response, and so on.
In some embodiments, the processor 401 may also be configured to generate signals, such as bluetooth broadcast signals, beacon signals, which are transmitted to the outside by the wireless communication module 403 and/or the wired LAN communication processing module 406.
A memory 402 is coupled to the processor 401 for storing various software programs and/or sets of instructions. In particular implementations, memory 402 may include high-speed random access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state storage devices. The memory 402 may store an operating system, such as an embedded operating system like uCOS, VxWorks, RTLinux, etc. The memory 402 may also store communication programs that may be used for communication by the terminal 100, one or more servers, or accessory devices.
The wireless communication module 403 may include one or more of a UWB communication module 403A, a bluetooth communication module 403B, WLAN communication module 404C, and an infrared communication module 404D. The UWB communication module 403A may be integrated into a Chip (System on Chip, SOC), and the UWB communication module 403A may be integrated with other communication modules (e.g., the bluetooth communication module 403B) in hardware (or software).
In some embodiments, one or more of the UWB communication module 403A, the bluetooth communication module 403B, WLAN, the infrared communication module 404D may listen to signals transmitted by other devices (e.g., the electronic device 100), such as measurement signals, scanning signals, etc., and may send response signals, such as measurement responses, scanning responses, etc., so that the other devices (e.g., the electronic device 100) may discover the electronic device 201 and establish a wireless communication connection with the other devices (e.g., the electronic device 100) via one or more of UWB, bluetooth, WLAN, or infrared, for data transmission.
In other embodiments, one or more of the UWB communication module 403A, the bluetooth communication module 403B, WLAN, the communication module 404C, and the infrared communication module 404D may also transmit signals, such as broadcast UWB measurement signals and beacon signals, so that other devices (e.g., the electronic device 100) may discover the electronic device 201 and establish a wireless communication connection with other devices (e.g., the electronic device 100) via one or more of UWB, bluetooth, WLAN, or infrared for data transmission.
The wireless communication module 403 may also include a cellular mobile communication module (not shown). The cellular mobile communication processing module may communicate with other devices, such as servers, via cellular mobile communication technology.
The antenna 404 may be used to transmit and receive electromagnetic wave signals. The antennas of different communication modules can be multiplexed and can also be mutually independent so as to improve the utilization rate of the antennas. For example: the antenna of the bluetooth communication module 403A may be multiplexed as the antenna of the WLAN communication module 403B. For example, the UWB communication module 403A is to use a separate UWB antenna.
In this embodiment, the electronic device 201 has at least one UWB antenna for implementing UWB communication.
The power switch 405 may be used to control the power supply of the power source to the electronic device 201.
The wired LAN communication processing module 406 may be used to communicate with other devices in the same LAN through a wired LAN, and may also be used to connect to a WAN through a wired LAN, communicating with devices in the WAN.
The USB communication processing module 407 may be used to communicate with other devices through a USB interface (not shown).
The audio module 408 may be configured to output audio signals through the audio output interface, which may enable the electronic device 201 to support audio playback. The audio module may also be configured to receive audio data via the audio input interface. The electronic device 201 may be a television, a sound box, or other media playing device, or may also be an air conditioner, a refrigerator, or other non-media playing device. When the voice wake-up function of the electronic device 201 is turned on, the audio module 408 may collect ambient sound in real time to obtain audio data. The audio module can also perform speech recognition on audio data received by the audio module.
It should be understood that the electronic device 201 shown in fig. 4 is merely an example, and that the electronic device 201 may have more or fewer components than shown in fig. 4, may combine two or more components, or may have a different configuration of components. The various components shown in the figures may be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
Referring to fig. 5A, a flowchart of a method for searching an application object according to an embodiment of the present application is provided to specifically describe how to search an application object in the embodiment of the present application, which may include, but is not limited to, the following steps:
step S100, the first device acquires the pointing direction of the first device.
In the embodiment of the present application, the first device may include, but is not limited to: smart phones (e.g., Android phones, ios phones, etc.), tablet computers, palm computers, and Mobile Internet Devices (MID), PADs, and other electronic devices.
In this embodiment of the present application, an implementation process of acquiring, by a first device, a pointing direction of the first device may include: and determining the pointing direction of the first equipment according to the preset reference direction and the acquired inclination angle of the first equipment. The preset reference direction may be determined based on the coordinate systems shown in fig. 3B to 3E. The first device may use a gyro sensor to determine the tilt angle of the electronic device 100. Here, the tilt angle of the electronic device 100 is used to characterize the pointing direction.
In particular, a gyroscopic sensor is an instrument that is capable of accurately determining the orientation of a moving object. The working principle is as follows: the direction pointed by the axis of rotation of a rotating object does not change when not affected by external forces, according to which the gyro sensor maintains the direction, subsequently reads the direction pointed by the axis and automatically transmits data signals to the control system.
In general, the gyro sensor may be considered to include three axes, i.e., an X axis, a Y axis, and a Z axis, and the output data of the gyro sensor is angular velocity values rotating around the three axes, and when the electronic device 100 is placed in a portrait screen, the portrait screen direction is the Y axis direction of the gyro. When the electronic device 100 is tilted left and right and returned, the gyro sensor may integrate the angular velocity of the rotation around the Y axis over time to obtain the angular displacement of the left and right tilt of the electronic device 100, which is the radian of the left or right tilt when the mobile terminal is placed in a portrait orientation, i.e., the tilt angle. Therefore, the operating state of the left and right tilt when the electronic apparatus 100 is placed on the portrait screen can be recognized by using the characteristics of the gyro sensor.
In practical applications, in order to reduce resource consumption of the first device, the first device performs acquiring the pointing direction of the first device after receiving the first trigger instruction of the user, instead of acquiring the pointing direction of the first device at any time. The first trigger instruction is used for triggering the application object to be searched.
In an example, as shown in fig. 5B, a home page 221 is displayed on a display screen (e.g., the touch screen 210) of the first device, where the page 221 includes icons of a plurality of applications and icons corresponding to function keys, taking the function key 225 as an example, and the function key 225-1 is a return function key; function key 225-3 is a menu function key; function key 225-5 is a locate function key. When the user performs one touch operation with respect to the function key 225-3, for example, the touch operation may include but is not limited to: clicking, double clicking, pressing and the like, wherein the first device receives a first trigger instruction initiated by a user for the first device, and the first trigger instruction is used for establishing the relevance between the first device and a target object. Specifically, the association is embodied as: triggering the first device to recognize that the first device is pointed at the target object.
In one example, as shown in fig. 5C, the palm 201 of the user floats on the first device 202, and when the direction of the movement of the palm 201 of the user on the surface of the first device 202 is consistent with the set direction or the movement track is consistent with the set track, this indicates that the first device receives a first trigger instruction initiated by the user for the first device, where the first trigger instruction is used to establish an association between the first device and the target object. Specifically, the association is embodied as: triggering the first device to recognize that the first device is pointed at the target object. It will be appreciated that the palm 201 suspended above the first device may be replaced by a finger.
In one example, the first trigger instruction may be a piece of voice, such as "detecting a distance between the first device and the target object," after the voice piece is acquired by the first device, indicating that the first device receives a first trigger instruction initiated by a user for the first device. The representation form of the first trigger instruction initiated by the user is only an example, and should not be construed as a limitation.
Step S102, whether the first equipment points to the target object is identified, and if yes, step S104 is executed.
In embodiments of the present application, the target object may include an electronic device or a non-electronic device.
In an embodiment of the present application, the implementation process of identifying that the first device is pointed at the target object may include: the first equipment determines a target object according to the pointing direction of the first equipment and the spatial position of objects around the first equipment. Under the condition that the target object is the electronic device, the first device and the target object are connected in a two-way mode to acquire the orientation parameters of the target object, and therefore the space position of the target object can be determined according to the orientation parameters of the target object. Under the condition that the target object is a non-electronic device, the first device establishes bidirectional connection with at least three wireless positioning modules preset in the same space, and determines the spatial position of the target object according to at least three pieces of distance information and by combining a three-point positioning principle. This is specifically illustrated below:
in one example, based on the scenario shown in fig. 1A, the devices involved include electronic device 100, electronic device 201, electronic device 202, electronic device 203, and electronic device 204. The electronic device 201, the electronic device 202, the electronic device 203, and the electronic device 204 are objects around the electronic device 100. In practical applications, the object around the first device may be an object included in a signal strength coverage area of the first device, an object included in a preset range, or the like. Referring to fig. 5D, fig. 5D illustrates a positioning method provided in an embodiment of the present application. In the method, when the electronic device 100 detects a first trigger instruction (for example, the first trigger instruction is a voice instruction) initiated by a user for a first device, a UWB measurement request is initiated; the electronic device 201 determines the distance between the electronic device 100 and the electronic device 201 from the UWB measurement request. Specifically, the implementation process may include, but is not limited to, the following steps:
step S102-1, the electronic device 100 broadcasts a measurement request.
Step S102-2, the electronic device 201, the electronic device 202, and the electronic device 203 receive measurement requests, respectively.
Taking the electronic device 201 as an example, the electronic device 201 receives the UWB measurement request, where the UWB measurement request carries an Identity (ID) 1 of the electronic device 100.
Specifically, the electronic device 100 broadcasts a first measurement request, and records that the sending time of the first measurement request is T1, the first measurement request carries an ID1, and the first measurement request is used for measuring the orientation parameter of the electronic device 201. The electronic device 201 receives the first measurement request sent by the electronic device 100 at time T2, and records the reception time of the first measurement request as T2.
Step S102-3, the electronic device 201 determines its own orientation parameters according to the UWB measurement request.
In an embodiment of the application, the orientation parameters comprise at least one of distance, signal AOA and RRSI.
In some embodiments, the UWB measurement request carries the transmission time of the UWB measurement request, and the electronic device 201 may determine the distance between the electronic device 100 and the electronic device 201 from the transmission time and the reception time of the UWB measurement request.
In some embodiments, the electronic device 201 may determine the signal AOA from the UWB measurement request.
In some embodiments, the electronic device 201 may determine the RRSI from the UWB measurement request.
Step S102-4, the electronic device 201, the electronic device 202, and the electronic device 203 respectively send the determined orientation parameters to the electronic device 100.
Step S102-5, the electronic device 100 determines respective spatial positions according to the orientation parameters of the electronic device 201, the electronic device 202 and the electronic device 203.
In step S102-6, the electronic device 100 identifies that the first device is pointing at the target object according to the pointing direction of the first device and the spatial position of the target object, for example, the target object is the electronic device 201.
In the embodiment of the present application, the spatial position of the target object is a relative position with respect to the electronic apparatus 100. For example, the target object is located in a direction directly south of the electronic device 100.
In this embodiment, the electronic device 100 may determine the respective spatial positions according to one or more of the distances, the signals AOA, and the RRSI, which correspond to the electronic device 201, the electronic device 202, and the electronic device 203, so that the target device may be determined to be the electronic device 201 based on the pointing direction of the first device and the spatial position of the target object.
In practical applications, the electronic device 100 may also determine the target device to be the electronic device 201 directly according to the orientation parameters of the electronic device 201, the electronic device 202, and the electronic device 203 without determining the respective spatial positions according to the orientation parameters of the electronic device 201, the electronic device 202, and the electronic device 203.
Referring to fig. 5E, the determining, by the electronic device 100, that the target device is the electronic device 201 according to the orientation parameters of the electronic device 201, the electronic device 202, and the electronic device 203 may specifically include:
s1, determining whether there is a device that is not shielded from the electronic device 100 according to the RRSI (or the first flag, for example, the first flag is equal to 1 indicates that there is shielding, and the first flag is equal to 0 indicates that there is no shielding) in the orientation parameters of the electronic device 201, the electronic device 202, and the electronic device 203, if yes, performing step S2, otherwise, performing step S4.
In some embodiments, when the RRSI of the signal sent by the electronic device 201 is smaller than the preset RRSI, it is determined that there is an occlusion between the electronic device 100 and the electronic device 201, otherwise there is no occlusion.
In some embodiments, when the first identifier of the electronic device 201 is equal to a first value (e.g., 1), there is no occlusion between the electronic device 201 and the electronic device 100; when the first identifier of the electronic device 201 is equal to the second value (e.g., 0), there is a block between the electronic device 201 and the electronic device 100.
S2, if the number of devices in the electronic device 201, the electronic device 202, and the electronic device 203 that are not covered is equal to 1, step S3 is executed, otherwise, step S4 is executed.
And S3, determining the electronic device without the shielding as the electronic device 201.
It is to be appreciated that when there is only one unobstructed device in the vicinity of the electronic device 100, the unobstructed device is determined to be the target device that the user intends to point to.
And S4, determining two electronic devices with the signals AOA closest to the preset angle.
In some embodiments, if it is determined in step S1 that no device that is not hidden from the electronic device 100 exists in the electronic devices 201, 202, and 203, the electronic device 100 determines two electronic devices of the electronic devices 201, 202, and 203 whose signals AOA are closest to the preset angle in S4.
In some embodiments, if it is determined in step S2 that the number of devices out of the electronic device 201, the electronic device 202, and the electronic device 203 that are not shielded from the electronic device 100 is greater than 1, the electronic device 100 determines two electronic devices out of the electronic device 201, the electronic device 202, and the electronic device 203 that the signal AOA is closest to the preset angle in S4.
In some embodiments, electronic device 100 determines signal AOA of electronic device 201 from the phase difference of antenna A and antenna C, the connection of antenna A and antenna C being parallel to the Y-axis. In this implementation, the preset angle is equal to 0 degree. In other embodiments, electronic device 100 determines signal AOA transmitted by electronic device 201 based on the phase difference between antenna A and antenna B, which are connected parallel to the X-axis. In this implementation, the predetermined angle is equal to 90 degrees. In other embodiments, electronic device 100 determines signal AOA transmitted by electronic device 201 based on the phase difference between antenna B and antenna C, which are connected parallel to the Z-axis. In this implementation, the predetermined angle is equal to 90 degrees.
It is understood that the signal AOA of the electronic device 201 is determined in different manners, and the preset angle may be different. In the embodiment of the present application, the preset angle may also be other values, and is not specifically limited herein.
S5, whether the difference between the AOA signals of the two electronic devices is greater than the first threshold, if so, performing step S6, otherwise, performing step S7.
For example, the first threshold is 10 degrees.
S6, determining that the electronic device with the signal AOA closer to the preset angle is the electronic device 201.
S7, the electronic device with the smaller distance is determined as the electronic device 201.
It is understood that, in the above embodiment, the electronic device 100 determines that the electronic device closest to the pointing direction of the electronic device 100 and closest to the pointing direction of the electronic device 100 among the non-occluded devices (or among all the devices) is the device that the user intends to point to.
In other embodiments of the present application, the orientation parameters include range, signal AOA. The step S102-5 may specifically include only S4 to S7. That is, the electronic device 100 determines, as the electronic device 201, a device closest to or closest to a preset angle from among the nearby devices according to the orientation parameters of the nearby devices.
It should be noted that, in some embodiments, after receiving a first trigger instruction initiated by a user acting on the first device, the first device broadcasts a UWB measurement request, and electronic devices (e.g., the electronic device 201, the electronic device 202, and the electronic device 203) of the user receiving the UWB measurement request all send measurement responses to the electronic device 100, and then the electronic device 100 may determine orientation parameters of the electronic device 201, the electronic device 202, and the electronic device 203 according to the measurement responses. Taking the measurement response sent by the electronic device 201 as an example, the implementation process may include, but is not limited to, steps S201 to S209, where:
s201, the electronic device 100 receives a first trigger instruction.
S202, the electronic device 100 broadcasts a UWB measurement request, and the electronic device 201 receives the UWB measurement request.
Specifically, the electronic device 100 broadcasts a first measurement request at time T1, where the first measurement request carries the ID1, and the first measurement request is used for measuring the orientation parameter of the electronic device 201. Meanwhile, the electronic apparatus 100 records the first measurement request transmission time as T1. The electronic device 201 receives the first measurement request sent by the electronic device 100 at time T2, and records the reception time of the first measurement request as T2.
S204, the electronic device 201 sends a measurement response to the electronic device 100.
S205, the electronic device 100 determines the orientation parameters of the electronic device 100 according to the measurement response sent by the electronic device 201.
S206, the electronic device 100 determines that the target device is the electronic device 201 according to the orientation parameters of the electronic device 201, the electronic device 202 and the electronic device 203.
In particular, how the electronic device 100 determines the target device according to the orientation parameters of the electronic device 201, the electronic device 202, and the electronic device 203 may refer to the foregoing related embodiments. And will not be described in detail herein.
To facilitate a better understanding of the ultra-wideband positioning technique, the UWB chip system architecture 500 of the present application is described below.
As shown in fig. 5F, the present application provides a UWB chip system architecture 500, which may include, but is not limited to, an Application Processor (AP)501 and a UWB chip 502. The application processor 501 may include a UWB positioning service 501A and a UWB protocol stack 501B, and the UWB chip 502 may include a UWB positioning management module 502A, UWB positioning measurement module 502B. The UWB location service 501A may be, among other things, a function/service/application that requires distance measurements, AOA measurements, and/or RRSI measurements.
In the present example, the UWB chip system 500 may be on an electronic device of a user (e.g., the electronic device 100, the electronic device 201, the electronic device 202, the electronic device 203, the electronic device 204, and the like of the above-described embodiments).
For a UWB location-initiating device (e.g., electronic device 100), the following steps may be implemented in the UWB chip system 500:
1. the UWB positioning service 501A sends an enabling instruction to the UWB protocol stack 501B instructing the UWB protocol stack 501B to perform UWB positioning, i.e., to enable measurement of orientation parameters (including distance measurement, AOA measurement, RRSI measurement).
2. The UWB protocol stack 501B may send a UWB positioning broadcast instruction to the UWB positioning management module 502A instructing the UWB positioning management module 502A to perform positioning broadcast.
3. UWB location management module 502A may trigger UWB location measurement module 502B to broadcast a UWB location measurement request.
4. The UWB location measurement module 502B may transmit a measurement response to the UWB location management module 501A upon receiving a UWB location measurement response transmitted by a nearby device (e.g., the electronic device 201).
5. After receiving the measurement response, the UWB location management module 501A may analyze the measurement parameters such as the reception time of the measurement request, the transmission time of the measurement response, the phase difference information, and the RSSI of the measurement response according to the measurement response. The UWB location management module 501A may then send the above-described measurement parameters to the UWB location protocol stack 501B.
6. After receiving the measurement parameters, the UWB positioning protocol stack 501B determines the parameters of the nearby devices through the UWB positioning algorithm (including determining the distance through the distance measurement algorithm, determining the signal AOA through the AOA measurement algorithm, and determining whether the electronic device 201 is occluded through the RRSI according to the distance and the RSSI of the measurement response).
7. After calculating the location parameters, the UWB protocol stack 501B may send the location parameters of the nearby device to the UWB location service 501A.
8. The UWB location service 501A may determine the target device based on the location parameters of a plurality of nearby devices.
It can be understood that, with the positioning method provided by the present application, the first device may determine the spatial position of the target object, so that the first device may recognize that the first device points to the target object according to the pointing direction of the first device and the spatial position of the target object.
In one example, based on the scenario shown in fig. 1B, the involved objects include electronic device 100, non-electronic device 205, non-electronic device 206, and non-electronic device 207. Referring to fig. 5G, fig. 5G illustrates a positioning method provided in an embodiment of the present application. In the method, when the electronic device 100 detects a first trigger instruction (for example, the first trigger instruction is a voice instruction) initiated by a user for a first device, a measurement request is initiated to at least three wireless positioning modules preset in the same space; at least three wireless positioning modules determine the distance information between the wireless positioning modules and the electronic device 100 according to the UWB measurement request. Specifically, the implementation process may include, but is not limited to, the following steps:
step S102-11, the electronic device 100 broadcasts a measurement request.
Step S102-12, the wireless positioning module A, the wireless positioning module B and the wireless positioning module C receive the measurement request respectively.
In the embodiment of the present application, when the wireless positioning module a, the wireless positioning module B, and the wireless positioning module C are arranged, the three wireless positioning modules need to be arranged in the same space (for example, the same room), are not on the same straight line, and the mutual distance is not less than the preset distance. For example, as shown in fig. 5H, a schematic diagram of a wireless positioning module is provided in the embodiment of the present application. Wherein (a) in fig. 5H represents an erroneous setting; fig. 5H (b) shows a correct setting. As can be understood from (b) in fig. 5H, in actual installation, the following requirements need to be satisfied: it is ensured that the electronic device 100 can communicate with each wireless positioning module, the height of the wireless positioning module on the wall is at least 2 meters, the wireless positioning module should cover the whole space as much as possible, no obstacles (e.g., walls or other objects) exist between the wireless positioning module and the wireless positioning module, and the like.
Specifically, the electronic device 100 broadcasts a first measurement request, and records that the sending time of the first measurement request is T1, where the first measurement request carries an ID1, and the first measurement request is used to measure distance information between the wireless positioning module a and the electronic device 100. The wireless positioning module a receives the first measurement request sent by the electronic device 100 at time T2, and records the reception time of the first measurement request as T2.
Step S102-13, the wireless positioning module A determines the distance information between itself and the electronic device 100 according to the measurement request.
Specifically, since the measurement request carries the sending time of the measurement request, the wireless positioning module a may determine the distance information between the wireless positioning module a and the electronic device 100 according to the sending time and the receiving time of the measurement request.
Step S102-14, the wireless positioning module a, the wireless positioning module B, and the wireless positioning module C respectively send the determined distance information to the electronic device 100.
Step S102-15, the electronic device 100 determines the spatial position of the target object according to the at least three distance information and by combining the three-point positioning principle, so that the target object pointed by the first device can be identified as the tea table 207 based on the pointing direction and the spatial position of the first device.
In the embodiment of the application, three-point positioning is to calculate current position information by using coordinate position information of three points, namely, the known three-point position coordinates (x0, y0), (x1, y1), (x2, y2) and the distances d0, d1, d2 from the found position point (x, y) to the three points respectively. Then, three circles are drawn with the radii of d0, d1 and d2, and according to the pythagoras theorem, a position calculation formula of the unknown point is obtained, and specifically, the position calculation formula can be expressed as:
Figure BDA0002619977320000211
Figure BDA0002619977320000212
Figure BDA0002619977320000213
the coordinates of the unknown point can be determined by the above position calculation formula.
After the distance information between the electronic device 100 and the at least three wireless positioning modules is determined, the target object pointed by the first device is identified as the tea table 207 by combining the three-point positioning principle.
It can be understood that, with the positioning method provided by the present application, the first device may determine the spatial position of the target object, so that the first device may recognize that the first device points to the target object according to the pointing direction of the first device and the spatial position of the target object.
It should be noted that, when at least three electronic devices in the same space are not on the same straight line and the mutual distance is not less than the preset distance, the electronic device 100 may also determine the spatial position of the target object based on the distance information between each of the three electronic devices and the electronic device 100, the inclination angle of the electronic device 100, and by combining the three-point positioning principle.
In the embodiment of the present application, after acquiring the pointing direction of the first device, the first device acquires the spatial position of the target object pointed to by the first device in real time through the UWB wireless positioning technology and the three-point positioning technology described above. In practical application, when the first device determines the target object according to the pointing direction of the first device and the spatial position of the object around the first device, the target object may be determined according to the pointing direction of the first device and the historical spatial position of the object around the first device. Considering that the spatial positions of the objects around the first device do not change within a period of time, after the first device determines the spatial positions of the objects around the first device through the UWB technology or the three-point positioning technology, the spatial positions may be stored, so that the first device may acquire the spatial positions of the objects around the first device in the following. By the implementation mode, the execution efficiency of the first device can be improved, and the resource consumption of the first device can be reduced.
Further, the first device may periodically update the stored spatial locations of objects around the first device, the periodic operations being embodied in: after the preset time period is met, the spatial positions of the objects around the first device are determined again through the UWB technology or the three-point positioning technology, and then the stored spatial positions of the objects around the first device are updated.
Step S104, indicating the first device to determine a target application object according to the target object; the target application object has a correspondence with the target object.
In an embodiment of the present application, the implementation process of the first device determining the target application object according to the target object may include: firstly, acquiring a corresponding relation between a pre-constructed object and an application object by first equipment; then, the first device determines a target application object corresponding to the target object based on the correspondence.
In the embodiment of the present application, the correspondence between the object and the application object that are constructed in advance may be stored in the first device. When the target object is determined, the first device may search for the target application object corresponding to the target object according to the stored correspondence between the object and the application object. On the contrary, when it is determined that the first device does not determine the target object, it may be considered that the user does not have a tendency or need to search for the application program, and at this time, the application program currently running or the task running of the first device may be maintained, which is not specifically limited in this embodiment of the present application.
In the embodiment of the present application, the application object may include the application itself or a functional component in the application. Specifically, a functional component refers to a component for performing a specific function of an application. For example, taking an application as a wechat application as an example, the wechat application includes a "discovery" function component, and further includes a plurality of sub-function components such as "friend circle", video number, scan, pan, watch, search, and the like under the function component. It should be noted that, for convenience of description, the functional components and the sub-functional components included in the functional components are referred to as functional components.
In this embodiment of the present application, a specific implementation manner of the first device searching for the target application object corresponding to the target object from the installed application program may be as follows: and the first device searches the application program corresponding to the target object from the installed application programs or searches the functional components in the application program corresponding to the target object from the installed application programs according to the stored mapping relation between the object and the application object. The correspondence between the object and the application object may be set by the user as needed, or may be set by the first device according to a history setting record of the user, or may be set by the first device according to a usage habit of the user, which is not limited in this embodiment of the application. In practical applications, the corresponding relationship between the build object and the application object may be implemented before step S100.
In one example, in the stored correspondence between the objects and the application objects, one object may correspond to one application object, for example, the correspondence may be as shown in table 1:
object Application object
Electric lamp Memorandum application
Table (Ref. Table) Weather application
Chair (Ref. TM. chair) WeChat application
TABLE 1
As can be understood from table 1, in the correspondence between the stored object and the application object, the electric lamp corresponds to the memo application; the table corresponds to a weather application; the chair corresponds to the WeChat application. The correspondence relationship shown in table 1 is merely an example, and should not be construed as limiting.
In one example, in the stored correspondence between the object and the application object, one object may correspond to a plurality of application objects, for example, the correspondence may be as shown in table 2:
object Application object
Electric lamp Memo application and memo application
Table (Ref. Table) Weather application program and clock application program
Chair (Ref. TM. chair) WeChat application program and QQ application program
TABLE 2
As can be seen from table 2, in the correspondence between the stored objects and the application programs, the electric lamps correspond to the memo application program and the memo application program, respectively, the desk corresponds to the weather application program and the clock application program, respectively, and the chairs correspond to the WeChat application program and the QQ application program, respectively. The correspondence relationship shown in table 2 is merely an example, and should not be construed as limiting.
Further, when one object corresponds to a plurality of application objects, the plurality of application objects may be application objects of the same type or similar types (similar type means that the similarity between the application programs is greater than a set threshold), for example, in table 2, a chair corresponds to a wechat application program and a QQ application program, respectively, where the wechat application program and the QQ application program are both instant messaging application programs; the plurality of application objects may also be different types of application objects, for example, in table 2, the tables correspond to the weather application and the clock application, respectively, which is not limited in this application.
In an alternative embodiment, after identifying the target object pointed by the first device, the method may further include:
and step S11-1, determining the type of the target object.
As described above, the target object pointed by the first device may be identified by different positioning manners, for example, the orientation parameters between the electronic device and the electronic device may be obtained by an ultra-wideband wireless positioning technology, the distance information between the electronic device and the non-electronic device may be obtained by a three-point positioning technology, and then the type of the target object may be determined based on the different positioning manners. For example, the target object is an electronic device, or the target object is a non-electronic device.
And step S11-2, when the target object is the electronic device, searching the target application object corresponding to the target object according to the corresponding relation between the electronic device and the application object.
In practical applications, when the corresponding relationship between the electronic device and the application object is constructed, the corresponding relationship may be constructed according to the electronic device included in the environment where the first device is located. For example, the electronic devices included in the environmental scene in which the first device is located include the electronic device 1, the electronic device 2, and the electronic device 3. The stored correspondence between the object and the application object is a correspondence between the electronic device and the application object, for example, the electronic device 1 is associated with a memo application; the electronic device 2 is associated with a weather application; the electronic device 3 is associated with a wechat application.
It is understood that, in the case that the electronic devices included in the environmental scene in which the first device is located are different, the corresponding relationship between the electronic devices and the application objects may present different representation forms.
And step S11-3, when the target object is a non-electronic device, searching a target application object corresponding to the target object according to the corresponding relation between the non-electronic device and the application object.
In practical applications, when constructing the corresponding relationship between the non-electronic device and the application object, the corresponding relationship may be constructed according to the non-electronic device included in the environment where the first device is located. For example, the electronic devices included in the environmental scene in which the first device is located include the non-electronic device 1, the non-electronic device 2, and the non-electronic device 3. The stored correspondence between the object and the application object is a correspondence between the electronic device and the application object, for example, the non-electronic device 1 is associated with a memo application; non-electronic device 2 is associated with a weather application; the non-electronic device 3 is associated with a wechat application.
It is understood that in the case that the non-electronic devices included in the environmental scene in which the first device is located are different, the corresponding relationship between the electronic device and the application object may present different representation forms.
In this implementation manner, since the corresponding relationship takes into account the environmental scene where the first device is located and the type of the target object, the data size of the corresponding relationship is small, and the efficiency of searching for the application object can be improved.
It can be understood that, in the case that the stored correspondence between the objects and the application objects is that one object corresponds to one application object, the number of the found application objects is one; and under the condition that the stored corresponding relation between the object and the application objects is that one object corresponds to a plurality of application objects, the number of the searched application objects is a plurality.
And S106, instructing the first device to display a corresponding interface according to the target application object.
When the first device determines the target application object corresponding to the target object, the corresponding interface of the target application object can be displayed through the display screen of the first device. For example, taking an application object as an application program as an example, in the case that the number of the found application programs is one, the display interface includes the name and/or the icon of the application program, or the display interface is a main interface of the application program; further, when the name and/or icon of the searched target application object is displayed, the name (icon) may be displayed in an enlarged manner or in another manner convenient for the user to operate. And under the condition that the number of the searched application programs is multiple, the display interface comprises names and/or icons corresponding to the multiple application programs. The first device may then receive a user selection of an application of the plurality of applications, and may launch the user-selected application.
It will be appreciated that the display interface is different for different application objects.
By implementing the embodiment of the application, the target application object can be found without the need of multiple page turning of the user, but the target application object is determined according to the target object under the condition that the first device determines the target object, so that the target application object can be found.
As can be seen from the above description, the method for searching for an application object provided in the embodiment of the present application associates an application object with a real object in a physical space, and organizes the application object in the real world, on one hand, a user does not need to turn pages many times to search for a target application program, on the other hand, a memory burden of the user is reduced, the user does not need to memorize a position of the application program on a display page, an efficiency of searching for the application program can be improved, and a user experience can also be improved.
Optionally, as shown in fig. 6, a flowchart of another method for searching for an application object provided in the embodiment of the present application is shown, where after the target application object is determined, the method may further include:
step S108, under the condition that the target application object on the first device receives the message notification, sending indication information to the target object, wherein the indication information is used for indicating the target object to output prompt information.
For example, taking the target application object as the wechat application as an example, the message notification may be an update notification of the wechat application, or may be instant messaging information of the wechat application. For another example, taking the target application object as a mail application, the message notification may be new mail notification information of the mail application. For another example, the message notification may be push information of the latest news of the browser application, or may be an update notification of the browser application, taking the target application object as the browser application.
In this embodiment of the application, when the target application object on the first device receives the message notification, the first device does not output the prompt information, but outputs the prompt information through the searched target object, where the prompt information is used to prompt the user that the first device receives the message notification. Specifically, when the target object outputs the prompt information, the target object outputs the prompt information according to the self capability information. Wherein the capability information of the target object is used for characterizing an output mode of the target object. The output mode refers to different presentation modes of the message, and may include display, voice broadcast, and the like.
In one example, taking a target object as an electronic device for example, the target object has audio capability and display capability. In this case, the target object may output the prompt message through an audio capability, for example, a prompt message "you receive a message notification" is broadcast by voice broadcast, and the specific content of the message notification may also be displayed through a display capability.
For example, in the correspondence between the stored object and the application object, a TV is associated with a news application, and since the TV has audio capability and display capability, when the first device outputs the prompt information through the TV, as shown in fig. 7, a visual notification may be provided on the display screen of the TV, and the prompt information may also be output through a voice playing device of the TV.
In one example, taking the target object as an electronic device for example, the target object does not have audio capability, but does have display capability. In this case, the target object may output the prompt message through audio capabilities, but not displayed.
For example, in the correspondence between the stored object and the application object, the electric lamp is associated with the word application program, and because the electric lamp has a display capability, when the remote user changes the word document to generate a message notification, the electric lamp can adjust the brightness or color thereof to remind the user.
In one example, taking the target object as a non-electronic device for example, the target object does not have audio capability and display capability. In this case, the first device does not output the prompt information through the target object.
It can be understood that, after the first device outputs the prompt message through the target object, the user may hold the first device to point at or approach the target object again, at this time, the first device may determine the target object pointed at or approaching, and after determining the target object pointed at by the first device, output the prompt message through the first device. In the implementation mode, the output prompt information is switched to the first equipment for output.
In practical application, the stay time that the user holds the first device and points at or approaches the target object may be detected, for example, when the stay time is detected to be longer than a set time period, the prompt information is output through the first device, so as to avoid the occurrence of the misjudgment.
By implementing the embodiment of the application, when the target application object on the first device receives the message notification, the first device outputs the prompt information through the target object instead of outputting the prompt information through the first device, for example, when the first device is in a mute state or a vibration state, the user can be prevented from missing important message notifications, the presentation experience of the message notifications is better, and the user experience is improved.
The foregoing embodiments mainly illustrate how to find a target application program corresponding to a target object according to a stored correspondence between the object and an application object, and the following specifically illustrates an apparatus related to the present application. Referring to fig. 8, an apparatus for finding an application object according to an embodiment of the present application is provided. As shown in fig. 8, the apparatus 80 may include:
an obtaining pointing unit 800, configured to obtain a pointing direction of a first device;
an application object determining unit 804, configured to determine a target application object according to a target object when the identifying unit 802 identifies that the first device is pointed to the target object; the target application object and the target object have a corresponding relationship;
a display unit 806, configured to instruct the first device to display a corresponding interface according to the target application object.
In one possible implementation, the apparatus 80 may further include:
the instruction receiving unit 808 is configured to receive a first trigger instruction of a user before the obtaining of the pointing direction of the first device by the pointing unit 800, where the first trigger instruction is used to trigger searching of the application object.
In a possible implementation manner, the identifying unit 802 is specifically configured to:
and determining the target object according to the pointing direction of the first equipment and the spatial position of the object around the first equipment.
In a possible implementation manner, the identifying unit 802 is specifically configured to:
and determining the target object according to the pointing direction of the first equipment and the historical space positions of the objects around the first equipment.
In a possible implementation manner, the obtaining pointing unit 800 is specifically configured to:
and determining the pointing direction of the first equipment according to a preset reference direction and the acquired inclination angle of the first equipment.
In one possible implementation, the apparatus 80 further includes:
a first sending unit 8010, configured to broadcast an ultra-bandwidth measurement request to the target object; the target object is used for determining the self orientation parameter according to the ultra-bandwidth measurement request;
a first receiving unit 8012 for receiving orientation parameters from the target object;
the first position determining unit 8014 is configured to determine a spatial position of the target object according to the orientation parameter corresponding to the target object.
In one possible implementation, the apparatus 80 further includes:
a second sending unit 8016, configured to broadcast a measurement request to at least three wireless positioning modules preset in the same space; the at least three wireless positioning modules are not on the same straight line, and the distance between every two wireless positioning modules is not less than the preset distance; each wireless positioning module in the at least three wireless positioning modules is used for acquiring distance information between the wireless positioning module and the first equipment according to the measurement request;
a second receiving unit 8018 configured to receive at least three distance information;
the second position determining unit 8020 is configured to determine the spatial position of the target object according to the at least three pieces of distance information and by combining a three-point positioning principle.
In a possible implementation manner, the application object determining unit 804 is specifically configured to:
acquiring a corresponding relation between a pre-constructed object and an application object;
and determining a target application object corresponding to the target object based on the corresponding relation.
In one possible implementation, the apparatus 80 may further include:
a sending unit, configured to send, to a target object, indication information when a target application object on the first device receives a message notification, where the indication information is used to indicate the target object to output prompt information.
In a possible implementation manner, when the target object outputs the prompt information, the target object outputs the prompt information according to the capability information of the target object, and the capability information of the target object is used for representing an output manner of the target object.
It should be noted that, for specific implementation of each functional device, reference may be made to relevant descriptions in the foregoing method embodiments, and details are not described in this application embodiment again.
Referring to fig. 9, fig. 9 is a schematic structural diagram of a vehicle-mounted terminal according to an embodiment of the present disclosure, where the vehicle-mounted terminal 90 may include at least one processor 901, at least one memory 902, a communication bus 903, and at least one communication interface 904, where the processor 901 is connected to the memory 902 and the communication interface 904 through the communication bus, and may also complete communication therebetween.
The processor 901 may be a general-purpose Central Processing Unit (CPU), a microprocessor, an Application Specific Integrated Circuit (ASIC), a Graphics Processing Unit (GPU), a neural Network Processor (NPU), or one or more Integrated circuits, and is configured to execute related programs to execute the method for searching an Application object described in the method embodiments of the present Application.
The processor 901 may also be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the data processing method of the present application may be implemented by integrated logic circuits of hardware or instructions in the form of software in the processor 901. The processor 901 may also be a general-purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, or a discrete hardware component. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 902, and the processor 901 reads the information in the memory 902 and, in combination with hardware thereof, executes the method for finding an application object according to the embodiment of the present application.
The Memory 902 may be a Read Only Memory (ROM), a static Memory device, a dynamic Memory device, or a Random Access Memory (RAM). The memory 902 may store programs and data, such as programs of methods of finding application objects in the embodiments of the present application, and the like. When the program stored in the memory 902 is executed by the processor 901, the processor 901 and the communication interface 904 are used for executing the steps of the method for finding an application object of the embodiment of the present application.
For example, the application object searching method in the embodiment of the present application is implemented by a program and the like. For example, the method may be a method according to the first aspect of the embodiment of the present application, or may be a method according to the second aspect of the embodiment of the present application.
Communication interface 904 enables communication between data processing device 90 and other devices or communication networks using transceiver means, such as, but not limited to, transceivers.
Optionally, the data Processing apparatus may further include an artificial intelligence processor 905, and the artificial intelligence processor 905 may be any processor suitable for large-scale exclusive or operation Processing, such as a neural Network Processor (NPU), a Tensor Processor (TPU), or a Graphics Processing Unit (GPU). The artificial intelligence processor 905 may be mounted as a coprocessor to a main CPU (host CPU) for which tasks are assigned. The artificial intelligence processor 905 may implement one or more of the operations involved in the above-described method of finding an application object. For example, taking an NPU as an example, the core portion of the NPU is an arithmetic circuit, and the controller controls the arithmetic circuit to extract matrix data in the memory 902 and perform a multiply-add operation.
The processor 901 is used for calling data and program codes in the memory and executing:
acquiring the pointing direction of the first device;
if the first equipment is identified to point at a target object, determining a target application object according to the target object; the target application object and the target object have a corresponding relationship;
and instructing the first equipment to display a corresponding interface according to the target application object.
Wherein the processor 901 is further configured to:
receiving a first trigger instruction of a user, wherein the first trigger instruction is used for triggering the search of an application object.
Wherein the processor 901 identifies that the first device is pointed at a target object, comprising:
and determining the target object according to the pointing direction of the first equipment and the spatial position of the object around the first equipment.
The determining, by the processor 901, the target object according to the pointing direction of the first device and the spatial positions of the objects around the first device includes:
and determining the target object according to the pointing direction of the first equipment and the historical space positions of the objects around the first equipment.
The acquiring, by the processor 901, the pointing direction of the first device includes:
and determining the pointing direction of the first equipment according to a preset reference direction and the acquired inclination angle of the first equipment.
Wherein before the processor 901 identifies that the first device is pointed at the target object, the method further includes:
broadcasting an ultra-bandwidth measurement request to the target object; the target object is used for determining the self orientation parameter according to the ultra-bandwidth measurement request;
receiving orientation parameters from the target object;
and determining the spatial position of the target object according to the azimuth parameter corresponding to the target object.
Wherein before the processor 901 identifies that the first device is pointed at the target object, the method further includes:
broadcasting a measurement request to at least three wireless positioning modules preset in the same space; the at least three wireless positioning modules are not on the same straight line, and the distance between every two wireless positioning modules is not less than the preset distance; each wireless positioning module in the at least three wireless positioning modules is used for acquiring distance information between the wireless positioning module and the first equipment according to the measurement request;
receiving at least three distance information;
and determining the spatial position of the target object according to the at least three pieces of distance information and by combining a three-point positioning principle.
Wherein the processor 901 determines a target application object according to the target object, including:
acquiring a corresponding relation between a pre-constructed object and an application object;
and determining a target application object corresponding to the target object based on the corresponding relation.
Wherein, the processor 901 is further configured to:
and in the case that the target application object on the first device receives the message notification, sending indication information to the target object through the communication interface 904, wherein the indication information is used for indicating the target object to output prompt information.
When the target object outputs the prompt information, the target object outputs the prompt information according to the capability information of the target object, and the capability information of the target object is used for representing the output mode of the target object.
The present embodiments also provide a computer storage medium having instructions stored therein, which when executed on a computer or a processor, cause the computer or the processor to perform one or more steps of the method according to any one of the above embodiments. Based on the understanding that the constituent modules of the above-mentioned apparatus, if implemented in the form of software functional units and sold or used as independent products, may be stored in the computer-readable storage medium, and based on this understanding, the technical solutions of the present application, in essence, or a part contributing to the prior art, or all or part of the technical solutions, may be embodied in the form of software products, and the computer products are stored in the computer-readable storage medium.
The computer readable storage medium may be an internal storage unit of the device according to the foregoing embodiment, such as a hard disk or a memory. The computer readable storage medium may be an external storage device of the above-described apparatus, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, the computer-readable storage medium may include both an internal storage unit and an external storage device of the device. The computer-readable storage medium is used for storing the computer program and other programs and data required by the apparatus. The above-described computer-readable storage medium may also be used to temporarily store data that has been output or is to be output.
It will be understood by those skilled in the art that all or part of the processes of the methods of the above embodiments may be implemented by a computer program, which can be stored in a computer-readable storage medium, and can include the processes of the above embodiments of the methods when the computer program is executed. And the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
The steps in the method of the embodiment of the application can be sequentially adjusted, combined and deleted according to actual needs.
The modules in the device can be merged, divided and deleted according to actual needs.
It is to be understood that one of ordinary skill in the art would recognize that the elements and algorithm steps of the various examples described in connection with the embodiments disclosed in the various embodiments disclosed herein can be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
Those of skill would appreciate that the functions described in connection with the various illustrative logical blocks, modules, and algorithm steps disclosed in the various embodiments disclosed herein may be implemented as hardware, software, firmware, or any combination thereof. If implemented in software, the functions described in the various illustrative logical blocks, modules, and steps may be stored on or transmitted over as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. The computer-readable medium may include a computer-readable storage medium, which corresponds to a tangible medium, such as a data storage medium, or any communication medium including a medium that facilitates transfer of a computer program from one place to another (e.g., according to a communication protocol). In this manner, a computer-readable medium may generally correspond to (1) a non-transitory tangible computer-readable storage medium, or (2) a communication medium, such as a signal or carrier wave. A data storage medium may be any available medium that can be accessed by one or more computers or one or more processors to retrieve instructions, code and/or data structures for implementing the techniques described herein. The computer program product may include a computer-readable medium.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (22)

1. A method for finding an application object, comprising:
the method comprises the steps that first equipment obtains the pointing direction of the first equipment;
if the first device is identified to be directed to a target object, the first device determines a target application object according to the target object; the target application object and the target object have a corresponding relationship;
and instructing the first equipment to display a corresponding interface according to the target application object.
2. The method of claim 1, wherein the method further comprises:
the first device receives a first trigger instruction of a user, wherein the first trigger instruction is used for triggering and searching for an application object.
3. The method of claim 1, wherein the identifying that the first device is pointed at a target object comprises:
and the first equipment determines the target object according to the pointing direction of the first equipment and the spatial position of the object around the first equipment.
4. The method of claim 3, wherein the first device determining the target object based on the pointing direction of the first device and the spatial locations of objects around the first device comprises:
and the first equipment determines the target object according to the pointing direction of the first equipment and the historical spatial positions of objects around the first equipment.
5. The method of claim 1, wherein the obtaining the pointing direction of the first device comprises:
and determining the pointing direction of the first equipment according to a preset reference direction and the acquired inclination angle of the first equipment.
6. The method of claim 1, wherein prior to identifying that the first device is pointed at a target object, further comprising:
the first device broadcasts an ultra-bandwidth measurement request to the target object; the target object is used for determining the self orientation parameter according to the ultra-bandwidth measurement request;
the first device receiving orientation parameters from the target object;
and the first equipment determines the spatial position of the target object according to the orientation parameters corresponding to the target object.
7. The method of claim 1, wherein prior to identifying that the first device is pointed at a target object, further comprising:
the first equipment broadcasts a measurement request to at least three wireless positioning modules preset in the same space; the at least three wireless positioning modules are not on the same straight line, and the distance between every two wireless positioning modules is not less than the preset distance; each wireless positioning module in the at least three wireless positioning modules is used for acquiring distance information between the wireless positioning module and the first equipment according to the measurement request;
the first device receives at least three distance information;
and the first equipment determines the spatial position of the target object according to the at least three pieces of distance information and by combining a three-point positioning principle.
8. The method of claim 1, wherein the first device determines a target application object from the target object, comprising:
the first equipment acquires a corresponding relation between a pre-constructed object and an application object;
and the first equipment determines a target application object corresponding to the target object based on the corresponding relation.
9. The method of any of claims 1-8, wherein the target object comprises an electronic device; the method further comprises the following steps:
and under the condition that a target application object on the first device receives a message notification, sending indication information to the target object, wherein the indication information is used for indicating the target object to output prompt information.
10. The method according to claim 9, wherein when the target object outputs the prompt information, the target object outputs the prompt information according to capability information of the target object, and the capability information of the target object is used for representing an output mode of the target object.
11. An apparatus for finding an application object, comprising:
the acquisition pointing unit is used for acquiring the pointing direction of the first equipment;
the application object determining unit is used for determining a target application object according to the target object under the condition that the first equipment is identified to be directed to the target object by the identifying unit; the target application object and the target object have a corresponding relationship;
and the display unit is used for indicating the first equipment to display a corresponding interface according to the target application object.
12. The apparatus of claim 11, wherein the apparatus further comprises:
the instruction receiving unit is used for receiving a first trigger instruction of a user, and the first trigger instruction is used for triggering and searching an application object.
13. The apparatus according to claim 11, wherein the identification unit is specifically configured to:
and determining the target object according to the pointing direction of the first equipment and the spatial position of the object around the first equipment.
14. The apparatus according to claim 13, wherein the identification unit is specifically configured to:
and determining the target object according to the pointing direction of the first equipment and the historical space positions of the objects around the first equipment.
15. The apparatus of claim 11, wherein the acquisition pointing unit is specifically configured to:
and determining the pointing direction of the first equipment according to a preset reference direction and the acquired inclination angle of the first equipment.
16. The apparatus of claim 11, wherein the apparatus further comprises:
a first sending unit, configured to broadcast an ultra-bandwidth measurement request to the target object; the target object is used for determining the self orientation parameter according to the ultra-bandwidth measurement request;
a first receiving unit for receiving orientation parameters from the target object;
and the first position determining unit is used for determining the spatial position of the target object according to the orientation parameters corresponding to the target object.
17. The apparatus of claim 11, wherein the apparatus further comprises:
the second sending unit is used for broadcasting the measurement request to at least three wireless positioning modules preset in the same space; the at least three wireless positioning modules are not on the same straight line, and the distance between every two wireless positioning modules is not less than the preset distance; each wireless positioning module in the at least three wireless positioning modules is used for acquiring distance information between the wireless positioning module and the first equipment according to the measurement request;
a second receiving unit for receiving at least three distance information;
and the second position determining unit is used for determining the spatial position of the target object according to the at least three pieces of distance information and by combining a three-point positioning principle.
18. The apparatus of claim 11, wherein the application object determination unit is specifically configured to:
acquiring a corresponding relation between a pre-constructed object and an application object;
and determining a target application object corresponding to the target object based on the corresponding relation.
19. The apparatus of any of claims 11-18, wherein the target object comprises an electronic device; the device further comprises:
a third sending unit, configured to send, when a target application object on the first device receives a message notification, instruction information to the target object, where the instruction information is used to instruct the target object to output prompt information.
20. The apparatus according to claim 19, wherein when the target object outputs the prompt information, the target object outputs the prompt information according to capability information of the target object, the capability information of the target object being used to characterize an output mode that the target object has.
21. An electronic device comprising a processor, an input device, an output device, and a memory, the processor, the input device, the output device, and the memory being interconnected, wherein the memory is configured to store a computer program comprising program instructions, the processor being configured to invoke the program instructions to perform the method of any of claims 1-10.
22. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program comprising program instructions that, when executed by a processor, cause the processor to carry out the method according to any one of claims 1-10.
CN202010786041.3A 2020-08-05 2020-08-05 Method, terminal and computer readable storage medium for searching application object Pending CN112134995A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010786041.3A CN112134995A (en) 2020-08-05 2020-08-05 Method, terminal and computer readable storage medium for searching application object

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010786041.3A CN112134995A (en) 2020-08-05 2020-08-05 Method, terminal and computer readable storage medium for searching application object

Publications (1)

Publication Number Publication Date
CN112134995A true CN112134995A (en) 2020-12-25

Family

ID=73851514

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010786041.3A Pending CN112134995A (en) 2020-08-05 2020-08-05 Method, terminal and computer readable storage medium for searching application object

Country Status (1)

Country Link
CN (1) CN112134995A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112838968A (en) * 2020-12-31 2021-05-25 青岛海尔科技有限公司 Equipment control method, device, system, storage medium and electronic device
CN112954758A (en) * 2021-02-02 2021-06-11 维沃移动通信有限公司 Network switching method and device and electronic equipment
CN113179487A (en) * 2021-04-22 2021-07-27 Oppo广东移动通信有限公司 Working mode control method and device, electronic equipment and storage medium
CN113505153A (en) * 2021-05-11 2021-10-15 深圳软牛科技有限公司 Memorandum backup method based on iOS system and related equipment
CN115150646A (en) * 2021-03-31 2022-10-04 华为技术有限公司 Method for displaying control window of second electronic equipment and first electronic equipment
WO2023134445A1 (en) * 2022-01-12 2023-07-20 湖北星纪魅族科技有限公司 Terminal control method and device based on uwb

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102298533A (en) * 2011-09-20 2011-12-28 宇龙计算机通信科技(深圳)有限公司 Method for activating application program and terminal equipment
CN110568767A (en) * 2019-07-31 2019-12-13 华为技术有限公司 Intelligent household equipment selection method and terminal

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102298533A (en) * 2011-09-20 2011-12-28 宇龙计算机通信科技(深圳)有限公司 Method for activating application program and terminal equipment
CN110568767A (en) * 2019-07-31 2019-12-13 华为技术有限公司 Intelligent household equipment selection method and terminal

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112838968A (en) * 2020-12-31 2021-05-25 青岛海尔科技有限公司 Equipment control method, device, system, storage medium and electronic device
CN112838968B (en) * 2020-12-31 2022-08-05 青岛海尔科技有限公司 Equipment control method, device, system, storage medium and electronic device
CN112954758A (en) * 2021-02-02 2021-06-11 维沃移动通信有限公司 Network switching method and device and electronic equipment
CN115150646A (en) * 2021-03-31 2022-10-04 华为技术有限公司 Method for displaying control window of second electronic equipment and first electronic equipment
CN113179487A (en) * 2021-04-22 2021-07-27 Oppo广东移动通信有限公司 Working mode control method and device, electronic equipment and storage medium
CN113179487B (en) * 2021-04-22 2023-08-29 Oppo广东移动通信有限公司 Method and device for controlling working mode, electronic equipment and storage medium
CN113505153A (en) * 2021-05-11 2021-10-15 深圳软牛科技有限公司 Memorandum backup method based on iOS system and related equipment
WO2023134445A1 (en) * 2022-01-12 2023-07-20 湖北星纪魅族科技有限公司 Terminal control method and device based on uwb

Similar Documents

Publication Publication Date Title
CN110495819B (en) Robot control method, robot, terminal, server and control system
CN112134995A (en) Method, terminal and computer readable storage medium for searching application object
CN108924737B (en) Positioning method, device, equipment and computer readable storage medium
WO2020062267A1 (en) Information prompt method and electronic device
CN111369988A (en) Voice awakening method and electronic equipment
CN112312366B (en) Method, electronic equipment and system for realizing functions through NFC (near field communication) tag
CN110401767B (en) Information processing method and apparatus
CN112637758B (en) Equipment positioning method and related equipment thereof
CN110649719A (en) Wireless charging method and electronic equipment
CN111182140B (en) Motor control method and device, computer readable medium and terminal equipment
CN110557740A (en) Electronic equipment control method and electronic equipment
CN111835907A (en) Method, equipment and system for switching service across electronic equipment
CN113921002A (en) Equipment control method and related device
WO2022048453A1 (en) Unlocking method and electronic device
WO2022100219A1 (en) Data transfer method and related device
CN115032640B (en) Gesture recognition method and terminal equipment
CN112099741A (en) Display screen position identification method, electronic device and computer readable storage medium
CN112992127A (en) Voice recognition method and device
WO2022152174A9 (en) Screen projection method and electronic device
CN114338642B (en) File transmission method and electronic equipment
CN115035187A (en) Sound source direction determining method, device, terminal, storage medium and product
CN115250428A (en) Positioning method and device
CN115248693A (en) Application management method and electronic equipment
CN114095542A (en) Display control method and electronic equipment
CN114116610A (en) Method, device, electronic equipment and medium for acquiring storage information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20201225

RJ01 Rejection of invention patent application after publication