CN117289205A - Method for positioning external device, electronic device and positioning system - Google Patents

Method for positioning external device, electronic device and positioning system Download PDF

Info

Publication number
CN117289205A
CN117289205A CN202311226537.5A CN202311226537A CN117289205A CN 117289205 A CN117289205 A CN 117289205A CN 202311226537 A CN202311226537 A CN 202311226537A CN 117289205 A CN117289205 A CN 117289205A
Authority
CN
China
Prior art keywords
information
electronic device
external device
detection points
distances
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311226537.5A
Other languages
Chinese (zh)
Inventor
葛玲玲
谢志栋
张佩茹
葛莹
孙宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics China R&D Center
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics China R&D Center
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics China R&D Center, Samsung Electronics Co Ltd filed Critical Samsung Electronics China R&D Center
Priority to CN202311226537.5A priority Critical patent/CN117289205A/en
Publication of CN117289205A publication Critical patent/CN117289205A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management
    • H04W64/006Locating users or terminals or network equipment for network management purposes, e.g. mobility management with additional information processing, e.g. for direction or speed determination

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

The present disclosure provides a method and an electronic device for locating an external device, and a locating system. The method comprises the following steps: detecting, at a plurality of detection points of an electronic device, a plurality of distances of the plurality of detection points from the external device, wherein at least four detection points among the plurality of detection points are not coplanar; based on the detected plurality of distances, position information and distance information of the external device relative to the electronic device are determined.

Description

Method for positioning external device, electronic device and positioning system
Technical Field
The present disclosure relates to the field of electronics, and more particularly, to a method and an electronic device for locating an external device, and a locating system.
Background
With the development of intelligent electronic devices such as smartphones, augmented reality (Augmented Reality, AR) devices, navigation devices, etc., there is an increasing demand for positioning of target objects, for example, it is desired to position target objects in a non-visible range, it is desired to position target objects having a certain height at a specific horizontal position (i.e., spatial positioning), it is desired to use portable positioning devices, etc.
In order to improve the universality and convenience of the electronic device, how to provide a method for conveniently and accurately positioning the external device in space is a problem which needs to be solved by those skilled in the art.
Disclosure of Invention
The embodiment of the disclosure provides a method for positioning an external device, an electronic device and a positioning system, which are convenient for a user to accurately position a target external device so as to perform corresponding operation according to a positioning result.
According to a first aspect of embodiments of the present disclosure, there is provided a method for locating an external device, comprising: detecting, at a plurality of detection points of an electronic device, a plurality of distances of the plurality of detection points from the external device, wherein at least four detection points among the plurality of detection points are not coplanar; based on the detected plurality of distances, position information and distance information of the external device relative to the electronic device are determined.
Optionally, at a plurality of detection points of the electronic device, the step of detecting a plurality of distances of the plurality of detection points from the external device comprises: the distance from the external device is detected at each of the plurality of detection points, respectively, by using Ultra Wideband (UWB) technology.
Optionally, the step of determining the location information of the external device relative to the electronic device based on the detected plurality of distances comprises: based on the plurality of distances input, position information of the external device relative to the electronic device is obtained through a neural network model, wherein the position information comprises horizontal view angle information and vertical view angle information.
Optionally, the neural network model is pre-trained by using training data related to the plurality of detection points, wherein the training data is obtained by: creating a space rectangular coordinate system with an estimated central point of the electronic device as an origin; calculating reference azimuth information and distance information of each sample point relative to the plurality of detection points, wherein each sample point is each of a plurality of vertexes of a plurality of unit cubes divided from a space of the space rectangular coordinate system; correcting the coordinate value of the origin of the space rectangular coordinate system based on the calculated reference azimuth information and distance information; recalculating azimuth information of each sample point relative to the plurality of detection points based on the corrected coordinate values of the origin point; the recalculated position information and distance information for each sample point are determined as training data.
Optionally, the method further comprises: identifying whether the external device is within a predetermined range for the electronic device based on the determined orientation information; based on the result of the recognition, related information about the external device is provided.
Optionally, the related information of the external device includes at least one of the following information: orientation information and distance information with respect to the electronic device, guide information guiding the electronic device toward the external device, information of the external device combined with environment information, and control information for controlling the external device.
Optionally, the method further comprises: re-detecting a plurality of distances of the plurality of detection points from the external device in response to movement of at least one of the electronic device and the external device; based on the re-detected plurality of distances, location information and distance information of the external device relative to the electronic device are updated.
According to a second aspect of embodiments of the present disclosure, there is provided an electronic device for locating an external device, wherein the electronic device comprises: a detection module configured to: detecting, at a plurality of detection points of the electronic device, a plurality of distances of the plurality of detection points from the external device, wherein at least four detection points among the plurality of detection points are not coplanar; a positioning module configured to: based on the detected plurality of distances, position information and distance information of the external device relative to the electronic device are determined.
Optionally, the detection module is configured to detect a plurality of distances of a plurality of detection points from the external device at the plurality of detection points of the electronic device by: the distance from the external device is detected at each of the plurality of detection points, respectively, by using ultra wideband UWB technology.
Optionally, the positioning module is configured to determine the position information of the external device relative to the electronic device based on the detected plurality of distances by: based on the plurality of distances input, position information of the external device relative to the electronic device is obtained through a neural network model, wherein the position information comprises horizontal view angle information and vertical view angle information.
Optionally, the electronic device further comprises a model training module, wherein the model training module is configured to pre-train the neural network model by using training data related to the plurality of detection points, wherein the model training module is configured to obtain the training data by: creating a space rectangular coordinate system with an estimated central point of the electronic device as an origin; calculating reference azimuth information and distance information of each sample point relative to the detection points; correcting the coordinate value of the origin of the space rectangular coordinate system based on the calculated reference azimuth information and distance information; recalculating azimuth information of each sample point relative to the plurality of detection points based on the corrected coordinate values of the origin point; the recalculated position information and distance information for each sample point are determined as training data.
Optionally, the electronic device further comprises a target recognition module, wherein the target recognition module is configured to: identifying whether the external device is within a predetermined range for the electronic device based on the determined orientation information; based on the result of the recognition, related information about the external device is provided.
Optionally, the related information of the external device includes at least one of the following information: orientation information and distance information with respect to the electronic device, guide information guiding the electronic device toward the external device, information of the external device combined with environment information, and control information for controlling the external device.
Optionally, the electronic device further comprises an information updating module, wherein the information updating module is configured to: re-detecting a plurality of distances of the plurality of detection points from the external device in response to movement of at least one of the electronic device and the external device; based on the re-detected plurality of distances, location information and distance information of the external device relative to the electronic device are updated.
According to a third aspect of embodiments of the present disclosure, there is provided a positioning system, wherein the positioning system comprises one or more external devices and an electronic device for positioning the one or more external devices, wherein the one or more external devices are capable of communicating with the electronic device, the electronic device performing the method as described above.
According to a fourth aspect of embodiments of the present disclosure, there is provided an electronic device comprising: at least one processor; at least one memory storing computer-executable instructions, wherein the computer-executable instructions, when executed by the at least one processor, cause the at least one processor to perform the method as described above.
According to a fifth aspect of embodiments of the present disclosure, there is provided a computer readable storage medium, wherein instructions in the computer readable storage medium, when executed by at least one processor, cause the at least one processor to perform the method as described above.
According to the method for positioning the external device, the electronic device and the positioning system, spatial positioning can be achieved without using a fixed-point pre-installed router. Furthermore, by introducing positioning for altitude on the basis of two-dimensional navigation, the user's accurate positioning needs (e.g., altitude navigation needs in buildings with altitude differences) can be satisfied. Furthermore, by the use of the ranging technique, positioning and subsequent operations for the external device are achieved without depending on image recognition, regardless of whether the external device is obscured.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
For a more complete understanding of the present disclosure and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which like reference numerals represent like parts, and in which:
fig. 1 is a flowchart illustrating a method for locating an external device according to an embodiment of the present disclosure;
fig. 2 is a diagram illustrating an example of a method for locating an external device according to an embodiment of the present disclosure;
FIG. 3 is a diagram illustrating an example of spatial partitioning according to an embodiment of the present disclosure;
FIG. 4A is a flowchart illustrating a training process of a neural network model, according to an embodiment of the present disclosure;
fig. 4B is a diagram illustrating an example structure of a neural network model according to an embodiment of the present disclosure;
FIG. 5 is a flow chart illustrating obtaining training data according to an embodiment of the present disclosure;
fig. 6 is a diagram illustrating an example of a horizontal predetermined range according to an embodiment of the present disclosure;
fig. 7 is a diagram illustrating an example of a vertical predetermined range according to an embodiment of the present disclosure;
fig. 8 is a diagram illustrating an example of electronic device movement according to an embodiment of the present disclosure;
Fig. 9 is a diagram illustrating an electronic apparatus for locating an external device according to an embodiment of the present disclosure;
FIG. 10 is a diagram illustrating a positioning system according to an embodiment of the present disclosure; and
fig. 11 to 16 are diagrams illustrating examples of application scenarios of the embodiments of the present disclosure.
Detailed Description
In order to enable those of ordinary skill in the art to better understand the technical solutions of the present disclosure, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the foregoing figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the disclosure described herein are capable of operation in sequences other than those illustrated or otherwise described herein. The embodiments described in the examples below are not representative of all embodiments consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the disclosure as detailed in the claims.
At least one of "… …" appearing in the present disclosure means three cases in parallel including "… … any one", "… … combinations of any plurality", "… …" and the like. For example, "including at least one of a and B" includes three cases in parallel: (1) comprises A; (2) comprising B; (3) includes A and B. Further, in this document, an item expressed in the singular may include one or more items unless the context clearly indicates otherwise.
As used herein, the term "module" may include units implemented in hardware, software, firmware, or a combination thereof, and may be used interchangeably with other terms (e.g., "logic," "logic block," "portion," "unit," or "circuit"). A module may be a single integrated component configured to perform one or more functions or a minimal unit or portion of the single integrated component. For example, according to an embodiment, a module may be implemented in the form of an Application Specific Integrated Circuit (ASIC).
According to embodiments of the present disclosure, at least one of the electronic device (or electronic apparatus) and the external device may be an internet of things (IoT) device, and may include at least one of a smart phone, a tablet Personal Computer (PC), a mobile phone, a video phone, an electronic book reader, a desktop PC, a laptop computer, a netbook computer, a workstation, a server, a Personal Digital Assistant (PDA), a Portable Multimedia Player (PMP), an MP3 player, an ambulatory medical device, a camera, or a wearable device, etc., but is not limited thereto.
According to embodiments of the present disclosure, the wearable device may include at least one of an accessory device (e.g., a watch, a ring, a bracelet, a chain of feet, a necklace, glasses (e.g., AR glasses), contact lenses, or a Head Mounted Device (HMD)), a fabric or clothing integrated device (e.g., electronic clothing), a body attached device (e.g., a skin pad or tattoo), or a body implantable device (e.g., an implantable circuit), etc., but is not limited thereto.
With the development of intelligent electronic devices, various positioning methods have been developed in the related art, but these positioning methods have a number of drawbacks. For example, the related spatial positioning method generally needs to preset a preset spatial coordinate system using a plurality of electronic devices and achieve positioning by a fixed plurality of electronic devices, which depends on fixed-point presetting of the plurality of electronic devices and simultaneous operation of the plurality of electronic devices, and has high implementation cost and small application range (generally, the method is limited to indoor positioning). As another example, related navigation devices typically require navigation or positioning using a global positioning system (Global Positioning System, GPS), which typically only performs positioning of a horizontal position, and cannot achieve accurate positioning of a target object having a certain height, i.e., cannot achieve spatial positioning. As another example, related image recognition-based positioning methods are generally applicable only to target objects in the visual range, and thus image recognition-based positioning methods cannot achieve recognition and positioning of occluded target objects, and they rely on various sensors (e.g., image sensors, acceleration sensors, gyroscopic sensors, etc.) resulting in high costs and susceptibility to environmental factors (e.g., scene light).
The present disclosure, in view of at least the above-mentioned problems occurring in the related art, proposes a method of determining a position of an external device with respect to an electronic device based only on a distance from the external device, the electronic device, and a positioning system, and an embodiment according to the present disclosure will be described below with reference to fig. 1 to 16.
Fig. 1 is a flowchart illustrating a method for locating an external device according to an embodiment of the present disclosure. Methods for locating an external device according to embodiments of the present disclosure may be performed by the above and other various types of electronic devices.
According to an embodiment, at step S101, at a plurality of detection points of an electronic device, a plurality of distances of the plurality of detection points from an external device are detected, wherein at least four detection points among the plurality of detection points are not coplanar. This step is described below with reference to fig. 2 taking AR glasses as an example.
Fig. 2 is a diagram illustrating an example of a method for locating an external device according to an embodiment of the present disclosure. In fig. 2, although AR glasses are described as an example of an electronic device, the electronic device of the present disclosure is not limited thereto.
Referring to fig. 2, four detection points (a first detection point a, a second detection point B, a third detection point C, and a fourth detection point D) are provided on AR glasses.
Herein, the term "detection point" may denote a position point included in an electronic device for detecting various parameters, for example, a sensor for detection such as a gesture sensor, a gyro sensor, an atmospheric pressure sensor, a magnetic sensor, an acceleration sensor, a grip sensor, a proximity sensor, a color sensor, an Infrared (IR) sensor, a biometric sensor, or an illuminance sensor may be provided at the detection point.
According to an embodiment, to achieve spatial positioning, the number of the plurality of detection points is greater than or equal to 4 and there are at least four non-coplanar detection points. For example, the first detection point a, the second detection point B, the third detection point C, and the fourth detection point D in fig. 2 are not located in the same plane to achieve detection for a stereoscopic space.
According to an embodiment, a plurality of detection points may be symmetrically (such as centrosymmetric, axisymmetric, etc.) distributed on the electronic device to improve the accuracy of the detection. For example, the first detection point a and the second detection point B in fig. 2 are axisymmetric, and the third detection point C and the fourth detection point D are axisymmetric.
According to an embodiment of the present disclosure, the electronic device and the external device in step S101 may be any of the above and other various types of electronic devices, respectively, as long as communication between the electronic device and the external device is possible. According to an embodiment, a direct (e.g., wired) communication channel or a wireless communication channel may be established between the electronic device and the external device, via which the electronic device and the external device may perform communication.
Herein, the term "external device" denotes a target object to be located that is capable of communicating with an electronic device, and thus may also be referred to as a "target object".
Referring to fig. 2, first, second, third and fourth distances D1, D2, D3 and D4 from the target object are detected at first, second, third and fourth detection points a, B, C and D of the AR glasses, respectively.
According to an embodiment of the present disclosure, a distance from an external device is detected at each of a plurality of detection points, respectively, by using Ultra Wide Band (UWB) technology. In particular, UWB technology may make distance measurements by using Time Of Flight (ToF) Of signal transmissions. According to an embodiment of the present disclosure, a distance may be detected at each detection point based on a signal received from an external device (i.e., a target object) by using UWB technology. For example, in fig. 2, the external device sends a signal to a sensor at each detection point on the electronic device (e.g., AR glasses) such that the electronic device obtains a plurality of distances of the plurality of detection points from the external device. The ranging method using UWB will not be described in detail herein.
According to an embodiment of the present disclosure, the number of external devices in step S101 may be one or more. When there are a plurality of external devices, the method further includes determining identification information of the corresponding external device when detecting a plurality of distances of the plurality of detection points from each external device.
For example, for the first external device ED1, distance information (d 1, d2, d3, d4, ED 1) may be determined, wherein the first distance d1, the second distance d2, the third distance d3, and the fourth distance d4 respectively correspond to the respective detection points, and ED1 may represent that the distance information is distance information of the first external device ED1 with respect to the electronic device.
By executing step S101, a plurality of distances can be detected at detection points at different positions of the mobile or portable electronic device for use in subsequent positioning processing.
Referring back to fig. 1, in step S102, the azimuth information (v, h) and the distance information (d 1, d2, d3, d4, d) of the external device with respect to the electronic device are determined based on the detected plurality of distances. In particular, localization may be performed by using a neural network model.
According to embodiments of the present disclosure, distance information of an external device with respect to an electronic device may be determined based on a plurality of detected distances.
According to an embodiment of the present disclosure, based on the inputted plurality of distances (e.g., the first distance d1, the second distance d2, the third distance d3, and the fourth distance d4 in fig. 2), distance information (d 1, d2, d3, d4, d) of the external device with respect to the electronic device may be obtained, wherein d may represent a distance of the external device with respect to the electronic device, e.g., a distance of the external device from a center point of the electronic device, determined based on the plurality of distances. According to an embodiment, the center point of the electronic device may comprise a center point of a structure of the electronic device or a user desired center point, etc. For example, when the electronic device is AR glasses, the center point of the electronic device is the center of a line between the center points of two lenses of the AR glasses (or the center points of both eye positions of the user) or a center point determined according to user input.
According to an embodiment of the present disclosure, orientation information (v, h) of an external device with respect to an electronic device is obtained through a neural network model based on a plurality of distances (e.g., a first distance d1, a second distance d2, a third distance d3, and a fourth distance d4 in fig. 2) input, wherein the orientation information includes horizontal viewing angle information h and vertical viewing angle information v.
According to embodiments of the present disclosure, the position information and the distance information of the external device with respect to the electronic device may be generally represented as a one-dimensional array or vector (v, h, d1, d2, d3, d4, d).
Herein, the azimuth is described by a vertical viewing angle V and a horizontal viewing angle H. According to an embodiment, the vertical viewing angle V and the horizontal viewing angle H in a space or field of view relative to the electronic device are divided into N (1.ltoreq.N.ltoreq.360), respectively, to discretize the viewing angle value into a limited number. That is, the vertical viewing angle V and the horizontal viewing angle H are equally divided by N, respectively. Herein, the horizontal viewing angle information included in the azimuth information may be represented as h, the vertical viewing angle information may be represented as v, where h and v are greater than or equal to 1 and less than or equal to N, respectively, and the azimuth information may be represented as (v, h). According to an embodiment, a spherical coordinate system (d ', v, h) with respect to the electronic device may be used herein, where d' represents a distance from an origin, v represents a zenith angle (i.e., vertical viewing angle information), and h represents an azimuth angle (i.e., horizontal viewing angle information). This is described with reference to fig. 3.
Fig. 3 is a diagram illustrating an example of spatial partitioning according to an embodiment of the present disclosure.
In fig. 3, the spatial division is described by taking a horizontal viewing angle H as an example. The horizontal viewing angle H in fig. 3 is divided into 4 areas, that is, N is 4. In this case, the values of h may be 1, 2, 3 and 4.
Although four equally divided regions are shown in fig. 3, the space may be divided at any angle, i.e., the angle of each region may be adjustable, according to embodiments of the present disclosure. Although the spatial division is described in fig. 3 by taking the horizontal viewing angle H as an example, embodiments according to the present disclosure are not limited thereto, and the spatial division manner may be extended to the division of a three-dimensional stereoscopic space.
According to embodiments of the present disclosure, one or more regions may be determined as a predetermined range to be described below. For example, when the electronic device is AR glasses, one area corresponding to h=1 may be determined as a predetermined range, that is, a viewing area of the AR glasses.
According to an embodiment of the present disclosure, a neural network model is pre-trained using training data associated with a plurality of detection points. Here, "pre-training" refers to training a neural network model having a plurality of training data by a training algorithm to obtain a predefined operating rule or neural network model configured to perform a desired feature (or purpose).
According to embodiments of the present disclosure, the neural network may include a radial basis function (Radial Basis Function, RBF) neural network. The training process of the RBF neural network model is described below with reference to fig. 4.
Fig. 4 is a flowchart illustrating a training process of a neural network model according to an embodiment of the present disclosure.
Referring to fig. 4, in step S401, training data is extracted by using a finite element method (Finite Element Method, FEM). Herein, "training data" means data for training a neural network model, which may also be referred to as "tag data". This step is described in detail below with reference to fig. 5.
Fig. 5 is a flowchart illustrating obtaining training data according to an embodiment of the present disclosure.
Referring to fig. 5, in step S501, a space rectangular coordinate system having an estimated center point of the electronic device as an origin is created. That is, a space rectangular coordinate system based on the x-axis, the y-axis, and the z-axis, which is different from the spherical coordinate system based on d, v, and h, is created.
According to an embodiment, the electronic device and its surrounding space may be estimated as an abstract cube, wherein a rectangular coordinate system may be created with the center point of the abstract cube as the origin.
According to an embodiment, a space rectangular coordinate system may be created with an arbitrary point in space as an origin. For example, a spatial rectangular coordinate system is created with the estimated center point of the electronic device as the (0, 0) origin. In this case, coordinates of detection points of the electronic device may be determined, for example, a first detection point a coordinates are (X1, Y1, Z1), a second detection point B coordinates are (X2, Y2, Z2), a third detection point C coordinates are (X3, Y4, Z4), and a fourth detection point D coordinates are (X4, Y4, Z4).
In step S502, reference azimuth information and distance information of each sample point, which is each of a plurality of vertices of a plurality of unit cubes divided from a space of a space rectangular coordinate system, with respect to a plurality of detection points are calculated.
According to an embodiment, an abstract cube of the space rectangular coordinate system as described above may be divided or partitioned into a plurality of unit cubes, and a plurality of vertices of the plurality of unit cubes may be determined as a plurality of sample points (for example, vertices of a partitioned square in a horizontal view angle in fig. 3). In this case, the coordinates of a plurality of sample points in a space rectangular coordinate system are known.
According to an embodiment, for each of a plurality of sample points, reference azimuth information (vsr, hsr) and distance information (ds 1, ds2, ds3, ds4, ds) for each sample point relative to the plurality of detection points are calculated by using coordinates of the sample point and coordinates of the plurality of detection points, wherein the reference azimuth information may include reference horizontal view information hsr of the sample point and reference vertical view information vsr of the sample point. According to an embodiment, the reference azimuth information and the distance information about the sample point may also be generally represented as one-dimensional arrays or vectors (vsr, hsr, ds1, ds2, ds3, ds4, ds), where ds represents the distance of the sample point from the origin.
According to an embodiment, the determination of the estimated center point as the origin in step S501 may not be applicable to a coordinate system including a plurality of detection points, that is, the estimated center point may not conform to the center point of the structure of the actual electronic device or the center point of the electronic device desired by the user, in which case the information of the relative relationship between the respective points may be erroneous. For example, when the electronic device is AR glasses, the origin of the space rectangular coordinate system may not correspond to the center point determined according to the position of the eye, and thus, it is necessary to correct the coordinate value of the origin of the coordinate system to obtain an accurate coordinate value.
In step S503, the coordinate value of the origin of the space rectangular coordinate system is corrected based on the calculated reference azimuth information and distance information.
According to an embodiment, to avoid that the origin of the created coordinate system deviates from the actual center point of the electronic device (e.g. the center point of the structure of the electronic device or the center point desired by the user), the coordinates of the origin of the created coordinate system are modified. For example, when the electronic device is AR glasses, as described above, the origin of the space rectangular coordinate system created in step S501 may be offset from the actual center point of the electronic device.
According to an embodiment, coordinates of a center point of an actual electronic device may be calculated based on the calculated reference azimuth information and distance information. In particular, since symmetry exists in the plurality of sample points and/or detection points, an actual center point of the electronic device may be determined based on the calculated reference azimuth information and distance information.
According to embodiments, the center point of the electronic device, i.e., the specific point to be corrected to the origin, may also be determined in a spatial rectangular coordinate system based on user input or other suitable means.
According to the embodiment, the space rectangular coordinate system is adjusted based on the coordinates of the determined center point of the electronic device or the specific point to be corrected as the origin, or the coordinates of the respective points are directly adjusted.
For example, the determined center point of the electronic device or a specific point to be corrected as the origin may be at (l, m, n) in the created space rectangular coordinate system.
In step S504, the azimuth information of each sample point with respect to the plurality of detection points is recalculated based on the corrected coordinate values of the origin point. According to an embodiment, vertical view angle information and horizontal view angle information of each sample point with respect to a plurality of detection points are redetermined using the corrected coordinate system or corrected respective points (including each sample point, detection point, and the like). According to an embodiment, as described above, when the center point of the determined electronic device or the specific point to be corrected to the origin is at (l, m, n), if the coordinate value (l, m, n) is corrected to the origin (i.e., corrected to (0, 0)), a certain point (x, y, z) in the original coordinate system is correspondingly corrected to (x-l, y-m, z-n).
In step S505, the recalculated azimuth information and distance information for each sample point are determined as training data. According to an embodiment, the training data for each sample point may be represented as a one-dimensional array or vector (vs ', hs', ds1, ds2, ds3, ds4, ds).
By performing the process of extracting training data using FEM described with reference to fig. 5, a plurality of training data for neural network model training based on a plurality of detection points may be obtained, and the obtained training data may be a one-dimensional array or vector. Although 4 detection points are used as an example in fig. 5, the present disclosure is not limited thereto.
Referring back to fig. 4, an RBF neural network model is constructed at step S402. The structure of the neural network model is described below with reference to fig. 5.
Fig. 4B is a diagram illustrating an example structure of a neural network model according to an embodiment of the present disclosure.
Referring to fig. 4B, a neural network model may include a plurality of neural network layers according to an embodiment of the present disclosure. An example structure of a neural network model may be represented as a three-level structure including an input layer, an implied layer, and an output layer, the transformation from the input layer to the implied layer being nonlinear, and the transformation from the implied layer to the output layer being linear.
According to an embodiment, the input layer may receive a one-dimensional input vector x= (x) representing distance 1 ,x 2 ,...,x P ) T ∈R P Wherein R can represent the effective value range of x, R P The space that may represent the input data, P may be the number of radial basis functions, which is a positive integer and corresponds to the number of detection points; x is x 1 Can represent the distance from the target object to the first detection point, x 2 Can represent the distance of the target object from the second detection point, and so on, x P The distance of the target object from the P-th detection point may be represented. That is, when training a neural network model using training data, x 1 Can correspond to ds1, x 2 Can correspond to ds2, x 3 Can correspond to ds3, x 4 May correspond to ds4. When calculating position information based on a plurality of distances detected using a neural network model, x 1 Can correspond to d1, x 2 Can correspond to d2, x 3 Can correspond to d3, x 4 May correspond to d4.
According to an embodiment, the hidden layer may map the vector from p in the low dimension to h in the high dimension such that the low dimension becomes linearly inseparable from the high dimension. The hidden layer may be composed of a gaussian (kernel) function of RBF, which can be expressed by equation 1:
wherein x may represent an input vector, i.e., an x vector received by the input layer, where P is an integer between 1 and P, x p A p-th parameter representing a vector x; c is a center vector of a Gaussian function, wherein i is an integer between 1 and h, and h is the number of nodes of an hidden layer; sigma is a wideband parameter of the gaussian function, used to control the radial range of action of the function.
Based on equation (1), the output of the RBF can be expressed by equation 2:
wherein y represents the output of the gaussian kernel function, where y j The j-th output may be represented, where j for the hidden layer is 1 and j for the output layer is 2; w (w) ij May represent the weighting coefficients between layers.
The least squares loss function in the hidden layer can be represented by equation 3:
wherein d is the actual center point distance of the basis function; m may represent the value of the number of outputs, and according to an embodiment, the value of m is 2.
By using the output function as shown in equation 2, the one-dimensional distance vector x= (x) 1 ,x 2 ,...,x P ) T Mapped to two-dimensional values (v, h) representing the bearing information.
The values of the neurons of the output layer represent the output results of the neural network, i.e., the azimuth information (v, h).
By the neural network model including the above function, the one-dimensional vector of the input distance can be up-scaled to a two-dimensional value representing the azimuth information, thereby realizing the conversion from the distance information to the view angle information.
Referring back to fig. 4A, the RBF neural network model is tested using the training data at step S403. According to an embodiment, the RBF neural network model is tested by using parameter values in the training data. For example, (d 1, d2, d3, d 4) of the training data (v ', h', d1, d2, d3, d4, d) for each sample is input as an x vector to the constructed RBF neural network model.
In step S404, it is determined whether the accuracy of the RBF neural network model satisfies a demand. According to an embodiment, it may be determined whether the accuracy of the output of the RBF neural network model tested using the training data meets the demand based on a predetermined condition set according to an empirical value. For example, it may be determined whether the accuracy of the RBF neural network model satisfies the demand according to a comparison result between (vo, ho) outputted from the RBF neural network model and (v ', h') of training data, based on a predetermined condition set according to an empirical value.
When it is determined that the accuracy does not meet the demand at step S404, parameters of the RBF neural network model are modified at step S406, and then the process proceeds to step S402. Steps S402 to S405 are repeated until the accuracy of the obtained neural network model meets the requirement.
When it is determined that the accuracy meets the requirement at step S404, a trained RBF neural network model is obtained at step S405.
Through the above-described steps described with reference to fig. 4, a trained RBF neural network model can be obtained to determine location information of an external device based on a detected distance using the RBF neural network model. Although fig. 4 is described by way of example only with respect to an RFB neural network model, the present disclosure is not limited thereto and embodiments of the present disclosure may apply to a Back Propagation (BP) neural network model or any other suitable neural network model. Furthermore, the process of training a neural network model according to embodiments of the present disclosure is not limited to the training process described with reference to fig. 4 and 5, and embodiments of the present disclosure may apply to any other applicable neural network model training process.
According to an embodiment of the present disclosure, the method for locating an external device may further include: identifying whether the external device is within a predetermined range for the electronic device based on the determined orientation information; based on the result of the recognition, related information about the external device is provided. This is described in detail below with reference to fig. 6 and 7.
Fig. 6 is a diagram illustrating an example of a horizontal predetermined range according to an embodiment of the present disclosure. Fig. 7 is a diagram illustrating an example of a vertical predetermined range according to an embodiment of the present disclosure. In fig. 6 and 7, a predetermined range for AR glasses is illustrated with AR glasses, and an external device (i.e., a target object) is represented with dots.
As described above, the space or the field of view for the electronic device may be divided into N regions in the horizontal viewing angle direction and the vertical viewing angle direction, respectively, N is 2 with reference to fig. 6 and 7, and the predetermined range is represented as a blue region and the non-predetermined range is represented as a gray region. For example, in the case where the electronic device is AR glasses, the predetermined range may represent a target area viewable by the AR glasses, i.e., a front view angle area.
Although shown as N being 2 and only one region being determined as a predetermined range in fig. 6 and 7, the present disclosure is not limited thereto.
Referring first to fig. 6, based on the horizontal angle of view information included in the determined azimuth information, it may be determined that the first and second external devices ED1 and ED2 and a part of the external devices (i.e., represented by blue dots in fig. 6) are included in a predetermined range, and another part of the external devices (i.e., represented by gray dots in fig. 6) are not included in the predetermined range.
Then referring to fig. 7, based on the horizontal angle of view information included in the determined azimuth information, it may be determined that the second external device ED2 and a part of the external devices (i.e., represented by blue dots in fig. 7) are included in a predetermined range, and the first external device ED1 and another part of the external devices (i.e., represented by gray dots in fig. 7) are not included in the predetermined range.
Although it is illustrated herein that the horizontal viewing angle information is compared with the horizontal predetermined range first and then the vertical viewing angle information is compared with the vertical predetermined range to identify whether the external device is within the predetermined range for the electronic device, the identification order of the present disclosure is not limited thereto. Alternatively, the vertical viewing angle information may be first compared with a vertical predetermined range, and then the horizontal viewing angle information may be compared with a horizontal predetermined range. Alternatively, a predetermined range shared with the vertical predetermined range based on a combination of both the vertical view information and the horizontal predetermined range may be compared.
As such, external devices within a predetermined range for the electronic device are identified by comparison, and here, the external devices may be referred to as valid external device valid target objects.
According to an embodiment of the present disclosure, when an external device is not within a predetermined range for an electronic device, guide information guiding the electronic device toward the external device may be provided to change the predetermined range for the electronic device so that the external device enters the predetermined range. For example, when the electronic device is AR glasses, guiding movement of the head of a wearer of the AR glasses or movement of the field of view to change the field of view (FoV) may be provided such that the external device is within the field of view.
According to the embodiments of the present disclosure, when an external device is within a predetermined range for an electronic device, information about the external device combined with environmental information may be provided. For example, when the electronic device is AR glasses, combined information of environment information (e.g., an environment image) guiding the current scanning of the AR glasses and information of the external device (e.g., an image of the external device is displayed regardless of a mask, etc.) may be provided.
According to the embodiments of the present disclosure, when an external device is within a predetermined range for an electronic device, control information for controlling the external device may be provided. For example, when the electronic device is AR glasses, related control information for performing a control operation on the external device (e.g., a control operation item for turning on/off the external device) may be provided.
According to an embodiment of the present disclosure, the related information of the external device includes at least one of the following information: the orientation information and distance information with respect to the electronic device, guide information guiding the electronic device toward the external device, information of the external device combined with the environment information, and control information for controlling the external device. Hereinafter, this will be described in detail in connection with an example of an application scenario.
According to the embodiments of the present disclosure, since the electronic device and the external device are movable, updating of the related information of the external device is also required.
According to an embodiment of the present disclosure, the method for locating an external device may further include: re-detecting a plurality of distances of the plurality of detection points from the external device in response to movement of at least one of the electronic device and the external device; based on the re-detected plurality of distances, orientation information and distance information of the external device with respect to the electronic device are updated. This is described in detail below with reference to fig. 8.
Fig. 8 is a diagram illustrating an example of electronic device movement according to an embodiment of the present disclosure. In fig. 8, taking a horizontal viewing angle direction having a specific distance range (i.e., a smaller circle) as an example, the horizontal viewing angle is divided into 4 areas, wherein the first area (1) is determined as a predetermined range. In fig. 8, red dots may represent electronic devices, and blue and gray dots may represent external devices.
Referring to fig. 8, before the electronic device moves, the third external device ED3 is within a predetermined range for the electronic device, and the fourth external device ED4 is outside the predetermined range for the electronic device.
As the electronic device moves, the fourth external device ED4 gradually moves into a predetermined range for the electronic device.
After the electronic device moves, both the third external device ED3 and the fourth external device are within a predetermined range.
According to an embodiment of the present disclosure, during and after movement of the electronic device, a plurality of distances of the plurality of detection points from the external device may be re-detected, and orientation information and distance information of the external device with respect to the electronic device may be updated based on the re-detected plurality of distances.
Here, the steps of re-detecting the plurality of distances of the plurality of detection points from the external device and determining the azimuth information and the distance information based on the re-detected plurality of distances are similar to the steps described above with reference to fig. 1, and a repetitive description thereof will not be made.
According to the embodiments of the present disclosure, in addition to the movement of at least one of the electronic device and the external device, the position information and the distance information of the external device with respect to the electronic device may be updated in response to occurrence of other events, such as performing the updating step at predetermined time intervals, performing the positioning updating operation with respect to the specific external device in response to the distance of the movement of at least one of the electronic device and the external device satisfying a predetermined condition, performing the positioning updating operation with respect to the specific external device in response to a user's request, and the like.
According to the embodiments of the present disclosure, when the position information and the distance information of the external device with respect to the electronic device are updated, the related information of the external device may be updated accordingly to provide information feedback about the external device in real time.
Fig. 9 is a diagram illustrating an electronic apparatus for locating an external device according to an embodiment of the present disclosure.
Referring to fig. 9, an electronic device 900 may include a detection module 901 and a positioning module 902.
According to an embodiment of the present disclosure, the detection module 901 may be configured to: at a plurality of detection points of the electronic device 900, a plurality of distances from the external device are detected, wherein at least four detection points among the plurality of detection points are not coplanar. That is, the detection module 901 may perform an operation corresponding to step S101 described above with reference to fig. 1, and detailed description thereof will not be repeated here.
According to an embodiment of the present disclosure, the positioning module 902 may be configured to: based on the detected plurality of distances, position information and distance information of the external device with respect to the electronic device 900 are determined. That is, the positioning module 902 may perform operations corresponding to step S101 described above with reference to fig. 1, and detailed descriptions thereof will not be repeated here.
The detection module 901 may be configured to detect a plurality of distances of a plurality of detection points from an external device at a plurality of detection points of the electronic device 900 by: by using Ultra Wideband (UWB) technology, the distance to the external device is detected at each of a plurality of detection points, respectively.
The positioning module 902 may be configured to determine position information of the external device relative to the electronic device 900 based on the detected plurality of distances by: based on the plurality of distances input, position information of the external device with respect to the electronic device 900 is obtained through the neural network model, wherein the position information includes horizontal view angle information and vertical view angle information.
According to embodiments of the present disclosure, the electronic device 900 may also include a model training module.
According to embodiments of the present disclosure, the model training module may be configured to pre-train the neural network model by using training data related to a plurality of detection points.
According to an embodiment of the present disclosure, the model training module is configured to obtain training data by: creating a space rectangular coordinate system with an estimated central point of the electronic device 900 as an origin; calculating reference azimuth information and distance information of each sample point relative to a plurality of detection points; correcting the coordinate value of the origin of the space rectangular coordinate system based on the calculated reference azimuth information and distance information; recalculating azimuth information of each sample point relative to a plurality of detection points based on the corrected coordinate values of the origin point; the recalculated position information and distance information for each sample point are determined as training data.
According to embodiments of the present disclosure, the electronic device 900 may also include a target recognition module.
According to embodiments of the present disclosure, the object recognition module may be configured to: identifying whether the external device is within a predetermined range for the electronic device 900 based on the determined orientation information; based on the result of the recognition, related information about the external device is provided.
According to an embodiment of the present disclosure, the related information of the external device includes at least one of the following information: the orientation information and distance information with respect to the electronic device 900, guide information guiding the electronic device 900 toward an external device, information of the external device combined with environment information, and control information for controlling the external device.
According to embodiments of the present disclosure, the electronic device 900 may further include an information update module.
According to an embodiment of the present disclosure, the information update module may be configured to: re-detecting a plurality of distances of the plurality of detection points from the external device in response to movement of at least one of the electronic device 900 and the external device; based on the re-detected plurality of distances, the position information and distance information of the external device with respect to the electronic device 900 are updated.
The specific manner in which the respective modules of the electronic device 900 in the above embodiments perform operations has been described in detail in the embodiments of the related methods, and will not be described in detail herein.
Further, it should be understood that various modules in electronic device 900 according to embodiments of the present disclosure may be implemented as hardware components and/or software components. The individual units may be implemented, for example, using a Field Programmable Gate Array (FPGA) or an Application Specific Integrated Circuit (ASIC), depending on the processing performed by the defined individual modules.
Fig. 10 is a diagram illustrating a positioning system according to an embodiment of the present disclosure.
Referring to fig. 10, a positioning system 1000 may include one or more external devices 1010 and an electronic device 1020 for positioning the one or more external devices 1010.
According to an embodiment of the present disclosure, each of the one or more external devices 1010 is capable of communicating with the electronic device 900.
According to an embodiment of the present disclosure, the electronic device 1020 is used to perform the method for positioning an external device as described above with reference to fig. 1 to 8, or may correspond to the electronic device 900 as described above with reference to fig. 9.
According to embodiments of the present disclosure, each of the one or more external devices 1010 may correspond to an external device in the various embodiments described above.
Examples of application scenarios of embodiments of the present disclosure will be described below with reference to fig. 11 to 16.
Fig. 11 to 16 are diagrams illustrating examples of application scenarios of the embodiments of the present disclosure.
Referring to fig. 11, an example is shown in an application scenario of home IoT control. As shown in fig. 11, when the user wears the AR glasses, external devices (i.e., effective target objects) within a predetermined range (i.e., blue area) are a television, a cleaner, and a refrigerator according to the FoV of the AR glasses. In this case, information about an effective external device (name information about a television, a cleaner, and a refrigerator, as shown in fig. 11) may be provided to the user through AR glasses for further operation by the user. In this case, when the user moves, the predetermined range corresponding to the FoV is changed, the effective target object is changed, and the information corresponding to the effective target object is changed.
Referring to fig. 12, an example is shown in an application scenario of home IoT control. Fig. 12 is an example of an application scenario in which a user selects a specific external device in the application scenario of fig. 11. As shown in fig. 12, when the user selects "tv", control information for controlling an active external device ("tv") may be provided to the user through AR glasses, as shown in fig. 12, with respect to interactive operation options of the tv (such as changing play contents, adjusting volume, etc.).
According to an embodiment, related information of the external device may be provided by using an AR User Interface (UI).
According to an embodiment, in this case, when it is determined by the above-described updating step that the external device is not already within the predetermined range for the electronic device, the related information of the provided external device may be changed, for example, the display of the UI or the like may be stopped.
Referring to fig. 13 (a) and (b), an example in an application scenario in which an external device is located is shown. As shown in fig. 13, when the user wears AR glasses, all external devices within the FoV are searched according to the FoV of the AR glasses, regardless of whether the external devices are blocked. In this case, information about the valid external device (azimuth information and distance information with respect to the electronic device, as shown in fig. 13 (a) and (b)) may be provided to the user through AR glasses (e.g., UI) for further operations by the user.
Referring to fig. 14, an example in a navigation application scenario is shown. As shown in fig. 14 (a), when the user wears AR glasses, relevant information (e.g., azimuth information and distance information about the external device, selection information about the external device, etc.) of the effective external device is automatically provided according to communication with surrounding external devices. As shown in (b) of fig. 14, when the user wears AR glasses, related information of external devices that are not within a predetermined range (e.g., azimuth information and distance information about the external devices, guide information guiding the user toward the external devices, etc.) is automatically provided according to communication with surrounding external devices.
Referring to fig. 15, an example in an application scenario of precise positioning in a moving condition is shown. As shown in fig. 15, when a user wears AR glasses, related information of an external device (such as azimuth information and distance information about a person holding the external device) is provided according to communication with a specific external device. In this case, in the process that the user is continuously approaching the target object, the azimuth information and the distance information are updated in real time, and the positioning accuracy is also improved.
Referring to fig. 16, an example of an application scenario for finding a target object in a dim condition is shown. As shown in fig. 16, when the user wears AR glasses, even if light in the current environment is dim, which makes it impossible to use image recognition, the external device can be accurately positioned according to communication with the external device, thereby achieving AR guidance.
As such, by using the method for locating an external device and the electronic device and the locating system according to the embodiments of the present disclosure, spatial locating can be achieved without using a fixed-point pre-installed router. Furthermore, by introducing positioning for altitude on the basis of two-dimensional navigation, the user's accurate positioning needs (e.g., altitude navigation needs in buildings with altitude differences) can be satisfied. Furthermore, by the use of the ranging technique, positioning and subsequent operations for the external device are achieved without depending on image recognition, regardless of whether the external device is obscured.
There is also provided, in accordance with an embodiment of the present disclosure, an electronic device comprising at least one processor and at least one memory storing computer-executable instructions, wherein the computer-executable instructions, when executed by the at least one processor, cause the at least one processor to perform the method as described above.
According to an embodiment, the electronic device may be a PC computer, tablet device, personal digital assistant, smart phone, or other device capable of executing the above set of instructions. Here, the electronic device is not necessarily a single electronic device, but may be any device or an aggregate of circuits capable of executing the above-described instructions (or instruction set) singly or in combination. The electronic device may also be part of an integrated control system or system manager, or may be configured as a portable electronic device that interfaces with either locally or remotely (e.g., via wireless transmission).
According to an embodiment, each of the at least one processor may include a Central Processing Unit (CPU), a Graphics Processor (GPU), a programmable logic device, a special purpose processor system, a microcontroller, or a microprocessor. According to an embodiment, each of the at least one processor may further comprise an analog processor, a digital processor, a microprocessor, a multi-core processor, a processor array, a network processor, or the like.
According to an embodiment, each of the at least one processor may execute instructions or code stored in the memory, wherein the memory may also store data. The instructions and data may also be transmitted and received over a network via a network interface device, which may employ any known transmission protocol.
According to an embodiment, each of the at least one processor may be integrated with a memory, e.g. RAM or flash memory arranged within an integrated circuit microprocessor or the like. According to embodiments, the memory may comprise a stand-alone device, such as an external disk drive, a storage array, or any other storage device usable by a database system. Each of the at least one processor and the memory may be operatively coupled or may communicate with each other, such as through an I/O port, network connection, etc., such that each of the at least one processor is capable of reading files stored in the memory.
According to an embodiment, the electronic device may also include a video display (such as a liquid crystal display) and a user interaction interface (such as a keyboard, mouse, touch input device, etc.). According to an embodiment, all components of the electronic device may be connected to each other via a bus and/or a network.
According to an embodiment of the present disclosure, there may also be provided a computer-readable storage medium, wherein instructions in the computer-readable storage medium, when executed by at least one processor, cause the at least one processor to perform the method as described above.
According to an embodiment of the present disclosure, a computer-readable storage medium may include: read-only memory (ROM), random-access programmable read-only memory (PROM), electrically erasable programmable read-only memory (EEPROM), random-access memory (RAM), dynamic random-access memory (DRAM), static random-access memory (SRAM), flash memory, nonvolatile memory, CD-ROM, CD-R, CD + R, CD-RW, CD+RW, DVD-ROM, DVD-R, DVD + R, DVD-RW, DVD+RW, DVD-RAM, BD-ROM, BD-R, BD-RLTH, BD-RE, blu-ray or optical disk storage, hard Disk Drives (HDD), solid State Disks (SSD), card-type memories (such as multimedia cards, secure Digital (SD) cards or extreme digital (XD) cards), magnetic tapes, floppy disks, magneto-optical data storage devices, hard disks, solid state disks, and any other devices configured to store instructions and any associated data, data files and data structures in a non-transitory manner and to provide the instructions and any associated data, data files and data structures to a processor or computer such that the processor or computer can execute the instructions. The instructions in the computer-readable storage medium described above may be run in an environment deployed in an electronic device, such as a client, host, proxy, server, etc. According to an embodiment, the instructions and any associated data, data files, and data structures are distributed across networked computer systems such that the instructions and any associated data, data files, and data structures are stored, accessed, and executed in a distributed manner by one or more processors or computers.
According to an embodiment of the present disclosure, there may also be provided computer software, instructions in which are executable by at least one processor to perform the method as described above.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the claims.

Claims (17)

1. A method for locating an external device, comprising:
detecting, at a plurality of detection points of an electronic device, a plurality of distances of the plurality of detection points from the external device, wherein at least four detection points among the plurality of detection points are not coplanar;
Based on the detected plurality of distances, position information and distance information of the external device relative to the electronic device are determined.
2. The method of claim 1, wherein detecting, at a plurality of detection points of an electronic device, a plurality of distances of the plurality of detection points from the external device comprises:
the distance from the external device is detected at each of the plurality of detection points, respectively, by using ultra wideband UWB technology.
3. The method of claim 1, wherein determining location information of the external device relative to the electronic device based on the detected plurality of distances comprises:
based on the plurality of distances input, position information of the external device relative to the electronic device is obtained through a neural network model, wherein the position information comprises horizontal view angle information and vertical view angle information.
4. The method of claim 3, wherein the neural network model is pre-trained using training data associated with the plurality of detection points,
wherein the training data is obtained by:
creating a space rectangular coordinate system with an estimated central point of the electronic device as an origin;
Calculating reference azimuth information and distance information of each sample point relative to the plurality of detection points, wherein each sample point is each of a plurality of vertexes of a plurality of unit cubes divided from a space of the space rectangular coordinate system;
correcting the coordinate value of the origin of the space rectangular coordinate system based on the calculated reference azimuth information and distance information;
recalculating azimuth information of each sample point relative to the plurality of detection points based on the corrected coordinate values of the origin point;
the recalculated position information and distance information for each sample point are determined as training data.
5. The method of claim 1, further comprising:
identifying whether the external device is within a predetermined range for the electronic device based on the determined orientation information;
based on the result of the recognition, related information about the external device is provided.
6. The method of claim 5, wherein the external device related information includes at least one of: orientation information and distance information with respect to the electronic device, guide information guiding the electronic device toward the external device, information of the external device combined with environment information, and control information for controlling the external device.
7. The method of any of claims 1 to 6, further comprising:
re-detecting a plurality of distances of the plurality of detection points from the external device in response to movement of at least one of the electronic device and the external device;
based on the re-detected plurality of distances, location information and distance information of the external device relative to the electronic device are updated.
8. An electronic device for locating an external device, wherein the electronic device comprises:
a detection module configured to: detecting, at a plurality of detection points of the electronic device, a plurality of distances of the plurality of detection points from the external device, wherein at least four detection points among the plurality of detection points are not coplanar;
a positioning module configured to: based on the detected plurality of distances, position information and distance information of the external device relative to the electronic device are determined.
9. The electronic device of claim 8, wherein the detection module is configured to detect a plurality of distances of a plurality of detection points from the external device at the plurality of detection points of the electronic device by:
the distance from the external device is detected at each of the plurality of detection points, respectively, by using ultra wideband UWB technology.
10. The electronic device of claim 8, wherein the positioning module is configured to determine the location information of the external device relative to the electronic device based on the detected plurality of distances by:
based on the plurality of distances input, position information of the external device relative to the electronic device is obtained through a neural network model, wherein the position information comprises horizontal view angle information and vertical view angle information.
11. The electronic device of claim 10, further comprising a model training module, wherein the model training module is configured to pre-train the neural network model by using training data associated with the plurality of detection points,
wherein the model training module is configured to obtain the training data by:
creating a space rectangular coordinate system with an estimated central point of the electronic device as an origin;
calculating reference azimuth information and distance information of each sample point relative to the detection points;
correcting the coordinate value of the origin of the space rectangular coordinate system based on the calculated reference azimuth information and distance information;
recalculating azimuth information of each sample point relative to the plurality of detection points based on the corrected coordinate values of the origin point;
The recalculated position information and distance information for each sample point are determined as training data.
12. The electronic device of claim 8, further comprising a target identification module, wherein the target identification module is configured to:
identifying whether the external device is within a predetermined range for the electronic device based on the determined orientation information;
based on the result of the recognition, related information about the external device is provided.
13. The electronic device of claim 12, wherein the information about the external device includes at least one of: orientation information and distance information with respect to the electronic device, guide information guiding the electronic device toward the external device, information of the external device combined with environment information, and control information for controlling the external device.
14. The electronic device of any of claims 8-13, further comprising an information update module, wherein the information update module is configured to:
re-detecting a plurality of distances of the plurality of detection points from the external device in response to movement of at least one of the electronic device and the external device;
Based on the re-detected plurality of distances, location information and distance information of the external device relative to the electronic device are updated.
15. A positioning system, wherein the positioning system comprises one or more external devices and an electronic device for positioning the one or more external devices,
wherein the one or more external devices are capable of communicating with an electronic device that performs the method of any of claims 1-7.
16. An electronic device, comprising:
at least one processor;
at least one memory storing computer-executable instructions,
wherein the computer executable instructions, when executed by the at least one processor, cause the at least one processor to perform the method of any one of claims 1 to 7.
17. A computer-readable storage medium, wherein instructions in the computer-readable storage medium, when executed by at least one processor, cause the at least one processor to perform the method of any one of claims 1 to 7.
CN202311226537.5A 2023-09-21 2023-09-21 Method for positioning external device, electronic device and positioning system Pending CN117289205A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311226537.5A CN117289205A (en) 2023-09-21 2023-09-21 Method for positioning external device, electronic device and positioning system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311226537.5A CN117289205A (en) 2023-09-21 2023-09-21 Method for positioning external device, electronic device and positioning system

Publications (1)

Publication Number Publication Date
CN117289205A true CN117289205A (en) 2023-12-26

Family

ID=89256629

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311226537.5A Pending CN117289205A (en) 2023-09-21 2023-09-21 Method for positioning external device, electronic device and positioning system

Country Status (1)

Country Link
CN (1) CN117289205A (en)

Similar Documents

Publication Publication Date Title
EP3466070B1 (en) Method and device for obtaining image, and recording medium thereof
US11887246B2 (en) Generating ground truth datasets for virtual reality experiences
CN111094895B (en) System and method for robust self-repositioning in pre-constructed visual maps
US10776943B2 (en) System and method for 3D association of detected objects
US20170186165A1 (en) Tracking rigged smooth-surface models of articulated objects
US10740986B2 (en) Systems and methods for reconstructing a moving three-dimensional object
US12028626B2 (en) Visual-inertial tracking using rolling shutter cameras
US20210097714A1 (en) Location aware visual markers
US10679376B2 (en) Determining a pose of a handheld object
US11915453B2 (en) Collaborative augmented reality eyewear with ego motion alignment
US20200380723A1 (en) Online learning for 3d pose estimation
WO2021164712A1 (en) Pose tracking method, wearable device, mobile device, and storage medium
JP2014032623A (en) Image processor
US11188787B1 (en) End-to-end room layout estimation
KR20220100813A (en) Automatic driving vehicle registration method and device, electronic equipment and a vehicle
US20240071018A1 (en) Smooth object correction for augmented reality devices
EP4026092A1 (en) Scene lock mode for capturing camera images
CN107704106B (en) Attitude positioning method and device and electronic equipment
CN115500083A (en) Depth estimation using neural networks
CN115578432B (en) Image processing method, device, electronic equipment and storage medium
CN117289205A (en) Method for positioning external device, electronic device and positioning system
US11158119B2 (en) Systems and methods for reconstructing a three-dimensional object
US20210279907A1 (en) Methods and systems for sensing obstacles in an indoor environment
KR20180106178A (en) Unmanned aerial vehicle, electronic device and control method thereof
US10636205B2 (en) Systems and methods for outlier edge rejection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination