CN115103439A - Ultra-wideband visual auxiliary positioning method and device and storage medium - Google Patents

Ultra-wideband visual auxiliary positioning method and device and storage medium Download PDF

Info

Publication number
CN115103439A
CN115103439A CN202210635794.3A CN202210635794A CN115103439A CN 115103439 A CN115103439 A CN 115103439A CN 202210635794 A CN202210635794 A CN 202210635794A CN 115103439 A CN115103439 A CN 115103439A
Authority
CN
China
Prior art keywords
position information
ultra
wideband
target object
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210635794.3A
Other languages
Chinese (zh)
Inventor
张超杰
张锐
管京龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Ironman Technology Co ltd
Original Assignee
Beijing Ironman Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Ironman Technology Co ltd filed Critical Beijing Ironman Technology Co ltd
Priority to CN202210635794.3A priority Critical patent/CN115103439A/en
Publication of CN115103439A publication Critical patent/CN115103439A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management
    • H04W64/006Locating users or terminals or network equipment for network management purposes, e.g. mobility management with additional information processing, e.g. for direction or speed determination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/025Services making use of location information using location based information parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/33Services specially adapted for particular environments, situations or purposes for indoor environments, e.g. buildings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W88/00Devices specially adapted for wireless communication networks, e.g. terminals, base stations or access point devices
    • H04W88/08Access point devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses an ultra wide band visual auxiliary positioning method, an ultra wide band visual auxiliary positioning device and a storage medium. Wherein, the method comprises the following steps: carrying out initialization calibration on the ultra-wideband positioning base station and carrying out initialization calibration on the shooting equipment; acquiring first position information of a target object by adopting the ultra-wideband positioning base station, and acquiring second position information of the target object by adopting the shooting equipment; and performing fusion processing on the first position information and the second position information to obtain target position information. The invention solves the technical problems that the positioning method in the prior art has larger error range of positioning accuracy and inaccurate position information of the target object.

Description

Ultra-wideband visual auxiliary positioning method, device and storage medium
Technical Field
The invention relates to the technical field of auxiliary positioning, in particular to an ultra-wideband visual auxiliary positioning method, an ultra-wideband visual auxiliary positioning device and a storage medium.
Background
At present, in an indoor positioning scene, particularly in an indoor sand table scene, an ultra-wideband positioning technology (UWB positioning technology) is widely applied, data are transmitted by sending and receiving extremely narrow pulses with nanosecond or nanosecond level or below, the distance between nodes is measured by using the flight time of signals between two asynchronous transceivers, and the position information of a target object is determined.
However, in actual deployment, due to interference factors such as signal crosstalk and large-area shielding of a scene, an error range of positioning accuracy of the ultra-wideband positioning technology is expanded, and further position information of a target object jumps, is unstable, and is inaccurate.
In view of the above problems, no effective solution has been proposed.
Disclosure of Invention
The embodiment of the invention provides an ultra-wideband vision-assisted positioning method, an ultra-wideband vision-assisted positioning device and a storage medium, which are used for at least solving the technical problems that the positioning method in the prior art has a large error range of positioning accuracy and the position information of a target object is inaccurate.
According to an aspect of the embodiments of the present invention, there is provided an ultra-wideband visual-assisted positioning method, including: carrying out initialization calibration on the ultra-wideband positioning base station and carrying out initialization calibration on the shooting equipment; acquiring first position information of a target object by adopting the ultra-wideband positioning base station, and acquiring second position information of the target object by adopting the shooting equipment; and performing fusion processing on the first position information and the second position information to obtain target position information.
Optionally, before the initial calibration of the ultra-wideband positioning base station and the initial calibration of the shooting device, the method further includes: acquiring scene information in a target scene; determining the installation position of the ultra-wideband positioning base station based on the scene information; and determining a shooting area of the shooting device based on the scene information.
Optionally, the initializing and calibrating the ultra-wideband positioning base station and the initializing and calibrating the shooting device include: carrying out error correction processing on the ultra-wideband positioning base station; and determining the shooting height and the shooting angle of the shooting equipment based on the shooting area.
Optionally, the obtaining of the first position information of the target object by using the ultra-wideband positioning base station includes: acquiring a plurality of distance information between a plurality of ultra-wideband positioning base stations and the target object; first position information of the target object is determined based on the plurality of distance information.
Optionally, the obtaining, by the shooting device, second position information of the target object includes: acquiring a plurality of target detection frames generated when a plurality of shooting devices shoot the target object; and determining second position information of the target object based on the sizes of the target detection frames and the center point offset distance.
Optionally, before the fusing the first location information and the second location information to obtain the target location information, the method further includes: acquiring historical first position information and historical second position information which are stored in a historical database; and screening the first position information based on the historical first position information, and screening the second position information based on the historical second position information.
Optionally, the fusing the first location information and the second location information to obtain the target location information includes: calculating average position information of the first position information and the second position information; the average position information is used as the target position information.
According to another aspect of the embodiments of the present invention, there is also provided an ultra-wideband visual assistance positioning apparatus, including: the calibration module is used for carrying out initialization calibration on the ultra-wideband positioning base station and carrying out initialization calibration on the shooting equipment; the acquisition module is used for acquiring first position information of a target object by adopting the ultra-wideband positioning base station and acquiring second position information of the target object by adopting the shooting equipment; and the processing module is used for carrying out fusion processing on the first position information and the second position information to obtain target position information.
According to another aspect of the embodiments of the present invention, there is also provided a non-volatile storage medium, which stores a plurality of instructions, the instructions being suitable for being loaded by a processor and executing any one of the above ultra-wideband visual auxiliary positioning methods.
According to another aspect of the embodiments of the present invention, there is also provided an electronic device, including a memory and a processor, where the memory stores a computer program, and the processor is configured to execute the computer program to perform any one of the above ultra-wideband visual aided location methods.
In the embodiment of the invention, the ultra-wideband positioning base station is initialized and calibrated, and the shooting equipment is initialized and calibrated; acquiring first position information of a target object by adopting the ultra-wideband positioning base station, and acquiring second position information of the target object by adopting the shooting equipment; the first position information and the second position information are fused to obtain target position information, and the purposes of slowing down jump, instability and inaccuracy of the position information of the target object through vision-assisted positioning of the shooting equipment are achieved, so that the technical effects of improving positioning accuracy and enhancing data stability are achieved, and the technical problems that the error range of the positioning accuracy is large and the position information of the target object is inaccurate in the positioning method in the prior art are solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention and do not constitute a limitation on the invention. In the drawings:
FIG. 1 is a flow chart of an ultra-wideband visual-assisted positioning method according to an embodiment of the invention;
FIG. 2 is a schematic overall flow chart of an alternative assisted positioning method according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a power grid line loss prediction apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood by those skilled in the art, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
In accordance with an embodiment of the present invention, there is provided an embodiment of an ultra-wideband visual-assisted positioning method, it is noted that the steps illustrated in the flowchart of the drawings may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowchart, in some cases the steps illustrated or described may be performed in an order different than here.
Fig. 1 is a flowchart of an ultra-wideband visual-aided positioning method according to an embodiment of the present invention, as shown in fig. 1, the method includes the following steps:
step S102, carrying out initialization calibration on the ultra-wideband positioning base station and carrying out initialization calibration on shooting equipment;
step S104, acquiring first position information of a target object by adopting the ultra-wideband positioning base station, and acquiring second position information of the target object by adopting the shooting equipment;
and step S106, fusing the first position information and the second position information to obtain target position information.
In an embodiment of the present invention, an execution main body of the ultra-wideband visual auxiliary positioning method provided in the steps S102 to S106 is a positioning system, the positioning system adds a visual auxiliary positioning technology to an existing UWB positioning technology, and as shown in an overall flow diagram of an auxiliary positioning method shown in fig. 2, before the positioning system is used to obtain position information of a target object, an ultra-wideband positioning base station is initially calibrated, and shooting equipment of the visual auxiliary positioning technology is initially calibrated; after the initialization calibration and installation of the positioning equipment are completed, acquiring first position information of a target object by adopting the ultra-wideband positioning base station, and acquiring second position information of the target object by adopting the shooting equipment; and performing fusion processing on the obtained first position information and the second position information to determine target position information of the target object.
It should be noted that the application scenarios of the positioning system are not specifically limited, for example: the method can be applied to indoor sand table scenes and outdoor industrial scenes; the target scene is several of the application scenes, the number of the ultra-wideband positioning base stations is not limited to one, data can be transmitted by sending and receiving extremely narrow pulses with nanosecond or below, the distance between nodes is measured by using the flight time of signals between two asynchronous transceivers, and two-dimensional and three-dimensional coordinates, namely position information, of a target object is further determined based on technologies such as trilateration and quadrilateral positioning.
It should be noted that the shooting device is a shooting device of the vision-assisted positioning technology, for example: the positioning system respectively determines three-dimensional coordinates, namely position information, of the target object by shooting the visible target object and generating a detection frame for the shot target object.
As an alternative embodiment, certain errors exist in both the UWB positioning and the visual positioning in the actual positioning, and under the condition of no obvious interference, position information (the UWB positioning and the visual positioning) can be acquired respectively for target objects at the same position, and linear error balancing is performed to finally determine target position coordinates of the target objects, that is, the target objects.
According to the embodiment of the invention, the UWB positioning technology and the visual positioning technology are fused, so that the obtained position information is more stable and higher in precision, and the corresponding curve of the position information is smoother, and the purposes of slowing down the jump, instability and inaccuracy of the position information of the target object through the visual auxiliary positioning of the shooting equipment are achieved, so that the technical effects of improving the positioning precision and enhancing the stability of data are achieved, and the technical problems that the positioning method in the prior art is larger in error range of the positioning precision and inaccurate in the position information of the target object are solved.
In an optional embodiment, before performing the initial calibration on the ultra-wideband positioning base station and performing the initial calibration on the shooting device, the method further includes: acquiring scene information in a target scene; determining the installation position of the ultra-wideband positioning base station based on the scene information; and determining a shooting area of the shooting device based on the scene information.
In the embodiment of the invention, before the ultra-wideband positioning base station is adopted to measure the actual position information of the target object, the UWB base station module and the camera in the actual scene need to be reasonably deployed based on the scene information in the target scene, namely, the deployment of the UWB base station module needs to reduce the interference and influence caused by the material of the sand table as much as possible, and the camera needs to be uniformly deployed in each visible area of the sand table, so that the occurrence of shooting blind areas is avoided.
In the embodiment of the present invention, before shooting a target object by using a shooting device using a visual positioning technology, initialization calibration needs to be performed on a camera, for example: the height and the self angle of the camera are adjusted to be consistent with those of the camera when the camera leaves a factory as much as possible, so that the target distance in the visible range is accurately determined.
It should be noted that the scene information in the target scene includes, but is not limited to: obstacle position, obstacle height, field inclination, etc.
In an optional embodiment, the performing initialization calibration on the ultra-wideband positioning base station and performing initialization calibration on the shooting device includes: carrying out error correction processing on the ultra-wideband positioning base station; and determining the shooting height and the shooting angle of the shooting equipment based on the shooting area.
In the embodiment of the invention, when the ultra-wideband positioning base station is adopted to measure the actual position information of the target object, the measured distance and the actual distance have a linear error of y ═ ax + b, so that the scene information in the target scene needs to be obtained first, and the values of a and b in the linear error are determined according to the scene information, thereby facilitating the correction of the error; and simultaneously, determining the installation height and angle of the camera based on the scene information in the target scene.
In an optional embodiment, the obtaining, by using the ultra-wideband positioning base station, first position information of a target object includes: acquiring a plurality of distance information between a plurality of ultra-wideband positioning base stations and the target object; first position information of the target object is determined based on the plurality of distance information.
In this embodiment of the present invention, a plurality of the above ultra-wideband positioning base stations may be adopted to obtain first position information of a target object, for example: the distance information between the target object and the positioning base station can be detected by four or more base stations, and the three-dimensional coordinates of the target object are determined according to four-side positioning and multilateral positioning modes to obtain the first position information.
In an optional embodiment, the acquiring, by using the shooting device, the second position information of the target object includes: acquiring a plurality of target detection frames generated when a plurality of shooting devices shoot the target object; and determining second position information of the target object based on the sizes of the plurality of target detection frames and the central point offset distance.
In an embodiment of the present invention, the target object may be photographed by using a plurality of photographing apparatuses, and the three-dimensional coordinates of a plurality of target detection frames generated at the same time may be determined according to the size (distance) of the target detection frame and the distance from the center point to the left, right, up, and down, respectively, to obtain the second position information.
It should be noted that the target object may be a fixed detection object or a moving detection object, and therefore, the first position information and the second position information may be position information in one detection cycle.
In an optional embodiment, before the fusing the first location information and the second location information to obtain the target location information, the method further includes: acquiring historical first position information and historical second position information stored in a historical database; the first location information is filtered based on the historical first location information, and the second location information is filtered based on the historical second location information.
In the embodiment of the invention, when the target is positioned in an actual scene, the distance measurement between the UWB base station and the target object can cause the sudden increase of transmission delay due to accidental signal interference and the shielding of the material of the sand table, and the positioning information can be extremely unstable and data jumps; the camera accurately detects the target object on the premise of positioning the target object, and the interference of light and other similar objects in a scene may cause the false recognition of the camera, and also cause extremely unstable positioning information and data jumping.
Alternatively, a circular buffer, i.e. the above-mentioned historical database, may be defined for storing distance information of recent target ranges.
As an optional embodiment, data with a large offset is removed, the remaining position information is subjected to mean operation, and then the related three-dimensional coordinates are measured, so that the positioning information is more stable.
In an optional embodiment, the fusing the first location information and the second location information to obtain the target location information includes: calculating average position information of the first position information and the second position information; the average position information is used as the target position information.
In the embodiment of the invention, certain errors exist in the actual positioning of the UWB positioning and the visual positioning, and the UWB positioning and the visual positioning are within 10cm under the condition of no obvious interference. Therefore, it is possible to perform linear error balance by using position information (UWB positioning and visual positioning) acquired separately for target objects at the same position, for example: assuming that the three-dimensional coordinates acquired by the UWB are (X1, Y1, Z1) and the three-dimensional coordinates acquired by the visual positioning are (X2, Y2, Z2), the three-dimensional coordinates actually acquired are ((X1+ X2)/2, (Y1+ Y2)/2), (Z1+ Z2)/2).
Through the steps, the position information after data fusion is more stable, the corresponding curve of the position information is smoother, the positioning precision of the target object is improved, and the technical problems that the error range of the positioning precision is larger and the position information of the target object is inaccurate in the positioning method in the prior art are solved.
Example 2
According to an embodiment of the present invention, there is further provided an embodiment of an apparatus for implementing the above ultra-wideband visual aided location method, and fig. 3 is a schematic structural diagram of an ultra-wideband visual aided location apparatus according to an embodiment of the present invention, as shown in fig. 3, the apparatus includes: a calibration module 30, an acquisition module 32, and a processing module 34, wherein:
the calibration module 30 is used for performing initialization calibration on the ultra-wideband positioning base station and performing initialization calibration on the shooting equipment;
an obtaining module 32, configured to obtain first position information of a target object by using the ultra-wideband positioning base station, and obtain second position information of the target object by using the shooting device;
and a processing module 34, configured to perform fusion processing on the first location information and the second location information to obtain target location information.
It should be noted here that the calibration module 30, the obtaining module 32, and the processing module 34 correspond to steps S102 to S106 in embodiment 1, and the three modules are the same as the corresponding steps in the implementation example and the application scenario, but are not limited to the disclosure in embodiment 1.
It should be noted that, reference may be made to the relevant description in embodiment 1 for a preferred implementation of this embodiment, and details are not described here.
According to an embodiment of the present invention, there is also provided an embodiment of a computer-readable storage medium. Optionally, in this embodiment, the computer-readable storage medium may be used to store the program code executed by the ultra-wideband visual aided positioning method provided in embodiment 1.
Optionally, in this embodiment, the computer-readable storage medium may be located in any one of computer terminals in a computer terminal group in a computer network, or in any one of mobile terminals in a mobile terminal group.
Optionally, in this embodiment, the computer readable storage medium is configured to store program codes for performing the following steps: carrying out initialization calibration on the ultra-wideband positioning base station and carrying out initialization calibration on the shooting equipment; acquiring first position information of a target object by adopting the ultra-wideband positioning base station, and acquiring second position information of the target object by adopting the shooting equipment; and performing fusion processing on the first position information and the second position information to obtain target position information.
Optionally, the computer-readable storage medium is configured to store program codes for performing the following steps: acquiring scene information in a target scene; determining the installation position of the ultra-wideband positioning base station based on the scene information; and determining a shooting area of the shooting device based on the scene information.
Optionally, the computer-readable storage medium is configured to store program codes for performing the following steps: carrying out error correction processing on the ultra-wideband positioning base station; and determining the shooting height and the shooting angle of the shooting equipment based on the shooting area.
Optionally, the computer-readable storage medium is configured to store program codes for performing the following steps: acquiring a plurality of distance information between a plurality of ultra-wideband positioning base stations and the target object; first position information of the target object is determined based on the plurality of distance information.
Optionally, the computer-readable storage medium is configured to store program codes for performing the following steps: acquiring a plurality of target detection frames generated when a plurality of shooting devices shoot the target object; and determining second position information of the target object based on the sizes of the target detection frames and the center point offset distance.
Optionally, the computer-readable storage medium is configured to store program codes for performing the following steps: acquiring historical first position information and historical second position information stored in a historical database; the first location information is filtered based on the historical first location information, and the second location information is filtered based on the historical second location information.
Optionally, the computer-readable storage medium is configured to store program codes for performing the following steps: calculating average position information of the first position information and the second position information; the average position information is used as the target position information.
Embodiments of a processor are also provided according to embodiments of the present invention. Optionally, in this embodiment, the computer-readable storage medium may be used to store the program code executed by the ultra-wideband visual aided positioning method provided in embodiment 1.
An embodiment of the present application provides an electronic device, where the device includes a processor, a memory, and a program that is stored in the memory and can be run on the processor, and the processor implements the following steps when executing the program: carrying out initialization calibration on the ultra-wideband positioning base station and carrying out initialization calibration on the shooting equipment; acquiring first position information of a target object by adopting the ultra-wideband positioning base station, and acquiring second position information of the target object by adopting the shooting equipment; and fusing the first position information and the second position information to obtain target position information.
The present application further provides a computer program product adapted to perform a program of initializing the following method steps when executed on a data processing device: carrying out initialization calibration on the ultra-wideband positioning base station and carrying out initialization calibration on the shooting equipment; acquiring first position information of a target object by adopting the ultra-wideband positioning base station, and acquiring second position information of the target object by adopting the shooting equipment; and performing fusion processing on the first position information and the second position information to obtain target position information.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed coupling or direct coupling or communication connection between each other may be an indirect coupling or communication connection through some interfaces, units or modules, and may be electrical or in other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may also be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented as a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk, and various media capable of storing program codes.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that those skilled in the art can make various improvements and modifications without departing from the principle of the present invention, and these improvements and modifications should also be construed as the protection scope of the present invention.

Claims (10)

1. An ultra-wideband vision-assisted positioning method, comprising:
carrying out initialization calibration on the ultra-wideband positioning base station and carrying out initialization calibration on the shooting equipment;
acquiring first position information of a target object by adopting the ultra-wideband positioning base station, and acquiring second position information of the target object by adopting the shooting equipment;
and fusing the first position information and the second position information to obtain target position information.
2. The method of claim 1, wherein before the initial calibration of the ultra-wideband positioning base station and the initial calibration of the photographing device, the method further comprises:
acquiring scene information in a target scene;
determining an installation position of the ultra-wideband positioning base station based on the scene information;
determining a photographing region of the photographing apparatus based on the scene information.
3. The method of claim 2, wherein the initializing calibration of the ultra-wideband positioning base station and the initializing calibration of the photographing device comprise:
carrying out error correction processing on the ultra-wideband positioning base station;
and determining the shooting height and the shooting angle of the shooting device based on the shooting area.
4. The method of claim 1, wherein obtaining the first location information of the target object using the ultra-wideband positioning base station comprises:
acquiring a plurality of distance information between a plurality of ultra-wideband positioning base stations and the target object;
first position information of the target object is determined based on the plurality of distance information.
5. The method according to claim 1, wherein the acquiring, with the photographing apparatus, second position information of the target object includes:
acquiring a plurality of target detection frames generated when a plurality of shooting devices shoot the target object;
determining second position information of the target object based on the sizes of the plurality of target detection frames and the center point offset distance.
6. The method according to claim 1, wherein before the fusing the first location information and the second location information to obtain the target location information, the method further comprises:
acquiring historical first position information and historical second position information stored in a historical database;
the first location information is filtered based on the historical first location information, and the second location information is filtered based on the historical second location information.
7. The method according to any one of claims 1 to 6, wherein the fusing the first position information and the second position information to obtain target position information includes:
calculating average position information of the first position information and the second position information;
and taking the average position information as the target position information.
8. An ultra-wideband vision-assisted positioning device, comprising:
the calibration module is used for carrying out initialization calibration on the ultra-wideband positioning base station and carrying out initialization calibration on the shooting equipment;
the acquisition module is used for acquiring first position information of a target object by adopting the ultra-wideband positioning base station and acquiring second position information of the target object by adopting the shooting equipment;
and the processing module is used for carrying out fusion processing on the first position information and the second position information to obtain target position information.
9. A non-volatile storage medium, characterized in that it stores a plurality of instructions adapted to be loaded by a processor and to perform the ultra-wideband visual assisted positioning method of any of claims 1 to 7.
10. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and wherein the processor is configured to execute the computer program to perform the ultra-wideband visual assisted positioning method of any of claims 1 to 7.
CN202210635794.3A 2022-06-07 2022-06-07 Ultra-wideband visual auxiliary positioning method and device and storage medium Pending CN115103439A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210635794.3A CN115103439A (en) 2022-06-07 2022-06-07 Ultra-wideband visual auxiliary positioning method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210635794.3A CN115103439A (en) 2022-06-07 2022-06-07 Ultra-wideband visual auxiliary positioning method and device and storage medium

Publications (1)

Publication Number Publication Date
CN115103439A true CN115103439A (en) 2022-09-23

Family

ID=83288032

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210635794.3A Pending CN115103439A (en) 2022-06-07 2022-06-07 Ultra-wideband visual auxiliary positioning method and device and storage medium

Country Status (1)

Country Link
CN (1) CN115103439A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116193581A (en) * 2023-05-04 2023-05-30 广东工业大学 Indoor unmanned aerial vehicle hybrid positioning method and system based on member-collecting filtering

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108550234A (en) * 2018-04-24 2018-09-18 成都恒高科技有限公司 Tag match, fence boundary management method, device and the storage medium of Dual base stations
CN110657803A (en) * 2018-06-28 2020-01-07 深圳市优必选科技有限公司 Robot positioning method, device and storage device
CN111225440A (en) * 2019-11-22 2020-06-02 三一重工股份有限公司 Cooperative positioning method and device and electronic equipment
CN111413970A (en) * 2020-03-18 2020-07-14 天津大学 Ultra-wideband and vision integrated indoor robot positioning and autonomous navigation method
CN113099529A (en) * 2021-03-29 2021-07-09 千寻位置网络(浙江)有限公司 Indoor vehicle navigation method, vehicle-mounted terminal, field terminal server and system
CN113516708A (en) * 2021-05-25 2021-10-19 中国矿业大学 Power transmission line inspection unmanned aerial vehicle accurate positioning system and method based on image recognition and UWB positioning fusion

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108550234A (en) * 2018-04-24 2018-09-18 成都恒高科技有限公司 Tag match, fence boundary management method, device and the storage medium of Dual base stations
CN110657803A (en) * 2018-06-28 2020-01-07 深圳市优必选科技有限公司 Robot positioning method, device and storage device
CN111225440A (en) * 2019-11-22 2020-06-02 三一重工股份有限公司 Cooperative positioning method and device and electronic equipment
CN111413970A (en) * 2020-03-18 2020-07-14 天津大学 Ultra-wideband and vision integrated indoor robot positioning and autonomous navigation method
CN113099529A (en) * 2021-03-29 2021-07-09 千寻位置网络(浙江)有限公司 Indoor vehicle navigation method, vehicle-mounted terminal, field terminal server and system
CN113516708A (en) * 2021-05-25 2021-10-19 中国矿业大学 Power transmission line inspection unmanned aerial vehicle accurate positioning system and method based on image recognition and UWB positioning fusion

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李宇杰: "基于视觉的三维目标检测算法研究综述", 计算机工程与应用, vol. 2020, no. 56, 12 October 2019 (2019-10-12), pages 2 - 4 *
李康: "室内环境无人机复合定位方法研究", 硕士电子期刊, 15 March 2019 (2019-03-15) *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116193581A (en) * 2023-05-04 2023-05-30 广东工业大学 Indoor unmanned aerial vehicle hybrid positioning method and system based on member-collecting filtering
CN116193581B (en) * 2023-05-04 2023-08-04 广东工业大学 Indoor unmanned aerial vehicle hybrid positioning method and system based on member-collecting filtering

Similar Documents

Publication Publication Date Title
CN108226906B (en) A kind of scaling method, device and computer readable storage medium
CN111060948B (en) Positioning method, positioning device, helmet and computer readable storage medium
CN104613930B (en) Method and device for measuring distance as well as mobile terminal
CN111352069B (en) Indoor positioning method, server, storage medium and program product
CN110868752B (en) Terminal positioning method and device
CN110012416B (en) User terminal positioning method and device
CN107980138A (en) A kind of false-alarm obstacle detection method and device
CN111080662A (en) Lane line extraction method and device and computer equipment
CN112816949B (en) Sensor calibration method and device, storage medium and calibration system
CN109658497B (en) Three-dimensional model reconstruction method and device
CN116958146B (en) Acquisition method and device of 3D point cloud and electronic device
CN113658263B (en) Visual scene-based electromagnetic interference source visual labeling method
WO2019165632A1 (en) Indoor positioning method, apparatus and equipment
CN115359130B (en) Radar and camera combined calibration method and device, electronic equipment and storage medium
CN115103439A (en) Ultra-wideband visual auxiliary positioning method and device and storage medium
CN110673092A (en) Ultra-wideband-based time-sharing positioning method, device and system
JP2019502115A (en) Positioning of mobile equipment
CN115953483A (en) Parameter calibration method and device, computer equipment and storage medium
CN115134741A (en) UWB base station anomaly detection method and electronic equipment
CN109218961B (en) Multi-station cooperative interference positioning method and system based on virtual nodes
CN112946612B (en) External parameter calibration method and device, electronic equipment and storage medium
CN113077523B (en) Calibration method, calibration device, computer equipment and storage medium
CN109141344A (en) A kind of method and system based on the accurate ranging of binocular camera
CN115022805A (en) UWB base station calibration method, device, electronic device and medium
CN114782496A (en) Object tracking method and device, storage medium and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination