CN109711398B - Sensor spatial relationship acquisition method and device and electronic equipment - Google Patents

Sensor spatial relationship acquisition method and device and electronic equipment Download PDF

Info

Publication number
CN109711398B
CN109711398B CN201811653358.9A CN201811653358A CN109711398B CN 109711398 B CN109711398 B CN 109711398B CN 201811653358 A CN201811653358 A CN 201811653358A CN 109711398 B CN109711398 B CN 109711398B
Authority
CN
China
Prior art keywords
sensor
track
spatial relationship
movement
relative spatial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811653358.9A
Other languages
Chinese (zh)
Other versions
CN109711398A (en
Inventor
杨大业
宋建华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201811653358.9A priority Critical patent/CN109711398B/en
Publication of CN109711398A publication Critical patent/CN109711398A/en
Application granted granted Critical
Publication of CN109711398B publication Critical patent/CN109711398B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The utility model provides a sensor spatial relationship acquisition method, including obtaining the first movement track that first sensor gathered and the second movement track that the second sensor gathered, predict the first extension track of first movement track, and at least based on the coincidence degree of second movement track with first extension track accords with specific condition, according to second movement track with first movement track, confirm the relative spatial relationship of first sensor and second sensor. The disclosure also provides a sensor spatial relationship acquisition device, an electronic device and a computer readable storage medium.

Description

Sensor spatial relationship acquisition method and device and electronic equipment
Technical Field
The disclosure relates to a sensor spatial relationship acquisition method and device and electronic equipment.
Background
A large amount of useful information can be provided in the existing sensor network, especially when real-time sensor data is interfaced with an artificial intelligence system, such as human face recognition, gait recognition and the like, valuable information can be quickly found, and the relative position of the sensor is key data for realizing the value. Existing sensors, such as various cameras, have no positional information due to inconsistent installation times and applications, because most of the initial designs of systems are local security monitoring, or special applications, such as traffic violation photography. A large amount of manpower and material resources are wasted by manually recording the position of the camera.
Disclosure of Invention
One aspect of the present disclosure provides a sensor spatial relationship obtaining method, including obtaining a first movement track acquired by a first sensor and a second movement track acquired by a second sensor, predicting a first extended track of the first movement track, and determining a relative spatial relationship between the first sensor and the second sensor according to the second movement track and the first movement track, based on at least that a coincidence degree of the second movement track and the first extended track meets a specific condition.
Optionally, the relative spatial relationship at least includes a relative position relationship and a relative view angle relationship.
Optionally, the first sensor acquires a plurality of first movement tracks, the second sensor acquires a plurality of second movement tracks, the determining the relative spatial relationship between the first sensor and the second sensor according to the second movement tracks and the first movement tracks based on at least that the coincidence degree of the second movement tracks and the first extension tracks meets a specific condition includes determining the coincidence degree of each of the first extension tracks and each of the second movement tracks, and determining the relative spatial relationship between the first sensor and the second sensor based on the first movement track corresponding to at least one first extension track and at least one second movement track when the coincidence degree of the at least one first extension track and the at least one second movement track is greater than a first threshold value.
Optionally, the method further includes predicting, when the coincidence degree of at least one first extended trajectory and at least one second extended trajectory is greater than a first threshold, a time when the first extended trajectory occurs based on the relative spatial relationship of the first sensor and the second sensor and a time when the first extended trajectory corresponding to the first extended trajectory occurs, and excluding, when a difference between the predicted time when the first extended trajectory occurs and the predicted time when the second extended trajectory occurs is greater than a second threshold, the corresponding relationship between the first extended trajectory and the second extended trajectory, and re-determining the relative positional relationship between the first sensor and the second sensor.
Optionally, the determining the relative spatial relationship between the first sensor and the second sensor according to the second movement track and the first movement track based on at least the coincidence degree of the second movement track and the first extension track conforming to a specific condition includes determining a probability distribution of the relative spatial relationship between the first sensor and the second sensor based on at least the coincidence degree of the second movement track and the first extension track conforming to a specific condition, and determining the relative spatial relationship between the first sensor and the second sensor based on the probability distribution of the relative spatial relationship between the first sensor and the second sensor.
Optionally, the determining the relative spatial relationship between the first sensor and the second sensor based on the probability distribution of the relative spatial relationship between the first sensor and the second sensor includes obtaining a probability distribution of the relative spatial relationship between a plurality of sensor pairs, wherein the plurality of sensor pairs include a plurality of sensor pairs obtained by combining a plurality of sensors two by two, the plurality of sensors include the first sensor and the second sensor, and determining the relative spatial relationship between the plurality of sensors through global optimization based on the probability distribution of the relative spatial relationship between the plurality of sensor pairs, wherein the relative spatial relationship between the plurality of sensors includes the relative spatial relationship between the first sensor and the second sensor.
Optionally, the method further comprises obtaining position and orientation information of at least one sensor of the plurality of sensors, and determining the position and orientation information of the plurality of sensors based on the position and orientation information of the at least one sensor and the relative spatial relationship between the plurality of sensors.
Another aspect of the disclosure provides a sensor spatial relationship acquisition apparatus, including a first obtaining module, a predicting module, and a first determining module. The first obtaining module is used for obtaining a first moving track acquired by the first sensor and a second moving track acquired by the second sensor. And the prediction module is used for predicting a first extension track of the first movement track. And the first determining module is used for determining the relative spatial relationship between the first sensor and the second sensor according to the second movement track and the first movement track at least based on the coincidence degree of the second movement track and the first extension track meeting a specific condition.
Optionally, the relative spatial relationship at least includes a relative position relationship and a relative view angle relationship.
Optionally, the first sensor acquires a plurality of first movement trajectories, the second sensor acquires a plurality of second movement trajectories, and the first determining module includes a first determining submodule and a second determining submodule. And the first determining submodule is used for determining the coincidence degree of each first extension track and each second movement track. And the second determining submodule is used for determining the relative spatial relationship between the first sensor and the second sensor based on the first movement track corresponding to the first extension track and the second movement track when the coincidence degree of the at least one first extension track and the at least one second movement track is greater than a first threshold value.
Optionally, the first determination module further comprises a prediction sub-module and an exclusion sub-module. And the predicting submodule is used for predicting the time of the first extended track based on the relative spatial relationship between the first sensor and the second sensor and the time of the first extended track corresponding to the first extended track when the coincidence degree of at least one first extended track and at least one second extended track is greater than a first threshold value. And the elimination submodule is used for eliminating the corresponding relation between the first extension track and the second movement track and re-determining the relative position relation between the first sensor and the second sensor under the condition that the difference between the predicted appearance time of the first extension track and the appearance time of the second movement track is larger than a second threshold value.
Optionally, the first determination module comprises a third determination submodule and a fourth determination submodule. A third determining submodule, configured to determine a probability distribution of a relative spatial relationship between the first sensor and the second sensor based on at least that a degree of coincidence of the second movement trajectory and the first extension trajectory meets a certain condition. A fourth determination submodule to determine a relative spatial relationship of the first sensor and the second sensor based on a probability distribution of the relative spatial relationship of the first sensor and the second sensor.
Optionally, the fourth determining submodule includes an obtaining unit and a determining unit. An obtaining unit, configured to obtain probability distribution of relative spatial relationships of a plurality of sensor pairs, where the plurality of sensor pairs include a plurality of sensor pairs obtained by combining a plurality of sensors pairwise, and the plurality of sensors include the first sensor and the second sensor. A determining unit, configured to determine a relative spatial relationship among the plurality of sensors through global optimization based on a probability distribution of the relative spatial relationship among the plurality of sensor pairs, where the relative spatial relationship among the plurality of sensors includes a relative spatial relationship of the first sensor and the second sensor.
Optionally, the apparatus further comprises a second obtaining module and a second determining module. A second obtaining module for obtaining position and attitude information of at least one of the plurality of sensors. A second determination module for determining the position and orientation information of the plurality of sensors based on the position and orientation information of the at least one sensor and the relative spatial relationship between the plurality of sensors.
Another aspect of the disclosure provides an electronic device comprising a processor and a memory. The memory has stored thereon a computer program which, when executed by the processor, causes the processor to perform the method as described above.
Another aspect of the present disclosure provides a computer-readable storage medium storing computer-executable instructions for implementing the method as described above when executed.
Another aspect of the disclosure provides a computer program comprising computer executable instructions for implementing the method as described above when executed.
Drawings
For a more complete understanding of the present disclosure and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
fig. 1A and 1B schematically illustrate an application scenario of a sensor spatial relationship acquisition method according to an embodiment of the present disclosure;
FIG. 2 schematically shows a flow chart of a sensor spatial relationship acquisition method according to an embodiment of the present disclosure;
3A-3C schematically illustrate a flow chart for determining a relative spatial relationship of a first sensor and a second sensor in accordance with various embodiments of the present disclosure;
FIG. 3D schematically illustrates a flow chart for determining the relative spatial relationship of a first sensor and a second sensor based on a probability distribution of the relative spatial relationship of the first sensor and the second sensor, according to an embodiment of the disclosure;
FIG. 3E schematically illustrates a flow chart of a sensor spatial relationship acquisition method according to another embodiment of the present disclosure;
fig. 4 schematically shows a block diagram of a sensor spatial relationship acquisition apparatus according to an embodiment of the present disclosure;
5A-5C schematically illustrate block diagrams of a first determination module according to various embodiments of the present disclosure;
FIG. 5D schematically illustrates a block diagram of a fourth determination submodule according to an embodiment of the present disclosure;
fig. 5E schematically shows a block diagram of a sensor spatial relationship acquisition apparatus according to another embodiment of the present disclosure; and
fig. 6 schematically shows a block diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
Hereinafter, embodiments of the present disclosure will be described with reference to the accompanying drawings. It should be understood that the description is illustrative only and is not intended to limit the scope of the present disclosure. In the following detailed description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the embodiments of the disclosure. It may be evident, however, that one or more embodiments may be practiced without these specific details. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present disclosure.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. The terms "comprises," "comprising," and the like, as used herein, specify the presence of stated features, steps, operations, and/or components, but do not preclude the presence or addition of one or more other features, steps, operations, or components.
All terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art unless otherwise defined. It is noted that the terms used herein should be interpreted as having a meaning that is consistent with the context of this specification and should not be interpreted in an idealized or overly formal sense.
Where a convention analogous to "at least one of A, B and C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B and C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.). Where a convention analogous to "A, B or at least one of C, etc." is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., "a system having at least one of A, B or C" would include but not be limited to systems that have a alone, B alone, C alone, a and B together, a and C together, B and C together, and/or A, B, C together, etc.).
Some block diagrams and/or flow diagrams are shown in the figures. It will be understood that some blocks of the block diagrams and/or flowchart illustrations, or combinations thereof, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the instructions, which execute via the processor, create means for implementing the functions/acts specified in the block diagrams and/or flowchart block or blocks. The techniques of this disclosure may be implemented in hardware and/or software (including firmware, microcode, etc.). In addition, the techniques of this disclosure may take the form of a computer program product on a computer-readable storage medium having instructions stored thereon for use by or in connection with an instruction execution system.
Fig. 1A and 1B schematically show schematic diagrams of application scenarios of a sensor spatial relationship acquisition method according to an embodiment of the present disclosure. It should be noted that fig. 1 is only an example of a scenario in which the embodiments of the present disclosure may be applied to help those skilled in the art understand the technical content of the present disclosure, but does not mean that the embodiments of the present disclosure may not be applied to other devices, systems, environments or scenarios.
As shown in FIG. 1A, multiple sensors X are included in the scene1、X2、X3、X4、X5For example for capturing imagesThe camera of (1). Each camera can acquire images in real time and store the images. The cameras can be arranged at different positions of a street for example, monitor the traffic or safety of a city, and can acquire the moving tracks of objects such as pedestrians and vehicles. With the development of technology, the automatic processing of many functions can be realized by accurately positioning the position of the camera, however, the installation position and the visual angle of the camera are not recorded during the installation of some cameras due to the early installation of the cameras.
The embodiment of the disclosure provides a sensor spatial relationship obtaining method, which includes obtaining a first moving track acquired by a first sensor and a second moving track acquired by a second sensor, predicting a first extending track of the first moving track, and determining a relative spatial relationship between the first sensor and the second sensor according to the second moving track and the first moving track at least based on that the contact ratio of the second moving track and the first extending track meets a specific condition.
As shown in fig. 1B, between two cameras (e.g., at camera X)1And X2In between) there may be trajectories generated by some of the same objects. The method can be used for determining the relative spatial relationship between the cameras by predicting the extending track of the track acquired by one camera and matching the extending track with the tracks acquired by other cameras.
Fig. 2 schematically shows a flow chart of a sensor spatial relationship acquisition method according to an embodiment of the present disclosure.
As shown in fig. 2, the method includes operations S210 to S230.
In operation S210, a first movement trace acquired by a first sensor and a second movement trace acquired by a second sensor are obtained.
In operation S220, a first extension trajectory of the first movement trajectory is predicted.
In operation S230, a relative spatial relationship between the first sensor and the second sensor is determined according to the second movement track and the first movement track based on at least that a degree of coincidence of the second movement track and the first extension track meets a specific condition.
According to the method, the relative position relation between the sensors is determined through the matching of the moving track acquired by the sensors and the extension track predicted based on the existing model, the sensors can be automatically positioned, manual measurement is not needed, and the labor cost is greatly saved.
According to the embodiment of the disclosure, the relative spatial relationship at least includes a relative position relationship and a relative view angle relationship.
For example, the pedestrian a leaves a first movement locus when passing through the area acquired by the first sensor, and leaves a second movement locus when passing through the area acquired by the second sensor. If the degree of coincidence of the first extended trajectory and the second trajectory of the first trajectory is predicted to meet a certain condition, for example, the degree of coincidence is higher than a threshold value, that is, the first extended trajectory and the second trajectory are matched, the relative spatial relationship between the first sensor and the second sensor may be estimated based on the matching relationship.
Of course, the relative spatial relationship determined by a single trajectory is not sufficiently reliable. According to the embodiment of the present disclosure, the first sensor acquires a plurality of first movement trajectories, and the second sensor acquires a plurality of second movement trajectories. The relative spatial relationship between the sensors may be determined by matching between multiple traces.
FIG. 3A schematically illustrates a flow chart for determining a relative spatial relationship of a first sensor and a second sensor, in accordance with an embodiment of the disclosure.
As shown in fig. 3A, the method includes operations S311 and S312.
In operation S311, a coincidence degree of each of the first extended trajectories and each of the second moving trajectories is determined. For example, the plurality of first extended traces include A1、A2、A3The plurality of second movement trajectories includes B1、B2、B3In the case of (2), A can be determined separately1And B1Degree of coincidence (hereinafter, represented as A)1-B1) And A1-B2、A1-B3、A2-B1、A2-B2、A2-B3、A3-B1、A3-B2And A3-B3The degree of coincidence of (c).
In operation S312, in a case that a degree of coincidence of at least one first extended trajectory and at least one second extended trajectory is greater than a first threshold, a relative spatial relationship between the first sensor and the second sensor is determined based on a first movement trajectory corresponding to the first extended trajectory and the second movement trajectory. For example, in the above example, if A is1-B2And A3-B3All the contact ratios are greater than the first threshold value, then the contact ratio can be determined according to A1-B2And A3-B3A relative spatial relationship of the first sensor and the second sensor is determined.
FIG. 3B schematically illustrates a flow chart for determining a relative spatial relationship of a first sensor and a second sensor according to another embodiment of the present disclosure.
As shown in fig. 3B, the method may further include operations S321 to S323 based on the foregoing embodiment.
After completion of performing S312, the determined relative spatial relationship of the first sensor and the second sensor may be used as an alternative relative spatial relationship, and operation S321 is continued.
In operation S321, a time when the first extended trajectory occurs is predicted based on the relative spatial relationship between the first sensor and the second sensor and a time when the first moving trajectory corresponding to the first extended trajectory occurs. For example, the moving rate of the first moving trajectory (i.e., the moving speed of the object generating the first moving trajectory), the time when the first moving trajectory occurs, and the relative spatial relationship of the first sensor and the second sensor determined by operation S312 may be determined, and the time when the first extended trajectory occurs may be predicted, and if the second moving trajectory is the predicted first extended trajectory, the time when the second moving trajectory occurs should be substantially the same as the predicted time when the first extended trajectory occurs. If the two are obviously different, the second moving track is not related to the predicted first extending track.
In operation S322, it is determined whether a difference between the time when the first extending trace appears and the time when the second moving trace appears is greater than a second threshold, if so, operation S323 is performed, otherwise, operation S324 is skipped, and the process ends.
In operation S323, in a case where the difference between the predicted time when the first extended trajectory appears and the predicted time when the second moving trajectory appears is greater than the second threshold, the corresponding relationship between the first extended trajectory and the second moving trajectory is excluded, and the operation S312 is returned to, and the relative positional relationship between the first sensor and the second sensor is redetermined.
FIG. 3C schematically illustrates a flow chart for determining a relative spatial relationship of a first sensor and a second sensor according to yet another embodiment of the present disclosure.
As shown in fig. 3C, the method includes operations S331 and S332.
In operation S331, a probability distribution of a relative spatial relationship between the first sensor and the second sensor is determined based on at least that a degree of coincidence of the second movement trajectory with the first extension trajectory meets a certain condition. According to the embodiment of the disclosure, each camera collects a plurality of tracks, the relative spatial relationship determined by different tracks may be different, and the probability distribution of the relative spatial relationship between two cameras can be determined in a statistical manner. Each pair of matched first movement trajectory (or first extension trajectory) and second movement trajectory is referred to as a set of trajectory pairs. The relative spatial relationships supported by a greater number of pairs of trajectories are also more probable than the relative spatial relationships supported by fewer pairs of trajectories.
In operation S332, a relative spatial relationship of the first sensor and the second sensor is determined based on a probability distribution of the relative spatial relationship of the first sensor and the second sensor. For example, the relative spatial relationship with the highest probability may be determined as the relative spatial relationship of the first sensor and the second sensor.
Fig. 3D schematically illustrates a flow chart for determining the relative spatial relationship of a first sensor and a second sensor based on a probability distribution of the relative spatial relationship of the first sensor and the second sensor, according to an embodiment of the disclosure.
As shown in fig. 3D, the method includes operations S341 and S342.
In operation S341, a probability distribution of relative spatial relationships of a plurality of sensor pairs is obtained, where the plurality of sensor pairs includes a plurality of sensor pairs obtained by combining a plurality of sensors two by two, and the plurality of sensors includes the first sensor and the second sensor. For example, in the presence of four sensors X1、X2、X3、X4In the case of (3), two of them can be combined to obtain X1-X2、X1-X3、X1-X4、X2-X3、X2-X4And X3-X4The probability distribution of the relative spatial relationship between two is determined according to the method of the previous embodiment, respectively.
In operation S342, a relative spatial relationship between the plurality of sensors is determined through global optimization based on a probability distribution of the relative spatial relationship of the plurality of sensor pairs, wherein the relative spatial relationship between the plurality of sensors includes the relative spatial relationship of the first sensor and the second sensor.
Global optimization is to find a set of parameter values under a series of constraints to optimize the values of a certain objective function or a certain set of objective functions. When the objective function is a nonlinear function, the global optimization is a nonlinear global optimization. All parameter values satisfying the constraint constitute a feasible domain. At least one local optimal solution in the feasible domain can be determined by a traditional method such as Newton method or gradient method, and then a global optimal solution is obtained by comparison. The evolutionary algorithm is an intelligent global optimization method, such as a penalty function method, a genetic algorithm, an evolutionary strategy, an evolutionary plan, an ant colony algorithm, a particle swarm algorithm and the like.
According to the method, the relative spatial relationship among the sensors can be accurately determined through global optimization of the relative spatial relationship among the sensors.
Fig. 3E schematically shows a flow chart of a sensor spatial relationship acquisition method according to another embodiment of the present disclosure.
As shown in fig. 3E, the method may further include operations S351 and S352 on the basis of the foregoing embodiment.
In operation S351, position and orientation information of at least one of the plurality of sensors is obtained.
In operation S352, position and orientation information of the plurality of sensors is determined based on the position and orientation information of the at least one sensor and the relative spatial relationship between the plurality of sensors.
According to the embodiment of the disclosure, after the relative spatial relationship of a plurality of sensors is determined, the position and attitude information of other sensors can be determined only by obtaining the absolute position and attitude information of at least one sensor.
Based on the same inventive concept, the present disclosure also provides a sensor spatial relationship acquisition apparatus, and the following describes the sensor spatial relationship acquisition apparatus according to the embodiment of the present disclosure with reference to fig. 4 and 5A to 5E.
Fig. 4 schematically shows a block diagram of a sensor spatial relationship acquisition apparatus 400 according to an embodiment of the present disclosure.
As shown in fig. 4, the sensor spatial relationship obtaining apparatus 400 includes a first obtaining module 410, a predicting module 420 and a first determining module 430. The sensor spatial relationship acquisition apparatus 400 may perform the various methods described above.
The first obtaining module 410, for example, performs operation S210 described with reference to fig. 2 above, for obtaining a first movement trajectory acquired by the first sensor and a second movement trajectory acquired by the second sensor.
The prediction module 420, for example, performs the operation S220 described with reference to fig. 2 above, for predicting a first extended trajectory of the first movement trajectory.
The first determining module 430, for example, performs operation S230 described with reference to fig. 2 above, and is configured to determine a relative spatial relationship between the first sensor and the second sensor according to the second movement track and the first movement track, based on at least that a degree of coincidence of the second movement track and the first extension track meets a specific condition.
According to the embodiment of the disclosure, the relative spatial relationship at least includes a relative position relationship and a relative view angle relationship.
Fig. 5A schematically illustrates a block diagram of the first determination module 510 according to an embodiment of the disclosure.
According to the embodiment of the present disclosure, the first sensor acquires a plurality of first movement trajectories, and the second sensor acquires a plurality of second movement trajectories. As shown in fig. 5A, the first determination module 510 may include a first determination submodule 511 and a second determination submodule 512.
The first determining sub-module 511, for example, performs the operation S311 described with reference to fig. 3A above, and is configured to determine a coincidence degree of each of the first extension tracks and each of the second movement tracks.
The second determining submodule 512, for example, executes the operation S312 described with reference to fig. 3A above, and is configured to determine the relative spatial relationship between the first sensor and the second sensor based on the first movement track and the second movement track corresponding to the first extension track when the coincidence degree of the at least one first extension track and the at least one second movement track is greater than the first threshold.
Fig. 5B schematically illustrates a block diagram of the first determination module 520 according to another embodiment of the present disclosure.
As shown in fig. 5B, the first determining module 520 may further include a predicting sub-module 521 and an excluding sub-module 522 based on the foregoing embodiments.
The predicting sub-module 521, for example, performs operation S321 described with reference to fig. 3B above, and is configured to predict, when at least one first extended trajectory occurs, based on the relative spatial relationship between the first sensor and the second sensor and the time when the first extended trajectory corresponding to the first extended trajectory occurs, if the coincidence degree of the first extended trajectory and the second extended trajectory is greater than the first threshold.
The excluding sub-module 522, for example, executes the operation S323 described with reference to fig. 3B, and is configured to, in a case that the difference between the predicted time of occurrence of the first extended trajectory and the predicted time of occurrence of the second moving trajectory is greater than the second threshold, exclude the corresponding relationship between the first extended trajectory and the second moving trajectory, and re-determine the relative position relationship between the first sensor and the second sensor.
Fig. 5C schematically illustrates a block diagram of the first determining module 530 according to yet another embodiment of the present disclosure.
As shown in fig. 5C, the first determination module 530 may include a third determination submodule 531 and a fourth determination submodule 532.
The third determining submodule 531, for example performing operation S331 described with reference to fig. 3C above, is configured to determine a probability distribution of the relative spatial relationship between the first sensor and the second sensor based on at least that the degree of coincidence of the second movement trajectory and the first extension trajectory meets a certain condition.
A fourth determination submodule 532, for example performing operation S332 described with reference to fig. 3C above, is configured to determine the relative spatial relationship of the first sensor and the second sensor based on the probability distribution of the relative spatial relationship of the first sensor and the second sensor.
Fig. 5D schematically illustrates a block diagram of a fourth determination submodule 540 according to an embodiment of the present disclosure.
As shown in fig. 5D, the fourth determination submodule 540 includes an obtaining unit 541 and a determination unit 542.
The obtaining unit 541, for example, performs operation S341 described with reference to fig. 3D above, to obtain a probability distribution of relative spatial relationships of a plurality of sensor pairs, where the plurality of sensor pairs includes a plurality of sensor pairs obtained by combining two by two of a plurality of sensors, and the plurality of sensors includes the first sensor and the second sensor.
The determining unit 542, for example, performs operation S342 described with reference to fig. 3D above, and is configured to determine a relative spatial relationship between the plurality of sensors through global optimization based on a probability distribution of the relative spatial relationship between the plurality of sensor pairs, where the relative spatial relationship between the plurality of sensors includes the relative spatial relationship between the first sensor and the second sensor.
Fig. 5E schematically shows a block diagram of a sensor spatial relationship acquisition apparatus 550 according to another embodiment of the present disclosure.
As shown in fig. 5E, the sensor spatial relationship obtaining apparatus 550 may further include a second obtaining module 551 and a second determining module 552 on the basis of the foregoing embodiments.
The second obtaining module 551, for example, performs the operation S351 described with reference to fig. 3E above, for obtaining the position and orientation information of at least one sensor of the plurality of sensors.
The second determining module 552, for example, performs operation S352 described with reference to fig. 3E above, for determining the position and orientation information of the plurality of sensors based on the position and orientation information of the at least one sensor and the relative spatial relationship between the plurality of sensors.
Any number of modules, sub-modules, units, sub-units, or at least part of the functionality of any number thereof according to embodiments of the present disclosure may be implemented in one module. Any one or more of the modules, sub-modules, units, and sub-units according to the embodiments of the present disclosure may be implemented by being split into a plurality of modules. Any one or more of the modules, sub-modules, units, sub-units according to embodiments of the present disclosure may be implemented at least in part as a hardware circuit, such as a Field Programmable Gate Array (FPGA), a Programmable Logic Array (PLA), a system on a chip, a system on a substrate, a system on a package, an Application Specific Integrated Circuit (ASIC), or may be implemented in any other reasonable manner of hardware or firmware by integrating or packaging a circuit, or in any one of or a suitable combination of software, hardware, and firmware implementations. Alternatively, one or more of the modules, sub-modules, units, sub-units according to embodiments of the disclosure may be at least partially implemented as a computer program module, which when executed may perform the corresponding functions.
For example, any plurality of the first obtaining module 410, the prediction module 420, the first determining module 430, the first determining submodule 511, the second determining submodule 512, the prediction submodule 521, the excluding submodule 522, the third determining submodule 531, the fourth determining submodule 532, the obtaining unit 541, the determining unit 542, the second obtaining module 551, and the second determining module 552 may be combined to be implemented in one module, or any one of them may be split into a plurality of modules. Alternatively, at least part of the functionality of one or more of these modules may be combined with at least part of the functionality of the other modules and implemented in one module. According to an embodiment of the present disclosure, at least one of the first obtaining module 410, the prediction module 420, the first determining module 430, the first determining submodule 511, the second determining submodule 512, the prediction submodule 521, the excluding submodule 522, the third determining submodule 531, the fourth determining submodule 532, the obtaining unit 541, the determining unit 542, the second obtaining module 551 and the second determining module 552 may be at least partially implemented as a hardware circuit, such as Field Programmable Gate Arrays (FPGAs), Programmable Logic Arrays (PLAs), systems on a chip, systems on a substrate, systems on a package, Application Specific Integrated Circuits (ASICs), or may be implemented in hardware or firmware in any other reasonable way of integrating or packaging circuits, or in any one of three implementations, software, hardware and firmware, or in any suitable combination of any of them. Alternatively, at least one of the first obtaining module 410, the prediction module 420, the first determining module 430, the first determining submodule 511, the second determining submodule 512, the prediction submodule 521, the excluding submodule 522, the third determining submodule 531, the fourth determining submodule 532, the obtaining unit 541, the determining unit 542, the second obtaining module 551 and the second determining module 552 may be at least partially implemented as a computer program module which, when executed, may perform a corresponding function.
Fig. 6 schematically shows a block diagram of an electronic device 600 according to an embodiment of the disclosure. The computer system illustrated in FIG. 6 is only one example and should not impose any limitations on the scope of use or functionality of embodiments of the disclosure.
As shown in fig. 6, the electronic device 600 includes a processor 610 and a computer-readable storage medium 620. The electronic device 600 may perform a method according to an embodiment of the present disclosure.
In particular, the processor 610 may comprise, for example, a general purpose microprocessor, an instruction set processor and/or related chip set and/or a special purpose microprocessor (e.g., an Application Specific Integrated Circuit (ASIC)), or the like. The processor 610 may also include onboard memory for caching purposes. The processor 610 may be a single processing unit or a plurality of processing units for performing the different actions of the method flows according to embodiments of the present disclosure.
Computer-readable storage medium 620, for example, may be a non-volatile computer-readable storage medium, specific examples including, but not limited to: magnetic storage devices, such as magnetic tape or Hard Disk Drives (HDDs); optical storage devices, such as compact disks (CD-ROMs); a memory, such as a Random Access Memory (RAM) or a flash memory; and so on.
The computer-readable storage medium 620 may include a computer program 621, which computer program 621 may include code/computer-executable instructions that, when executed by the processor 610, cause the processor 610 to perform a method according to an embodiment of the disclosure, or any variation thereof.
The computer program 621 may be configured with, for example, computer program code comprising computer program modules. For example, in an example embodiment, code in computer program 621 may include one or more program modules, including 621A, 621B, … …, for example. It should be noted that the division and number of the modules are not fixed, and those skilled in the art may use suitable program modules or program module combinations according to actual situations, so that the processor 610 may execute the method according to the embodiment of the present disclosure or any variation thereof when the program modules are executed by the processor 610.
According to an embodiment of the present invention, at least one of the first obtaining module 410, the predicting module 420, the first determining module 430, the first determining submodule 511, the second determining submodule 512, the predicting submodule 521, the excluding submodule 522, the third determining submodule 531, the fourth determining submodule 532, the obtaining unit 541, the determining unit 542, the second obtaining module 551 and the second determining module 552 may be implemented as a computer program module described with reference to fig. 6, which, when executed by the processor 610, may implement the respective operations described above.
The present disclosure also provides a computer-readable storage medium, which may be contained in the apparatus/device/system described in the above embodiments; or may exist separately and not be assembled into the device/apparatus/system. The computer-readable storage medium carries one or more programs which, when executed, implement the method according to an embodiment of the disclosure.
According to embodiments of the present disclosure, the computer-readable storage medium may be a non-volatile computer-readable storage medium, which may include, for example but is not limited to: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Those skilled in the art will appreciate that various combinations and/or combinations of features recited in the various embodiments and/or claims of the present disclosure can be made, even if such combinations or combinations are not expressly recited in the present disclosure. In particular, various combinations and/or combinations of the features recited in the various embodiments and/or claims of the present disclosure may be made without departing from the spirit or teaching of the present disclosure. All such combinations and/or associations are within the scope of the present disclosure.
While the disclosure has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents. Accordingly, the scope of the present disclosure should not be limited to the above-described embodiments, but should be defined not only by the appended claims, but also by equivalents thereof.

Claims (10)

1. A sensor spatial relationship acquisition method includes:
obtaining a first moving track acquired by a first sensor and a second moving track acquired by a second sensor;
predicting a first extension locus of the first movement locus; and
and determining the relative spatial relationship between the first sensor and the second sensor according to the second movement track and the first movement track at least based on the coincidence degree of the second movement track and the first extension track meeting a specific condition.
2. The method of claim 1, wherein the relative spatial relationship comprises at least a relative positional relationship, a relative perspective relationship.
3. The method of claim 1, wherein the first sensor acquires a plurality of first movement trajectories and the second sensor acquires a plurality of second movement trajectories, and wherein determining the relative spatial relationship of the first and second sensors from the second movement trajectories and the first movement trajectories based at least on the degree of coincidence of the second movement trajectories with the first extension trajectory meeting a particular condition comprises:
determining the coincidence degree of each first extension track and each second movement track;
and under the condition that the coincidence degree of at least one first extended track and at least one second moving track is greater than a first threshold value, determining the relative spatial relationship between the first sensor and the second sensor based on the first moving track corresponding to the first extended track and the second moving track.
4. The method of claim 3, further comprising:
under the condition that the coincidence degree of at least one first extended track and at least one second moving track is larger than a first threshold value, predicting the time of the first extended track based on the relative spatial relationship of the first sensor and the second sensor and the time of the first moving track corresponding to the first extended track; and
and under the condition that the difference between the predicted time of the first extending track and the predicted time of the second moving track is larger than a second threshold value, excluding the corresponding relation between the first extending track and the second moving track, and re-determining the relative position relation between the first sensor and the second sensor.
5. The method of claim 1, wherein the determining the relative spatial relationship of the first and second sensors from the second movement trajectory and the first movement trajectory based at least on the degree of overlap of the second movement trajectory with the first extended trajectory meeting a particular condition comprises:
determining a probability distribution of a relative spatial relationship of the first sensor and the second sensor based on at least that a degree of coincidence of the second movement trajectory with the first extension trajectory meets a certain condition; and
determining a relative spatial relationship of the first sensor and the second sensor based on a probability distribution of the relative spatial relationship of the first sensor and the second sensor.
6. The method of claim 5, wherein the determining the relative spatial relationship of the first and second sensors based on the probability distribution of the relative spatial relationship of the first and second sensors comprises:
obtaining probability distribution of relative spatial relationship of a plurality of sensor pairs, wherein the plurality of sensor pairs comprise a plurality of sensor pairs obtained by combining a plurality of sensors pairwise, and the plurality of sensors comprise the first sensor and the second sensor;
determining relative spatial relationships among the plurality of sensors by global optimization based on a probability distribution of the relative spatial relationships among the plurality of sensor pairs, wherein the relative spatial relationships among the plurality of sensors include the relative spatial relationships of the first sensor and the second sensor.
7. The method of claim 6, further comprising:
obtaining position and attitude information of at least one sensor of the plurality of sensors; and
determining position and attitude information of the plurality of sensors based on the position and attitude information of the at least one sensor and the relative spatial relationship between the plurality of sensors.
8. A sensor spatial relationship acquisition apparatus comprising:
the first obtaining module is used for obtaining a first moving track acquired by a first sensor and a second moving track acquired by a second sensor;
a prediction module for predicting a first extension trajectory of the first movement trajectory; and
and the first determining module is used for determining the relative spatial relationship between the first sensor and the second sensor according to the second movement track and the first movement track at least based on the coincidence degree of the second movement track and the first extension track meeting a specific condition.
9. An electronic device, comprising:
a processor; and
a memory having stored thereon a computer program that, when executed by the processor, causes the processor to:
obtaining a first moving track acquired by a first sensor and a second moving track acquired by a second sensor;
predicting a first extension locus of the first movement locus; and
and determining the relative spatial relationship between the first sensor and the second sensor according to the second movement track and the first movement track at least based on the coincidence degree of the second movement track and the first extension track meeting a specific condition.
10. A computer-readable storage medium having stored thereon a computer program which, when executed by a processor, causes the processor to perform:
obtaining a first moving track acquired by a first sensor and a second moving track acquired by a second sensor;
predicting a first extension locus of the first movement locus; and
and determining the relative spatial relationship between the first sensor and the second sensor according to the second movement track and the first movement track at least based on the coincidence degree of the second movement track and the first extension track meeting a specific condition.
CN201811653358.9A 2018-12-29 2018-12-29 Sensor spatial relationship acquisition method and device and electronic equipment Active CN109711398B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811653358.9A CN109711398B (en) 2018-12-29 2018-12-29 Sensor spatial relationship acquisition method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811653358.9A CN109711398B (en) 2018-12-29 2018-12-29 Sensor spatial relationship acquisition method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN109711398A CN109711398A (en) 2019-05-03
CN109711398B true CN109711398B (en) 2021-02-19

Family

ID=66259740

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811653358.9A Active CN109711398B (en) 2018-12-29 2018-12-29 Sensor spatial relationship acquisition method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN109711398B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1865842A (en) * 2004-05-14 2006-11-22 佳能株式会社 Method and device for determining position and direction

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106383679B (en) * 2016-08-31 2020-03-24 联想(北京)有限公司 Positioning method and terminal equipment thereof
US20180130226A1 (en) * 2016-11-07 2018-05-10 Lincoln Global, Inc. System and method for calibrating a welding trainer
KR101996419B1 (en) * 2016-12-30 2019-07-04 현대자동차주식회사 Sensor integration based pedestrian detection and pedestrian collision prevention apparatus and method
CN106952299B (en) * 2017-03-14 2019-07-16 大连理工大学 A kind of 3 d light fields Implementation Technology suitable for Intelligent mobile equipment
CN108280442B (en) * 2018-02-10 2020-07-28 西安交通大学 Multi-source target fusion method based on track matching
CN108492583A (en) * 2018-03-15 2018-09-04 张忠义 A kind of determination method of Car license recognition camera installation site

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1865842A (en) * 2004-05-14 2006-11-22 佳能株式会社 Method and device for determining position and direction

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PTZ camera based position tracking in IP-surveillance system;Chu-Sing Yang et al;《2008 3rd International Conference on Sensing Technology》;20081231;第1-10页 *
时空数据语义理解:技术与应用;姚迪 等;《软件学报》;20180430(第7期);第2018-2045页 *

Also Published As

Publication number Publication date
CN109711398A (en) 2019-05-03

Similar Documents

Publication Publication Date Title
CN111091708B (en) Vehicle track prediction method and device
US10796201B2 (en) Fusing predictions for end-to-end panoptic segmentation
US20220343138A1 (en) Analysis of objects of interest in sensor data using deep neural networks
WO2021180130A1 (en) Trajectory prediction
EP3278317B1 (en) Method and electronic device
Srikanth et al. Infer: Intermediate representations for future prediction
CN108388834A (en) The object detection mapped using Recognition with Recurrent Neural Network and cascade nature
EP3729397A1 (en) System, device and method for detecting abnormal traffic events in a geographical location
US10919543B2 (en) Learning method and learning device for determining whether to switch mode of vehicle from manual driving mode to autonomous driving mode by performing trajectory-based behavior analysis on recent driving route
US11475671B2 (en) Multiple robots assisted surveillance system
CN110795813A (en) Traffic simulation method and device
JP2013537654A (en) Method and system for semantic label propagation
JP6700373B2 (en) Apparatus and method for learning object image packaging for artificial intelligence of video animation
GB2562018A (en) A method and system for analyzing the movement of bodies in a traffic system
Wang et al. Realtime wide-area vehicle trajectory tracking using millimeter-wave radar sensors and the open TJRD TS dataset
Giese et al. Road course estimation using deep learning on radar data
IL257092A (en) A method and system for tracking objects between cameras
EP3828866A1 (en) Method and device for determining the trajectories of mobile elements
WO2023064693A1 (en) Verifying reproducibility for a vehicle controller
Naveed et al. Deep introspective SLAM: Deep reinforcement learning based approach to avoid tracking failure in visual SLAM
EP4020427A1 (en) Transportation environment data service
Liu et al. A survey on autonomous driving datasets: Data statistic, annotation, and outlook
St-Aubin Driver behaviour and road safety analysis using computer vision and applications in roundabout safety
CN109711398B (en) Sensor spatial relationship acquisition method and device and electronic equipment
Prabu et al. Scendd: A scenario-based naturalistic driving dataset

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant