CN116088580A - Flying object tracking method and device - Google Patents
Flying object tracking method and device Download PDFInfo
- Publication number
- CN116088580A CN116088580A CN202310180137.9A CN202310180137A CN116088580A CN 116088580 A CN116088580 A CN 116088580A CN 202310180137 A CN202310180137 A CN 202310180137A CN 116088580 A CN116088580 A CN 116088580A
- Authority
- CN
- China
- Prior art keywords
- smoothing
- smooth
- tracking
- result
- tracking target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 60
- 238000009499 grossing Methods 0.000 claims abstract description 131
- 238000012545 processing Methods 0.000 claims abstract description 70
- 238000004364 calculation method Methods 0.000 claims abstract description 44
- 230000008569 process Effects 0.000 claims abstract description 12
- 238000004422 calculation algorithm Methods 0.000 claims description 33
- 230000002159 abnormal effect Effects 0.000 claims description 24
- 238000000605 extraction Methods 0.000 claims description 20
- 230000008859 change Effects 0.000 claims description 10
- 238000013016 damping Methods 0.000 claims description 8
- 238000004891 communication Methods 0.000 description 13
- 230000006870 function Effects 0.000 description 6
- 230000008878 coupling Effects 0.000 description 4
- 238000010168 coupling process Methods 0.000 description 4
- 238000005859 coupling reaction Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 230000005236 sound signal Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000001788 irregular Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/106—Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a flying object tracking method and device. Wherein the method comprises the following steps: acquiring a tracking target source; fitting the smooth data in the tracking target source with a smooth judgment model to obtain a smooth calculation result; carrying out smoothing treatment according to the smoothing calculation result to obtain a smoothing treatment result; and tracking the smoothing processing result as a final tracking target. The invention solves the technical problems that in the prior art, the coordinate position of the tracked flying object is judged and tracked only by acquiring the real-time picture, so that the tracking parameters are judged by utilizing the profile data and the motion data of the flying object, the judgment process when the flying object does not have a smooth structure can not be realized, and the tracking efficiency and the accuracy of the flying object tracking are greatly reduced.
Description
Technical Field
The invention relates to the field of image processing analysis, in particular to a flying object tracking method and device.
Background
Along with the continuous development of intelligent science and technology, intelligent equipment is increasingly used in life, work and study of people, and the quality of life of people is improved and the learning and working efficiency of people is increased by using intelligent science and technology means.
At present, a flying bird recognition device for a flying bird tracking algorithm is generally adopted for tracking flying birds and other objects moving at high speed, and tracked point location information is obtained, so that whether dangerous accidents occur or not is judged. However, in the prior art, the coordinate position of the tracked flying object is judged by acquiring a real-time picture, so that the tracking parameters are judged by utilizing the profile data and the motion data of the flying object, and the judgment process when the flying object does not have a smooth structure cannot be realized, thereby greatly reducing the tracking efficiency and the accuracy of the flying object tracking.
In view of the above problems, no effective solution has been proposed at present.
Disclosure of Invention
The embodiment of the invention provides a flying object tracking method and a device, which at least solve the technical problems that in the prior art, the flying object is judged and tracked, and the coordinate position of the tracked flying object is judged only by acquiring a real-time picture, so that the tracked parameters are judged by utilizing the profile data and the motion data of the flying object, the judgment process when the flying object does not have a smooth structure cannot be met, and the tracking efficiency and the accuracy of the flying object tracking are greatly reduced.
According to an aspect of an embodiment of the present invention, there is provided a flying object tracking method including: acquiring a tracking target source; fitting the smooth data in the tracking target source with a smooth judgment model to obtain a smooth calculation result; carrying out smoothing treatment according to the smoothing calculation result to obtain a smoothing treatment result; and tracking the smoothing processing result as a final tracking target.
Optionally, after the acquiring the tracking target source, the method further includes: obtaining a smooth extraction model; and inputting the information of the tracking target source into the smooth extraction model to obtain the smooth data.
Optionally, the performing smoothing processing according to the smoothing calculation result, and obtaining a smoothing processing result includes: obtaining the smoothing calculationSmoothly varying parameters in the results; and carrying out exponential smoothing algorithm processing on the tracking target source according to the smooth change parameter to obtain the smoothing processing result, wherein the exponential smoothing algorithm comprises: w=α 2 * (x, y) where W is the smoothing result, α is a smoothing index factor, and x and y represent pixel coordinate parameters of the smoothed image.
Optionally, before the tracking the smoothing result as the final tracking target, the method further includes: judging whether abnormal data exist according to the smooth calculation result; if the abnormal data exists, calculating the smoothing result by using an elastic damping interpolation algorithm and the abnormal data.
According to another aspect of the embodiment of the present invention, there is also provided a flying-object tracking device including: the acquisition module is used for acquiring a tracking target source; the fitting module is used for fitting the smooth data in the tracking target source with the smooth judgment model to obtain a smooth calculation result; the processing module is used for carrying out smoothing processing according to the smoothing calculation result to obtain a smoothing processing result; and the tracking module is used for tracking the smoothing processing result as a final tracking target.
Optionally, the apparatus further includes: the acquisition module is also used for acquiring the smooth extraction model; and the input module is used for inputting the information of the tracking target source into the smooth extraction model to obtain the smooth data.
Optionally, the processing module includes: an obtaining unit, configured to obtain a smooth variation parameter in the smooth calculation result; the computing unit is used for carrying out the processing of an exponential smoothing algorithm on the tracking target source according to the smooth change parameter to obtain the smoothing processing result, wherein the exponential smoothing algorithm comprises: w=α 2 * (x, y) where W is the smoothing result, α is a smoothing index factor, and x and y represent pixel coordinate parameters of the smoothed image.
Optionally, the apparatus further includes: the judging module is used for judging whether abnormal data exist according to the smooth calculation result; and the calculating module is used for calculating the smoothing processing result by using an elastic damping interpolation algorithm and the abnormal data if the abnormal data exist.
According to another aspect of the embodiment of the present invention, there is also provided a nonvolatile storage medium, where the nonvolatile storage medium includes a stored program, and when the program runs, the program controls a device in which the nonvolatile storage medium is located to execute a flying object tracking method.
According to another aspect of the embodiment of the present invention, there is also provided an electronic device including a processor and a memory; the memory stores computer readable instructions, and the processor is configured to execute the computer readable instructions, where the computer readable instructions execute a flying object tracking method when executed.
In the embodiment of the invention, a tracking target source is acquired; fitting the smooth data in the tracking target source with a smooth judgment model to obtain a smooth calculation result; carrying out smoothing treatment according to the smoothing calculation result to obtain a smoothing treatment result; the method for tracking the final tracking target by using the smoothing result solves the technical problems that in the prior art, the coordinate position of the tracked flying object is judged by acquiring a real-time picture only, so that the tracked parameters are judged by utilizing the profile data and the motion data of the flying object, the judgment process when the flying object does not have a smooth structure cannot be met, and the tracking efficiency and the accuracy of the flying object tracking are greatly reduced.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiments of the invention and together with the description serve to explain the invention and do not constitute a limitation on the invention. In the drawings:
FIG. 1 is a flow chart of a method of flying object tracking according to an embodiment of the present invention;
FIG. 2 is a block diagram of a flying-object tracking device according to an embodiment of the present invention;
fig. 3 is a block diagram of a terminal device for performing the method according to the invention according to an embodiment of the invention;
fig. 4 is a memory unit for holding or carrying program code for implementing a method according to the invention, according to an embodiment of the invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an embodiment of the present invention, there is provided a method embodiment of a flying-object tracking method, it being noted that the steps shown in the flowchart of the figures may be performed in a computer system, such as a set of computer-executable instructions, and that, although a logical sequence is shown in the flowchart, in some cases, the steps shown or described may be performed in a different order than what is shown or described herein.
Example 1
Fig. 1 is a flowchart of a flying-object tracking method according to an embodiment of the present invention, as shown in fig. 1, the method including the steps of:
step S102, a tracking target source is acquired.
Specifically, in order to solve the technical problems that in the prior art, the method and the device for tracking the flying object only judge the coordinate position of the tracked flying object by acquiring a real-time picture, so that the tracked parameters are judged by utilizing the profile data and the motion data of the flying object, and the judging process of the flying object when the flying object does not have a smooth structure cannot be met, the tracking efficiency and the accuracy degree of the flying object tracking are greatly reduced.
Optionally, after the acquiring the tracking target source, the method further includes: obtaining a smooth extraction model; and inputting the information of the tracking target source into the smooth extraction model to obtain the smooth data.
Specifically, after the tracking source data is acquired, the embodiment of the invention needs to extract the smooth data according to a related algorithm, for example, after the tracking target source is acquired, the method further includes: obtaining a smooth extraction model; and inputting the information of the tracking target source into the smooth extraction model to obtain the smooth data.
And step S104, fitting the smooth data in the tracking target source with a smooth judgment model to obtain a smooth calculation result.
Specifically, after the CPU of the high-precision image capturing system receives the smoothed data of the tracking target, in order to increase the smoothness recognition of the target, the smoothed data and the smoothed judgment model may be fitted and calculated, so as to obtain a result of whether all the tracking targets need to be smoothed, and by using such a result, the smoothed data and the tracking targets may be further processed, so as to obtain data that is convenient for tracking.
And step S106, carrying out smoothing processing according to the smoothing calculation result to obtain a smoothing processing result.
Optionally, the performing smoothing processing according to the smoothing calculation result, and obtaining a smoothing processing result includes: obtaining a smooth variation parameter in the smooth calculation result; and carrying out exponential smoothing algorithm processing on the tracking target source according to the smooth change parameter to obtain the smoothing processing result, wherein the exponential smoothing algorithm comprises: w=α 2 * (x, y) where W is the smoothing result, α is a smoothing index factor, and x and y represent pixel coordinate parameters of the smoothed image.
Specifically, the smoothing calculation result is smoothed by using a smoothing calculation algorithm, so that the situation of irregular tracking targets can be further smoothed, for example, a smooth variation parameter in the smoothing calculation result is obtained; and carrying out exponential smoothing algorithm processing on the tracking target source according to the smooth change parameter to obtain the smoothing processing result, wherein the exponential smoothing algorithm comprises: w=α 2 * (x, y) where W is the smoothing result, α is a smoothing index factor, and x and y represent pixel coordinate parameters of the smoothed image.
And step S108, tracking the smoothing processing result as a final tracking target.
Specifically, the final smoothing result obtained by smoothing can be used as a final tracking target of a high-precision real-time camera system, so that the system can track in real time, the probability of error tracking is reduced, and the overall tracking efficiency is improved.
Optionally, before the tracking the smoothing result as the final tracking target, the method further includes: judging whether abnormal data exist according to the smooth calculation result; if the abnormal data exists, calculating the smoothing result by using an elastic damping interpolation algorithm and the abnormal data.
By the embodiment, the problems that in the prior art, the coordinate position of the tracked flying object is judged and tracked only by acquiring the real-time picture, so that the tracking parameters are judged by utilizing the profile data and the motion data of the flying object, the judgment process when the flying object does not have a smooth structure cannot be met, and the tracking efficiency and the accuracy of the flying object tracking are greatly reduced are solved.
Example two
Fig. 2 is a block diagram of a flying-object tracking device according to an embodiment of the present invention, as shown in fig. 2, the device including:
an acquisition module 20, configured to acquire a tracking target source.
Specifically, in order to solve the technical problems that in the prior art, the method and the device for tracking the flying object only judge the coordinate position of the tracked flying object by acquiring a real-time picture, so that the tracked parameters are judged by utilizing the profile data and the motion data of the flying object, and the judging process of the flying object when the flying object does not have a smooth structure cannot be met, the tracking efficiency and the accuracy degree of the flying object tracking are greatly reduced.
Optionally, the apparatus further includes: the acquisition module is also used for acquiring the smooth extraction model; and the input module is used for inputting the information of the tracking target source into the smooth extraction model to obtain the smooth data.
Specifically, after the tracking source data is acquired, the embodiment of the invention needs to extract the smooth data according to a related algorithm, for example, after the tracking target source is acquired, the method further includes: obtaining a smooth extraction model; and inputting the information of the tracking target source into the smooth extraction model to obtain the smooth data.
And the fitting module 22 is configured to fit the smoothed data in the tracking target source to a smoothing judgment model to obtain a smoothing calculation result.
Specifically, after the CPU of the high-precision image capturing system receives the smoothed data of the tracking target, in order to increase the smoothness recognition of the target, the smoothed data and the smoothed judgment model may be fitted and calculated, so as to obtain a result of whether all the tracking targets need to be smoothed, and by using such a result, the smoothed data and the tracking targets may be further processed, so as to obtain data that is convenient for tracking.
And the processing module 24 is used for performing smoothing processing according to the smoothing calculation result to obtain a smoothing processing result.
Optionally, the processing module includes: an obtaining unit, configured to obtain a smooth variation parameter in the smooth calculation result; the computing unit is used for carrying out the processing of an exponential smoothing algorithm on the tracking target source according to the smooth change parameter to obtain the smoothing processing result, wherein the exponential smoothing algorithm comprises: w=α 2 * (x, y) where W is the smoothing result, α is a smoothing index factor, and x and y represent pixel coordinate parameters of the smoothed image.
Specifically, the smoothing calculation result is smoothed by using a smoothing calculation algorithm, so that the situation of irregular tracking targets can be further smoothed, for example, a smooth variation parameter in the smoothing calculation result is obtained; and carrying out exponential smoothing algorithm processing on the tracking target source according to the smooth change parameter to obtain the smoothing processing result, wherein the exponential smoothing algorithm comprises: w=α 2 * (x, y) where W is the smoothing result, α is a smoothing index factor, and x and y represent pixel coordinate parameters of the smoothed image.
And the tracking module 26 is used for tracking the smoothing processing result as a final tracking target.
Specifically, the final smoothing result obtained by smoothing can be used as a final tracking target of a high-precision real-time camera system, so that the system can track in real time, the probability of error tracking is reduced, and the overall tracking efficiency is improved.
Optionally, the apparatus further includes: the judging module is used for judging whether abnormal data exist according to the smooth calculation result; and the calculating module is used for calculating the smoothing processing result by using an elastic damping interpolation algorithm and the abnormal data if the abnormal data exist.
By the embodiment, the problems that in the prior art, the coordinate position of the tracked flying object is judged and tracked only by acquiring the real-time picture, so that the tracking parameters are judged by utilizing the profile data and the motion data of the flying object, the judgment process when the flying object does not have a smooth structure cannot be met, and the tracking efficiency and the accuracy of the flying object tracking are greatly reduced are solved.
According to another aspect of the embodiment of the present invention, there is also provided a nonvolatile storage medium, where the nonvolatile storage medium includes a stored program, and when the program runs, the program controls a device in which the nonvolatile storage medium is located to execute a flying object tracking method.
Specifically, the method comprises the following steps: acquiring a tracking target source; fitting the smooth data in the tracking target source with a smooth judgment model to obtain a smooth calculation result; carrying out smoothing treatment according to the smoothing calculation result to obtain a smoothing treatment result; and tracking the smoothing processing result as a final tracking target. Optionally, after the acquiring the tracking target source, the method further includes: obtaining a smooth extraction model; and inputting the information of the tracking target source into the smooth extraction model to obtain the smooth data. Optionally, the performing smoothing processing according to the smoothing calculation result, and obtaining a smoothing processing result includes: obtaining a smooth variation parameter in the smooth calculation result; and carrying out exponential smoothing algorithm processing on the tracking target source according to the smooth change parameter to obtain the smoothing processing result, wherein the exponential smoothing algorithm comprises: w=α 2 * (x, y) where W is the smoothing result, α is a smoothing exponential factor, and x and y represent pixel coordinates of the smoothed imageParameters. Optionally, before the tracking the smoothing result as the final tracking target, the method further includes: judging whether abnormal data exist according to the smooth calculation result; if the abnormal data exists, calculating the smoothing result by using an elastic damping interpolation algorithm and the abnormal data.
According to another aspect of the embodiment of the present invention, there is also provided an electronic device including a processor and a memory; the memory stores computer readable instructions, and the processor is configured to execute the computer readable instructions, where the computer readable instructions execute a flying object tracking method when executed.
Specifically, the method comprises the following steps: acquiring a tracking target source; fitting the smooth data in the tracking target source with a smooth judgment model to obtain a smooth calculation result; carrying out smoothing treatment according to the smoothing calculation result to obtain a smoothing treatment result; and tracking the smoothing processing result as a final tracking target. Optionally, after the acquiring the tracking target source, the method further includes: obtaining a smooth extraction model; and inputting the information of the tracking target source into the smooth extraction model to obtain the smooth data. Optionally, the performing smoothing processing according to the smoothing calculation result, and obtaining a smoothing processing result includes: obtaining a smooth variation parameter in the smooth calculation result; and carrying out exponential smoothing algorithm processing on the tracking target source according to the smooth change parameter to obtain the smoothing processing result, wherein the exponential smoothing algorithm comprises: w=α 2 * (x, y) where W is the smoothing result, α is a smoothing index factor, and x and y represent pixel coordinate parameters of the smoothed image. Optionally, before the tracking the smoothing result as the final tracking target, the method further includes: judging whether abnormal data exist according to the smooth calculation result; if the abnormal data exists, calculating the smoothing result by using an elastic damping interpolation algorithm and the abnormal data.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
In the foregoing embodiments of the present invention, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, fig. 3 is a schematic hardware structure of a terminal device according to an embodiment of the present application. As shown in fig. 3, the terminal device may include an input device 30, a processor 31, an output device 32, a memory 33, and at least one communication bus 34. The communication bus 34 is used to enable communication connections between the elements. The memory 33 may comprise a high-speed RAM memory or may further comprise a non-volatile memory NVM, such as at least one magnetic disk memory, in which various programs may be stored for performing various processing functions and implementing the method steps of the present embodiment.
Alternatively, the processor 31 may be implemented as, for example, a central processing unit (Central Processing Unit, abbreviated as CPU), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a controller, a microcontroller, a microprocessor, or other electronic components, and the processor 31 is coupled to the input device 30 and the output device 32 through wired or wireless connections.
Alternatively, the input device 30 may include a variety of input devices, for example, may include at least one of a user-oriented user interface, a device-oriented device interface, a programmable interface of software, a camera, and a sensor. Optionally, the device interface facing the device may be a wired interface for data transmission between devices, or may be a hardware insertion interface (such as a USB interface, a serial port, etc.) for data transmission between devices; alternatively, the user-oriented user interface may be, for example, a user-oriented control key, a voice input device for receiving voice input, and a touch-sensitive device (e.g., a touch screen, a touch pad, etc. having touch-sensitive functionality) for receiving user touch input by a user; optionally, the programmable interface of the software may be, for example, an entry for a user to edit or modify a program, for example, an input pin interface or an input interface of a chip, etc.; optionally, the transceiver may be a radio frequency transceiver chip, a baseband processing chip, a transceiver antenna, etc. with a communication function. An audio input device such as a microphone may receive voice data. The output device 32 may include a display, audio, or the like.
In this embodiment, the processor of the terminal device may include functions for executing each module of the data processing apparatus in each device, and specific functions and technical effects may be referred to the above embodiments and are not described herein again.
Fig. 4 is a schematic hardware structure of a terminal device according to another embodiment of the present application. Fig. 4 is a specific embodiment of the implementation of fig. 3. As shown in fig. 4, the terminal device of the present embodiment includes a processor 41 and a memory 42.
The processor 41 executes the computer program code stored in the memory 42 to implement the methods of the above-described embodiments.
The memory 42 is configured to store various types of data to support operation at the terminal device. Examples of such data include instructions for any application or method operating on the terminal device, such as messages, pictures, video, etc. The memory 42 may include a random access memory (random access memory, simply referred to as RAM) and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory.
Optionally, a processor 41 is provided in the processing assembly 40. The terminal device may further include: a communication component 43, a power supply component 44, a multimedia component 45, an audio component 46, an input/output interface 47 and/or a sensor component 48. The components and the like specifically included in the terminal device are set according to actual requirements, which are not limited in this embodiment.
The processing component 40 generally controls the overall operation of the terminal device. The processing component 40 may include one or more processors 41 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 40 may include one or more modules that facilitate interactions between the processing component 40 and other components. For example, processing component 40 may include a multimedia module to facilitate interaction between multimedia component 45 and processing component 40.
The power supply assembly 44 provides power to the various components of the terminal device. Power supply components 44 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for terminal devices.
The multimedia component 45 comprises a display screen between the terminal device and the user providing an output interface. In some embodiments, the display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the display screen includes a touch panel, the display screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation.
The audio component 46 is configured to output and/or input audio signals. For example, the audio component 46 includes a Microphone (MIC) configured to receive external audio signals when the terminal device is in an operational mode, such as a speech recognition mode. The received audio signals may be further stored in the memory 42 or transmitted via the communication component 43. In some embodiments, audio assembly 46 further includes a speaker for outputting audio signals.
The input/output interface 47 provides an interface between the processing assembly 40 and peripheral interface modules, which may be click wheels, buttons, etc. These buttons may include, but are not limited to: volume button, start button and lock button.
The sensor assembly 48 includes one or more sensors for providing status assessment of various aspects for the terminal device. For example, the sensor assembly 48 may detect the open/closed state of the terminal device, the relative positioning of the assembly, the presence or absence of user contact with the terminal device. The sensor assembly 48 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact, including detecting the distance between the user and the terminal device. In some embodiments, the sensor assembly 48 may also include a camera or the like.
The communication component 43 is configured to facilitate communication between the terminal device and other devices in a wired or wireless manner. The terminal device may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one embodiment, the terminal device may include a SIM card slot, where the SIM card slot is used to insert a SIM card, so that the terminal device may log into a GPRS network, and establish communication with a server through the internet.
From the above, it will be appreciated that the communication component 43, the audio component 46, and the input/output interface 47, the sensor component 48 referred to in the embodiment of fig. 4 may be implemented as an input device in the embodiment of fig. 3.
In the several embodiments provided in the present application, it should be understood that the disclosed technology content may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and the division of the units, for example, may be a logic function division, and may be implemented in another manner, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.
Claims (10)
1. A method of tracking a flying object, comprising:
acquiring a tracking target source;
fitting the smooth data in the tracking target source with a smooth judgment model to obtain a smooth calculation result;
carrying out smoothing treatment according to the smoothing calculation result to obtain a smoothing treatment result;
and tracking the smoothing processing result as a final tracking target.
2. The method of claim 1, wherein after the acquisition of the tracking target source, the method further comprises:
obtaining a smooth extraction model;
and inputting the information of the tracking target source into the smooth extraction model to obtain the smooth data.
3. The method of claim 1, wherein performing the smoothing process according to the smoothing calculation result to obtain a smoothing process result comprises:
obtaining a smooth variation parameter in the smooth calculation result;
and carrying out exponential smoothing algorithm processing on the tracking target source according to the smooth change parameter to obtain the smoothing processing result, wherein the exponential smoothing algorithm comprises:
W=α 2 *(x,y)
where W is the smoothing result, α is a smoothing exponential factor, and x and y represent pixel coordinate parameters of the smoothed image.
4. The method of claim 1, wherein prior to said tracking said smoothing result as a final tracking target, said method further comprises:
judging whether abnormal data exist according to the smooth calculation result;
if the abnormal data exists, calculating the smoothing result by using an elastic damping interpolation algorithm and the abnormal data.
5. A flying-object tracking device, comprising:
the acquisition module is used for acquiring a tracking target source;
the fitting module is used for fitting the smooth data in the tracking target source with the smooth judgment model to obtain a smooth calculation result;
the processing module is used for carrying out smoothing processing according to the smoothing calculation result to obtain a smoothing processing result;
and the tracking module is used for tracking the smoothing processing result as a final tracking target.
6. The apparatus of claim 5, wherein the apparatus further comprises:
the acquisition module is also used for acquiring the smooth extraction model;
and the input module is used for inputting the information of the tracking target source into the smooth extraction model to obtain the smooth data.
7. The apparatus of claim 5, wherein the processing module comprises:
an obtaining unit, configured to obtain a smooth variation parameter in the smooth calculation result;
the computing unit is used for carrying out the processing of an exponential smoothing algorithm on the tracking target source according to the smooth change parameter to obtain the smoothing processing result, wherein the exponential smoothing algorithm comprises:
W=α 2 *(x,y)
where W is the smoothing result, α is a smoothing exponential factor, and x and y represent pixel coordinate parameters of the smoothed image.
8. The apparatus of claim 5, wherein the apparatus further comprises:
the judging module is used for judging whether abnormal data exist according to the smooth calculation result;
and the calculating module is used for calculating the smoothing processing result by using an elastic damping interpolation algorithm and the abnormal data if the abnormal data exist.
9. A non-volatile storage medium, characterized in that the non-volatile storage medium comprises a stored program, wherein the program, when run, controls a device in which the non-volatile storage medium is located to perform the method of any one of claims 1 to 4.
10. An electronic device comprising a processor and a memory; the memory has stored therein computer readable instructions for executing the processor, wherein the computer readable instructions when executed perform the method of any of claims 1 to 4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310180137.9A CN116088580B (en) | 2023-02-15 | 2023-02-15 | Flying object tracking method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310180137.9A CN116088580B (en) | 2023-02-15 | 2023-02-15 | Flying object tracking method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116088580A true CN116088580A (en) | 2023-05-09 |
CN116088580B CN116088580B (en) | 2023-11-07 |
Family
ID=86206540
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310180137.9A Active CN116088580B (en) | 2023-02-15 | 2023-02-15 | Flying object tracking method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116088580B (en) |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101676744A (en) * | 2007-10-31 | 2010-03-24 | 北京航空航天大学 | Method for tracking small target with high precision under complex background and low signal-to-noise ratio |
CN106530328A (en) * | 2016-11-04 | 2017-03-22 | 深圳维周机器人科技有限公司 | Method for detecting and smoothly following moving object based on video images |
US20170186165A1 (en) * | 2015-12-29 | 2017-06-29 | Microsoft Technology Licensing, Llc | Tracking rigged smooth-surface models of articulated objects |
US20180204343A1 (en) * | 2017-01-17 | 2018-07-19 | Thomson Licensing | Method and device for determining a trajectory within a 3d scene for a camera |
CN109035304A (en) * | 2018-08-07 | 2018-12-18 | 北京清瑞维航技术发展有限公司 | Method for tracking target, calculates equipment and device at medium |
CN110488850A (en) * | 2019-08-02 | 2019-11-22 | 南京理工大学 | A kind of quadrotor drone vision navigation system and method based on raspberry pie |
JP2020005147A (en) * | 2018-06-28 | 2020-01-09 | 株式会社リコー | Information processing apparatus, movable body, remote control system, information processing method, and program |
CN110796010A (en) * | 2019-09-29 | 2020-02-14 | 湖北工业大学 | Video image stabilization method combining optical flow method and Kalman filtering |
CN110992378A (en) * | 2019-12-03 | 2020-04-10 | 湖南大学 | Dynamic update visual tracking aerial photography method and system based on rotor flying robot |
CN112286230A (en) * | 2020-11-13 | 2021-01-29 | 国家电网有限公司 | Unmanned aerial vehicle visual image algorithm, obstacle avoidance step and information fusion processing system thereof |
WO2021227519A1 (en) * | 2020-05-15 | 2021-11-18 | 深圳市优必选科技股份有限公司 | Target tracking method and apparatus, and computer-readable storage medium and robot |
CN114003064A (en) * | 2021-09-14 | 2022-02-01 | 阳光电源股份有限公司 | Tracking method and device of tracking support and photovoltaic system |
-
2023
- 2023-02-15 CN CN202310180137.9A patent/CN116088580B/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101676744A (en) * | 2007-10-31 | 2010-03-24 | 北京航空航天大学 | Method for tracking small target with high precision under complex background and low signal-to-noise ratio |
US20170186165A1 (en) * | 2015-12-29 | 2017-06-29 | Microsoft Technology Licensing, Llc | Tracking rigged smooth-surface models of articulated objects |
CN106530328A (en) * | 2016-11-04 | 2017-03-22 | 深圳维周机器人科技有限公司 | Method for detecting and smoothly following moving object based on video images |
US20180204343A1 (en) * | 2017-01-17 | 2018-07-19 | Thomson Licensing | Method and device for determining a trajectory within a 3d scene for a camera |
JP2020005147A (en) * | 2018-06-28 | 2020-01-09 | 株式会社リコー | Information processing apparatus, movable body, remote control system, information processing method, and program |
CN109035304A (en) * | 2018-08-07 | 2018-12-18 | 北京清瑞维航技术发展有限公司 | Method for tracking target, calculates equipment and device at medium |
CN110488850A (en) * | 2019-08-02 | 2019-11-22 | 南京理工大学 | A kind of quadrotor drone vision navigation system and method based on raspberry pie |
CN110796010A (en) * | 2019-09-29 | 2020-02-14 | 湖北工业大学 | Video image stabilization method combining optical flow method and Kalman filtering |
CN110992378A (en) * | 2019-12-03 | 2020-04-10 | 湖南大学 | Dynamic update visual tracking aerial photography method and system based on rotor flying robot |
WO2021227519A1 (en) * | 2020-05-15 | 2021-11-18 | 深圳市优必选科技股份有限公司 | Target tracking method and apparatus, and computer-readable storage medium and robot |
CN112286230A (en) * | 2020-11-13 | 2021-01-29 | 国家电网有限公司 | Unmanned aerial vehicle visual image algorithm, obstacle avoidance step and information fusion processing system thereof |
CN114003064A (en) * | 2021-09-14 | 2022-02-01 | 阳光电源股份有限公司 | Tracking method and device of tracking support and photovoltaic system |
Non-Patent Citations (1)
Title |
---|
谢征等: "飞行移动平台成像匹配的电子稳像技术研究", 计算机仿真, vol. 33, no. 02, pages 138 - 141 * |
Also Published As
Publication number | Publication date |
---|---|
CN116088580B (en) | 2023-11-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN115623336B (en) | Image tracking method and device for hundred million-level camera equipment | |
CN115984126A (en) | Optical image correction method and device based on input instruction | |
CN116614453B (en) | Image transmission bandwidth selection method and device based on cloud interconnection | |
CN116261044B (en) | Intelligent focusing method and device for hundred million-level cameras | |
CN115409869B (en) | Snow field track analysis method and device based on MAC tracking | |
CN111814840A (en) | Method, system, equipment and medium for evaluating quality of face image | |
CN116088580B (en) | Flying object tracking method and device | |
CN116664413B (en) | Image volume fog eliminating method and device based on Abbe convergence operator | |
CN116468883B (en) | High-precision image data volume fog recognition method and device | |
CN115984333B (en) | Smooth tracking method and device for airplane target | |
CN116389915B (en) | Method and device for reducing flicker of light field camera | |
CN116744102B (en) | Ball machine tracking method and device based on feedback adjustment | |
CN116723419B (en) | Acquisition speed optimization method and device for billion-level high-precision camera | |
CN116228593B (en) | Image perfecting method and device based on hierarchical antialiasing | |
CN116579964B (en) | Dynamic frame gradual-in gradual-out dynamic fusion method and device | |
CN115546053B (en) | Method and device for eliminating diffuse reflection of graphics on snow in complex terrain | |
CN115914819B (en) | Picture capturing method and device based on orthogonal decomposition algorithm | |
CN116030501B (en) | Method and device for extracting bird detection data | |
CN115809006B (en) | Method and device for controlling manual instructions through picture | |
CN116543013B (en) | Ball movement track analysis method and device | |
CN116797479B (en) | Image vertical distortion conversion method | |
CN116485841A (en) | Motion rule identification method and device based on multiple wide angles | |
CN117367455A (en) | Deep learning algorithm unmanned aerial vehicle route design method and device for photovoltaic power station | |
CN115994872A (en) | Smooth adjustment method and device for objects around aircraft | |
CN116468751A (en) | High-speed dynamic image detection method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |