CN115480275B - Motion state acquisition method and device, computer equipment and storage medium - Google Patents

Motion state acquisition method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN115480275B
CN115480275B CN202211119632.0A CN202211119632A CN115480275B CN 115480275 B CN115480275 B CN 115480275B CN 202211119632 A CN202211119632 A CN 202211119632A CN 115480275 B CN115480275 B CN 115480275B
Authority
CN
China
Prior art keywords
vector
identified
motion
segment
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211119632.0A
Other languages
Chinese (zh)
Other versions
CN115480275A (en
Inventor
阮佳
高宇
郑宏辉
张新建
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GUANGDONG MARITIME SAFETY ADMINISTRATION OF PEOPLE'S REPUBLIC OF CHINA
Original Assignee
GUANGDONG MARITIME SAFETY ADMINISTRATION OF PEOPLE'S REPUBLIC OF CHINA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by GUANGDONG MARITIME SAFETY ADMINISTRATION OF PEOPLE'S REPUBLIC OF CHINA filed Critical GUANGDONG MARITIME SAFETY ADMINISTRATION OF PEOPLE'S REPUBLIC OF CHINA
Priority to CN202211119632.0A priority Critical patent/CN115480275B/en
Publication of CN115480275A publication Critical patent/CN115480275A/en
Application granted granted Critical
Publication of CN115480275B publication Critical patent/CN115480275B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/393Trajectory determination or predictive tracking, e.g. Kalman filtering
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/203Specially adapted for sailing ships
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/46Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being of a radio-wave signal type

Abstract

The application relates to a method and a device for acquiring a motion state of a moving object and computer equipment. The method comprises the following steps: acquiring motion parameters of a moving target in a section to be identified; the segment to be identified is a motion track of a moving object, and the motion parameters comprise position information of a plurality of track points of the segment to be identified; determining a plurality of sub-segments which are adjacent in sequence of the segment to be identified according to the motion parameters of the segment to be identified; determining the vector of each sub-segment according to the motion parameters of the sub-segment; according to the vector of each sub-segment, determining the vector included angle of each two adjacent sub-segments; and determining the motion state of the moving object in the segment to be identified according to the vector included angle of each two adjacent subsections. The method can accurately acquire the motion state of the moving object, and can realize the judgment of the turning or turning-around state of the ship when the method is applied to the field of navigation.

Description

Motion state acquisition method and device, computer equipment and storage medium
Technical Field
The present disclosure relates to the field of navigation technologies, and in particular, to a method and an apparatus for acquiring a motion state, a computer device, and a storage medium.
Background
With the development of the economic society, the sea transportation becomes an important carrier for trade between China and the world. The effective monitoring of the navigation state of the moving object on water provides convenience for maritime institutions to manage maritime transactions.
At present, the position and the speed of a moving target on water are obtained by analyzing message information, so that whether the moving target on water is in a motion state or a parking state is judged, and the requirement of high-precision navigation control cannot be met.
Disclosure of Invention
Based on the above, it is necessary to provide a motion state acquisition method, a motion state acquisition device and a computer device capable of accurately acquiring a specific motion state of each segment of a motion track of a motion target, so as to provide a data basis for high-precision motion control.
In a first aspect, a method for acquiring a motion state is provided, including:
acquiring motion parameters of a moving target in a section to be identified; the section to be identified is at least one section of track on the motion track of the moving object, and the motion parameters comprise position information of a plurality of track points of the section to be identified;
determining a plurality of sub-segments which are adjacent in sequence of the segment to be identified according to the motion parameters of the segment to be identified;
determining the vector of each sub-segment according to the motion parameters of the sub-segment; the vector of each sub-segment is a vector of a first track point pointing to a second track point on the sub-segment, and the sampling time corresponding to the first track point is earlier than the sampling time corresponding to the second track point;
According to the vector of each sub-segment, determining the vector included angle of each two adjacent sub-segments;
and determining the motion state of the moving object in the segment to be identified according to the vector included angle of each two adjacent subsections.
In one implementation, the sub-segments include at least a first segment, a second segment, and a third segment.
In one implementation, determining the vector included angle of each two adjacent subsections according to the vector of each subsection includes:
if the directions of any two adjacent subsections are consistent according to the vectors of the subsections, determining the vector included angle of each two adjacent subsections according to the vector of each subsection.
In one implementation, determining the vector included angle of each two adjacent subsections according to the vector of each subsection further comprises:
if the vector directions of any two adjacent subsections are inconsistent according to the vector of the subsections, a new subsection to be identified is determined, and a step of obtaining the motion parameters of the moving object in the subsection to be identified is entered.
In one implementation, determining the motion state of the moving object in the segment to be identified according to the vector included angle of each two adjacent subsections includes:
when the sum of vector included angles of every two adjacent subsections is larger than or equal to 90 degrees and smaller than 180 degrees, determining that the moving object is in a turning state in the section to be identified;
When the sum of vector included angles of every two adjacent subsections is larger than or equal to 180 degrees, determining that the moving object is in a turning state in the section to be identified.
In one implementation, the motion state acquisition method further includes:
and under the condition that the moving object is determined to be in a turning state or a turning state in the section to be identified, acquiring the running speed of the moving object in the section to be identified.
In one implementation, acquiring a motion parameter of a moving object in a section to be identified includes:
acquiring motion parameters from a Kaff card message queue in real time; or alternatively, the first and second heat exchangers may be,
and acquiring the motion parameters of the motion object history from the distributed database.
In a second aspect, there is provided a motion state acquisition apparatus, the apparatus comprising:
the acquisition module is used for acquiring the motion parameters of the moving object in the section to be identified; the section to be identified is at least one section of track on the motion track of the moving object, and the motion parameters comprise position information of a plurality of track points of the section to be identified;
the subsection determining module is used for determining a plurality of sequentially adjacent subsections of the section to be identified according to the motion parameters of the section to be identified;
the vector determining module is used for determining the vector of each sub-segment according to the motion parameters of the sub-segment; the vector of each sub-segment is a vector of a first track point pointing to a second track point on the sub-segment, and the sampling time corresponding to the first track point is earlier than the sampling time corresponding to the second track point;
The included angle determining module is used for determining the vector included angle of each two adjacent subsections according to the vector of each subsection;
and the motion state determining module is used for determining the motion state of the moving object in the segment to be identified according to the vector included angle of each two adjacent subsections.
In a third aspect, a computer device is provided, comprising a memory storing a computer program and a processor implementing the steps of any one of the above methods for acquiring a motion state of a moving object when the computer program is executed.
In a fourth aspect, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method of any one of the motion state acquisition methods of a moving object described above.
The motion state acquisition method, the motion state acquisition device, the computer equipment and the storage medium have at least the following beneficial effects:
obtaining the motion parameters of a moving object in a section to be identified; the band identification section is at least one section of track on the motion track of the moving object, and the motion parameters comprise position information of a plurality of track points of the section to be identified; determining a plurality of sub-segments which are adjacent in sequence of the segment to be identified according to the motion parameters of the segment to be identified; determining the vector of each sub-segment according to the motion parameters of the sub-segment; the vector of each sub-segment is a vector of a first track point pointing to a second track point on the sub-segment, and the sampling time corresponding to the first track point is earlier than the sampling time corresponding to the second track point; according to the vector of each sub-segment, the vector included angle of each two adjacent sub-segments is determined, the vector included angle can reflect the motion change condition of the moving object in the front sub-segment and the rear sub-segment, based on the vector included angle, the specific motion state of the moving object in the segment to be identified can be determined according to the vector included angle of each two adjacent sub-segments, and powerful data support is provided for further motion control of the moving object. Further, when the method is applied to the field of navigation, the method can achieve the turning or turning around state of the navigation target on the water.
Drawings
FIG. 1 is an application environment diagram of a method for acquiring a motion state of a moving object in one embodiment;
FIG. 2 is a flow chart of a method for acquiring motion status of a moving object according to an embodiment;
FIG. 3 is a schematic diagram of a segment to be identified in a method for acquiring a motion state of a moving object according to an embodiment;
FIG. 4 is a schematic diagram of a segment to be identified in a method for acquiring a motion state of a moving object according to another embodiment;
FIG. 5 is a flowchart of a method for acquiring a motion state of a moving object according to another embodiment;
FIG. 6 is a schematic diagram of a segment to be identified in a method for acquiring a motion state of a moving object according to still another embodiment;
fig. 7 is a flowchart of a method for acquiring a motion state of a moving object according to still another embodiment;
fig. 8 is a flowchart of a method for acquiring a motion state of a moving object according to still another embodiment;
fig. 9 is a flowchart of a method for acquiring a motion state of a moving object according to still another embodiment;
fig. 10 is a flowchart of a method for acquiring a motion state of a moving object according to still another embodiment;
FIG. 11 is a block diagram showing a configuration of a moving state acquisition device of a moving object in one embodiment;
fig. 12 is an internal structural diagram of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
The method for acquiring the motion state of the moving object 2 provided by the embodiment of the application can be applied to an application environment as shown in fig. 1. In which the moving object 2 communicates with the satellite 10, and also communicates with the base station 4 via the network to access the server 6. The data storage system 8 may store data that the server 6 needs to process, such as historical motion parameters of the moving object 2, and the like. The data storage system 8 may be integrated on the server 6 or may be placed on a cloud or other network server. The motion state recognition system installed on the moving object 2 outputs the motion parameters (position information) of the moving object in the section to be recognized through the positioning assistance of the satellite 10, and the motion state recognition system can directly determine a plurality of sub-sections which are adjacent in sequence of the section to be recognized according to the motion parameters of the section to be recognized; determining the vector of each sub-segment according to the motion parameters of the sub-segment; according to the vector of each sub-segment, determining the vector included angle of each two adjacent sub-segments; and determining the motion state of the moving object 2 in the segment to be identified according to the vector included angle of each two adjacent subsections. In addition to this, the movement parameters of the moving object in the section to be identified can also be transmitted via the communication network to the data storage system 8 in the server 6 of the base station 4, and the movement state identification system installed in the base station 4 can also perform the above-described method to determine the movement state of the moving object 2. Wherein the moving object 2 may be, but is not limited to, a movable object such as an automobile or a ship. The server 6 may be implemented as a stand-alone server 6 or as a cluster of servers 6 consisting of a plurality of servers 6.
In one embodiment, as shown in fig. 2, a motion state obtaining method is provided, and an application of the method to the motion state recognition system in the application environment description is taken as an example for explanation, and the method includes the following steps:
step S202, obtaining motion parameters of a moving object in a section to be identified; the section to be identified is at least one section of track on the motion track of the moving object, and the motion parameters comprise position information of a plurality of track points of the section to be identified.
The segment to be identified may be one or more segments of a motion track divided based on a time dimension, one or more segments of a motion track divided based on a space dimension, or one or more segments of a motion track divided based on both time and space dimensions. Meanwhile, the segment to be identified may be at least one segment (or may be all) on a real-time motion track acquired in real time, or may be at least one segment (or may be all) on a historical motion track acquired in advance and stored in a storage space. Further, the section to be identified is provided with a plurality of motion track points, wherein the motion track points are virtual points of a plurality of columns recorded by the sampling device in the form of motion parameters (including position information) for the moving object when the moving object passes through the section to be identified. The position information may include coordinates of the moving object and time information when the moving object passes through the coordinates. Still further, the motion parameters may further include motion direction information (where the direction information is determined based on one reference direction, for example, north in the world coordinate system as a reference direction) and motion speed information of the moving object in the same reference coordinate system.
In one embodiment, the segment to be identified may refer to a motion track formed by arranging the plurality of track points based on information of two dimensions, namely a space dimension and a time dimension, and then connecting the track points according to the information. For example, as shown in fig. 3, under the same reference coordinate system, the moving object passes through the P1, P2, P3, and P4 trajectory points for a certain period of time. Wherein, the P1 track point comprises time information (10:00) and coordinates (0, 1). In this case, the four track points P1 to P4 are arranged in positions based on spatial dimensions (coordinates), and then sequentially connected based on time dimensions (time information), so as to finally form a segment to be identified as shown in fig. 3, namely, form a motion track to be identified.
Step S204, determining a plurality of sub-segments adjacent in sequence of the segment to be identified according to the motion parameters of the segment to be identified.
The sub-segment refers to the segment of the segment to be identified, that is, the segment of the motion track, and specifically may refer to a segment of the motion track pointing from one track point to an adjacent track point according to the sequence of the spatial dimension and the time dimension. The subsections can be motion tracks determined by pointing to adjacent track points from one track point according to the sequence of the space dimension and the time dimension; for example, as shown in fig. 3, the sub-segment refers to a motion trajectory segment P1P2 from the P1 trajectory point to the P2 trajectory point, and a motion trajectory P2P3 from the P2 to the P3 trajectory point. The subsections can also be at least three track points adjacent to each other, which are sequentially connected according to the sequence of the space dimension and the time dimension, and then the track points are determined; for example, as shown in fig. 3, the sub-segment may refer to a motion track determined by sequentially connecting a P2 track point and a P3 track point from a P1 track point.
Specifically, after the motion parameters of the segment to be identified, that is, the position information (including coordinates and time information) of the track points on the segment to be identified, are obtained, the segment to be identified is divided according to the coordinates and time information of the track points, so as to obtain a plurality of sub-segments which are adjacent in sequence.
Step S206, determining the vector of each sub-segment according to the motion parameters of the sub-segment; the vector of each sub-segment is a vector of a first track point pointing to a second track point on the sub-segment, and the sampling time corresponding to the first track point is earlier than the sampling time corresponding to the second track point.
Wherein the vector of the sub-segment may refer to a directional motion track determined by pointing from one track point to an adjacent track point according to the sequence of the space dimension and the time dimension, for example, as shown in fig. 3, the vector of the sub-segment may be a vector determined by pointing from P1 to P2The first track point refers to the start point of the vector of the sub-segment, and the second track point refers to the end point of the vector of the sub-segment, for example, as shown in fig. 3, in the vector +.>Wherein, the P1 track point is a first track point, and the P2 track point is a second track point; similarly, in vector->The P2 track point is the first track point, and the P3 track point is the second track point. The sampling time refers to the time when the moving object passes through the corresponding track point, for example, as shown in FIG. 3, in the vector +. >The sampling time of the first track point is the time when the moving object passes through the P1 track point, namely the sampling record time at the moment is 10:00. the vector of the subsections may also be a displacement determined after at least three mutually adjacent track points are connected in sequence according to the sequence of the space dimension and the time dimension. For example, as shown in FIG. 3, in the motion track in which the P1 track point is sequentially connected to the P2 track point and the P3 track point, the vector of the sub-segment may be the vector determined from the P1 track point to the P3 track point ∈>The P1 track point at this time is the first track point, and the P3 track point is the second track point.
Specifically, in one embodiment, the directionality of the sub-segments is determined according to the motion parameters of the sub-segments (i.e., the time information and coordinates of the trajectory points of each sub-segment), thereby determining the displacement of the moving object on each sub-segment.
Step S208, according to the vectors of the sub-segments, determining the vector included angle of each two adjacent sub-segments.
Wherein the vector angle of each two adjacent sub-segments refers to the angle determined by the vector of any two adjacent sub-segments in the segment to be identified, for example, as shown in FIG. 3Is a vector of one subsection, and vector +.>For vector->Vectors of adjacent subsections, wherein an included angle determined by the two vectors is alpha; similarly, the included angle beta is a vector +. >Sum vector->The determined included angles of vectors of adjacent subsections; or vector +.>Is a vector of one subsection, and vector +.>For vector->The vectors of adjacent subsections, at this time, the included angle determined by the two vectors is gamma.
Step S210, determining the motion state of the moving object in the segment to be identified according to the vector included angle of each two adjacent subsections.
The movement state may include a state in which a moving object turns and turns around.
In particular, in the case of vector determination for each sub-segment, the vector angle for each two adjacent sub-segments is naturally determined. The motion track of the moving object is obtained, the motion track is segmented based on motion parameters of track points, vectors of all sub-segments are determined, the motion direction and the moving distance of the moving object in all the sub-segments can be clearly known based on the included angle of the sub-segment vectors, the change condition of the moving object in the motion direction of all the different sub-segments can be known based on the change condition, whether the moving object turns or turns around in the segment to be identified can be known, and the specific motion state of the moving object in all the sub-segments can be accurately known, so that a powerful data basis is provided for motion control of the moving object. When the method is applied to the navigation field, whether the ship turns or turns around in the navigation process can be clearly known, and whether the ship collides with other ships after turning around can be further determined based on the judgment, so that the navigation accident is avoided.
To assist those skilled in the art in better understanding the practice of the present application, it is illustrated herein and not intended to limit the scope of the present application. For example, in the case of a segment to be identified having only two subsections, the motion state of the moving object can be determined by comparing the vector angle of the two subsections with a threshold angle, wherein the threshold angle is used for characterizing the moving objectA value of the state of motion. For example, as shown in FIG. 4, in the case of the vectorSum vector->In the composed segment to be identified, the included angle theta is the included angle determined by the vectors of the two adjacent subsections, and the motion state of the moving object can be determined only by comparing the included angle theta with the threshold angle. For another example, in the case that the segment to be identified includes at least three subsections, the included angle determined by the vector of each two adjacent subsections needs to be accumulated, and the motion state of the moving object can be determined by comparing the value of the accumulated value with the threshold angle, for example, as shown in fig. 3, in the case that the vector is>And->In the formed segment to be identified, the included angle determined by the vectors of every two adjacent subsections comprises alpha and beta, at the moment, alpha and beta are added, and the motion state of the moving object can be determined by comparing the value calculated by accumulation with the threshold angle.
In one embodiment, as shown in FIG. 3, the sub-segments include at least a first segment, a second segment, and a third segment. For example, the segment to be identified determined by the track points P1-P4 at least includes a first sub-segment P1P2, a second sub-segment P3P4 and a third sub-segment P5P6. In the section to be recognized composed of at least three sub-sections, the motion state of the moving object can be acquired more accurately than the section to be recognized of only two sub-sections.
In one implementation, as shown in fig. 5, determining the vector included angle of each two adjacent subsections according to the vector of each subsection includes:
in step S502, if it is determined that the vector directions of any two adjacent sub-segments are identical according to the vector of the sub-segment, the vector included angle of each two adjacent sub-segments is determined according to the vector of each sub-segment.
The consistent direction refers to that in the vectors of any two adjacent subsections in a section to be identified, the direction pointed by the vector of the next subsection is clockwise or anticlockwise relative to the direction pointed by the vector of the previous subsection. For example, as shown in FIG. 3, in the case of the vectorAnd->In the composed section to be identified, vectors are calculated under the same reference coordinate systemRelative to vector->Deflection clockwise, vector +. >Relative to vector->The motion object is deflected clockwise, namely, the motion object is judged to be consistent with the vector directions of any two adjacent subsections at the moment, the vector included angle of each two adjacent subsections can be determined according to the vector included angles of each subsection, then the total deflection angle of the motion object in the section to be identified can be determined according to the sum of the vector included angles of each two adjacent subsections, the motion state of the motion object in the section to be identified can be determined to be particularly turning around or turning according to the total deflection angle, and the specific deflection angle of the motion object in each subsection can be determined according to the vector included angles of the motion object in the adjacent subsections, so that the motion object can be precisely known. In this embodiment, compared with the method that only the included angle determined by the vectors of every two adjacent subsections is used to determine the motion state of the moving object, the method increases the criterion of whether the directions of any two adjacent subsections are consistent, fully considers the subsectionsIf the segment directions are not identical, if the adjacent two sub-vectors are added to obtain the total deflection angle of the moving object, the result will be wrong. The true motion state of the moving object can be acquired more accurately by judging the motion state of the segments to be identified with consistent directions.
In one implementation, as shown in fig. 5, determining the vector included angle of each two adjacent subsections according to the vector of each subsection further includes:
step S504, if it is judged that the vector directions of any two adjacent subsections are inconsistent according to the vectors of the subsections, a new section to be identified is determined, and a step of acquiring the motion parameters of the moving object in the section to be identified is entered.
The direction inconsistency refers to that in vectors of any two adjacent subsections in a section to be identified, the changing direction of the vector of the next subsection is not clockwise or anticlockwise relative to the vector of the previous subsection. The new segment to be identified refers to a motion track which can be determined by taking a second track point of the last sub-segment participating in the determination (namely, an end point of a vector of the sub-segment in the original segment to be identified) as a starting point and then with a subsequent track point (track point not in the original segment to be identified) under the condition that the determination of the inconsistent direction is established. For example, as shown in FIG. 6, in the case of the vectorAnd->In the composed segments to be identified, vectors are +.>Relative to vector->Deflection clockwise, vector +.>Relative to vector->And (3) deflecting in the anticlockwise direction, namely judging that the directions of any two adjacent sub-segment vectors are inconsistent at the moment. The new segment to be identified can take P11 as a starting point, and then connect the determined motion trail with the subsequent trail points P12 and P13. And re-entering the step of acquiring the motion parameters of the moving object in the section to be identified after the new section to be identified is determined, and re-acquiring the motion state of the new moving object. Considering that the motion state of the original segment to be identified is expected to be known by the user, the embodiment of the application can avoid that the newly selected segment to be identified is too far away from the original segment to be identified and the motion state of the moving object near the original segment to be identified can not be known by selecting the last track point to be identified (which can be the last track point of the time dimension) as the starting point of the new segment to be identified.
Of course, in one embodiment, when it is determined that the vector directions of any two adjacent subsections are inconsistent according to the vectors of the subsections, a new section to be identified may be determined according to the starting point of the original section to be identified as the end point of the new section to be identified.
Alternatively, when it is determined that only two adjacent sub-segments in the segments to be identified have inconsistent vector directions according to the vectors of the sub-segments, a new segment to be identified may be determined based on track points which are not shared in the two sub-segments having inconsistent directions. For example, as shown in FIG. 6,and->Compared with the inconsistent directions of the adjacent subsections, P9 or P11 can be taken as one end of the new section to be identified, a new section to be identified is determined, and the new section to be identified does not comprise P10. The determined new segment to be identified can reflect the motion state of the original segment to be identified to the maximum extent.
In the embodiment, under the condition that the vector directions of any two adjacent subsections are inconsistent according to the vector determination of the subsections, the new subsection to be identified is redetermined for acquiring the motion state of the moving object, so that inaccurate motion state identification caused by directly determining the motion state of the moving object by angles under the condition that the number of subsections of the subsection to be identified is large is avoided.
In one implementation, as shown in fig. 7, determining a motion state of a moving object in a segment to be identified according to a vector included angle of each two adjacent subsections includes:
step S702, when the sum of vector included angles of every two adjacent subsections is more than or equal to 90 degrees and less than 180 degrees, determining that a moving object is in a turning state in a section to be identified; when the sum of vector included angles of every two adjacent subsections is larger than or equal to 180 degrees, determining that the moving object is in a turning state in the section to be identified.
Specifically, after the vector included angles of each two adjacent subsections determined in the above embodiment are accumulated and calculated, the calculated result is compared with a threshold angle, so as to determine the motion state of the moving object. For example, as shown in FIG. 3, in the case of the vectorAnd->In the composed segment to be identified, the sum of vector angles of every two adjacent subsections is alpha+beta, and if the value of alpha+beta is larger than or equal to 90 degrees and smaller than 180 degrees, the moving object is determined to be in a turning state in the segment to be identified; if the value of alpha plus beta is larger than or equal to 180 degrees, determining that the moving object is in a turning state at the section to be identified. As another example, as shown in FIG. 4, the vector +. >Sum vector->In the composed segment to be identified, the sum of the vector included angles of every two adjacent subsections is theta,at this time, if the value of θ is greater than or equal to 90 ° and smaller than 180 °, determining that the moving object is in a turning state at the section to be identified; if the value of theta is larger than or equal to 180 degrees, determining that the moving object is in a turning state in the section to be identified.
In the embodiment, the motion state of the moving object is judged by setting the threshold angles of 90 degrees and 180 degrees, so that the acquired motion state can be more finely divided, and the real motion state of the moving object is further accurately determined.
In one implementation, as shown in fig. 7, the method further includes:
step S704, acquiring the running speed of the moving object in the section to be identified under the condition that the moving object is determined to be in a turning state or a turning state in the section to be identified.
The running speed of the section to be identified may refer to an average speed of the section to be identified when the moving object is determined to be in a turning state or a turning state, or may refer to an average speed of each sub-section.
Specifically, under the condition that the moving object is determined to be in a turning state or a turning state in the section to be identified, accumulating the distances of all sub-sections of the section to be identified, and simultaneously determining the total time of the moving object passing through the distances, namely the time of the moving object passing from the starting point of the section to be identified to the end point of the identification section; the running speed of the moving object in the section to be identified can be known by the speed = distance/time. For example, as shown in FIG. 3, in the case of the vector And->In the section to be identified of the composition, it is assumed that the value of α+β is greater than 90 ° and less than 180 ° at this time, and the identified section is determined to be in a turning state at this time. Then the vector is->And->The distances of the sub-sections are accumulated and calculated to obtain the distance of the section to be identified, the time of the moving object passing through the section to be identified is determined according to the time difference value of the track point P1 and the track point P4, and then the running speed of the moving object in the section to be identified is calculated. In the sub-section of the section to be recognized determined to be in the turning state, the movement speed of the section to be recognized at this time is a vector +.>Or->The average rate of one of the segments.
In this embodiment, the information of the turning speed or the turning speed of the moving object under the condition of turning or turning around is further determined while the moving state is determined, so that the actual moving state of the moving object is truly reflected from various information, the moving state judgment is more accurate, and a powerful data base is provided for subsequent other judgment operations.
In one embodiment, as shown in fig. 8, acquiring motion parameters of a moving object in a section to be identified includes:
step S802, acquiring motion parameters from a Kaff card message queue in real time; or, acquiring the motion parameters of the motion object history from the distributed database.
The Kaff card message queue is used for storing the position information of the track points generated by the moving object in real time, and the distributed database is used for storing the position information of the history track points of the moving object.
Specifically, the moving object generates the position information of each track point in real time in the moving process, and the generated track points are directly input into the Kaff card message queue, so that the quick acquisition of the real-time information is ensured in the processing flow of actually determining the moving state of the moving object. Furthermore, by storing the historical data in a distributed database with mass storage characteristics, the historical data generated by the moving target can be guaranteed to be stored completely, a data base is provided for subsequent data acquisition and processing, and the algorithm is convenient for data query and judgment processing.
For further explanation of the present application, a specific example will be described below, taking as an example the determination of the state of motion of a ship in the field of navigation. In this case, as shown in fig. 1 to 10, the moving object at this time is a ship; the segment to be identified is a motion trail determined by the position information of the trail point in the Yu Kafu card message queue generated and stored in real time when the ship sails, or a motion trail determined by the target position information of the trail point in the history sailing data stored in the distributed database; and the motion state refers to a turning state or a turning state of the ship. This may be classified as a real-time determination of the state of motion of the ship or a historical determination.
Under the judgment of the real-time motion state of the ship, the specific implementation steps are as follows:
step one, the FLINK real-time stream calculation engine obtains the position information of the track point in the Kafka message queue. The position information format of the track point is as follows: position information 1: (ais-id: 2, lon:111.2, lat,22.3, sog:5.6, cog:105, time: 20210118091500), wherein ais-id is a track point number, lon is longitude, lat is latitude, sog is the speed of the ship to ground, cog is the heading of the ship to ground, and time is the time the ship passes the track point.
Step two, obtaining the latest track point P1 of one navigation target A, and storing the track point into the track point sequences of the latest N navigation targets A of the navigation target A in the memory database.
And thirdly, calculating the navigation target A from the point P1 to the back, and judging whether the continuous track points turn in a certain direction.
Recording the sailing distance L, initially setting the sailing distance L to 0, the sailing duration T and initially recording the sailing duration T to 0
And fifthly, assuming that the latest track point of the target point A is P1, track points immediately followed in the memory database are P2, P3 and P4 …, the method for judging turning is that P1 and P2 are connected into a vector D1, P2 and P3 are connected into a vector D2, and the vector D2 is judged to be clockwise or anticlockwise rotated of the vector D1.
Step six, calculating the distance S between the P1 point and the P2 point, and calculating the timestamp difference K between the P1 point and the P2 point, wherein L=L+S and T=T+K.
And step seven, continuing to judge the vector D3 formed by connecting the vectors P2 and P3 and the vector D3 formed by connecting the vectors P3 and P4, judging whether the vector D3 rotates clockwise or anticlockwise (the rotation direction is required to be consistent with that calculated by D) calculated in the step 5, and if the rotation direction is changed, terminating the calculation, and starting to recalculate from the track point with changed direction.
Step eight, calculating the distance S between the P2 point and the P3 point, and calculating the timestamp difference K between the P2 point and the P3 point, wherein l=l+s, and t=t+k.
And step nine, uniformly polling the track points of the navigation target A, judging that the corners of the continuous track points rotating in the same direction and accumulating are more than 90 degrees or 180 degrees, judging that the ship is in a turning state when the corners are more than or equal to 90 degrees and less than 180 degrees, and judging that the ship is in a turning state when the corners are more than 180 degrees.
And step ten, storing a series of track points from the P1 point to the successfully judged track points into a turning result sequence.
Step eleven, and calculate the average speed d=l/T of the turning process.
And step twelve, outputting a turning track point sequence set S and a turning average speed D.
Also, under the determination of the historical motion state of the ship, the specific implementation steps are as follows:
step one, according to input parameters (ID information of a navigation target and a start-stop time period), all track points of the navigation target ID in the time period are screened out from an HBase distributed database.
And step two, sequencing all track points of the navigation target ID and the time period in the distributed database according to time sequence.
And thirdly, obtaining a position point P1 of the navigation target A according to the ordered data.
Step four, calculating the target A from the point P1 to the back, and judging whether the continuous points turn in a certain direction
Fifthly, recording the sailing distance L, initially setting the sailing distance L to 0, the sailing duration T and initially recording the sailing duration T to 0
Step six, assuming that the starting track point of the target point A is P1, the ordered track points returned at the bottom layer of the distributed database are P2, P3 and P4 …, the turning judgment method is that P1 and P2 are connected into a vector D1, P2 and P3 are connected into a vector D2, and the vector D2 is judged to rotate clockwise or anticlockwise of the vector D1.
Step seven, the distance S between the P1 point and the P2 point is calculated, and the timestamp difference K between the P1 point and the P2 point is calculated, l=l+s, and t=t+k.
And step eight, continuing to judge the vector D3 formed by the vector D2 formed by the vector P2 and the vector P3 formed by the vector P3 and the vector P4, judging whether the vector D3 rotates clockwise or anticlockwise (the rotation direction is consistent with that calculated in the step D), and if the rotation direction is changed, resetting the changed track point to be P1 for calculation, and entering the step four for recalculation.
Step nine, calculating the distance S between the P2 point and the P3 point, and calculating the timestamp difference K between the P2 point and the P3 point, l=l+s, t=t+k.
And step ten, uniformly polling the track points of the A target, and judging that the corners of the continuous track points rotating in the same direction and accumulating are over 90 degrees or 180 degrees.
And step eleven, storing the P1 point to a series of track points which are judged to be successful into a turning result sequence.
Step twelve, and calculate the average speed d=l/T of the turning process.
And step thirteen, after judging that one turning state is finished, if the follow-up track points exist, marking the follow-up track points as P1, and entering the step e to recalculate and see whether more turning records exist in the target in the time period.
Step fourteen, the set of track point sequences S of the turn and the average speed D of the turn (since it is data in the history period, a plurality of pieces of turn result data may be returned).
The method finally realizes the judgment of the sailing state of the ship, and provides a simple and quick method for judging the turning or turning around of the ship.
It should be understood that, although the steps in the flowcharts related to the embodiments described above are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the present application also provides a motion state acquisition device of a moving object for implementing the motion state acquisition method of a moving object related to the above. The implementation of the solution provided by the device is similar to the implementation described in the above method, so the specific limitation in the embodiments of the motion state acquisition device for one or more moving objects provided below may refer to the limitation of the motion state acquisition method for a moving object hereinabove, and will not be described herein.
In one embodiment, as shown in fig. 11, there is provided a motion state acquisition apparatus 110 of a moving object, including: an acquisition module 1102, a subsection determination module 1104, a vector determination module 1106, an included angle determination module 1108, and a motion state determination module 11010, wherein:
an obtaining module 1102, configured to obtain a motion parameter of a moving target in a section to be identified; the section to be identified is at least one section of track on the motion track of the moving object, and the motion parameters comprise position information of a plurality of track points of the section to be identified;
the subsection determining module 1104 is configured to determine a plurality of subsections of the section to be identified that are adjacent in sequence according to the motion parameters of the section to be identified;
a vector determining module 1106, configured to determine a vector of each sub-segment according to the motion parameter of the sub-segment; the vector of each sub-segment is a vector of a first track point pointing to a second track point on the sub-segment, and the sampling time corresponding to the first track point is earlier than the sampling time corresponding to the second track point;
an included angle determining module 1108, configured to determine, according to the vectors of the sub-segments, a vector included angle of each two adjacent sub-segments;
the motion state determining module 11010 is configured to determine a motion state of the moving object in the segment to be identified according to a vector included angle of each two adjacent subsections.
In one embodiment, the above included angle determining module 1108 is further configured to determine, according to the vectors of the subsections, that the vector directions of any two adjacent subsections are identical, and determine, according to the vector of each subsection, the vector included angle of each two adjacent subsections.
In one embodiment, the above-mentioned included angle determining module 1108 is further configured to determine, according to the vectors of the subsections, that the vector directions of any two adjacent subsections are inconsistent, determine a new section to be identified, and enter a step of obtaining the motion parameters of the moving object in the section to be identified.
In one embodiment, the motion state determining module 11010 is further configured to determine that the moving object is in a turning state in the segment to be identified when the sum of vector included angles of each two adjacent subsections is greater than or equal to 90 ° and less than 180 °; when the sum of vector included angles of every two adjacent subsections is larger than or equal to 180 degrees, determining that the moving object is in a turning state in the section to be identified.
In one embodiment, the moving object moving state obtaining device 110 further includes an operation speed obtaining module 1102, configured to obtain an operation speed of the moving object in the section to be identified when it is determined that the moving object is in a turning state or a turning state in the section to be identified.
In one embodiment, the obtaining module 1102 is further configured to obtain the motion parameter from the kaff card message queue in real time; or, acquiring the motion parameters of the motion object history from the distributed database.
The respective modules in the above-described moving object moving state acquisition means may be realized in whole or in part by software, hardware, and combinations thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a server, and the internal structure of which may be as shown in fig. 12. The computer device includes a processor, a memory, and a network interface connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer device is used for storing position information data of the moving object. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program, when executed by a processor, implements a method of motion state acquisition of a moving object.
It will be appreciated by those skilled in the art that the structure shown in fig. 12 is merely a block diagram of some of the structures associated with the present application and is not limiting of the computer device to which the present application may be applied, and that a particular computer device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided, including a memory and a processor, where the memory stores a computer program, and the processor implements the steps of the method for acquiring a motion state of a moving object in any of the above embodiments when the computer program is executed.
It should be noted that, user information (including but not limited to user equipment information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, presented data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the various embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magnetic random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (Phase Change Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like. The databases referred to in the various embodiments provided herein may include at least one of relational databases and non-relational databases. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic units, quantum computing-based data processing logic units, etc., without being limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples only represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the present application. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application shall be subject to the appended claims.

Claims (8)

1. A method for acquiring a motion state, comprising:
acquiring motion parameters of a moving target in a section to be identified; the section to be identified is at least one section of track on the motion track of the moving object, and the motion parameters comprise position information of a plurality of track points of the section to be identified;
determining a plurality of sub-sections which are adjacent in sequence of the section to be identified according to the motion parameters of the section to be identified;
Determining the vector of each subsection according to the motion parameters of the subsection; the vector of each subsection is a vector of a first track point pointing to a second track point on the subsection, and the sampling time corresponding to the first track point is earlier than the sampling time corresponding to the second track point;
according to the vector of each subsection, determining the vector included angle of each two adjacent subsections;
determining the motion state of the moving target in the segment to be identified according to the vector included angle of each two adjacent subsections;
the determining the vector included angle of each two adjacent subsections according to the vector of each subsection comprises the following steps:
if the vector directions of any two adjacent subsections are inconsistent according to the vector of the subsection, determining a new subsection to be identified, and entering the step of acquiring the motion parameters of the motion target in the subsection to be identified; the new segment to be identified refers to a motion track determined by a second track point of a sub-segment which finally participates in judgment as a starting point and then a subsequent track point under the condition that the judgment of inconsistent vector directions is established;
the determining the motion state of the moving object in the segment to be identified according to the vector included angle of each two adjacent subsections comprises:
When the sum of vector included angles of every two adjacent subsections is larger than or equal to 90 degrees and smaller than 180 degrees, determining that the moving target is in a turning state in the section to be identified;
and when the sum of vector included angles of every two adjacent subsections is larger than or equal to 180 degrees, determining that the moving target is in a turning state in the section to be identified.
2. The method of claim 1, wherein the sub-segments comprise at least a first segment, a second segment, and a third segment.
3. The method of claim 1, wherein determining the vector angle between each two adjacent subsections based on the vector of each subsection, further comprises:
if the vector directions of any two adjacent subsections are consistent according to the vectors of the subsections, determining the vector included angle of each two adjacent subsections according to the vector of each subsection.
4. The method according to claim 1, wherein the method further comprises:
and under the condition that the moving target is determined to be in the turning state or the turning state in the section to be identified, acquiring the running speed of the moving target in the section to be identified.
5. The method according to claim 1, wherein the acquiring the motion parameters of the moving object in the segment to be identified comprises:
acquiring the motion parameters from a Kaff card message queue in real time; or alternatively, the first and second heat exchangers may be,
and acquiring the motion parameters of the motion object history from a distributed database.
6. A motion state acquisition device, the device comprising:
the acquisition module is used for acquiring the motion parameters of the moving object in the section to be identified; the section to be identified is at least one section of track on the motion track of the moving object, and the motion parameters comprise position information of a plurality of track points of the section to be identified;
the subsection determining module is used for determining a plurality of sequentially adjacent subsections of the section to be identified according to the motion parameters of the section to be identified;
the vector determining module is used for determining the vector of each subsection according to the motion parameters of the subsection; the vector of each subsection is a vector of a first track point pointing to a second track point on the subsection, and the sampling time corresponding to the first track point is earlier than the sampling time corresponding to the second track point;
the included angle determining module is used for determining the vector included angle of each two adjacent subsections according to the vector of each subsection; the method is also used for judging that the vector directions of any two adjacent subsections are inconsistent according to the vectors of the subsections, determining a new subsection to be identified, and entering the step of acquiring the motion parameters of the moving object in the subsection to be identified; the new segment to be identified refers to a motion track determined by a second track point of a sub-segment which finally participates in judgment as a starting point and then a subsequent track point under the condition that the judgment of inconsistent vector directions is established;
The motion state determining module is used for determining the motion state of the moving object in the segment to be identified according to the vector included angle of each two adjacent subsections; the method is also used for determining that the moving target is in a turning state in the section to be identified when the sum of vector included angles of every two adjacent subsections is more than or equal to 90 degrees and less than 180 degrees; and when the sum of vector included angles of every two adjacent subsections is larger than or equal to 180 degrees, determining that the moving target is in a turning state in the section to be identified.
7. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any one of claims 1 to 5 when the computer program is executed.
8. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 5.
CN202211119632.0A 2022-09-15 2022-09-15 Motion state acquisition method and device, computer equipment and storage medium Active CN115480275B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211119632.0A CN115480275B (en) 2022-09-15 2022-09-15 Motion state acquisition method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211119632.0A CN115480275B (en) 2022-09-15 2022-09-15 Motion state acquisition method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN115480275A CN115480275A (en) 2022-12-16
CN115480275B true CN115480275B (en) 2023-08-08

Family

ID=84423394

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211119632.0A Active CN115480275B (en) 2022-09-15 2022-09-15 Motion state acquisition method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115480275B (en)

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3685159A (en) * 1969-01-03 1972-08-22 Bofors Ab Method and system for establishing a correct lead when firing at a moving target
JP2013111407A (en) * 2011-11-30 2013-06-10 Fujitsu Ltd Trajectory calculation unit and trajectory calculation method
CN105989224A (en) * 2015-02-04 2016-10-05 南京乐行天下智能科技有限公司 Mode recognition technique for different travel modes
JP2016176863A (en) * 2015-03-20 2016-10-06 カシオ計算機株式会社 Electronic apparatus, position correction method, and program
CN106530688A (en) * 2016-10-14 2017-03-22 浙江工业大学 Hadoop-based massive traffic data processing method
CN107966953A (en) * 2017-11-24 2018-04-27 上海维宏电子科技股份有限公司 For the method for line segment processing of turning back in numerical control processing track
CN108106623A (en) * 2017-09-08 2018-06-01 同济大学 A kind of unmanned vehicle paths planning method based on flow field
CN109490923A (en) * 2018-11-19 2019-03-19 西安交通大学 Vehicle driving camber angle real-time detection method based on adaptive least square fitting
CN109827582A (en) * 2019-03-29 2019-05-31 深圳市鹏途交通科技有限公司 A kind of method and system of quick determining road network disease relative position
CN109831744A (en) * 2017-11-23 2019-05-31 腾讯科技(深圳)有限公司 It is a kind of exception track recognizing method, device and storage equipment
CN110609881A (en) * 2019-08-28 2019-12-24 中山大学 Vehicle trajectory deviation detection method, system and storage medium
CN110715664A (en) * 2019-11-05 2020-01-21 大连理工大学 Intelligent unmanned aerial vehicle track rapid planning method under multi-constraint condition
CN112415536A (en) * 2020-11-11 2021-02-26 南京市测绘勘察研究院股份有限公司 Method for automatically acquiring abnormal area of vehicle-mounted laser point cloud driving track
CN112749622A (en) * 2020-11-30 2021-05-04 浙江大华技术股份有限公司 Emergency lane occupation identification method and device
CN112800349A (en) * 2021-02-02 2021-05-14 中华人民共和国广东海事局 Method, device, equipment and medium for acquiring motion state of aquatic moving target
CN213238939U (en) * 2020-09-30 2021-05-18 西安雷华测控技术有限公司 Image motion compensation test device
CN112947516A (en) * 2021-02-02 2021-06-11 三亚海兰寰宇海洋信息科技有限公司 Ship motion state discrimination method and system
CN113239719A (en) * 2021-03-29 2021-08-10 深圳元戎启行科技有限公司 Track prediction method and device based on abnormal information identification and computer equipment
CN113344265A (en) * 2021-05-28 2021-09-03 深圳市无限动力发展有限公司 Track closure judging method and device, computer equipment and storage medium
CN113345228A (en) * 2021-06-01 2021-09-03 星觅(上海)科技有限公司 Driving data generation method, device, equipment and medium based on fitted track

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8315749B2 (en) * 2010-01-11 2012-11-20 Che-Hang Charles Ih Innovative optimal spacecraft safing methodology

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3685159A (en) * 1969-01-03 1972-08-22 Bofors Ab Method and system for establishing a correct lead when firing at a moving target
JP2013111407A (en) * 2011-11-30 2013-06-10 Fujitsu Ltd Trajectory calculation unit and trajectory calculation method
CN105989224A (en) * 2015-02-04 2016-10-05 南京乐行天下智能科技有限公司 Mode recognition technique for different travel modes
JP2016176863A (en) * 2015-03-20 2016-10-06 カシオ計算機株式会社 Electronic apparatus, position correction method, and program
CN106530688A (en) * 2016-10-14 2017-03-22 浙江工业大学 Hadoop-based massive traffic data processing method
CN108106623A (en) * 2017-09-08 2018-06-01 同济大学 A kind of unmanned vehicle paths planning method based on flow field
CN109831744A (en) * 2017-11-23 2019-05-31 腾讯科技(深圳)有限公司 It is a kind of exception track recognizing method, device and storage equipment
CN107966953A (en) * 2017-11-24 2018-04-27 上海维宏电子科技股份有限公司 For the method for line segment processing of turning back in numerical control processing track
CN109490923A (en) * 2018-11-19 2019-03-19 西安交通大学 Vehicle driving camber angle real-time detection method based on adaptive least square fitting
CN109827582A (en) * 2019-03-29 2019-05-31 深圳市鹏途交通科技有限公司 A kind of method and system of quick determining road network disease relative position
CN110609881A (en) * 2019-08-28 2019-12-24 中山大学 Vehicle trajectory deviation detection method, system and storage medium
CN110715664A (en) * 2019-11-05 2020-01-21 大连理工大学 Intelligent unmanned aerial vehicle track rapid planning method under multi-constraint condition
CN213238939U (en) * 2020-09-30 2021-05-18 西安雷华测控技术有限公司 Image motion compensation test device
CN112415536A (en) * 2020-11-11 2021-02-26 南京市测绘勘察研究院股份有限公司 Method for automatically acquiring abnormal area of vehicle-mounted laser point cloud driving track
CN112749622A (en) * 2020-11-30 2021-05-04 浙江大华技术股份有限公司 Emergency lane occupation identification method and device
CN112800349A (en) * 2021-02-02 2021-05-14 中华人民共和国广东海事局 Method, device, equipment and medium for acquiring motion state of aquatic moving target
CN112947516A (en) * 2021-02-02 2021-06-11 三亚海兰寰宇海洋信息科技有限公司 Ship motion state discrimination method and system
CN113239719A (en) * 2021-03-29 2021-08-10 深圳元戎启行科技有限公司 Track prediction method and device based on abnormal information identification and computer equipment
CN113344265A (en) * 2021-05-28 2021-09-03 深圳市无限动力发展有限公司 Track closure judging method and device, computer equipment and storage medium
CN113345228A (en) * 2021-06-01 2021-09-03 星觅(上海)科技有限公司 Driving data generation method, device, equipment and medium based on fitted track

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
船舶碰撞事故反演有限元仿真;张磊;赵晓博;甘浪雄;李慧;郑元洲;周春辉;;中国航海(第01期);81-86 *

Also Published As

Publication number Publication date
CN115480275A (en) 2022-12-16

Similar Documents

Publication Publication Date Title
CN111627065B (en) Visual positioning method and device and storage medium
CN114299303A (en) Ship target detection method, terminal device and storage medium
CN112118537B (en) Method and related device for estimating movement track by using picture
CN115620252A (en) Trajectory rectification method and device, computer equipment and storage medium
CN115480275B (en) Motion state acquisition method and device, computer equipment and storage medium
CN113835078B (en) Signal level joint detection method and device based on local three-dimensional grid
CN113516682B (en) Loop detection method of laser SLAM
CN112823378A (en) Image depth information determination method, device, equipment and storage medium
CN116701492A (en) Track matching degree verification method and device, computer equipment and storage medium
CN114022518B (en) Method, device, equipment and medium for acquiring optical flow information of image
CN114501615B (en) Terminal positioning method, terminal positioning device, computer equipment, storage medium and program product
CN112800349B (en) Method, device, equipment and medium for acquiring motion state of aquatic moving target
CN113640760B (en) Radar discovery probability evaluation method and equipment based on air situation data
CN113903016B (en) Bifurcation point detection method, bifurcation point detection device, computer equipment and storage medium
CN113465616B (en) Track abnormal point detection method and device, electronic equipment and storage medium
CN113033578B (en) Image calibration method, system, terminal and medium based on multi-scale feature matching
CN115294280A (en) Three-dimensional reconstruction method, apparatus, device, storage medium, and program product
CN115035190A (en) Pose positioning method and device, computer equipment and storage medium
CN110889979B (en) Inland waterway data fusion method and device
CN110413716B (en) Data storage and data query method and device and electronic equipment
WO2019041271A1 (en) Image processing method, integrated circuit, processor, system and movable device
CN115544191A (en) Three-dimensional point cloud crowdsourcing type semantic map updating method and device
CN115905343B (en) Water shipping data processing method, apparatus, device, medium and program product
CN116012376B (en) Target detection method and device and vehicle
CN115454676B (en) Position information fusion method, device, equipment, storage medium and program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant