CN108845321A - Recognition methods, device and the unmanned smart machine of target object - Google Patents
Recognition methods, device and the unmanned smart machine of target object Download PDFInfo
- Publication number
- CN108845321A CN108845321A CN201810353757.7A CN201810353757A CN108845321A CN 108845321 A CN108845321 A CN 108845321A CN 201810353757 A CN201810353757 A CN 201810353757A CN 108845321 A CN108845321 A CN 108845321A
- Authority
- CN
- China
- Prior art keywords
- target object
- characteristic information
- millimetre
- wave radar
- echo
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
Abstract
The application provides recognition methods, device and the unmanned smart machine of a kind of target object, is applied to unmanned smart machine, and the unmanned smart machine is provided with millimetre-wave radar, and a specific embodiment of the method includes:It is at least partially in response to detect target object, extracts characteristic information from the echo-signal that the millimetre-wave radar receives;The characteristic information is input in disaggregated model trained in advance;The classification of the target object is determined according to the result that the disaggregated model exports.The embodiment further can accurately determine the classification of target object, be conducive to control unmanned smart machine, improve the operational efficiency of unmanned smart machine when detecting target object.
Description
Technical field
This application involves unmanned technical field, in particular to a kind of recognition methods of target object, device and nobody
Driving intelligent equipment.
Background technique
With the continuous development of unmanned technology, unmanned smart machine is widely applied to every field, is
People's lives provide various conveniences.For at present, unmanned smart machine is only capable of detecting the target object of surrounding, but difficult
To identify the classification of target object.And the classification of target object largely affects the control of unmanned smart machine
Therefore strategy and operational efficiency are of great significance to the identification of target object classification.
Summary of the invention
The application provides recognition methods, device and the unmanned smart machine of a kind of target object.
According to the embodiment of the present application in a first aspect, a kind of recognition methods of target object is provided, applied to unmanned
Smart machine, the unmanned smart machine are provided with millimetre-wave radar, including:
It is at least partially in response to detect target object, be extracted from the echo-signal that the millimetre-wave radar receives
Characteristic information;
The characteristic information is input in disaggregated model trained in advance;
The classification of the target object is determined according to the result that the disaggregated model exports.
Optionally, the characteristic information includes following one or more:
The characteristic information of the centroid position of the target object;
The characteristic information of the speed of service of the target object;
The characteristic information of the size of the target object;
The characteristic information of the doppler bandwidth of the echo-signal;
The characteristic information of the intensity of the echo-signal.
Optionally, described to be at least partially in response to detect target object, what is received from the millimetre-wave radar returns
Characteristic information is extracted in wave signal, including:
In response to detecting target object, the location information of the target object is determined;
If determining that the target object meets preset condition according to the positional information, connect from the millimetre-wave radar
Characteristic information is extracted in the echo-signal received.
Optionally, described to determine that the target object meets preset condition according to the positional information, including:
According to the positional information, determine that the distance between the target object and the millimetre-wave radar are greater than or equal to
Pre-determined distance;And/or
According to the positional information, determination is not detected in the previous frame echo-signal that the millimetre-wave radar receives
The target object.
Optionally, the disaggregated model is Nonlinear Classifier.
According to the second aspect of the embodiment of the present application, a kind of identification device of target object is provided, is applied to unmanned
Smart machine, the unmanned smart machine are provided with millimetre-wave radar, including:
Extraction module detects target object for being at least partially in response to, receives from the millimetre-wave radar
Characteristic information is extracted in echo-signal;
Input module, for the characteristic information to be input in disaggregated model trained in advance;
Output module, the result for being exported according to the disaggregated model determine the classification of the target object.
Optionally, the characteristic information includes following one or more:
The characteristic information of the centroid position of the target object;
The characteristic information of the speed of service of the target object;
The characteristic information of the size of the target object;
The characteristic information of the doppler bandwidth of the echo-signal;
The characteristic information of the intensity of the echo-signal.
Optionally, the extraction module includes:
Submodule is determined, for determining the location information of the target object in response to detecting target object;
Extracting sub-module, for when determining that the target object meets preset condition according to the positional information, from institute
It states in the echo-signal that millimetre-wave radar receives and extracts characteristic information.
Optionally, the extracting sub-module determines that the target object meets according to the positional information in the following way
Preset condition:
According to the positional information, determine that the distance between the target object and the millimetre-wave radar are greater than or equal to
Pre-determined distance;And/or
According to the positional information, determination is not detected in the previous frame echo-signal that the millimetre-wave radar receives
The target object.
According to the third aspect of the embodiment of the present application, a kind of computer readable storage medium is provided, the storage medium is deposited
Computer program is contained, the computer program realizes side described in any one of above-mentioned first aspect when being executed by processor
Method.
According to the fourth aspect of the embodiment of the present application, a kind of unmanned smart machine, the unmanned intelligence are provided
Equipment is provided with millimetre-wave radar, including memory, processor and stores the meter that can be run on a memory and on a processor
Calculation machine program, the processor realize method described in any one of above-mentioned first aspect when executing described program.
The technical solution that embodiments herein provides can include the following benefits:
The recognition methods for the target object that embodiments herein provides and device, are detected by being at least partially in response to
To target object, characteristic information is extracted from the echo-signal that millimetre-wave radar receives, by features described above information input in advance
In first trained disaggregated model, and determine according to the result that the disaggregated model exports the classification of target object.So as to examine
When measuring target object, further accurately determines the classification of target object, be conducive to control unmanned smart machine
System, improves the operational efficiency of unmanned smart machine.
It should be understood that above general description and following detailed description be only it is exemplary and explanatory, not
The application can be limited.
Detailed description of the invention
The drawings herein are incorporated into the specification and forms part of this specification, and shows the implementation for meeting the application
Example, and together with specification it is used to explain the principle of the application.
Fig. 1 is a kind of the application flow chart of the recognition methods of target object shown according to an exemplary embodiment;
Fig. 2 is the flow chart of the recognition methods of the application another target object shown according to an exemplary embodiment;
Fig. 3 is the flow chart of the recognition methods of the application another target object shown according to an exemplary embodiment;
Fig. 4 is the flow chart of the recognition methods of the application another target object shown according to an exemplary embodiment;
Fig. 5 is a kind of the application block diagram of the identification device of target object shown according to an exemplary embodiment;
Fig. 6 is the block diagram of the identification device of the application another target object shown according to an exemplary embodiment;
Fig. 7 is a kind of the application structural schematic diagram of unmanned smart machine shown according to an exemplary embodiment.
Specific embodiment
Example embodiments are described in detail here, and the example is illustrated in the accompanying drawings.Following description is related to
When attached drawing, unless otherwise indicated, the same numbers in different drawings indicate the same or similar elements.Following exemplary embodiment
Described in embodiment do not represent all embodiments consistent with the application.On the contrary, they be only with it is such as appended
The example of the consistent device and method of some aspects be described in detail in claims, the application.
It is only to be not intended to be limiting the application merely for for the purpose of describing particular embodiments in term used in this application.
It is also intended in the application and the "an" of singular used in the attached claims, " described " and "the" including majority
Form, unless the context clearly indicates other meaning.It is also understood that term "and/or" used herein refers to and wraps
It may be combined containing one or more associated any or all of project listed.
It will be appreciated that though various information, but this may be described using term first, second, third, etc. in the application
A little information should not necessarily be limited by these terms.These terms are only used to for same type of information being distinguished from each other out.For example, not departing from
In the case where the application range, the first information can also be referred to as the second information, and similarly, the second information can also be referred to as
One information.Depending on context, word as used in this " if " can be construed to " ... when " or " when ...
When " or " in response to determination ".
As shown in Figure 1, Fig. 1 is a kind of process of the recognition methods of target object shown according to an exemplary embodiment
Figure, this method can be applied in unmanned smart machine, which is provided with millimetre-wave radar.Ability
Field technique personnel are appreciated that the unmanned smart machine can include but is not limited to intelligent robot, unmanned vehicle, unmanned plane
Etc..This approach includes the following steps:
In a step 101, it is at least partially in response to detect target object, the echo letter received from millimetre-wave radar
Characteristic information is extracted in number.
In the present embodiment, it is provided with millimetre-wave radar on above-mentioned unmanned smart machine, millimetre-wave radar is work
In the radar of millimeter wave band detection.The wavelength of millimeter wave between microwave and centimeter wave, therefore millimetre-wave radar have concurrently it is micro-
Some advantages of wave radar and electro-optic radar.Millimetre-wave radar can differentiate the target of identification very little, and can identify simultaneously multiple
The advantages of target has imaging capability strong, small in size, mobility and good concealment.Currently, vehicle-mounted millimetre-wave radar mainly rises
It is acted on to avoidance, only carries out target detection, export the information such as the distance, orientation, speed of target, there is no realize target identification.
Inventors realized that this problem, and this is studied, a kind of recognition methods of target object is proposed, is driven applied to nobody
Smart machine is sailed, the unmanned smart machine is provided with millimetre-wave radar.
In the present embodiment, the object around the unmanned smart machine can be detected using the millimetre-wave radar.Tool
For body, the millimetre-wave radar can periodically electromagnetic signals can quilt after the electromagnetic wave signal encounters barrier
Barrier reflects, which can receive by the reflected every frame echo-signal of barrier.When the millimeter
When wave radar receives echo-signal, target object can be confirmly detected.
In one implementation, it when confirmly detecting target object, can directly be received from millimetre-wave radar
Characteristic information is extracted in echo-signal.
In another implementation, when confirmly detecting target object, it can also further judge the mesh detected
Whether mark object meets preset condition.When determining that the target object meets preset condition, then received from millimetre-wave radar
Characteristic information is extracted in echo-signal.Wherein, preset condition can be any reasonable condition, tool of the application to preset condition
Hold aspect in vivo not limit.
In the present embodiment, features described above information may include following one or more:The centroid position of target object
Characteristic information;The characteristic information of the speed of service of target object;The characteristic information of the size of target object;Echo-signal it is how general
Strangle the characteristic information of bandwidth;The characteristic information of the intensity of echo-signal.Since features described above information and the classification of target object have
There is close connection, therefore, the classification of target object can be more accurately determined based on features described above information.It is appreciated that
Features described above information can also include other any reasonable characteristic informations, particular content side of the application to features described above information
Face does not limit.
In a step 102, by features described above information input into disaggregated model trained in advance.
In step 103, the classification of target object is determined according to the result that the disaggregated model exports.
In the present embodiment, the classification of target object can be an attribute of target object, for example, the class of target object
It not can include but is not limited to animal, people, bicycle, automobile, railing etc..It is appreciated that class of the application to target object
It is not limited in terms of other specific object.
It in the present embodiment, can be by features described above information input into disaggregated model trained in advance, and according to this point
The result of class model output determines the classification of target object.For example, the disaggregated model can directly export the classification of target object,
Therefore the classification of target object can be directly determined according to the result that the disaggregated model exports.In another example the disaggregated model may be used also
To export multiple classifications and the corresponding weight of each classification, which can indicate that target object belongs to the probability of the category,
It therefore can be according to disaggregated model output as a result, using the maximum classification of weight as the classification of target object.
In the present embodiment, disaggregated model trained in advance can be Nonlinear Classifier, and including but not limited to logic is returned
Return model, SVM (Support Vector Machine, support vector machines) and AdaBoost model etc..It specifically, can be with
Train classification models in the following way:Firstly, making the unmanned smart machine for being provided with millimetre-wave radar in preset field
It is run under scape, and detects the object of surrounding.When detecting object, the echo-signal that millimetre-wave radar receives is obtained, and mark
Label of the classification of note (using other device flags or handmarking) object as object.
Then, from it is above-mentioned detect object when the echo-signal that gets in extract characteristic information as sample characteristics letter
Sample data using the label of sample characteristics information and corresponding object as sample data, and is divided into training set and tested by breath
Card collection.Wherein, sample characteristics information may include following one or more:The characteristic information of the centroid position of above-mentioned object;On
State the characteristic information of the speed of service of object;The characteristic information of the size of above-mentioned object;The spy of the doppler bandwidth of echo-signal
Reference breath;The characteristic information etc. of the intensity of echo-signal.
Finally, being adjusted using parameter of the training set to current class model.Point that front is trained using verifying collection
Class model is verified.When verification result is met the requirements, using current class model as trained disaggregated model.
The recognition methods of the target object provided by the above embodiment of the application, is detected by being at least partially in response to
Target object extracts characteristic information from the echo-signal that millimetre-wave radar receives, by features described above information input in advance
In trained disaggregated model, and the classification of target object is determined according to the result that the disaggregated model exports.So as to detect
When to target object, further accurately determines the classification of target object, is conducive to control unmanned smart machine,
Improve the operational efficiency of unmanned smart machine.
As shown in Fig. 2, the process of the recognition methods of Fig. 2 another target object shown according to an exemplary embodiment
Figure, This embodiment describes being at least partially in response to detect target object, the echo-signal received from millimetre-wave radar
The middle process for extracting characteristic information, this method can be applied in unmanned smart machine, which sets
It is equipped with millimetre-wave radar.Include the following steps:
In step 201, in response to detecting target object, the location information of the target object is determined.
In step 202, if determining that target object meets preset condition according to the location information, from the millimeter wave thunder
Up to extracting characteristic information in the echo-signal received.
In the present embodiment, when detecting target object, the echo that can be received first according to the millimetre-wave radar
Signal determines the location information of the target object.Then, whether which is determined according to the location information of the target object
Meet preset condition.If the target object meets preset condition, the echo further received from the millimetre-wave radar is believed
Characteristic information is extracted in number.
In one implementation, the target pair can be determined according to the location information of the target object in the following way
As if no meet preset condition:It is determined between the target object and the millimetre-wave radar according to the location information of the target object
Whether distance is greater than or equal to pre-determined distance, if it is greater than or equal to pre-determined distance, then it is pre- can to determine that the target object meets
If condition.
In another implementation, the mesh can also be determined according to the location information of the target object in the following way
Whether mark object meets preset condition:According to the location information of the target object, determination receives upper in the millimetre-wave radar
The target object whether is detected in one frame echo-signal, if the target object is not detected, can determine the target pair
As meeting preset condition.Specifically, the echo-signal that millimetre-wave radar receives is one by one, to return when former frame
When detecting target object in wave signal, it can judge whether the target object appears in milli based on the target following technology of radar
In the previous frame echo-signal that metre wave radar receives.If it is determined that in the previous frame echo-signal that millimetre-wave radar receives not
It detects the target object, then can determine that the target object meets preset condition.
In another implementation, the mesh can also be determined according to the location information of the target object in the following way
Whether mark object meets preset condition:Firstly, determining the target object and the millimeter wave according to the location information of the target object
Whether the distance between radar is greater than or equal to pre-determined distance.If it is greater than or be equal to pre-determined distance, then further according to the mesh
The location information of object is marked, whether determination detects the target pair in the previous frame echo-signal that the millimetre-wave radar receives
As.If the target object is not detected, it can determine that the target object meets preset condition.
It is appreciated that can also determine whether the target object meets preset condition by other any reasonable manners,
The application to not limiting in this respect.
In step 203, by features described above information input into disaggregated model trained in advance.
In step 204, the classification of target object is determined according to the result that the disaggregated model exports.
It should be noted that no longer going to live in the household of one's in-laws on getting married in above-mentioned Fig. 2 embodiment for the step identical with Fig. 1 embodiment
It states, related content can be found in Fig. 1 embodiment.
The recognition methods of the target object provided by the above embodiment of the application, by response to detecting target object,
The location information of the target object is determined, if determining that target object meets preset condition according to the location information, from the milli
Characteristic information is extracted in the echo-signal that metre wave radar receives, by features described above information input to disaggregated model trained in advance
In, and the classification of target object is determined according to the result that the disaggregated model exports.Since the present embodiment is detecting target object
When, only when determining that target object meets preset condition according to the location information of target object, further identify target object
Classification, to improve the precision to the control of unmanned smart machine under the premise of not wasting computing resource.
As shown in figure 3, the process of the recognition methods of Fig. 3 another target object shown according to an exemplary embodiment
The process that determining target object meets preset condition is described in detail in figure, the embodiment, and this method can be applied to unmanned
In smart machine, which is provided with millimetre-wave radar, includes the following steps:
In step 301, in response to detecting target object, the location information of the target object is determined.
In step 302, if according to the location information of the target object, determine the target object and millimetre-wave radar it
Between distance be greater than or equal to pre-determined distance, then extract characteristic information from the echo-signal that the millimetre-wave radar receives.
In step 303, by features described above information input into disaggregated model trained in advance.
In step 304, the classification of target object is determined according to the result that the disaggregated model exports.
In step 305, if according to the location information of the target object, determine the target object and millimetre-wave radar it
Between distance be less than pre-determined distance, then execute the operation of avoiding barrier.
In the present embodiment, the operation of avoiding barrier can be any proper form executed for avoiding barrier
Operation, the application to not limiting in this respect.
It should be noted that for the step identical with Fig. 1 and Fig. 2 embodiment, in above-mentioned Fig. 3 embodiment no longer into
Row repeats, and related content can be found in Fig. 1 and Fig. 2 embodiment.
The recognition methods of the target object provided by the above embodiment of the application, by response to detecting target object,
The location information for determining the target object determines the target object and millimetre-wave radar in the location information according to the target object
The distance between be greater than or equal to pre-determined distance when, extract characteristic information from the echo-signal that the millimetre-wave radar receives.
By features described above information input into disaggregated model trained in advance, and target pair is determined according to the result that the disaggregated model exports
The classification of elephant.It is less than at a distance from the location information according to the target object determines the target object between millimetre-wave radar pre-
If apart from when, execute the operation of avoiding barrier.Since the present embodiment is after detecting target object, this is further considered
The distance between target object and millimetre-wave radar, and the directly behaviour of execution avoiding barrier when the distance is less than pre-determined distance
Make, only when the distance is greater than or equal to pre-determined distance, just identifies therefore the classification of target object is set in unmanned intelligence
When standby closer with obstacle distance, the identification to target object classification can be skipped, carries out urgent avoidance, it is more optimized to nobody
The control of driving intelligent equipment.
As shown in figure 4, the process of the recognition methods of Fig. 4 another target object shown according to an exemplary embodiment
The process that determining target object meets preset condition is described in detail in figure, the embodiment, and this method can be applied to unmanned
In smart machine, which is provided with millimetre-wave radar, includes the following steps:
In step 401, in response to detecting target object, the location information of the target object is determined.
In step 402, if according to the location information of the target object, upper one received in millimetre-wave radar is determined
The target object is not detected in frame echo-signal, then extracts feature letter from the echo-signal that the millimetre-wave radar receives
Breath.
In step 403, by features described above information input into disaggregated model trained in advance.
In step 404, the classification of target object is determined according to the result that the disaggregated model exports.
In step 405, if according to the location information of the target object, upper one received in millimetre-wave radar is determined
The target object is detected in frame echo-signal, then executes the operation of tracking target object.
In the present embodiment, the target pair is detected in the previous frame echo-signal that millimetre-wave radar receives when determining
As then illustrating to be classified target object, therefore, can then execute the operation etc. of tracking target object.
It should be noted that for the step identical with Fig. 1 and Fig. 2 embodiment, in above-mentioned Fig. 4 embodiment no longer into
Row repeats, and related content can be found in Fig. 1 and Fig. 2 embodiment.
The recognition methods of the target object provided by the above embodiment of the application, by response to detecting target object,
The location information for determining the target object, it is upper being received according to the determination of the location information of the target object in millimetre-wave radar
When the target object is not detected in one frame echo-signal, feature letter is extracted from the echo-signal that the millimetre-wave radar receives
Breath.By features described above information input into disaggregated model trained in advance, and mesh is determined according to the result that the disaggregated model exports
Mark the classification of object.The previous frame echo-signal received in millimetre-wave radar is being determined according to the location information of the target object
In when detecting the target object, execute the operation of tracking target object.Since the present embodiment is after detecting target object, again
Further contemplate whether the target object is to be checked through for the first time, and only when the target object is to be checked through for the first time,
It identifies the classification of target object, therefore, avoids duplicate identification operation, save computing resource, also optimize and nobody is driven
Sail the control of smart machine.
Although should be noted that in the above-described embodiments, the operation of the application method is described with particular order, this
These operations must be executed in this particular order by not requiring that or implying, or is had to carry out and operated just shown in whole
It is able to achieve desired result.On the contrary, the step of describing in flow chart can change and execute sequence.It additionally or alternatively, can be with
Certain steps are omitted, multiple steps are merged into a step and are executed, and/or a step is decomposed into execution of multiple steps.
Corresponding with the recognition methods embodiment of preceding aim object, present invention also provides the identification devices of target object
Embodiment.
As shown in figure 5, Fig. 5 is a kind of the application identification device frame of target object shown according to an exemplary embodiment
Figure, the device are applied to unmanned smart machine, which is provided with millimetre-wave radar, may include:
Extraction module 501, input module 502 and output module 503.
Wherein, extraction module 501 detect target object for being at least partially in response to, from above-mentioned millimetre-wave radar
Characteristic information is extracted in the echo-signal received.
Input module 502, for by this feature information input into disaggregated model trained in advance.
Output module 503, the result for being exported according to the disaggregated model determine the classification of target object.
In some optional embodiments, features described above information may include following one or more:The target object
The characteristic information of centroid position;The characteristic information of the speed of service of the target object;The characteristic information of the size of the target object;
The characteristic information of the doppler bandwidth of the echo-signal;The characteristic information of the intensity of the echo-signal.
As shown in fig. 6, Fig. 6 is the identification device of the application another target object shown according to an exemplary embodiment
Block diagram, on the basis of aforementioned embodiment illustrated in fig. 5, extraction module 501 may include the embodiment:Determine 601 He of submodule
Extracting sub-module 602.
Wherein it is determined that submodule 601, for determining the location information of the target object in response to detecting target object.
Extracting sub-module 602, for when determining that the target object meets preset condition according to above-mentioned location information, from this
Characteristic information is extracted in the echo-signal that millimetre-wave radar receives.
In other optional embodiments, extracting sub-module 602 is determined according to above-mentioned location information in the following way
The target object meets preset condition:
According to above-mentioned location information, it is default to determine that the distance between the target object and the millimetre-wave radar are greater than or equal to
Distance.And/or
According to above-mentioned location information, this is not detected in the previous frame echo-signal that the millimetre-wave radar receives in determination
Target object.
In other optional embodiments, trained disaggregated model is Nonlinear Classifier in advance.
It should be appreciated that above-mentioned apparatus can be set in advance in unmanned smart machine, the side such as downloading can also be passed through
Formula and be loaded into unmanned smart machine.Corresponding module in above-mentioned apparatus can be with the mould in unmanned smart machine
Block cooperates to realize the identifying schemes of target object.
For device embodiment, since it corresponds essentially to embodiment of the method, so related place is referring to method reality
Apply the part explanation of example.The apparatus embodiments described above are merely exemplary, wherein described be used as separation unit
The unit of explanation may or may not be physically separated, and component shown as a unit can be or can also be with
It is not physical unit, it can it is in one place, or may be distributed over multiple network units.It can be according to actual
The purpose for needing to select some or all of the modules therein to realize application scheme.Those of ordinary skill in the art are not paying
Out in the case where creative work, it can understand and implement.
The embodiment of the present application also provides a kind of computer readable storage medium, which is stored with computer journey
Sequence, computer program can be used for executing the recognition methods for the target object that above-mentioned Fig. 1 to Fig. 4 any embodiment provides.
Corresponding to the recognition methods of above-mentioned target object, the embodiment of the present application also proposed shown in Fig. 7 according to this Shen
The schematic configuration diagram of the unmanned smart machine of an exemplary embodiment please.Referring to FIG. 7, in hardware view, this nobody
Driving intelligent equipment includes processor, internal bus, network interface, memory and nonvolatile memory, is also possible to wrap certainly
Include hardware required for other business.Processor is right into memory from corresponding computer program is read in nonvolatile memory
After run, on logic level formed target object identification device.Certainly, other than software realization mode, the application is simultaneously
It is not excluded for other implementations, such as logical device or the mode of software and hardware combining etc., that is to say, that following process flow
Executing subject be not limited to each logic unit, be also possible to hardware or logical device.
Those skilled in the art after considering the specification and implementing the invention disclosed here, will readily occur to its of the application
Its embodiment.This application is intended to cover any variations, uses, or adaptations of the application, these modifications, purposes or
Person's adaptive change follows the general principle of the application and including the undocumented common knowledge in the art of the application
Or conventional techniques.The description and examples are only to be considered as illustrative, and the true scope and spirit of the application are by following
Claim is pointed out.
It should be understood that the application is not limited to the precise structure that has been described above and shown in the drawings, and
And various modifications and changes may be made without departing from the scope thereof.Scope of the present application is only limited by the accompanying claims.
Claims (10)
1. a kind of recognition methods of target object, which is characterized in that be applied to unmanned smart machine, the unmanned intelligence
Energy equipment is provided with millimetre-wave radar, the method includes:
It is at least partially in response to detect target object, extracts feature from the echo-signal that the millimetre-wave radar receives
Information;
The characteristic information is input in disaggregated model trained in advance;
The classification of the target object is determined according to the result that the disaggregated model exports.
2. the method according to claim 1, wherein the characteristic information includes following one or more:
The characteristic information of the centroid position of the target object;
The characteristic information of the speed of service of the target object;
The characteristic information of the size of the target object;
The characteristic information of the doppler bandwidth of the echo-signal;
The characteristic information of the intensity of the echo-signal.
3. the method according to claim 1, wherein described be at least partially in response to detect target object,
Characteristic information is extracted from the echo-signal that the millimetre-wave radar receives, including:
In response to detecting target object, the location information of the target object is determined;
If determining that the target object meets preset condition according to the positional information, received from the millimetre-wave radar
Echo-signal in extract characteristic information.
4. according to the method described in claim 3, it is characterized in that, described determine the target object according to the positional information
Meet preset condition, including:
According to the positional information, it is default to determine that the distance between the target object and the millimetre-wave radar are greater than or equal to
Distance;And/or
According to the positional information, it determines described in being not detected in the previous frame echo-signal that the millimetre-wave radar receives
Target object.
5. a kind of identification device of target object, which is characterized in that be applied to unmanned smart machine, the unmanned intelligence
Energy equipment is provided with millimetre-wave radar, and described device includes:
Extraction module detects target object for being at least partially in response to, the echo received from the millimetre-wave radar
Characteristic information is extracted in signal;
Input module, for the characteristic information to be input in disaggregated model trained in advance;
Output module, the result for being exported according to the disaggregated model determine the classification of the target object.
6. device according to claim 5, which is characterized in that the characteristic information includes following one or more:
The characteristic information of the centroid position of the target object;
The characteristic information of the speed of service of the target object;
The characteristic information of the size of the target object;
The characteristic information of the doppler bandwidth of the echo-signal;
The characteristic information of the intensity of the echo-signal.
7. device according to claim 5, which is characterized in that the extraction module includes:
Submodule is determined, for determining the location information of the target object in response to detecting target object;
Extracting sub-module, for when determining that the target object meets preset condition according to the positional information, from the milli
Characteristic information is extracted in the echo-signal that metre wave radar receives.
8. device according to claim 7, which is characterized in that the extracting sub-module is in the following way according to institute's rheme
Confidence breath determines that the target object meets preset condition:
According to the positional information, it is default to determine that the distance between the target object and the millimetre-wave radar are greater than or equal to
Distance;And/or
According to the positional information, it determines described in being not detected in the previous frame echo-signal that the millimetre-wave radar receives
Target object.
9. a kind of computer readable storage medium, which is characterized in that the storage medium is stored with computer program, the calculating
Method described in any one of the claims 1-5 is realized when machine program is executed by processor.
10. a kind of unmanned smart machine, the unmanned smart machine is provided with millimetre-wave radar, including memory,
Processor and storage are on a memory and the computer program that can run on a processor, which is characterized in that the processor is held
Method described in any one of the claims 1-5 is realized when row described program.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810353757.7A CN108845321A (en) | 2018-04-19 | 2018-04-19 | Recognition methods, device and the unmanned smart machine of target object |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810353757.7A CN108845321A (en) | 2018-04-19 | 2018-04-19 | Recognition methods, device and the unmanned smart machine of target object |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108845321A true CN108845321A (en) | 2018-11-20 |
Family
ID=64212112
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810353757.7A Pending CN108845321A (en) | 2018-04-19 | 2018-04-19 | Recognition methods, device and the unmanned smart machine of target object |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108845321A (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110009869A (en) * | 2019-04-01 | 2019-07-12 | 珠海格力电器股份有限公司 | Monitoring method, the device and system of action message |
CN110091871A (en) * | 2019-04-30 | 2019-08-06 | 广州小鹏汽车科技有限公司 | Control method, apparatus, medium and the control equipment of vehicle distances |
CN110632849A (en) * | 2019-08-23 | 2019-12-31 | 珠海格力电器股份有限公司 | Intelligent household appliance, control method and device thereof and storage medium |
CN110728701A (en) * | 2019-08-23 | 2020-01-24 | 珠海格力电器股份有限公司 | Control method and device for walking stick with millimeter wave radar and intelligent walking stick |
CN110727277A (en) * | 2019-08-23 | 2020-01-24 | 珠海格力电器股份有限公司 | Control method and device of car washer with millimeter wave radar and intelligent car washer |
CN111025241A (en) * | 2019-10-17 | 2020-04-17 | 珠海格力电器股份有限公司 | Boundary area detection method and device, electronic equipment and storage medium |
CN111226132A (en) * | 2019-03-18 | 2020-06-02 | 深圳市大疆创新科技有限公司 | Target detection method and device, millimeter wave radar and movable platform |
CN111273268A (en) * | 2020-01-19 | 2020-06-12 | 北京百度网讯科技有限公司 | Obstacle type identification method and device and electronic equipment |
CN111507361A (en) * | 2019-01-30 | 2020-08-07 | 富士通株式会社 | Microwave radar-based action recognition device, method and system |
CN111699406A (en) * | 2019-03-29 | 2020-09-22 | 深圳市大疆创新科技有限公司 | Tracking detection method of millimeter wave radar, millimeter wave radar and vehicle |
CN112505646A (en) * | 2020-11-18 | 2021-03-16 | 安洁无线科技(苏州)有限公司 | Foreign matter shielding judgment method and system based on millimeter wave radar |
CN113093176A (en) * | 2019-12-23 | 2021-07-09 | 北京三快在线科技有限公司 | Linear obstacle detection method, linear obstacle detection device, electronic apparatus, and storage medium |
CN113591677A (en) * | 2021-07-28 | 2021-11-02 | 厦门熵基科技有限公司 | Contraband identification method and device, storage medium and computer equipment |
CN113589254A (en) * | 2021-08-23 | 2021-11-02 | 东莞正扬电子机械有限公司 | Radar-based moving target detection method and device and radar detection equipment |
CN114325677A (en) * | 2021-12-30 | 2022-04-12 | 北京深思数盾科技股份有限公司 | Intelligent monitoring equipment and control method thereof |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101189533A (en) * | 2005-05-30 | 2008-05-28 | 罗伯特·博世有限公司 | Method and apparatus for identifying and classifying objects |
JP2012043364A (en) * | 2010-08-23 | 2012-03-01 | Toyota Motor Corp | Object recognition device |
CN103577834A (en) * | 2012-08-06 | 2014-02-12 | 现代自动车株式会社 | Method and system for producing classifier for recognizing obstacle |
CN105453157A (en) * | 2013-08-01 | 2016-03-30 | 本田技研工业株式会社 | Vehicle periphery monitoring device |
CN105512635A (en) * | 2015-12-15 | 2016-04-20 | 鲁东大学 | Category attribute fused deep network underground target identification method and system |
JP2016223780A (en) * | 2015-05-27 | 2016-12-28 | いすゞ自動車株式会社 | Method for determining discriminant, and discrimination device |
CN106371080A (en) * | 2016-08-24 | 2017-02-01 | 电子科技大学 | A radar target identification method based on geometrical structure characteristics and multi-feature combination |
CN106597439A (en) * | 2016-12-12 | 2017-04-26 | 电子科技大学 | Synthetic aperture radar target identification method based on incremental learning |
CN106814351A (en) * | 2017-01-10 | 2017-06-09 | 西安电子科技大学 | Aircraft Targets sorting technique based on three rank LPC techniques |
CN107169469A (en) * | 2017-06-02 | 2017-09-15 | 南京理工大学 | A kind of material identification method of the MIMO radar based on machine learning |
CN107870322A (en) * | 2017-11-03 | 2018-04-03 | 北京清瑞维航技术发展有限公司 | Target identification method, apparatus and system based on single band radar |
CN107886121A (en) * | 2017-11-03 | 2018-04-06 | 北京清瑞维航技术发展有限公司 | Target identification method, apparatus and system based on multiband radar |
-
2018
- 2018-04-19 CN CN201810353757.7A patent/CN108845321A/en active Pending
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101189533A (en) * | 2005-05-30 | 2008-05-28 | 罗伯特·博世有限公司 | Method and apparatus for identifying and classifying objects |
JP2012043364A (en) * | 2010-08-23 | 2012-03-01 | Toyota Motor Corp | Object recognition device |
CN103577834A (en) * | 2012-08-06 | 2014-02-12 | 现代自动车株式会社 | Method and system for producing classifier for recognizing obstacle |
CN105453157A (en) * | 2013-08-01 | 2016-03-30 | 本田技研工业株式会社 | Vehicle periphery monitoring device |
JP2016223780A (en) * | 2015-05-27 | 2016-12-28 | いすゞ自動車株式会社 | Method for determining discriminant, and discrimination device |
CN105512635A (en) * | 2015-12-15 | 2016-04-20 | 鲁东大学 | Category attribute fused deep network underground target identification method and system |
CN106371080A (en) * | 2016-08-24 | 2017-02-01 | 电子科技大学 | A radar target identification method based on geometrical structure characteristics and multi-feature combination |
CN106597439A (en) * | 2016-12-12 | 2017-04-26 | 电子科技大学 | Synthetic aperture radar target identification method based on incremental learning |
CN106814351A (en) * | 2017-01-10 | 2017-06-09 | 西安电子科技大学 | Aircraft Targets sorting technique based on three rank LPC techniques |
CN107169469A (en) * | 2017-06-02 | 2017-09-15 | 南京理工大学 | A kind of material identification method of the MIMO radar based on machine learning |
CN107870322A (en) * | 2017-11-03 | 2018-04-03 | 北京清瑞维航技术发展有限公司 | Target identification method, apparatus and system based on single band radar |
CN107886121A (en) * | 2017-11-03 | 2018-04-06 | 北京清瑞维航技术发展有限公司 | Target identification method, apparatus and system based on multiband radar |
Cited By (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111507361B (en) * | 2019-01-30 | 2023-11-21 | 富士通株式会社 | Action recognition device, method and system based on microwave radar |
CN111507361A (en) * | 2019-01-30 | 2020-08-07 | 富士通株式会社 | Microwave radar-based action recognition device, method and system |
CN111226132A (en) * | 2019-03-18 | 2020-06-02 | 深圳市大疆创新科技有限公司 | Target detection method and device, millimeter wave radar and movable platform |
CN111699406A (en) * | 2019-03-29 | 2020-09-22 | 深圳市大疆创新科技有限公司 | Tracking detection method of millimeter wave radar, millimeter wave radar and vehicle |
CN111699406B (en) * | 2019-03-29 | 2024-04-30 | 深圳市大疆创新科技有限公司 | Millimeter wave radar tracking detection method, millimeter wave radar and vehicle |
WO2020199010A1 (en) * | 2019-03-29 | 2020-10-08 | 深圳市大疆创新科技有限公司 | Millimeter wave radar-based tracking detection method, millimeter wave radar and vehicle |
CN110009869A (en) * | 2019-04-01 | 2019-07-12 | 珠海格力电器股份有限公司 | Monitoring method, the device and system of action message |
CN110091871A (en) * | 2019-04-30 | 2019-08-06 | 广州小鹏汽车科技有限公司 | Control method, apparatus, medium and the control equipment of vehicle distances |
CN110727277A (en) * | 2019-08-23 | 2020-01-24 | 珠海格力电器股份有限公司 | Control method and device of car washer with millimeter wave radar and intelligent car washer |
CN110728701A (en) * | 2019-08-23 | 2020-01-24 | 珠海格力电器股份有限公司 | Control method and device for walking stick with millimeter wave radar and intelligent walking stick |
CN110632849A (en) * | 2019-08-23 | 2019-12-31 | 珠海格力电器股份有限公司 | Intelligent household appliance, control method and device thereof and storage medium |
CN110728701B (en) * | 2019-08-23 | 2022-07-12 | 珠海格力电器股份有限公司 | Control method and device for walking stick with millimeter wave radar and intelligent walking stick |
CN110632849B (en) * | 2019-08-23 | 2020-11-17 | 珠海格力电器股份有限公司 | Intelligent household appliance, control method and device thereof and storage medium |
CN111025241A (en) * | 2019-10-17 | 2020-04-17 | 珠海格力电器股份有限公司 | Boundary area detection method and device, electronic equipment and storage medium |
CN113093176A (en) * | 2019-12-23 | 2021-07-09 | 北京三快在线科技有限公司 | Linear obstacle detection method, linear obstacle detection device, electronic apparatus, and storage medium |
CN111273268B (en) * | 2020-01-19 | 2022-07-19 | 北京百度网讯科技有限公司 | Automatic driving obstacle type identification method and device and electronic equipment |
CN111273268A (en) * | 2020-01-19 | 2020-06-12 | 北京百度网讯科技有限公司 | Obstacle type identification method and device and electronic equipment |
CN112505646A (en) * | 2020-11-18 | 2021-03-16 | 安洁无线科技(苏州)有限公司 | Foreign matter shielding judgment method and system based on millimeter wave radar |
CN112505646B (en) * | 2020-11-18 | 2024-04-12 | 安洁无线科技(苏州)有限公司 | Foreign matter shielding judging method and system based on millimeter wave radar |
CN113591677A (en) * | 2021-07-28 | 2021-11-02 | 厦门熵基科技有限公司 | Contraband identification method and device, storage medium and computer equipment |
CN113589254A (en) * | 2021-08-23 | 2021-11-02 | 东莞正扬电子机械有限公司 | Radar-based moving target detection method and device and radar detection equipment |
CN114325677A (en) * | 2021-12-30 | 2022-04-12 | 北京深思数盾科技股份有限公司 | Intelligent monitoring equipment and control method thereof |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108845321A (en) | Recognition methods, device and the unmanned smart machine of target object | |
Muhammad et al. | Deep learning for safe autonomous driving: Current challenges and future directions | |
Chen et al. | An edge traffic flow detection scheme based on deep learning in an intelligent transportation system | |
US11840239B2 (en) | Multiple exposure event determination | |
CN110753892A (en) | Method and system for instant object tagging via cross-modality verification in autonomous vehicles | |
KR20170140214A (en) | Filter specificity as training criterion for neural networks | |
CN110869559A (en) | Method and system for integrated global and distributed learning in autonomous vehicles | |
US11807269B2 (en) | Method for vehicle avoiding obstacle, electronic device, and computer storage medium | |
CN112193252A (en) | Driving risk early warning method and device, computing equipment and storage medium | |
CN111539425A (en) | License plate recognition method, storage medium and electronic equipment | |
CN109883438A (en) | Automobile navigation method, device, medium and electronic equipment | |
CN113642474A (en) | Hazardous area personnel monitoring method based on YOLOV5 | |
CN112560580A (en) | Obstacle recognition method, device, system, storage medium and electronic equipment | |
CN108960074A (en) | Small size pedestrian target detection method based on deep learning | |
US20230206652A1 (en) | Systems and methods for utilizing models to detect dangerous tracks for vehicles | |
Khnissi et al. | Implementation of a compact traffic signs recognition system using a new squeezed YOLO | |
CN112671487A (en) | Vehicle testing method, server and testing vehicle | |
Rahman et al. | Predicting driver behaviour at intersections based on driver gaze and traffic light recognition | |
Wang et al. | Vehicle key information detection algorithm based on improved SSD | |
CN115757828A (en) | Radiation source knowledge graph-based aerial target intention identification method | |
Wu et al. | Infrared target detection based on deep learning | |
Banerjee et al. | Automated parking system in smart campus using computer vision technique | |
Venkatesh et al. | An intelligent traffic management system based on the Internet of Things for detecting rule violations | |
CN112712061B (en) | Method, system and storage medium for recognizing multidirectional traffic police command gestures | |
WO2022257112A1 (en) | Improved object detection on reflective surfaces |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20181120 |