CN105975967A - Target positioning method and system - Google Patents

Target positioning method and system Download PDF

Info

Publication number
CN105975967A
CN105975967A CN201610279062.XA CN201610279062A CN105975967A CN 105975967 A CN105975967 A CN 105975967A CN 201610279062 A CN201610279062 A CN 201610279062A CN 105975967 A CN105975967 A CN 105975967A
Authority
CN
China
Prior art keywords
information
target
identity
candidate
candidate target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610279062.XA
Other languages
Chinese (zh)
Other versions
CN105975967B (en
Inventor
殳南
蒋冶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201610279062.XA priority Critical patent/CN105975967B/en
Publication of CN105975967A publication Critical patent/CN105975967A/en
Application granted granted Critical
Publication of CN105975967B publication Critical patent/CN105975967B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/255Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • G01C11/12Interpretation of pictures by comparison of two or more pictures of the same area the pictures being supported in the same relative position as when they were taken
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a target positioning method and a target positioning system. The method includes the following steps that: the motion parameter information of a target object is determined through the positioning system of the target object; the region grange of the target object is determined according to the motion parameter information; imaging information of a target region is obtained from an image acquisition device for monitoring the target region, and computer vision processing is performed on the imaging information; candidate targets in the target region are determined; the candidate targets are tracked; the motion states of the candidate targets are determined according to a tracking result, map information, the location information of the image acquisition device and the direction information of the image acquisition device; the result of the matching of target parameter information and candidate target parameter information is adopted as the candidate target of the target object; and a motion state corresponding to the candidate target is outputted as the result of current positioning. In the above positioning process, when the motion state of the target object is determined according to the tracking result, the map information, the location information of the image acquisition device and the direction information of the image acquisition device, positioning is not interfered by buildings, and therefore, the method has high precision.

Description

A kind of object localization method and system
Technical field
The present invention relates to tracing and positioning technical field, be specifically related to one and be used in combination wireless system, calculating The object localization method of machine vision and system.
Background technology
Current Satellite Navigation Technique can preferably complete outdoor positioning and navigation, but at indoor conditions Under signal is blocked due to building, it is impossible to effective use Satellite Navigation Technique to complete leading of indoor Boat application, limits the actual application such as indoor map.A lot of technical scheme is had to be carried indoor positioning scheme Go out, as used the intensity of the wireless signals such as WiFi, BT, ultra broadband, ZigBee to add up with measured value (such as patent WO/2012/106075A1).
Although having developed the indoor locating system as used the wireless signals such as WiFi, BT at present, but due to Wireless signal alignment system is affected by indoor multipath effect cannot provide point-device locating effect.
Summary of the invention
In view of this, the embodiment of the present invention provides a kind of object localization method and system, to realize indoor feelings Under condition, target object is accurately positioned.
For achieving the above object, the following technical scheme of embodiment of the present invention offer:
A kind of object localization method, including:
Obtaining target component information and the moving parameter information of target object, described moving parameter information is by institute State target object to use self-sensor device to measure to carry out location with alignment system and obtain;
Determined that described target object is positioned at by the three-dimensional coordinate in described moving parameter information and precision parameter Target area;
Determine for covering the target image collecting device needed for described target area;
Obtain the imaging data of described target image collecting device;
Described imaging data is carried out computer vision process, determines the candidate target in described imaging data;
Described candidate target is tracked, adopts according to described tracking result, cartographic information, described image The positional information of collection equipment and the directional information of described image capture device determine the fortune of described candidate target Dynamic state;
Obtain the parameter information of the candidate target consistent with described target component information type;
By candidate target corresponding for the candidate target parameter information the highest with described target component information matches degree As target object;
The kinestate matched with described target object is sent to described target object.
Preferably, in above-mentioned object localization method, described target component information is the motion ginseng of target object Number information;
Described candidate target parameter information is the kinestate of described candidate target.
Preferably, in above-mentioned object localization method, described will be the highest with described target component information matches degree Candidate target corresponding to candidate target parameter information as target object, including:
Kinestate based on each candidate target, sets up the kinestate mould corresponding with described candidate target Type, described kinestate model internal memory contains the kinestate template corresponding with candidate target each described;
Described moving parameter information is entered with the kinestate template of candidate target in described kinestate model Row coupling, using candidate target corresponding for kinestate template the highest for matching degree as described target object.
Preferably, in above-mentioned object localization method, described target component information is: preset based on continuous print The target travel profile of the moving parameter information formation of time period internal object object or exercise data group;
Described candidate target parameter information is: adopt based on described tracking result, cartographic information, described image The described candidate target that the positional information of collection equipment and the direction of described image capture device generate is described Candidate Motion profile in preset time period or exercise data group.
Preferably, in above-mentioned object localization method, described target component information is: based on preset time period Inside get the target corresponding with described target object that the moving parameter information of discrete target object is set up Time m-moving parameter information set, described object time-moving parameter information set includes: described default time Between various discrete moment in section and corresponding described target object moving parameter information;
Described candidate target parameter information is: adopt based on described tracking result, cartographic information, described image Candidate time-motion shape that the positional information of collection equipment and the directional information of described image capture device are set up State set, described candidate time-kinestate set includes: during various discrete in described preset time period Carve and corresponding described candidate target movement state information.
Preferably, in above-mentioned object localization method, described target component information is: the described mesh got The mark kind of information of object, status information and visual signature information;
Described candidate target parameter information is: described imaging data carries out computer vision process and obtains The kind of information of described candidate target, status information and visual signature information.
Preferably, in above-mentioned object localization method, also include:
Described target component information is identity of wireless device or the identity of the target object obtained;
Described candidate target parameter information is identity of wireless device or the identity mark of the described candidate target obtained Know.
Preferably, in above-mentioned object localization method, also include:
Judge whether to get the calibration request that image capture device is uploaded, if it is, generate calibration information, Described calibration information at least includes number information and the positional information of the image capture device of required calibration.
Preferably, in above-mentioned object localization method, also include:
By the direction of motion information in the routing information on cartographic information and described moving parameter information, Predict the moving line of described target object, according in described moving line and described moving parameter information The target area that velocity information prediction subsequent time is corresponding.
Preferably, in above-mentioned object localization method, described moving parameter information includes:
The acceleration of target object that obtains according to the senser element measurement entrained by described target object, speed Spend direction and towards data.
Preferably, in above-mentioned object localization method, also include:
Obtain the kind of information of described target object, status information and visual signature information and wireless device Mark and identity;
Obtain the kind of information of described candidate target, status information and visual signature information and report wireless Device identification and identity;
Get described target object or described identity of wireless device that described candidate target reports or identity mark After knowledge, will confirm that its described target object mated or the kind of information of candidate target, status information With visual signature information, by the kind of information of described target, status information and visual signature information with described Identity of wireless device or identity are bound;
Store described binding information.
Preferably, in above-mentioned object localization method, also include:
Obtain identity of wireless device or the identity of target object;
Obtain to obtain with described identity of wireless device or identity according to the described binding information of storage and match Kind of information, status information and visual signature information, by described kind of information, status information and vision Characteristic information is as described target component information;
Described candidate target parameter information is the kind of information of candidate target, status information and visual signature letter Breath;
Or
Obtain identity of wireless device or the identity of candidate target;
Obtain to obtain with described identity of wireless device or identity according to the described binding information of storage and match Kind of information, status information and visual signature information, by described kind of information, status information and vision Characteristic information is as described candidate target parameter information;
Described target component information is kind of information, status information and visual signature information;
Or
Described imaging data is carried out computer vision process, obtains the kind of information of candidate target, state Information and visual signature information;
Obtain and described kind of information, status information and visual signature letter according to the described binding information of storage The identity of wireless device of manner of breathing coupling or identity, obtain work by described identity of wireless device or identity For described candidate target parameter information;
Described target component information is identity of wireless device or identity.
A kind of object locating system, including:
Data acquisition unit, for obtaining target component information and the moving parameter information of target object, institute Stating moving parameter information is used the measurement of self-sensor device to be positioned to alignment system by described target object Arrive;
Zone location unit, for being determined by the three-dimensional coordinate in described moving parameter information and precision parameter The target area that described target object is positioned at;
Imaging acquisition unit, for determining that the target image collection needed for covering described target area sets Standby, obtain the imaging data of described target image collecting device;
VPU, for identifying candidate target object according to imaging data, obtains candidate's object Characteristic information;
Kinestate computing unit, for described candidate target is tracked, according to described tracking result, Cartographic information, the positional information of described image capture device and the directional information of described image capture device Determine the kinestate of described candidate target;
Candidate parameter collecting unit, for the candidate target ginseng consistent with described target component information type Number information;
Matching unit, for by the candidate target parameter information the highest with described target component information matches degree Corresponding candidate target is as target object;
Transmitting element, for being sent to described object by the kinestate matched with described target object Body.
Preferably, in above-mentioned object locating system, described target component information is the motion ginseng of target object Number information;
Described candidate target parameter information is the kinestate of described candidate target.
Preferably, in above-mentioned object locating system, described matching unit, including:
Unit set up by model, for kinestate based on each candidate target, sets up and described candidate's mesh The kinestate model that mark is corresponding, described kinestate model internal memory contains and candidate target pair each described The kinestate template answered;
First sub-matching unit, for by described moving parameter information and candidate in described kinestate model The kinestate template of target is mated, by candidate's mesh corresponding for kinestate template the highest for matching degree It is denoted as described target object.
Preferably, in above-mentioned object locating system, also include:
Target travel profile sets up unit, for motion based on continuous print preset time period internal object object Parameter information, the target travel profile of formation or exercise data group;
Candidate Motion profile sets up unit, for based on described tracking result, cartographic information, described image The described candidate target that the positional information of collecting device and the direction of described image capture device generate is in institute State the Candidate Motion profile in preset time period or exercise data group;
Described target component information is: target travel profile or exercise data group;
Described candidate target parameter information is: Candidate Motion profile or exercise data group.
Preferably, in above-mentioned object locating system, also include:
Unit is set up in object time-movable information set, for discrete based on get in preset time period Target object moving parameter information set up the object time corresponding with described target object-kinematic parameter Information aggregate, described object time-moving parameter information set includes: each in described preset time period Discrete instants and corresponding described target object moving parameter information;
Unit is set up in candidate time-kinestate set, for based on described tracking result, cartographic information, The positional information of described image capture device and the directional information of described image capture device are set up discrete Candidate time-kinestate set, described candidate time-kinestate set includes: described preset time period Interior various discrete moment and corresponding described candidate target movement state information;
Described target component information is: object time-moving parameter information set;
Described candidate target parameter information is: candidate time-kinestate set.
Preferably, in above-mentioned object locating system, described VPU, it is additionally operable to: to described one-tenth As data carry out computer vision process obtain the kind of information of described candidate target, status information and Visual signature information;
Described target component information is: body kind of information that the described target object got is uploaded, state Information and visual signature information;
Described candidate target parameter information is: the kind of information of described candidate target, status information and regard Feel characteristic information.
Preferably, in above-mentioned object locating system, described target component information is that the wireless of target object sets Standby mark or identity;
Described candidate target parameter information is identity of wireless device or the identity of described candidate target.
Preferably, in above-mentioned object locating system, also include:
Failure location unit, for judging whether to get the calibration request that image capture device is uploaded, as Fruit is, generates calibration information, and described calibration information at least includes the volume of the image capture device of required calibration Number information and positional information.
Preferably, in above-mentioned object locating system, also include:
Target area predicting unit, for by the routing information on cartographic information and described kinematic parameter Direction of motion information in information, it was predicted that the moving line of described target object, according to described moving line The target area that subsequent time is corresponding is predicted with the velocity information in described moving parameter information.
Preferably, in above-mentioned object locating system, described moving parameter information includes:
The acceleration of target object that obtains according to the senser element measurement entrained by described target object, speed Spend direction and towards data.
Preferably, in above-mentioned object locating system, also include:
Secure information storage unit, for determining the kind of information of described candidate target, status information and regarding Feel characteristic information, asks described candidate target to report identity of wireless device and identity, when getting After stating the described identity of wireless device or identity that candidate target reports, by the kind of described candidate target Category information, status information and visual signature information are bound with described identity of wireless device or identity, And described binding information is stored to presetting database.
Preferably, in above-mentioned object locating system, described data acquisition unit is additionally operable to: obtain object The identity of wireless device of body and identity, kind of information, status information, visual signature information, candidate The identity of wireless device of target and identity;
Described object locating system also includes: target component information acquisition unit, when described target component is believed When breath is for kind of information, status information and visual signature information, for according in described presetting database The kind letter that binding information acquisition and the identity of wireless device of described target object or identity match Breath, status information and visual signature information, by described kind of information, status information and visual signature information As target component information;When described target component information is identity of wireless device or identity, use In obtaining and the kind of information of described target object, state according to the binding information in described presetting database Information and the identity of wireless device of visual signature information match or identity, by described wireless device mark Know or identity is as target component information;
Candidate target parameter information acquiring unit, when described candidate target parameter information is kind of information, shape When state information and visual signature information, for obtaining and institute according to the binding information in described presetting database State the identity of wireless device of candidate target or kind of information, status information and vision that identity matches Characteristic information, believes described kind of information, status information and visual signature information as candidate target parameter Breath;When described candidate target parameter information is identity of wireless device or identity, for according to described Binding information in presetting database obtains and the kind of information of described candidate target, status information and vision Identity of wireless device that characteristic information matches or identity, by described identity of wireless device or identity mark Know as candidate target parameter information.
Based on technique scheme, the such scheme that the embodiment of the present invention provides, first pass through target object Alignment system determine its moving parameter information, determine described target object according to described moving parameter information The regional extent being positioned at, obtains the one-tenth to this region to the image capture device monitoring described target area As information carries out computer vision process, determine the candidate target in this target area, to each candidate's mesh Mark is tracked, according to follow the tracks of result, cartographic information, the positional information of described image capture device and The directional information of described image capture device determines the kinestate of described candidate target;Join according to target again The matching result of number information and candidate target parameter information determines the candidate target as target object, finally Kinestate corresponding for described candidate target is exported as this positioning result.Visible in this process, According to following the tracks of result, cartographic information, the positional information of described image capture device and described image acquisition When the directional information of equipment determines the kinestate of described target object, do not disturbed by building etc., therefore, There is higher precision.
Accompanying drawing explanation
In order to be illustrated more clearly that the embodiment of the present invention or technical scheme of the prior art, below will be to reality Execute the required accompanying drawing used in example or description of the prior art to be briefly described, it should be apparent that below, Accompanying drawing in description is only embodiments of the invention, for those of ordinary skill in the art, not On the premise of paying creative work, it is also possible to obtain other accompanying drawing according to the accompanying drawing provided.
Fig. 1 is the schematic flow sheet of a kind of object localization method disclosed in the embodiment of the present application;
Fig. 2 is the application scenarios schematic diagram of localization method disclosed in the embodiment of the present application;
Fig. 3 is the process of the kinestate determining candidate target in localization method disclosed in the embodiment of the present application Schematic diagram;
Fig. 4 is the disclosed process schematic being determined target object by coupling of the embodiment of the present application;
Fig. 5 is the application scenarios schematic diagram of localization method disclosed in another embodiment of the application;
Fig. 6 is the structural representation of a kind of object locating system disclosed in the embodiment of the present application.
Detailed description of the invention
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out Clearly and completely describe, it is clear that described embodiment is only a part of embodiment of the present invention, and It is not all, of embodiment.Based on the embodiment in the present invention, those of ordinary skill in the art are not doing Go out the every other embodiment obtained under creative work premise, broadly fall into the scope of protection of the invention.
In order to solve in prior art, when target object is positioned, the problem that positioning result precision is low, This application discloses a kind of object localization method and system, see Fig. 1, described object localization method includes:
Step S101: obtain the target ginseng matched with described target object that described target object is uploaded Number information and moving parameter information, described moving parameter information is used self-sensor device by described target object Measurement carries out location with alignment system and obtains;
Wherein, described target object (vehicle or mobile phone etc.) can be measured and location by self-sensor device System, uses the modes such as satellite, wireless location or ground magnetic orientation to external system (network location server Deng) initiate Location Request, obtain the moving parameter information of described target object, described moving parameter information Kinestate (three-dimensional coordinate information, velocity information, acceleration information and orientation information etc.) can be included With the precision parameter (referring to its certainty of measurement, range of error data) of kinestate, and by described target The Position Fixing Navigation System of object self maintains the estimation of this kinestate.Described target component information is follow-up During for by candidate target determines the reference information of target object, described target data information is wrapped Contain is the characteristic parameter of described target object, and this feature parameter has uniqueness, and concrete selection how Kind characteristic parameter, as described target component information, can determine according to user's request voluntarily;
Step S102: determined described by the three-dimensional coordinate information in described moving parameter information and precision parameter The target area that target object is positioned at;
In this step, the three-dimensional coordinate information reported due to target object is relatively rough, such as by its precision Parameter can determine that its error can be in the circle of 10 meters of radiuses, therefore, can determine that described mesh according to these information The scope of the target area that mark object is positioned at;
Step S103: determine for covering the target image collecting device needed for described target area;
After described target area determines, obtain for covering this target area according to the screening of described target area Image capture device needed for territory, i.e. the monitoring region with known image capture device compare confirmation Cover the image capture device list required for monitoring region;For example, with reference to shown in Fig. 2, at described mesh Mark regional extent in exist two image capture devices 301 and 302, therefore, can using 301 and 302 as Described target image collecting device, analyzes and just can complete follow-up with the imaging data processing 301 and 302 Method is mentioned to the identification of target object in example, position and tracking process.
Step S104: obtain described target image collecting device and described target area is carried out image acquisition obtain The imaging data arrived;
Step S105: described imaging data is carried out computer vision process, determines in described imaging data Candidate target;
Wherein, when these imaging datas being carried out computer vision and processing, by extracting after distinguishing background Going out foreground object, at identification, the kind of object in described target area (vehicles or pedestrians etc.) and state (are stood Vertical, walking, lies down, given pose), and object can also be extracted by computer vision process Visual signature (face contour, solid shape profile, vehicle license etc.), and can be further to identification The object gone out carries out tracking based on image.
Step S106: described candidate target is tracked, according to described tracking result, cartographic information, The positional information of described image capture device and the directional information of described image capture device determine described time Select the kinestate of target;
In this step, by the tracking result of each candidate target is combined given cartographic information and figure As positional information and the directional information of collecting device, i.e. can get the kinestate (three of described candidate target Dimension coordinate, speed, acceleration, towards).As a example by as shown in Figure 3, photographic head 301 regards at computer Feel gives the tracking result of two candidate targets after processing, in conjunction with positional information and the side of photographic head 301 Cartographic information in information and described target area, can obtain the two candidate by preset algorithm The kinestate of target.
Step S107: obtain the target component information of the candidate consistent with described target component information type;
In this step, candidate target parameter information can be determined according to given described target component information Information type, so that it is determined that needed for the candidate target parameter information of described candidate target transferred.
Step S108: by corresponding for the candidate target parameter information the highest with described target component information matches degree Candidate target as target object;
In this step, by by described target component information and each candidate target parameter information one by one Join, obtain the matching degree of described target component information and each candidate target parameter information, by described coupling Spend candidate target corresponding to the highest candidate parameter as target object.Certainly, in order to ensure matching result Correctness, coupling during, also can set a preset value, only be more than when the highest matching degree During this preset value, just can be using candidate target corresponding for candidate parameter the highest for described matching degree as object Body, otherwise, to user's output for characterizing the information that cannot position, certainly, at this matching process In, it is also possible to set another preset value, when matching degree is higher than this preset value, then it is believed that this matching degree Corresponding candidate target is target object, it is not necessary to is carrying out subsequent match, thus is improving locating speed.
Step S109: the kinestate matched with described target object is sent to described target object.
In object localization method disclosed in the above embodiments of the present application, first pass through the location of target object System determines its moving parameter information, determines that described target object is positioned at according to described moving parameter information Regional extent, the image capture device monitoring described target area is obtained image-forming information to this region Carry out computer vision process, determine the candidate target in this target area, each candidate target is carried out Follow the tracks of, according to following the tracks of result, cartographic information, the positional information of described image capture device and described figure As the directional information of collecting device determines the kinestate of described candidate target;Again according to target component information The candidate target as target object is determined with the matching result of candidate target parameter information, the most described The kinestate that candidate target is corresponding exports as this positioning result.Visible in this process, according to Track result, cartographic information, the positional information of described image capture device and described image capture device When directional information determines the kinestate of described target object, do not disturbed by building etc., therefore, have Higher precision.
In method disclosed in the above embodiments of the present application, described target component information can need according to user Seek sets itself, such as, in technical scheme disclosed in one embodiment of the application, described target component Information can be the moving parameter information of target object;The most described candidate target parameter information is then described The kinestate of candidate target.
Now, in the matching process by identifying the kinestate in described target object kinematic parameter and mesh The kinestate of all of candidate target in the range of mark, most like with the kinestate of described target object Candidate target, when both matching degrees reach set threshold value time, will this candidate target as target. In Fig. 4, the kinestate that target object reports includes: position, the speed being not zero and towards, Having 2 candidate targets in described target zone, the kinestate of one of them candidate target is static, Therefore, speeds match item is unsatisfactory for, and the kinestate of another candidate target can with successful match, and Matching degree is more than described setting threshold value, therefore using this candidate target as target object.
Certainly, for further Optimized Matching process, in above-mentioned matching process, this method can be with pin The kinestate of different types of target object (pedestrian or vehicle etc.) is set up model (include cadence, Leg speed, maximal rate, peak acceleration etc.), can in the matching process, by contrast target object In the moving parameter information reported and above-mentioned model, the mode of the motion feature template of candidate target is carried out Joining, goodness of fit is then regarded as being effectively matched higher than the coupling of threshold value.Thus improve object identification and mated The accuracy rate of journey, reduces erroneous matching.
That is, above-mentioned steps S108: specifically may include that
Kinestate based on each candidate target, sets up the kinestate mould corresponding with described candidate target Type, described kinestate model internal memory contains the kinestate template corresponding with candidate target each described;
Described moving parameter information is entered with the kinestate template of candidate target in described kinestate model Row coupling, using candidate target corresponding for kinestate template the highest for matching degree as described target object.
Certainly, except using above-mentioned moving parameter information in addition to target component information, permissible in order to prevent Solve described target object and report (reporting the most described target component information and moving parameter information) at single The problem that Shi Wufa is effectively matched, described target component information can also be: based on continuous print Preset Time The target travel profile of the moving parameter information formation of section internal object object or exercise data group;
Described candidate target parameter information is: adopt based on described tracking result, cartographic information, described image The described candidate target that the positional information of collection equipment and the direction of described image capture device generate is described Candidate Motion profile in preset time period or exercise data group, the element that described exercise data group is comprised Can be: after the measurement data of the sensor on described target object, and described Measurement and Data Processing The feature of the kinestate arrived.
Now, said method can record in a time period same candidate target on continuous time point Kinestate, form this candidate target Candidate Motion profile within this period or exercise data group, The target of the target object by being determined by the moving parameter information reported by target object in same time period Motion outline or exercise data group.By by the target travel profile of described target object or exercise data group Mate with described Candidate Motion profile or exercise data group, described target object can be solved at single On give the correct time the problem that cannot be effectively matched.
It is understood that in addition to using the matching way mentioned in above-described embodiment, the application is also Can mate, first by the way of contrast target object a period of time discrete many groups movable information Obtain the image that multiple image capture device collects belongs to many groups of same candidate target or single Movement state information set, then by single movement state information collection more than many groups corresponding for described candidate target Close many groups reported with target object or single movement state information is mated, find matching degree to be higher than door The optimal coupling of limit.The problem that can be used for after being mated solving to be effectively matched.
Concrete, described many groups or single movement state information set can be the fortune of object in Preset Time Dynamic information-time set, i.e. in method disclosed in the above embodiments of the present application, described target component information can Think: based on get in preset time period discrete target object moving parameter information set up with institute State object time corresponding to target object-moving parameter information set, described object time-moving parameter information Set includes: the various discrete moment in described preset time period and corresponding described target object Moving parameter information;
Described candidate target parameter information can be: based on described tracking result, cartographic information, described figure Candidate time-the fortune set up as the positional information of collecting device and the directional information of described image capture device Dynamic state set, described candidate time-kinestate set includes: in described preset time period each from Dissipate moment and corresponding described candidate target movement state information.
Further, when mating, in addition to above-mentioned matching way, it is also possible to pass through kind Combining of the visual signature information such as information, status information, motion feature and face, profile, vehicle license Closing characteristic information to mate, described target object can report described object by wireless communication system The visual signature such as the kind of information of body, status information, motion feature and face, profile, vehicle license The comprehensive characteristics information of information, certainly, these information can have described target object active reportings certain Also actively can be gathered by the system of application this method, and method disclosed in the embodiment of the present application, can The kind of information of candidate target and shape in target area described in identification in the way of being processed by computer vision State information, movement state information and visual signature information, the kind therefore reported by contrast target object Category information, status information, motion feature process with comprehensive characteristics and the employing computer vision of visual signature The kind of information of the candidate target that mode obtains, status information, movement state information and visual signature information, When matching degree is higher than threshold value, then it is regarded as being effectively matched.Thus can improve judge coupling performance or Directly position by above-mentioned information matches.
Concrete, the described target component information in the present embodiment can be: the described object got The kind of information of body, status information and visual signature information;
Described candidate target parameter information can be: described imaging data is carried out computer vision and processes The kind of information of described candidate target, status information and the visual signature information arrived.
Certainly, described target component information and candidate target parameter information can use above-mentioned data mode Outside, mark that described target object and candidate target can also be possessed by the application, that there is uniqueness Know data as coupling target component information and candidate target parameter information, concrete selection which kind of Mark data, can according to user's request certainly as described target component information and candidate target parameter information Row sets, as long as the mark data with uniqueness all can be as described target component information and candidate target Parameter information.Such as in technical scheme disclosed in the above embodiments of the present application, described target component information For the identity of wireless device of target object obtained or identity;Described candidate target parameter information is for obtaining The identity of wireless device of the described candidate target taken or identity.
Optimize further, when being pedestrian such as described location object, then when using terminal positioning function, OK People is likely to observe mobile phone screen.Described target object (mobile phone) is when reporting oneself state, described Target object can increase by one and the most just judge at standing state, now, and can effective district when coupling Divide the pedestrian needing location and neighbouring neighbouring pedestrian.As shown in Figure 5.Its method includes:
Whether the map denotation interface software of detection target object is currently running, if it is, pass through object The photographic head of body judge user the most just at view screen, if it is, judge the inertia of described target object Whether kinematic parameter meets user is stood angle needed for view screen, if it is, determine, described target The state of object is " standing " state.
Optimize said method further, in order to ensure the accurate of described image capture device pickup area Property, described each image capture device disclosed in the above embodiments of the present application is can be for self-position and angle The change of degree carries out the image capture device detected, and fixes it by using on these image capture devices On the method for inertial measurement system (comprising magnetic compass, accelerometer, gyroscope), and make inertia survey Amount system periodicity or the measurement of triggering property, the situation of change being processed measurement result by judgement realizes self-correcting Accurate (as calibrated the alignment angle information of imaging device), naturally it is also possible to output calibration request;In this Shen Please be in method disclosed in above-described embodiment, it is also possible to including:
Judge whether to get the calibration request that image capture device is uploaded, if it is, generate calibration information, Described calibration information at least includes number information and the positional information of the image capture device of required calibration.
Optimize said method further, can be by the road on cartographic information when following the tracks of described target object Footpath limits the movement locus of described target object, it was predicted that the moving line of described target object selection are suitably In the range of image capture device as the target image collecting device of subsequent time.It is directed to this, this Shen Method can also please include disclosed in above-described embodiment:
By the direction of motion information in the routing information on cartographic information and described moving parameter information, Predict the moving line of described target object, according in described moving line and described moving parameter information The target area that velocity information prediction subsequent time is corresponding.
Optimizing said method further, said method can be by processing the fortune of the target object periodically reported Dynamic supplemental characteristic, obtains movement locus or the profile of target object.Now in order to reduce described target object Power consumption and complexity, can mainly use target object Portable device senser element measure target object Moving parameter information, such as: acceleration, velocity attitude with towards data.That is, described kinematic parameter Information includes: according to the acceleration of the target object that the senser element measurement entrained by described target object obtains Degree, velocity attitude and towards data.
In order to further described target object is accurately positioned, during said method, calculate During the kinestate of described candidate target, the mode object that described candidate target can carry out sound ranging is known Not with location.The most then need requirement, described image capture device is installed acoustic equipment, including wheat Gram wind or microphone array, loudspeaker or tone generator.Due to these image capture devices itself coordinate it is known that So the acoustic equipment being installed on it can carry out acoustic range, extract testee vocal print feature, pass through Microphone array also uses analysis system to carry out process to carry out object identification and positioning function.
In order to optimize said method further, when completing the kinestate determining described candidate target, can With the record kind of testee, state, motion feature (include cadence, leg speed, maximal rate, High acceleration etc.) and the comprehensive characteristics letter of the visual signature information structure such as face, profile, vehicle license Breath, and ask described candidate target equipment report its identity of wireless device (such as IMSI, the IMEI of wireless device, MAC Address) and identity (such as the ID (identity number) card No. of user, drivers license number), complete tested Object comprehensive characteristics and identity of wireless device or the binding of identity.This binding relationship may be used for follow-up Effect is improved when identifying with location.Further, this binding relationship can be set with wireless as required Standby mark, identity and visual signature preserve and pass to other system.When determining target object, The comprehensive characteristics of the described target device extracted can be carried out with the binding relationship in known property data base Comparison, completes the confirmation to target device identity.
In such scheme, described target component information or candidate target parameter information can be tied up via above-mentioned Determine content to obtain, i.e. said method can also include:
Obtain identity of wireless device or the identity of target object;
Obtain to obtain with described identity of wireless device or identity according to the described binding information of storage and match Kind of information, status information and visual signature information, by described kind of information, status information and vision Characteristic information is as described target component information;
Described candidate target parameter information is the kind of information of candidate target, status information and visual signature letter Breath;
Or
Obtain identity of wireless device or the identity of candidate target;
Obtain to obtain with described identity of wireless device or identity according to the described binding information of storage and match Kind of information, status information and visual signature information, by described kind of information, status information and vision Characteristic information is as described candidate target parameter information;
Described target component information is kind of information, status information and visual signature information;
Or
Described imaging data is carried out computer vision process, obtains the kind of information of candidate target, state Information and visual signature information;
Obtain and described kind of information, status information and visual signature letter according to the described binding information of storage The identity of wireless device of manner of breathing coupling or identity, obtain work by described identity of wireless device or identity For described candidate target parameter information;
Described target component information is identity of wireless device or identity.
Concrete, said method can also include:
Obtain the kind of information of described target object, status information and visual signature information and wireless device Mark and identity;
Obtain the kind of information of described candidate target, status information and visual signature information and report wireless Device identification and identity;
Get described target object or described identity of wireless device that described candidate target reports or identity mark After knowledge, will confirm that described target object or candidate's mesh that this identity of wireless device or identity mated Target kind of information, status information and visual signature information, the described kind of information after determining, state Information and visual signature information are bound with described identity of wireless device or identity;
Store described binding information.The available described binding information to store of user, by described wireless Device identification and identity obtain corresponding kind of information, status information and visual signature information, And utilize described binding information by kind of information, status information and visual signature acquisition of information wireless device Mark and identification information.
It is directed to said method, disclosed herein as well is a kind of object locating system, see Fig. 6, including:
Data acquisition unit 100, for obtaining target component information and the moving parameter information of target object, Described moving parameter information is used self-sensor device to measure by described target object and positions with alignment system Obtain;
Zone location unit 200, for true by the three-dimensional coordinate in described moving parameter information and precision parameter The target area that fixed described target object is positioned at;
Imaging acquisition unit 300, for determining for covering the target image collection needed for described target area Equipment, obtains the imaging data of described target image collecting device;
VPU 400, for identifying candidate target object according to imaging data, obtains material standed for The characteristic information of body;
Kinestate computing unit 500, for being tracked described candidate target, follows the tracks of knot according to described Really, cartographic information, the positional information of described image capture device and the direction of described image capture device Information determines the kinestate of described candidate target;
Candidate parameter collecting unit 600, for obtaining the candidate consistent with described target component information type Target component information;
Matching unit 700, for by the candidate target parameter letter the highest with described target component information matches degree The candidate target of breath correspondence is as target object;
Transmitting element 800, for being sent to described target by the kinestate matched with described target object Object.
In object locating system disclosed in the above embodiments of the present application, first pass through the location system of target object System determines its moving parameter information, determines what described target object was positioned at according to described moving parameter information Regional extent, obtains the image-forming information to this region to the image capture device monitoring described target area and enters Row computer vision process, determine the candidate target in this target area, each candidate target is carried out with Track, adopts according to following the tracks of result cartographic information, the positional information of described image capture device and described image The directional information of collection equipment determines the kinestate of described candidate target;Again according to target component information and time The matching result selecting target component information determines the candidate target as target object, the most described candidate Kinestate corresponding to target exports as this positioning result.Visible in this process, according to follow the tracks of knot The really direction letter of cartographic information, the positional information of described image capture device and described image capture device When breath determines the kinestate of described target object, do not disturbed by building etc., therefore, have higher Precision.
Corresponding with said method, described target component information is the moving parameter information of target object;Institute State the kinestate that candidate target parameter information is described candidate target.
Corresponding with said method, matching unit described in system disclosed in the above embodiments of the present application, bag Include:
Unit set up by model, for kinestate based on each candidate target, sets up and described candidate's mesh The kinestate model that mark is corresponding, described kinestate model internal memory contains and candidate target pair each described The kinestate template answered;
First sub-matching unit, for by described moving parameter information and candidate in described kinestate model The kinestate template of target is mated, by candidate's mesh corresponding for kinestate template the highest for matching degree It is denoted as described target object.
Corresponding with said method, said system can also include: target travel profile set up unit and Candidate Motion profile sets up unit;
Described target travel profile sets up unit, for based on continuous print preset time period internal object object Moving parameter information, the target travel profile of formation or exercise data group;
Described Candidate Motion profile sets up unit, for based on described tracking result, cartographic information, described The described candidate target that the positional information of image capture device and the direction of described image capture device generate Candidate Motion profile in described preset time period or exercise data group;
In the present embodiment, described target component information is: target travel profile or exercise data group;Institute Stating candidate target parameter information is: Candidate Motion profile or exercise data group.
Corresponding with said method, in system disclosed in the above embodiments of the present application, it is also possible to include target Time m-moving parameter information set set up unit and unit is set up in candidate time-kinestate set;
Unit is set up in described object time-moving parameter information set, for obtaining based in preset time period The object time corresponding with the described target object-fortune set up to the moving parameter information of discrete target object Dynamic parameter information set, described object time-moving parameter information set includes: in described preset time period The various discrete moment and corresponding described target object moving parameter information;
Unit is set up in described candidate time-kinestate set, for based on described tracking result, map letter The directional information of breath, the positional information of described image capture device and described image capture device is set up Candidate time-kinestate set, described candidate time-kinestate set includes: described preset time period Interior various discrete moment and corresponding described candidate target movement state information;
In the present embodiment, described target component information is: object time-moving parameter information set;Institute Stating candidate target parameter information is: candidate time-kinestate set.
Corresponding with said method, in the application said system, described VPU, it is additionally operable to: Described imaging data is carried out computer vision and processes the kind of information of described candidate target, the state obtained Information and visual signature information;
In the present embodiment, described target component information is: the body that the described target object got is uploaded Kind of information, status information and visual signature information;Described candidate target parameter information is: described candidate The kind of information of target, status information and visual signature information.
Corresponding with said method, in the application said system, described target component information and candidate target Parameter information can use outside above-mentioned data mode, and the application can also be by described target object and time Select mark data that target is possessed, that have uniqueness as the target component information for mating and time Selecting target component information, which kind of mark data of concrete selection are as described target component information and candidate's mesh Mark parameter information, can be according to user's request sets itself, as long as having the mark data of uniqueness As described target component information and candidate target parameter information.Such as open in the above embodiments of the present application Technical scheme in, described target component information is the identity of wireless device of target object or identity obtained Mark;Described candidate target parameter information is identity of wireless device or the identity of the described candidate target obtained Mark.
Corresponding with said method, system disclosed in the above embodiments of the present application, also include:
Failure location unit, for judging whether to get the calibration request that image capture device is uploaded, as Fruit is, generates calibration information, and described calibration information at least includes the volume of the image capture device of required calibration Number information and positional information.
Corresponding with said method, system disclosed in the above embodiments of the present application, also include:
Target area predicting unit, for by the routing information on cartographic information and described kinematic parameter Direction of motion information in information, it was predicted that the moving line of described target object, according to described moving line The target area that subsequent time is corresponding is predicted with the velocity information in described moving parameter information.
Corresponding with said method, system disclosed in the above embodiments of the present application, described moving parameter information Including:
The acceleration of target object that obtains according to the senser element measurement entrained by described target object, speed Spend direction and towards data.
Corresponding with said method, system disclosed in the above embodiments of the present application, also include:
Secure information storage unit, for determining the kind of information of described candidate target, status information and regarding Feel characteristic information, asks described candidate target to report identity of wireless device and identity, when getting After stating the described identity of wireless device or identity that candidate target reports, by the kind of described candidate target Category information, status information and visual signature information are bound with described identity of wireless device or identity, And described binding information is stored to presetting database.Described binding information is utilized to complete from wireless device mark Know and identity obtains kind of information, status information and visual signature information, and utilize described binding Information completes from kind of information, status information and visual signature acquisition of information identity of wireless device and identity mark Knowledge information.
Corresponding with said method, described in said system, data acquisition unit is additionally operable to: obtain object The identity of wireless device of body and identity, kind of information, status information, visual signature information, candidate The identity of wireless device of target and identity;
Described object locating system also includes: target component information acquisition unit, when described target component is believed When breath is for kind of information, status information and visual signature information, for according in described presetting database The kind letter that binding information acquisition and the identity of wireless device of described target object or identity match Breath, status information and visual signature information, by described kind of information, status information and visual signature information As target component information;When described target component information is identity of wireless device or identity, use In obtaining and the kind of information of described target object, state according to the binding information in described presetting database Information and the identity of wireless device of visual signature information match or identity, by described wireless device mark Know or identity is as target component information;
Candidate target parameter information acquiring unit, when described candidate target parameter information is kind of information, shape When state information and visual signature information, for obtaining and institute according to the binding information in described presetting database State the identity of wireless device of candidate target or kind of information, status information and vision that identity matches Characteristic information, believes described kind of information, status information and visual signature information as candidate target parameter Breath;When described candidate target parameter information is identity of wireless device or identity, for according to described Binding information in presetting database obtains and the kind of information of described candidate target, status information and vision Identity of wireless device that characteristic information matches or identity, by described identity of wireless device or identity mark Know as candidate target parameter information.
In this specification, each embodiment uses the mode gone forward one by one to describe, and each embodiment stresses Being the difference with other embodiments, between each embodiment, identical similar portion sees mutually. For device disclosed in embodiment or method, due to its with embodiment disclosed in method or apparatus relative Should, so describe is fairly simple, relevant part sees method part and illustrates.
Described above to the disclosed embodiments, makes professional and technical personnel in the field be capable of or uses The present invention.Multiple amendment to these embodiments will be aobvious and easy for those skilled in the art See, generic principles defined herein can without departing from the spirit or scope of the present invention, Realize in other embodiments.Therefore, the present invention is not intended to be limited to the embodiments shown herein, And it is to fit to the widest scope consistent with principles disclosed herein and features of novelty.

Claims (24)

1. an object localization method, it is characterised in that including:
Obtaining target component information and the moving parameter information of target object, described moving parameter information is by institute State target object to use self-sensor device to measure to carry out location with alignment system and obtain;
Determined that described target object is positioned at by the three-dimensional coordinate in described moving parameter information and precision parameter Target area;
Determine for covering the target image collecting device needed for described target area;
Obtain the imaging data of described target image collecting device;
Described imaging data is carried out computer vision process, determines the candidate target in described imaging data;
Described candidate target is tracked, adopts according to described tracking result, cartographic information, described image The positional information of collection equipment and the directional information of described image capture device determine the fortune of described candidate target Dynamic state;
Obtain the parameter information of the candidate target consistent with described target component information type;
By candidate target corresponding for the candidate target parameter information the highest with described target component information matches degree As target object;
The kinestate matched with described target object is sent to described target object.
Object localization method the most according to claim 1, it is characterised in that described target component is believed Breath is the moving parameter information of target object;
Described candidate target parameter information is the kinestate of described candidate target.
Object localization method the most according to claim 2, it is characterised in that described will be with described mesh Candidate target corresponding to the mark the highest candidate target parameter information of parameter information matching degree as target object, Including:
Kinestate based on each candidate target, sets up the kinestate mould corresponding with described candidate target Type, described kinestate model internal memory contains the kinestate template corresponding with candidate target each described;
Described moving parameter information is entered with the kinestate template of candidate target in described kinestate model Row coupling, using candidate target corresponding for kinestate template the highest for matching degree as described target object.
Object localization method the most according to claim 1, it is characterised in that described target component is believed Breath is: the target travel wheel that moving parameter information based on continuous print preset time period internal object object is formed Exterior feature or exercise data group;
Described candidate target parameter information is: adopt based on described tracking result, cartographic information, described image The described candidate target that the positional information of collection equipment and the direction of described image capture device generate is described Candidate Motion profile in preset time period or exercise data group.
Object localization method the most according to claim 1, it is characterised in that described target component is believed Breath is: based on get in preset time period discrete target object moving parameter information set up with institute State object time corresponding to target object-moving parameter information set, described object time-moving parameter information Set includes: the various discrete moment in described preset time period and corresponding described target object Moving parameter information;
Described candidate target parameter information is: adopt based on described tracking result, cartographic information, described image Candidate time-motion shape that the positional information of collection equipment and the directional information of described image capture device are set up State set, described candidate time-kinestate set includes: during various discrete in described preset time period Carve and corresponding described candidate target movement state information.
Object localization method the most according to claim 1, it is characterised in that described target component is believed Breath is: the kind of information of described target object, status information and the visual signature information got;
Described candidate target parameter information is: described imaging data carries out computer vision process and obtains The kind of information of described candidate target, status information and visual signature information.
Object localization method the most according to claim 1, it is characterised in that also include:
Described target component information is identity of wireless device or the identity of the target object obtained;
Described candidate target parameter information is identity of wireless device or the identity mark of the described candidate target obtained Know.
Object localization method the most according to claim 1, it is characterised in that also include:
Judge whether to get the calibration request that image capture device is uploaded, if it is, generate calibration information, Described calibration information at least includes number information and the positional information of the image capture device of required calibration.
Object localization method the most according to claim 1, it is characterised in that also include:
By the direction of motion information in the routing information on cartographic information and described moving parameter information, Predict the moving line of described target object, according in described moving line and described moving parameter information The target area that velocity information prediction subsequent time is corresponding.
Object localization method the most according to claim 1, it is characterised in that described kinematic parameter Information includes:
The acceleration of target object that obtains according to the senser element measurement entrained by described target object, speed Spend direction and towards data.
11. object localization methods according to claim 1, it is characterised in that also include:
Obtain the kind of information of described target object, status information and visual signature information and wireless device Mark and identity;
Obtain the kind of information of described candidate target, status information and visual signature information and report wireless Device identification and identity;
Get described target object or described identity of wireless device that described candidate target reports or identity mark After knowledge, will confirm that its described target object mated or the kind of information of candidate target, status information With visual signature information, by the kind of information of described target, status information and visual signature information with described Identity of wireless device or identity are bound;
Store described binding information.
12. object localization methods according to claim 11, it is characterised in that also include:
Obtain identity of wireless device or the identity of target object;
Obtain to obtain with described identity of wireless device or identity according to the described binding information of storage and match Kind of information, status information and visual signature information, by described kind of information, status information and vision Characteristic information is as described target component information;
Described candidate target parameter information is the kind of information of candidate target, status information and visual signature letter Breath;
Or
Obtain identity of wireless device or the identity of candidate target;
Obtain to obtain with described identity of wireless device or identity according to the described binding information of storage and match Kind of information, status information and visual signature information, by described kind of information, status information and vision Characteristic information is as described candidate target parameter information;
Described target component information is kind of information, status information and visual signature information;
Or
Described imaging data is carried out computer vision process, obtains the kind of information of candidate target, state Information and visual signature information;
Obtain and described kind of information, status information and visual signature letter according to the described binding information of storage The identity of wireless device of manner of breathing coupling or identity, obtain work by described identity of wireless device or identity For described candidate target parameter information;
Described target component information is identity of wireless device or identity.
13. 1 kinds of object locating systems, it is characterised in that including:
Data acquisition unit, for obtaining target component information and the moving parameter information of target object, institute Stating moving parameter information is used the measurement of self-sensor device to be positioned to alignment system by described target object Arrive;
Zone location unit, for being determined by the three-dimensional coordinate in described moving parameter information and precision parameter The target area that described target object is positioned at;
Imaging acquisition unit, for determining that the target image collection needed for covering described target area sets Standby, obtain the imaging data of described target image collecting device;
VPU, for identifying candidate target object according to imaging data, obtains candidate's object Characteristic information;
Kinestate computing unit, for described candidate target is tracked, according to described tracking result, Cartographic information, the positional information of described image capture device and the directional information of described image capture device Determine the kinestate of described candidate target;
Candidate parameter collecting unit, for the candidate target ginseng consistent with described target component information type Number information;
Matching unit, for by the candidate target parameter information the highest with described target component information matches degree Corresponding candidate target is as target object;
Transmitting element, for being sent to described object by the kinestate matched with described target object Body.
14. object locating systems according to claim 13, it is characterised in that described target component Information is the moving parameter information of target object;
Described candidate target parameter information is the kinestate of described candidate target.
15. object locating systems according to claim 13, it is characterised in that described matching unit, Including:
Unit set up by model, for kinestate based on each candidate target, sets up and described candidate's mesh The kinestate model that mark is corresponding, described kinestate model internal memory contains and candidate target pair each described The kinestate template answered;
First sub-matching unit, for by described moving parameter information and candidate in described kinestate model The kinestate template of target is mated, by candidate's mesh corresponding for kinestate template the highest for matching degree It is denoted as described target object.
16. object locating systems according to claim 13, it is characterised in that also include:
Target travel profile sets up unit, for motion based on continuous print preset time period internal object object Parameter information, the target travel profile of formation or exercise data group;
Candidate Motion profile sets up unit, for based on described tracking result, cartographic information, described image The described candidate target that the positional information of collecting device and the direction of described image capture device generate is in institute State the Candidate Motion profile in preset time period or exercise data group;
Described target component information is: target travel profile or exercise data group;
Described candidate target parameter information is: Candidate Motion profile or exercise data group.
17. object locating systems according to claim 13, it is characterised in that also include:
Unit is set up in object time-movable information set, for discrete based on get in preset time period Target object moving parameter information set up the object time corresponding with described target object-kinematic parameter Information aggregate, described object time-moving parameter information set includes: each in described preset time period Discrete instants and corresponding described target object moving parameter information;
Unit is set up in candidate time-kinestate set, for based on described tracking result, cartographic information, The positional information of described image capture device and the directional information of described image capture device are set up discrete Candidate time-kinestate set, described candidate time-kinestate set includes: described preset time period Interior various discrete moment and corresponding described candidate target movement state information;
Described target component information is: object time-moving parameter information set;
Described candidate target parameter information is: candidate time-kinestate set.
18. object locating systems according to claim 13, it is characterised in that described visual processes Unit, is additionally operable to: described imaging data carries out computer vision and processes the described candidate target obtained Kind of information, status information and visual signature information;
Described target component information is: body kind of information that the described target object got is uploaded, state Information and visual signature information;
Described candidate target parameter information is: the kind of information of described candidate target, status information and regard Feel characteristic information.
19. object locating systems according to claim 13, it is characterised in that described target component Information is identity of wireless device or the identity of target object;
Described candidate target parameter information is identity of wireless device or the identity of described candidate target.
20. object locating systems according to claim 13, it is characterised in that also include:
Failure location unit, for judging whether to get the calibration request that image capture device is uploaded, as Fruit is, generates calibration information, and described calibration information at least includes the volume of the image capture device of required calibration Number information and positional information.
21. object locating systems according to claim 13, it is characterised in that also include:
Target area predicting unit, for by the routing information on cartographic information and described kinematic parameter Direction of motion information in information, it was predicted that the moving line of described target object, according to described moving line The target area that subsequent time is corresponding is predicted with the velocity information in described moving parameter information.
22. object locating systems according to claim 13, it is characterised in that described kinematic parameter Information includes:
The acceleration of target object that obtains according to the senser element measurement entrained by described target object, speed Spend direction and towards data.
23. object locating systems according to claim 13, it is characterised in that also include:
Secure information storage unit, for determining the kind of information of described candidate target, status information and regarding Feel characteristic information, asks described candidate target to report identity of wireless device and identity, when getting After stating the described identity of wireless device or identity that candidate target reports, by the kind of described candidate target Category information, status information and visual signature information are bound with described identity of wireless device or identity, And described binding information is stored to presetting database.
24. object locating systems according to claim 13, it is characterised in that described data acquisition Unit is additionally operable to: obtain identity of wireless device and identity, kind of information, the state letter of target object Breath, visual signature information, the identity of wireless device of candidate target and identity;
Described object locating system also includes: target component information acquisition unit, when described target component is believed When breath is for kind of information, status information and visual signature information, for according in described presetting database The kind letter that binding information acquisition and the identity of wireless device of described target object or identity match Breath, status information and visual signature information, by described kind of information, status information and visual signature information As target component information;When described target component information is identity of wireless device or identity, use In obtaining and the kind of information of described target object, state according to the binding information in described presetting database Information and the identity of wireless device of visual signature information match or identity, by described wireless device mark Know or identity is as target component information;
Candidate target parameter information acquiring unit, when described candidate target parameter information is kind of information, shape When state information and visual signature information, for obtaining and institute according to the binding information in described presetting database State the identity of wireless device of candidate target or kind of information, status information and vision that identity matches Characteristic information, believes described kind of information, status information and visual signature information as candidate target parameter Breath;When described candidate target parameter information is identity of wireless device or identity, for according to described Binding information in presetting database obtains and the kind of information of described candidate target, status information and vision Identity of wireless device that characteristic information matches or identity, by described identity of wireless device or identity mark Know as candidate target parameter information.
CN201610279062.XA 2016-04-29 2016-04-29 A kind of object localization method and system Expired - Fee Related CN105975967B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610279062.XA CN105975967B (en) 2016-04-29 2016-04-29 A kind of object localization method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610279062.XA CN105975967B (en) 2016-04-29 2016-04-29 A kind of object localization method and system

Publications (2)

Publication Number Publication Date
CN105975967A true CN105975967A (en) 2016-09-28
CN105975967B CN105975967B (en) 2019-04-23

Family

ID=56994205

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610279062.XA Expired - Fee Related CN105975967B (en) 2016-04-29 2016-04-29 A kind of object localization method and system

Country Status (1)

Country Link
CN (1) CN105975967B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107024208A (en) * 2017-05-18 2017-08-08 上海逍森自动化科技有限公司 A kind of localization method and its positioner
CN107063189A (en) * 2017-01-19 2017-08-18 上海勤融信息科技有限公司 The alignment system and method for view-based access control model
CN107133583A (en) * 2017-04-27 2017-09-05 深圳前海弘稼科技有限公司 One kind plants plant information acquisition method and device
CN107356229A (en) * 2017-07-07 2017-11-17 中国电子科技集团公司电子科学研究院 A kind of indoor orientation method and device
CN108344416A (en) * 2018-02-01 2018-07-31 感知智能科技新加坡有限公司 A kind of localization method based on cartographic information Auto-matching target
CN108364314A (en) * 2018-01-12 2018-08-03 香港科技大学深圳研究院 A kind of localization method, system and medium
CN110100433A (en) * 2016-12-27 2019-08-06 尚飞运营有限公司 There are control method and monitoring systems
CN110231039A (en) * 2019-06-27 2019-09-13 维沃移动通信有限公司 A kind of location information modification method and terminal device
CN112577475A (en) * 2021-01-14 2021-03-30 天津希格玛微电子技术有限公司 Video ranging method capable of effectively reducing power consumption

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101916437A (en) * 2010-06-18 2010-12-15 中国科学院计算技术研究所 Method and system for positioning target based on multi-visual information
CN103139700A (en) * 2011-11-28 2013-06-05 联想(北京)有限公司 Method and system of terminal positioning
CN103249142A (en) * 2013-04-26 2013-08-14 东莞宇龙通信科技有限公司 Locating method, locating system and mobile terminal
CN103841518A (en) * 2014-03-03 2014-06-04 联想(北京)有限公司 Information processing method and electronic device
US20140155104A1 (en) * 2012-11-30 2014-06-05 Cambridge Silicon Radio Limited Indoor positioning using camera and optical signal
CN104378735A (en) * 2014-11-13 2015-02-25 无锡儒安科技有限公司 Indoor positioning method, client side and server
CN104700408A (en) * 2015-03-11 2015-06-10 中国电子科技集团公司第二十八研究所 Indoor singe target positioning method based on camera network
CN104936283A (en) * 2014-03-21 2015-09-23 中国电信股份有限公司 Indoor positioning method, server and system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101916437A (en) * 2010-06-18 2010-12-15 中国科学院计算技术研究所 Method and system for positioning target based on multi-visual information
CN103139700A (en) * 2011-11-28 2013-06-05 联想(北京)有限公司 Method and system of terminal positioning
US20140155104A1 (en) * 2012-11-30 2014-06-05 Cambridge Silicon Radio Limited Indoor positioning using camera and optical signal
CN103249142A (en) * 2013-04-26 2013-08-14 东莞宇龙通信科技有限公司 Locating method, locating system and mobile terminal
CN103841518A (en) * 2014-03-03 2014-06-04 联想(北京)有限公司 Information processing method and electronic device
CN104936283A (en) * 2014-03-21 2015-09-23 中国电信股份有限公司 Indoor positioning method, server and system
CN104378735A (en) * 2014-11-13 2015-02-25 无锡儒安科技有限公司 Indoor positioning method, client side and server
CN104700408A (en) * 2015-03-11 2015-06-10 中国电子科技集团公司第二十八研究所 Indoor singe target positioning method based on camera network

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
JONGBAE KIM AND HEESUNG JUN: "Vision-Based Location Positioning using Augmented Reality for Indoor Navigation", 《IEEE TRANSACTIONS ON CONSUMER ELECTRONICS》 *
THOMAS KITTENBERGER, ANDREAS FERNER, REINHARD SCHEIKL: "A Simple Computer Vision Based Indoor Positioning System for Educational Micro Air Vehicles", 《JOURNAL OF AUTOMATION, MOBILE ROBOTICS & INTELLIGENT SYSTEMS》 *
王群: "基于计算机视觉的人体跟踪与运动参数测量", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110100433A (en) * 2016-12-27 2019-08-06 尚飞运营有限公司 There are control method and monitoring systems
CN107063189A (en) * 2017-01-19 2017-08-18 上海勤融信息科技有限公司 The alignment system and method for view-based access control model
CN107133583A (en) * 2017-04-27 2017-09-05 深圳前海弘稼科技有限公司 One kind plants plant information acquisition method and device
CN107024208A (en) * 2017-05-18 2017-08-08 上海逍森自动化科技有限公司 A kind of localization method and its positioner
CN107356229A (en) * 2017-07-07 2017-11-17 中国电子科技集团公司电子科学研究院 A kind of indoor orientation method and device
CN108364314A (en) * 2018-01-12 2018-08-03 香港科技大学深圳研究院 A kind of localization method, system and medium
CN108344416A (en) * 2018-02-01 2018-07-31 感知智能科技新加坡有限公司 A kind of localization method based on cartographic information Auto-matching target
CN108344416B (en) * 2018-02-01 2021-06-01 感知智能科技新加坡有限公司 Positioning method for automatically matching target based on map information
CN110231039A (en) * 2019-06-27 2019-09-13 维沃移动通信有限公司 A kind of location information modification method and terminal device
CN112577475A (en) * 2021-01-14 2021-03-30 天津希格玛微电子技术有限公司 Video ranging method capable of effectively reducing power consumption

Also Published As

Publication number Publication date
CN105975967B (en) 2019-04-23

Similar Documents

Publication Publication Date Title
CN105975967A (en) Target positioning method and system
US10466056B2 (en) Trajectory matching using ambient signals
US11885900B2 (en) Method and system for tracking a mobile device
CN108632761B (en) Indoor positioning method based on particle filter algorithm
KR101728123B1 (en) Simultaneous Localization and Mapping by Using Earth's Magnetic Fields
CN108413968B (en) A kind of method and system of movement identification
CN108151747A (en) A kind of indoor locating system and localization method merged using acoustical signal with inertial navigation
CN107014375B (en) Indoor positioning system and method with ultra-low deployment
CN111879305B (en) Multi-mode perception positioning model and system for high-risk production environment
CN111698774B (en) Indoor positioning method and device based on multi-source information fusion
CN105143822A (en) Crowd sourced pathway maps
KR20180056675A (en) METHOD AND SYSTEM FOR GENERATING A DIGIT MAP
JP2016212675A (en) Object recognition system
WO2016068742A1 (en) Method and system for indoor positioning of a mobile terminal
CN105785989A (en) System for calibrating distributed network camera by use of travelling robot, and correlation methods
CN103312899A (en) Smart phone with blind guide function
CN110007327A (en) Method for determining the parking stall of vehicle
CN113532499B (en) Sensor security detection method and device for unmanned system and storage medium
CN106600652A (en) Panoramic camera positioning method based on artificial neural network
KR20170032147A (en) A terminal for measuring a position and method thereof
CN108981729A (en) Vehicle positioning method and device
CN110597077A (en) Method and system for realizing intelligent scene switching based on indoor positioning
CN109379716A (en) A kind of indoor orientation method and system for safety monitoring project
CN114608560A (en) Passive combined indoor positioning system and method based on intelligent terminal sensor
KR102594135B1 (en) Gnss-based vehicle driving test device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20190423

Termination date: 20210429