WO2018214978A1 - 定位装置及方法以及自动行走设备 - Google Patents

定位装置及方法以及自动行走设备 Download PDF

Info

Publication number
WO2018214978A1
WO2018214978A1 PCT/CN2018/088519 CN2018088519W WO2018214978A1 WO 2018214978 A1 WO2018214978 A1 WO 2018214978A1 CN 2018088519 W CN2018088519 W CN 2018088519W WO 2018214978 A1 WO2018214978 A1 WO 2018214978A1
Authority
WO
WIPO (PCT)
Prior art keywords
positioning
carrier
positioning device
result
module
Prior art date
Application number
PCT/CN2018/088519
Other languages
English (en)
French (fr)
Inventor
杨洲
周昶
滕哲铭
Original Assignee
苏州宝时得电动工具有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201710385778.2A external-priority patent/CN108957512A/zh
Priority claimed from CN201710972337.2A external-priority patent/CN109682371A/zh
Priority claimed from CN201710978239.XA external-priority patent/CN109683604A/zh
Application filed by 苏州宝时得电动工具有限公司 filed Critical 苏州宝时得电动工具有限公司
Priority to US16/613,271 priority Critical patent/US11448775B2/en
Priority to EP18806033.9A priority patent/EP3633410A4/en
Publication of WO2018214978A1 publication Critical patent/WO2018214978A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/43Determining position using carrier phase measurements, e.g. kinematic positioning; using long or short baseline interferometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/166Mechanical, construction or arrangement details of inertial navigation systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/183Compensation of inertial measurements, e.g. for temperature effects
    • G01C21/188Compensation of inertial measurements, e.g. for temperature effects for accumulated errors, e.g. by coupling inertial systems with absolute positioning systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • G01S19/14Receivers specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/46Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being of a radio-wave signal type
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
    • G01S19/49Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system whereby the further system is an inertial position system, e.g. loosely-coupled
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/027Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0274Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01DHARVESTING; MOWING
    • A01D34/00Mowers; Mowing apparatus of harvesters
    • A01D34/006Control or measuring arrangements
    • A01D34/008Control or measuring arrangements for automated or remotely controlled operation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation

Definitions

  • the present disclosure relates to the field of positioning technologies, and in particular, to a positioning device and method, and an autonomous walking device.
  • the physical boundary can be set by laying a boundary line, but this method adds trouble to the user of the automatic walking device.
  • the position of the virtual boundary/determination device can also be established by a satellite positioning technology such as a GPS (Global Positioning System) positioning technology or a wireless positioning technology such as UWB (Ultra Wideband) positioning technology.
  • a satellite positioning technology such as a GPS (Global Positioning System) positioning technology or a wireless positioning technology such as UWB (Ultra Wideband) positioning technology.
  • GPS Global Positioning System
  • UWB Ultra Wideband
  • the present disclosure proposes a positioning device and method and an automatic walking device for easily and accurately determining the boundary of the working area of the autonomous walking device.
  • a positioning device configured to be carried by a carrier, the positioning device comprising: a first positioning module, configured to acquire a first carrier of the positioning device a positioning module; a sensor module for measuring acceleration and angle parameters of a carrier walking of the positioning device; and a processing module configured to: in the first mode for determining a boundary of an operating range of the autonomous walking device, if If the first positioning result satisfies the quality condition, determining a position of the carrier of the positioning device according to the first positioning result; and if the first positioning result does not satisfy the quality condition, according to the acceleration and the angle parameter, Determining a location of a carrier of the positioning device based on a pedestrian dead reckoning algorithm; and determining the boundary based on a location of a carrier of the positioning device.
  • a positioning device comprising: a first positioning module, configured to acquire a carrier of the positioning device a positioning result; a sensor module for measuring acceleration and angle parameters of a carrier walking of the positioning device; and a processing module configured to: in the first mode for determining a boundary of the working range of the autonomous walking device, according to the Determining, by the first positioning result, a third position of the carrier of the positioning device; determining, according to the acceleration and angle parameters, a fourth position of the carrier of the positioning device based on a pedestrian dead reckoning algorithm; a location and the fourth location determining a location of a carrier of the positioning device; and determining the boundary based on a location of a carrier of the positioning device.
  • an automatic walking apparatus in another aspect of the present disclosure, is provided, wherein the automatic traveling apparatus includes an apparatus body and the positioning device, wherein the positioning device is detachably mountable to the apparatus body.
  • a positioning method comprising: acquiring a first positioning result of a carrier of a positioning device, an acceleration and an angle parameter of a carrier walking; and determining a working range of the automatic walking device; In the first mode of the boundary, if the first positioning result satisfies the quality condition, determining a location of the carrier of the positioning device according to the first positioning result; if the first positioning result does not satisfy the quality condition, And determining, according to the acceleration and the angle parameter, a location of a carrier of the positioning device based on a pedestrian dead reckoning algorithm; and determining the boundary according to a location of a carrier of the positioning device.
  • the method may further include: acquiring a model parameter of a step-frequency model of a carrier of the positioning device, wherein the step-frequency model represents a step frequency of the carrier The relationship between the step size and the step size; determining the position of the carrier of the positioning device based on the pedestrian dead reckoning algorithm, including: the model parameter according to the step frequency model of the step, And the acceleration and the angle parameter, determining a location of a carrier of the positioning device based on a pedestrian dead reckoning algorithm.
  • determining a location of a carrier of the positioning device based on a pedestrian dead reckoning algorithm may include And: when the first positioning result does not satisfy the quality condition, acquiring a first positioning result that satisfies the most recent quality condition as a starting position of the pedestrian dead reckoning algorithm; determining a real-time heading of the carrier according to the angle parameter; Determining, according to the acceleration, a real-time stride of the carrier; determining, according to the real-time stride and the model parameter of the step-frequency model, the step-frequency model to determine the real-time step size of the carrier; And determining, according to the real-time heading, the real-time step, and the starting position, a location of the carrier of the positioning device based on a pedestrian dead reckoning algorithm.
  • acquiring the model parameter of the step-frequency model of the carrier of the positioning device may include: determining a step frequency and a stride point of the carrier according to the acceleration measured by the sensor module Determining, according to the first positioning result corresponding to the stride point, a step size of the carrier, and determining, according to the step frequency and the step size, a model parameter of the step size model, wherein the step point A feature point for each step of the carrier.
  • the method may further include: acquiring, when the first positioning result changes from a non-satisfying quality condition to a quality condition, acquiring the positioning device determined by the pedestrian dead reckoning algorithm a first position of the person and a second position of the carrier of the positioning device determined according to the first positioning result of the first positioning module; according to the first position and the second position, A positioning result does not satisfy the position of the carrier of the positioning device determined based on the pedestrian dead reckoning algorithm during the quality condition.
  • determining the boundary according to a location of a carrier of the positioning device may include: performing interpolation processing on a location of a carrier of the positioning device determined by a pedestrian dead reckoning algorithm, Obtaining a position of the carrier of the positioning device after the interpolation process; smoothing the position of the carrier of the positioning device after the interpolation process and the position of the carrier of the positioning device determined according to the first positioning result Filter to determine the boundary.
  • the first positioning module may be a satellite positioning module, and the first positioning result is a satellite positioning result.
  • the method may further include: determining, according to one or both of the number of satellites received by the satellite positioning module and the positioning state of the satellite positioning module, whether the satellite positioning result of the satellite positioning module is Meet the quality conditions.
  • determining whether the satellite positioning and positioning result of the satellite positioning module meets a quality condition according to one or both of a satellite number received by the satellite positioning module and a positioning state of the satellite positioning module may include When the positioning state is the specified state and the number of satellites is not less than the threshold, it is determined that the satellite positioning result satisfies the quality condition.
  • the method may further include: in a second mode for locating the position of the autonomous walking device, according to at least one of the first positioning result and the inertial positioning result, Determining a position of the autonomous walking device, wherein the inertial positioning result is determined based on an inertial navigation estimation algorithm according to at least an acceleration and an angle parameter output by the sensor module.
  • the inertial navigation estimation algorithm includes an INS algorithm.
  • the pedestrian dead reckoning algorithm includes a PDR algorithm.
  • the first positioning module includes a UWB positioning module.
  • a positioning method comprising: acquiring a first positioning result of a carrier of a positioning device, an acceleration and an angle parameter of a carrier walking; and determining a working range of the automatic walking device; In a first mode of the boundary, determining a third position of the carrier of the positioning device according to the first positioning result; determining, according to the acceleration and angle parameters, a carrier of the positioning device based on a pedestrian dead reckoning algorithm a fourth position; determining a location of a carrier of the positioning device based on the third position and the fourth position; and determining the boundary based on a location of a carrier of the positioning device.
  • determining the location of the carrier of the positioning device according to the third location and the fourth location may include: according to the third location and the fourth location The fusion determines the location of the carrier of the positioning device.
  • determining the location of the carrier of the positioning device according to the fusion of the third location and the fourth location may include: according to the third location and the A weighted sum of four positions determining the position of the carrier of the positioning device.
  • the method further includes determining, according to a quality of the first positioning result, a weight of each of the third location and the fourth location in the weighted sum.
  • determining a weight of each of the third location and the fourth location in the weighted sum according to a quality of the first positioning result may include: following the first An increase in quality of the positioning result, increasing a weight of the third position in the weighted sum, and reducing a weight of the fourth position in the weighted sum; The quality of the positioning result is reduced, the weight of the third position in the weighted sum is reduced, and the weight of the fourth position in the weighted sum is increased.
  • the first positioning module is a satellite positioning module
  • the first positioning result is a satellite positioning result
  • the method may further include: determining, according to one or both of a number of satellites received by the satellite positioning module and a positioning state of the satellite positioning module, the satellite positioning result of the satellite positioning module. quality.
  • determining the location of the carrier of the positioning device according to the third location and the fourth location may include: according to the third location and stride at the stepping point The fourth position at the point, determining the position of the carrier of the positioning device at the stride point; interpolating the position of the carrier of the positioning device at the stride point to obtain the carrier of the positioning device a location; wherein the stride point is a feature point of each step of the carrier.
  • determining the location of the carrier of the positioning device according to the third location and the fourth location may include: interpolating the fourth location at the stride point Obtaining a fourth position after interpolation; determining a position of a carrier of the positioning device according to the third position and the fourth position after the interpolation; wherein the stepping point is each cross of the carrier The feature point of the step.
  • the method may further include: acquiring a model parameter of a step-frequency model of a carrier of the positioning device, wherein the step-frequency model represents a step frequency of the carrier a relationship between the step size and the determining, according to the acceleration and angle parameters, a fourth position of the carrier of the positioning device based on the pedestrian dead reckoning algorithm, comprising: model parameters according to the step size model of the step, and The acceleration and the angle parameter determine a fourth position of a carrier of the positioning device based on a pedestrian dead reckoning algorithm.
  • the fourth position of the carrier of the positioning device is determined based on the model parameters of the step-frequency model, and the acceleration and the angle parameter, based on a pedestrian dead reckoning algorithm
  • the method may include: obtaining a starting position of the pedestrian dead reckoning algorithm; determining a real-time heading of the carrier according to the angle parameter; determining a real-time walking frequency of the carrier according to the acceleration; and determining the real-time stride according to the acceleration a model parameter of the stepped step frequency model, using the step size model to determine a real-time step size of the carrier; and based on the real-time heading, the real-time step, and the starting position, based on a pedestrian flight
  • the bit inference algorithm determines a fourth position of the carrier of the positioning device.
  • acquiring the model parameter of the step-frequency model of the carrier of the positioning device may include: determining a step frequency and a stride point of the carrier according to the acceleration measured by the sensor module Determining, according to the first positioning result corresponding to the stride point, a step size of the carrier, and determining, according to the step frequency and the step size, a model parameter of the step size model, wherein the step point A feature point for each step of the carrier.
  • the method further includes: determining, according to at least one of the first positioning result and the inertial positioning result, in the second mode for positioning the position of the autonomous walking device The position of the autonomous walking device, wherein the inertial positioning result is determined based on an inertial navigation estimation algorithm according to at least an acceleration and an angle parameter output by the sensor module.
  • the inertial navigation estimation algorithm includes an INS algorithm.
  • the first positioning module includes a UWB positioning module.
  • the pedestrian dead reckoning algorithm includes a PDR algorithm.
  • the positioning device, the positioning method and the automatic walking device of the aspects of the present disclosure introduce a pedestrian dead reckoning technology independent of the external environment in the positioning process for the boundary, so that the pedestrian dead reckoning technology is integrated with other positioning technologies to construct
  • the virtual boundary has high positioning accuracy, precise boundary construction, and no physical boundary is required to reduce the complexity of user operations.
  • the present invention also provides an automatic walking device, comprising: a visual module for acquiring visual data of a surrounding environment of the automatic walking device; an inertial navigation module for acquiring inertial data of the automatic walking device; and a satellite navigation module, a satellite positioning data processing module for acquiring the autonomous walking device, electrically connected to the vision module, the inertial navigation module, and the satellite navigation module, the processing module configured to: visualize according to the visual data Positioning, obtaining a visual positioning result; performing inertial positioning according to the inertial data to obtain an inertial positioning result; combining the visual positioning result and the inertial positioning result to obtain a first fusion result after fusion; and positioning the satellite in the satellite If the data meets the quality condition, the first fusion result and the satellite positioning data are fused to obtain a fused second fusion result, and the second fusion result is determined as the location of the automatic walking device.
  • the processing module is further configured to determine the first fusion result as a location of the autonomous walking device if the satellite positioning data does not satisfy a quality condition.
  • the visual positioning result and the inertial positioning result are fused to obtain the first fusion result after the fusion, including:
  • the inertial positioning result is determined as the first fusion result.
  • the first fusion result and the satellite positioning data are fused to obtain the fused second fusion result, including:
  • the vision module comprises a monochrome CMOS vision sensor.
  • the inertial navigation module includes an inertial sensor including a gyroscope and an accelerometer, or the inertial sensor includes a gyroscope and an accelerometer, and one of a geomagnetic sensor and a code wheel Or both.
  • the inertial data includes one or more of velocity, acceleration, angular velocity, and orientation angle.
  • an optical axis of the vision module, an axis of the inertial navigation module, and a center of the satellite navigation module are on a central axis of the autonomous device, and the light of the vision module
  • the shaft is collinear with an axis of the inertial navigation module.
  • the center of the satellite navigation module is above the center point of the two drive wheels of the autonomous vehicle.
  • the fusing the visual positioning result and the inertial positioning result comprises:
  • the visual positioning result and the inertial positioning result are merged by means of extended Kalman filtering.
  • the fusing the first fusion result and the satellite positioning data includes:
  • the first fusion result and the satellite positioning data are fused by means of extended Kalman filtering.
  • the present invention also provides a positioning method, comprising: performing visual positioning according to visual data to obtain a visual positioning result; performing inertial positioning according to inertial data to obtain an inertial positioning result; and integrating the visual positioning result and the inertial positioning result, Obtaining a first fusion result after the fusion; if the satellite positioning data satisfies the quality condition, the first fusion result and the satellite positioning data are merged to obtain a second fusion result after the fusion, and the second The fusion result is determined as the position of the autonomous walking device.
  • the method further includes determining the first fusion result as a location of the autonomous walking device if the satellite positioning data does not satisfy a quality condition.
  • the visual positioning result and the inertial positioning result are fused to obtain the first fusion result after the fusion, including:
  • the inertial positioning result is determined as the first fusion result.
  • the first fusion result and the satellite positioning data are fused to obtain the fused second fusion result, including:
  • the inertial data includes one or more of velocity, acceleration, angular velocity, and orientation angle.
  • the fusing the visual positioning result and the inertial positioning result comprises:
  • the visual positioning result and the inertial positioning result are merged by means of extended Kalman filtering.
  • the fusing the first fusion result and the satellite positioning data includes:
  • the first fusion result and the satellite positioning data are fused by means of extended Kalman filtering.
  • the present invention also provides a positioning device, comprising: a visual positioning module for visually positioning according to visual data to obtain a visual positioning result; and an inertial positioning module for performing inertial positioning according to inertial data to obtain an inertial positioning result; a module, configured to fuse the visual positioning result and the inertial positioning result to obtain a first fusion result after fusion; and a second fusion module, configured to: when the satellite positioning data meets a quality condition, A fusion result is merged with the satellite positioning data to obtain a second fusion result after the fusion, and the second fusion result is determined as the position of the automatic walking device.
  • the present invention also provides an automatic walking device, comprising: a laser module for acquiring laser data of a surrounding environment of the automatic walking device; an inertial navigation module for acquiring inertial data of the automatic walking device; and a satellite navigation module, a satellite positioning data for acquiring the autonomous walking device; a processing module electrically connected to the laser module, the inertial navigation module, and the satellite navigation module, the processing module being configured to: perform, according to the laser data Laser positioning, obtaining a laser positioning result; performing inertial positioning according to the inertial data to obtain an inertial positioning result; combining the laser positioning result and the inertial positioning result to obtain a first fusion result after fusion; If the positioning data meets the quality condition, the first fusion result and the satellite positioning data are fused to obtain the fused second fusion result, and the second fusion result is determined as the location of the automatic walking device. .
  • the processing module is further configured to:
  • the first fusion result is determined as the location of the autonomous walking device if the satellite positioning data does not satisfy the quality condition.
  • the laser positioning result and the inertial positioning result are fused to obtain the first fusion result after the fusion, including:
  • the laser positioning result is effective, the laser positioning result and the inertial positioning result are fused to obtain a first fusion result after fusion;
  • the inertial positioning result is determined as the first fusion result.
  • the first fusion result and the satellite positioning data are fused to obtain the fused second fusion result, including:
  • the laser module comprises a circular laser radar.
  • the inertial navigation module includes an inertial sensor including a gyroscope and an accelerometer, or the inertial sensor includes a gyroscope and an accelerometer, and one of a geomagnetic sensor and a code wheel Or both.
  • the inertial data includes one or more of velocity, acceleration, angular velocity, and orientation angle.
  • an axis of the laser module, an axis of the inertial navigation module, and a center of the satellite navigation module are on a central axis of the autonomous traveling device, and an axis of the laser module
  • the heart is collinear with an axis of the inertial navigation module.
  • the center of the satellite navigation module is above the center point of the two drive wheels of the autonomous device.
  • the fusion of the laser positioning result and the inertial positioning result includes:
  • the laser positioning result and the inertial positioning result are fused by means of extended Kalman filtering.
  • the fusing the first fusion result and the satellite positioning data includes:
  • the first fusion result and the satellite positioning data are fused by means of extended Kalman filtering.
  • the invention also provides a positioning method, comprising: performing laser positioning according to laser data to obtain a laser positioning result; performing inertial positioning according to inertial data to obtain an inertial positioning result; and integrating the laser positioning result and the inertial positioning result, Obtaining a first fusion result after the fusion; if the satellite positioning data satisfies the quality condition, the first fusion result and the satellite positioning data are merged to obtain a second fusion result after the fusion, and the second The fusion result is determined as the position of the autonomous walking device.
  • the method further includes:
  • the first fusion result is determined as the location of the autonomous walking device if the satellite positioning data does not satisfy the quality condition.
  • the laser positioning result and the inertial positioning result are fused to obtain the first fusion result after the fusion, including:
  • the laser positioning result is effective, the laser positioning result and the inertial positioning result are fused to obtain a first fusion result after fusion;
  • the inertial positioning result is determined as the first fusion result.
  • the first fusion result and the satellite positioning data are fused to obtain the fused second fusion result, including:
  • the inertial data includes one or more of velocity, acceleration, angular velocity, and orientation angle.
  • the fusion of the laser positioning result and the inertial positioning result includes:
  • the laser positioning result and the inertial positioning result are fused by means of extended Kalman filtering.
  • the fusing the first fusion result and the GPS positioning data includes:
  • the first fusion result and the satellite positioning data are fused by means of extended Kalman filtering.
  • the invention also provides a positioning device, comprising: a laser positioning module for performing laser positioning according to laser data to obtain a laser positioning result; and an inertial positioning module for performing inertial positioning according to inertial data to obtain an inertial positioning result; a module, configured to fuse the laser positioning result and the inertial positioning result to obtain a first fusion result after fusion; and a second fusion module, configured to: when the satellite positioning data meets a quality condition, A fusion result is merged with the satellite positioning data to obtain a second fusion result after the fusion, and the second fusion result is determined as the position of the automatic walking device.
  • FIG. 1 shows a schematic diagram of an exemplary application environment of an embodiment of the present disclosure.
  • FIG. 2 shows a block diagram of a positioning device in accordance with an embodiment of the present disclosure.
  • FIG. 3 shows a flow chart of one example of a processing procedure of a processing module.
  • FIG. 4 shows a flow chart of one example of a processing procedure of a processing module.
  • FIG. 5 shows a schematic diagram of one example of a process of determining a stride point according to acceleration.
  • FIG. 6 shows a flow chart of one example of a processing procedure of a processing module.
  • Fig. 7 shows a schematic diagram of an example of determining an initial heading by linear fitting.
  • FIG. 8 shows a flow chart of one example of a processing procedure of a processing module.
  • 9a and 9b are diagrams showing an example of correcting the positioning result of the pedestrian dead reckoning algorithm.
  • FIG. 10 shows a flow chart of one example of a processing procedure of a processing module.
  • FIG. 11 shows a structural diagram of a positioning device according to an embodiment of the present disclosure.
  • FIG. 12 shows a flow chart of one example of a processing procedure of a processing module.
  • FIG. 13 illustrates a structural diagram of an autonomous walking apparatus according to an embodiment of the present disclosure.
  • FIG. 14 shows a flow chart of a positioning method in accordance with an embodiment of the present disclosure.
  • FIG. 15 shows a flow chart of a positioning method in accordance with an embodiment of the present disclosure.
  • FIG. 17 shows a schematic diagram of an exemplary application environment of an autonomous walking device in accordance with an embodiment of the present disclosure.
  • FIG. 18 shows a block diagram of an automatic walking device according to an embodiment of the present disclosure.
  • FIG. 19 shows a schematic diagram of an automatic walking device in accordance with an embodiment of the present disclosure.
  • FIG. 20 shows a flow chart of one application example of a processing procedure of a processing module according to an embodiment of the present disclosure.
  • FIG. 21 shows a flow chart of a positioning method in accordance with an embodiment of the present disclosure.
  • FIG. 22 shows a block diagram of a positioning device in accordance with an embodiment of the present disclosure.
  • FIG. 23 is a schematic diagram showing an exemplary application environment of an automatic walking device according to an embodiment of the present disclosure.
  • FIG. 24 shows a block diagram of an automatic walking device in accordance with an embodiment of the present disclosure.
  • FIG. 25 shows a schematic diagram of an autonomous walking device in accordance with an embodiment of the present disclosure.
  • FIG. 26 shows a flow chart of one application example of a processing procedure of a processing module according to an embodiment of the present disclosure.
  • FIG. 27 shows a flow chart of a positioning method according to an embodiment of the present disclosure.
  • FIG. 28 shows a block diagram of a positioning device in accordance with an embodiment of the present disclosure.
  • FIG. 1 shows a schematic diagram of an exemplary application environment of an embodiment of the present disclosure.
  • the positioning method or the positioning device of the embodiment of the present disclosure can be used to determine the virtual area of the working area of the automatic lawn mower 10 ′.
  • a boundary 50', the boundary 50' is planned to have a working area 30' surrounded by a boundary 50' and a non-working area 70' located outside the boundary 50'.
  • the automatic lawn mower 10' When the automatic lawn mower 10' automatically walks in the work area 30', it can compare its position with the position of the boundary 50' to determine whether it is located in the work area 30', or to determine its own distance from the boundary 50'. And according to the result of the judgment, the movement mode is adjusted to keep itself in the work area 30'.
  • the positioning device 100 includes:
  • the first positioning module 101 is configured to acquire a first positioning result of the carrier of the positioning device 100;
  • a sensor module 102 for measuring acceleration and angle parameters of a carrier walking of the positioning device 100;
  • the processing module 103 is configured to:
  • a first mode for determining a boundary of a working range of the autonomous walking device if the first positioning result satisfies a quality condition, determining a position of a carrier of the positioning device according to the first positioning result; Determining that the first positioning result does not satisfy the quality condition, and determining, according to the acceleration and the angle parameter, a location of the carrier of the positioning device based on a pedestrian dead reckoning algorithm;
  • the boundary is determined based on the location of the carrier of the positioning device.
  • the positioning device integrates the pedestrian dead reckoning technology with other positioning technologies by determining the position of the carrier of the positioning device based on the first positioning result or based on the pedestrian dead reckoning algorithm according to the quality of the first positioning result.
  • the pedestrian dead reckoning technology is not easily affected by the external environment, it can make up for the lack of precision when other positioning technologies are affected by the environment, so that the positioning accuracy is high, the boundary is constructed accurately, and physical boundaries are not required, and user operations are reduced. The complexity.
  • the pedestrian dead reckoning algorithm may include a positioning technique based on the step size (the length of each stride) and the heading. There is a correlation between the walking frequency of the pedestrian walking (the number of steps per unit time) and the step size.
  • the gait of the pedestrian walking has periodicity. In a gait cycle (a step), the foot is separated from the ground, the swing, and the hind foot. Following the gait such as the ground and standing, the acceleration of the pedestrians corresponding to the various gaits is different. Therefore, by analyzing the characteristics of the pedestrian acceleration, the gait and the step frequency can be obtained, and the step size can be estimated according to the step frequency, and the pedestrian can be positioned according to the step size and the heading.
  • the sensor module 102 in the positioning device 100 measures the acceleration and angle parameters of the carrier's walking, and combines the principle of the pedestrian dead reckoning algorithm. Positioning based on the pedestrian dead reckoning algorithm.
  • the pedestrian dead reckoning algorithm may be, for example, a PDR (Pedestrian Dead Reckoning) algorithm, or any other pedestrian dead reckoning algorithm.
  • the positioning device 100 can operate in a plurality of working modes including a first mode, a second mode, and the like.
  • the first mode is used to determine the boundary of the working range of the autonomous walking device.
  • the positioning device 100 can be detached from the autonomous walking device, carried by a carrier (eg, a user) (eg, hand-held), and the carrier walks along a desired boundary (eg, boundary 50') during walking
  • a carrier eg, a user
  • boundary 50' e.g, boundary 50'
  • the second mode is used to locate the position of the autonomous walking device.
  • the positioning device can be mounted on the autonomous walking device, position the automatic walking device, and determine whether the automatic walking device exceeds the boundary.
  • the positioning device can also have only the first mode, ie only the position for determining the boundary.
  • the first positioning module 101 may be a positioning module that is not based on a pedestrian dead reckoning algorithm, such as any positioning module that is affected by an external environment.
  • the first positioning module 101 may be a satellite positioning module such as a GPS module, a Beidou positioning module, a Galileo positioning module, and the first positioning module 101 may be a wireless positioning module such as a UWB positioning module.
  • the first positioning module 101 is a GPS module, it can be any module capable of implementing GPS based positioning, such as a GPS receiver capable of receiving GPS signals for positioning.
  • the first positioning module 101 can be implemented based on the prior art.
  • the sensor module 102 can include components capable of measuring acceleration, such as accelerometers, and components capable of measuring angular parameters, such as gyroscopes, electronic compasses, and the like.
  • the angle parameter may be, for example, an angular velocity, a heading angle, or the like.
  • the sensor module 102 may include a 6-axis inertial sensor composed of a gyroscope and an accelerometer, and may also include a gyroscope, an accelerometer, and an electronic compass, and may also include an accelerometer and an electronic compass.
  • the processing module 103 can be any processing component capable of performing data processing, such as a single chip microcomputer, a CPU, an MPU, an FPGA, etc., and the processing module 103 can be implemented by a dedicated hardware circuit, or can be implemented by using a general processing component in combination with the executable logic instruction to execute the processing module 103. Process.
  • the first positioning module 101 and the sensor module 102 can maintain the working state to obtain the first positioning result, the acceleration, and the angle parameter in real time, and obtain the real-time
  • the first positioning result, acceleration, and angle parameters can be communicated to the processing module 103 for processing.
  • the data such as the acceleration and the angle parameter can be sampled by using the sampling clock unified by the first positioning module 101, that is, the processing module 103 can obtain the first positioning result, the acceleration measurement result, and the angle parameter measurement result of the first positioning module at the same sampling time point. .
  • the positioning apparatus 100 may further include a storage module (not shown) to store data generated by the processing module 103, such as coordinate data of a boundary or the like.
  • the quality of the first positioning module can be determined in any suitable manner, for example, according to the received signal or the output signal of the first positioning module, which is not limited by the disclosure.
  • the first positioning result is an example of the satellite positioning result
  • the processing module 103 can determine the satellite according to one or both of the number of satellites received by the satellite positioning module and the positioning state of the satellite positioning module. Whether the satellite positioning result of the positioning module satisfies the quality condition.
  • the number of satellites received by the satellite positioning module reflects the strength and information of the satellite signal, which has an impact on the accuracy of the satellite positioning result.
  • the positioning state of the satellite positioning module can reflect the accuracy of the satellite positioning result.
  • the positioning status may include:
  • Positioning status 0 not positioned
  • Positioning state 1 GPS single point positioning fixed solution
  • Positioning state 2 differential positioning
  • Positioning state 7 manual input mode
  • Some of the positioning states may represent that the GPS positioning result quality is good, and the partial positioning state (for example, the positioning state 5) may represent that the GPS positioning result quality is good, and the partial positioning state (for example, the positioning state 1, 2, 3, 6, 7, 8) can represent poor quality GPS positioning results.
  • the quality condition may be set, and if the quality condition is met, the first positioning result is considered to be of good quality, and the first positioning result is used for positioning, and if the quality condition is not satisfied, the first positioning result is considered.
  • the quality is poor, and the switch is to use the pedestrian dead reckoning algorithm for positioning.
  • the processing module 103 may determine that the satellite positioning result satisfies the quality condition if the positioning state is the specified state and the number of satellites is not less than the threshold.
  • the first positioning module is used as the GPS module.
  • the positioning states 4 and 5 can be used as the designated state, and the threshold of the satellite number, for example, six, can be set.
  • the processing module 103 can determine the number of satellites and the positioning state according to the output signal of the GPS module.
  • the processing module 103 can be configured to determine that the GPS positioning result (first positioning result) is satisfied if the positioning state is a specified state (for example, the positioning state 4 or 5), and the number of satellites is not less than a threshold (for example, not less than 6).
  • the positioning state is not the specified state (for example, not the positioning state 4 or the positioning state 5), and/or the satellite number is less than the threshold (for example, less than 6), the GPS positioning result can be judged (first The positioning result) does not satisfy the quality condition.
  • FIG. 3 shows a flow chart of one example of a processing procedure of a processing module.
  • the processing module 103 can be configured to: determine an operating mode of the positioning device (S301); if the positioning device is in the first mode, determine the first positioning module 101.
  • the position of the carrier of the positioning device is determined based on the pedestrian dead reckoning algorithm according to the acceleration and angle parameters measured by the sensor module 102. (S304), and the position can be saved, and the boundary can be determined according to the position of the carrier of the positioning device.
  • FIG. 4 shows a flow chart of one example of a processing procedure of a processing module.
  • FIG. 4 illustrates an exemplary process for positioning based on a pedestrian dead reckoning algorithm in the event that the first positioning result does not satisfy the quality condition.
  • the same reference numerals as in FIG. 3 represent similar steps.
  • the processing module 103 can be configured to:
  • determining, according to the acceleration and the angle parameter, a location of the carrier of the positioning device based on a pedestrian dead reckoning algorithm including:
  • model parameters of the step size model can be obtained in various appropriate ways, and the disclosure does not limit this.
  • gait detection can be performed based on acceleration, and model parameters are determined based on the results of gait detection.
  • the periodicity of the step causes a periodic change of the acceleration
  • the step frequency and the step of the carrier can be determined according to the acceleration measured by the sensor module. point.
  • FIG. 5 shows a schematic diagram of one example of a process of determining a stride point according to acceleration.
  • the threshold value Ta can be set, and an acceleration sampling point a i is a maximum value (a i >a i-1 and a i >a i+1 ) during the acceleration change period (step period).
  • the sampling point is considered to be a stride point, and the current period is considered to be a valid step period to implement gait detection.
  • the stride point as the feature point of each step of the carrier may also correspond to other feature points of the acceleration, and is not limited to the maximum point.
  • the frequency at which the step occurs may be used as the step frequency, or the change period of the acceleration may be used as the step frequency.
  • the step size l of the carrier may be determined according to the first positioning result corresponding to the stride point (of the same sampling time point).
  • the model parameters of the stepped step frequency model can be determined according to the step frequency f and the step size l.
  • the step-frequency model can be selected according to needs, either a linear model or a nonlinear model.
  • the present disclosure does not limit the specific form of the step-frequency model.
  • the step size model can be a linear model as follows:
  • f is the step frequency
  • l is the step size
  • k and b are the model parameters.
  • the parameters k and b can be calculated according to the least squares method using multiple sets of step size l and step frequency f data.
  • the first positioning result corresponding to the current sampling time point may be obtained, and the step length l is determined according to the first positioning result corresponding to the adjacent stride point, according to the adjacent cross
  • the time interval of the step determines the step frequency f, can store multiple sets of step size and step frequency data, and can calculate the parameters k and b according to the least squares method using the stored plurality of sets of step size and step frequency data.
  • the step size model can also be a nonlinear model as follows:
  • k, b are the model parameters
  • a max and a min are the maximum and minimum values of the acceleration in the stride period, respectively.
  • the model parameters k and b can be calculated by referring to the above example using the least squares method.
  • FIG. 6 shows a flow chart of one example of a processing procedure of a processing module. 6 illustrates that the positioning is determined based on a pedestrian dead reckoning algorithm according to a model parameter of the stepped step frequency model, and the acceleration and the angle parameter, in a case where the first positioning result does not satisfy the quality condition.
  • An example of a process for the location of a carrier of a device includes:
  • the sampling time point t i-1 may be The first positioning result (the most recent first positioning result satisfying the quality condition) is used as the starting position of the pedestrian dead reckoning algorithm.
  • the real-time heading may be the sum of the initial heading of the carrier of the positioning device and the integrated value of the angular velocity real-time measurement from the time determined by the initial heading.
  • the processing module can be configured to obtain an initial heading of the carrier of the positioning device.
  • the initial heading may be the initial heading of the carrier of the positioning device when the positioning device enters the first mode, in other words, may be the initial heading at the beginning of the process of positioning the boundary of the carrier.
  • the initial heading can be obtained simultaneously with the model parameters described above.
  • the initial heading may be determined according to the first positioning result of the first positioning module.
  • the first positioning result corresponding to the stepping point within a period of time (for example, 10 steps) may be linearly fitted, and the fitting may be performed.
  • the inverse tangent of the slope of the line after that is taken as the initial heading.
  • Fig. 7 shows a schematic diagram of an example of determining an initial heading by linear fitting. The dot represents the first positioning result corresponding to each stride point, and the ⁇ 0 angle is used as the initial heading.
  • the positioning apparatus 100 may include a reminding module, when the first positioning result satisfies the quality condition, reminding the user to start the boundary positioning, and when the quality condition is not satisfied, reminding the user that the boundary positioning cannot be performed, and at the beginning.
  • the first positioning result and the acceleration are obtained when performing boundary positioning (eg, entering the first mode) to determine initial heading and/or model parameters to ensure accurate initial heading and/or model parameters are obtained.
  • the manner in which the model parameters of the initial heading or stride step model are determined is not limited thereto.
  • the initial heading when entering the first mode, if the first positioning result does not satisfy the quality condition, the initial heading may be determined to be 0, and the model parameter of the step step frequency model may adopt an empirical value, and wait until the first positioning result The initial heading and model parameters can be corrected when the quality meets the conditions.
  • the positioning device may further include a component for determining a heading, such as a compass.
  • the compass may be used to determine the initial heading, step step
  • the model parameters of the frequency model can use empirical values, and the initial heading and model parameters can be corrected until the first positioning result quality satisfies the condition. In this way, the user can start to use the positioning device 100 for boundary positioning under any conditions without searching or waiting for the starting point of the first positioning result to be of good quality.
  • the real-time heading can be directly determined by the heading angle.
  • the heading angle measured by the electronic compass can be converted into a coordinate system as a real-time heading.
  • the real-time stride frequency can be determined from the acceleration by means of the gait detection exemplified above.
  • the execution order of S601, S602, and S603 is not limited.
  • the real-time step band can be determined by entering a step size model (eg, Equation 1 or Equation 2) determined according to the example above.
  • S605. Determine, according to the real-time heading, the real-time step, and the starting position, a location of the carrier of the positioning device based on a pedestrian dead reckoning algorithm. For example, according to the pedestrian dead reckoning algorithm, the relative position between the carrier and the starting position after each stride can be determined according to the real-time heading, the real-time step and the starting position, thereby realizing the positioning.
  • the first positioning result when the quality of the first positioning result is high and the pedestrian position when the quality of the first positioning result is low can be obtained.
  • the positioning result of the algorithm is estimated, thereby obtaining the position of the boundary.
  • the processing module 103 may further correct the positioning result obtained by the pedestrian dead reckoning algorithm according to the first positioning result to further improve the positioning accuracy of the boundary.
  • FIG. 8 shows a flow chart of one example of a processing procedure of a processing module.
  • FIG. 8 shows an example in which the processing module performs correction processing.
  • the processing module 103 is further configured to:
  • the positioning device 100 performs positioning based on the pedestrian dead reckoning algorithm at t 1 to t i to obtain the positioning result of the pedestrian dead reckoning algorithm ( For example, the coordinates of the carrier of the positioning device 100) P 1 -P i , at the sampling time point t i+1 , the processing module 103 determines that the first positioning result has changed from satisfying the quality condition to satisfying the quality condition, and the sampling can be acquired.
  • the positioning results P 1 -P i of the pedestrian dead reckoning algorithm of the sampling time points t 1 to t i can be corrected according to the first position P i+1 and the second position P′ i+1 described above.
  • the specific manner of the correction can be selected as needed, and the disclosure does not limit this.
  • Fig. 9a shows a schematic diagram of an example of correcting the positioning result of the pedestrian dead reckoning algorithm.
  • the deviation D between the first location P i+1 and the second location P' i+1 can be calculated:
  • the deviation D can be expressed as ( ⁇ x, ⁇ y), where:
  • the deviation D may be equally divided to a position of the carrier of the positioning device (positioning result of the pedestrian dead reckoning algorithm) P 1 -P determined based on the pedestrian dead reckoning algorithm during the first positioning result not satisfying the quality condition
  • the positioning result P j (j is one of 1 to i) of each pedestrian dead reckoning algorithm may be added to the quotient of the deviation D and the total number of positioning results i of the pedestrian dead reckoning algorithm, and the corrected
  • the positioning result of the corrected pedestrian dead reckoning algorithm can be saved to determine the boundary.
  • FIG. 9b is a schematic diagram showing another example of correcting the positioning result of the pedestrian dead reckoning algorithm. As shown in FIG. 9b, the positioning start position of the pedestrian dead reckoning algorithm can be determined (ie, the positioning of the pedestrian dead reckoning algorithm).
  • the second vector V 2 between 1 divides the angular deviation (the magnitude and direction of the angle ⁇ ) between the two vectors of the first vector and the second vector into the positioning process of the pedestrian dead reckoning algorithm.
  • 1 -P i for example, the i vectors between P 0 and P 1 -P i are rotated by an angle ⁇ /i toward the direction of V 2 (direction A indicated by the arrow), and the calculation method for the pedestrian dead reckoning is completed. Correction of the positioning results.
  • FIG. 10 shows a flow chart of one example of a processing procedure of a processing module.
  • Figure 10 illustrates an exemplary process by which a processing module determines the boundary based on the location of the carrier of the positioning device.
  • the processing module 103 is further configured to:
  • S1001 Perform interpolation processing on a position of a carrier of the positioning device determined by a pedestrian dead reckoning algorithm, and obtain a position of a carrier of the positioning device after the interpolation process;
  • S1002 Perform smoothing filtering on the position of the carrier of the positioning device after the interpolation process and the position of the carrier of the positioning device determined according to the first positioning result to determine the boundary.
  • the positioning result of the pedestrian dead reckoning algorithm can be further interpolated.
  • the positioning result after the interpolation process is obtained, so that the obtained position is smooth and continuous.
  • the number of interpolations between the positioning results of the adjacent two pedestrian dead reckoning algorithms is not limited, for example, the number of the first positioning results corresponding to the time interval between the positioning results of the adjacent two pedestrian dead reckoning algorithms may be equal to (or the number of original first positioning results).
  • the difference can be obtained according to the following formula to obtain the position P m after the interpolation processing:
  • N is the number of original first positioning results.
  • the position obtained by the first positioning result combined with the positioning result of the pedestrian dead reckoning algorithm may have jitter. Therefore, the position may be further smoothed to reduce data jitter and obtain a smooth boundary.
  • a dynamic sliding window filtering method can be used for smoothing filtering.
  • different sizes of filtering windows can be used, for example for the position at the corner of the boundary (corner point) can be filtered with a smaller window (eg window size of 2), for non-corner positions (non-corner) Point) can be filtered using a larger window (for example, a window size of 5).
  • An example of smoothing filtering is shown in the following equation, where P" is the smoothed filtered position:
  • a person skilled in the art may perform processing by one or both of S1001 and S1002 according to the needs of use, or may not perform processing of S1001 and S1002.
  • FIG. 11 shows a structural diagram of a positioning device according to an embodiment of the present disclosure.
  • the positioning device can be carried by the carrier.
  • the positioning device 1100 includes:
  • the first positioning module 1101 is configured to acquire a first positioning result of the carrier of the positioning device 1100.
  • a third position of the carrier of the positioning device determined according to the first positioning result; based on the acceleration and angle parameters, based on a pedestrian flight a position estimation algorithm determining a fourth position of a carrier of the positioning device; determining a position of a carrier of the positioning device based on the third position and the fourth position;
  • the boundary is determined based on the location of the carrier of the positioning device.
  • the positioning device integrates the pedestrian dead reckoning technology with other positioning technologies by determining the position of the carrier of the positioning device based on the position determined according to the first positioning result and the position determined based on the pedestrian dead reckoning algorithm.
  • the virtual boundary is constructed. Because the pedestrian dead reckoning technology is not easily affected by the external environment, it can compensate for the lack of precision when other positioning technologies are affected by the environment, so that the positioning accuracy is high, the boundary is constructed accurately, and physical boundaries are not required, and the user operation is reduced. the complexity.
  • first positioning module 1101, the sensor module 1102, and the processing module 1103 reference may be made to the above descriptions for the first positioning module 101, the sensor module 102, and the processing module 103.
  • determining the location of the carrier of the positioning device according to the third location and the fourth location may include: according to the third location and the fourth location The fusion determines the location of the carrier of the positioning device.
  • the third position and the fourth position may be merged in any suitable manner, which may be a tight fusion or a relatively loose fusion. The disclosure does not limit this.
  • determining the location of the carrier of the positioning device according to the fusion of the third location and the fourth location may include: according to a weighted sum of the third location and the fourth location Determining the location of the carrier of the positioning device.
  • determining a third position P gj of the carrier of the positioning device according to the first positioning result, and based on the acceleration and the angle parameter, based on a pedestrian dead reckoning algorithm Determining the fourth position P rj of the carrier of the positioning device, the position P j of the carrier of the positioning device at the sampling time point t j can be determined by the following formula 8:
  • w 1 is the weight value of the third position
  • w 2 is the weight value of the fourth position
  • w 1 and w 2 can be valued as needed, and the disclosure does not limit this.
  • w 1 , w 2 may be fixed values.
  • the processing module 1103 is further configured to: determine, according to a quality of the first positioning result, a weight of each of the third location and the fourth location in the weighted sum .
  • the first positioning module is a satellite positioning module
  • the first positioning result is a satellite positioning result, which may be based on one of a satellite number received by the satellite positioning module and a positioning state of the satellite positioning module. Or both, determining the quality of the satellite positioning result of the satellite positioning module.
  • determining a weight of each of the third location and the fourth location in the weighted sum according to a quality of the first positioning result may include: following the first An increase in quality of the positioning result, increasing a weight of the third position in the weighted sum, and reducing a weight of the fourth position in the weighted sum; The quality of the positioning result is reduced, the weight of the third position in the weighted sum is reduced, and the weight of the fourth position in the weighted sum is increased.
  • the quality of the first positioning result may be divided into multiple states such as an excellent (first quality state), a general (third quality state), and a difference (second quality state), and in different states, w 1 may be set.
  • w 2 is a different value.
  • the threshold of the number of satellites can be set, for example, six.
  • the processing module 103 can determine the number of satellites and the positioning state according to the output signal of the GPS module.
  • the processing module 103 can determine that the GPS positioning result is in the first quality state, if the positioning state is not 4 If the number is less than 5, and/or the number of satellites is less than 6, it is considered that the second quality condition is not met, and the processing module 103 can determine that the GPS positioning result is in the second quality state; if not, the positioning state is 5 Or the number of satellites is greater than or equal to 6, the processing module 103 can determine that the GPS positioning result is in the third quality state.
  • the conditions for distinguishing the quality states are merely examples, and those skilled in the art may set other conditions to distinguish different quality states as needed, and may also set one or any plurality of quality states.
  • the quality state may not be set, and the weights w 1 and w 2 may be calculated in real time according to the quality of the first positioning result, which is not limited in the disclosure.
  • the positioning result (fourth position) of the pedestrian dead reckoning algorithm may exist only at the stride point, and the first positioning result (third position) may exist at each sampling time point, and therefore, based on The interpolation process may be performed during the determination of the position of the carrier of the positioning device by the third position and the fourth position.
  • determining the location of the carrier of the positioning device according to the third location and the fourth location may include: according to the third location and stride at the stepping point The fourth position at the point, determining the position of the carrier of the positioning device at the stride point; interpolating the position of the carrier of the positioning device at the stride point to obtain the carrier of the positioning device a location; wherein the stride point is a feature point of each step of the carrier.
  • the position of the carrier at the stride point can be obtained according to the formula 8 according to the fourth position at the stride point and the third position at the stride point (ie, at the sampling time point corresponding to the stride point).
  • interpolating the carrier position at each stride point for example, by using the method shown in Equation 6 or any suitable manner, and the number of interpolations may be the number of first positioning results corresponding to the time interval between the stride points ( Or the number of original first positioning results, so as to obtain the position of the carrier of the continuous positioning device, to ensure the continuity of the results.
  • determining the location of the carrier of the positioning device according to the third location and the fourth location may include: interpolating the fourth location at the stride point Processing, obtaining a fourth position after interpolation; determining a position of a carrier of the positioning device according to the third position and the fourth position after the interpolation; wherein the stepping point is each of the carriers Step feature points.
  • the fourth position at each stride point may be interpolated, for example, by using the method shown in Equation 6 or any suitable manner, to obtain the interpolated fourth position, and the number of interpolation may be the time between the stride points.
  • the number of the first positioning results corresponding to the interval (or the number of the original first positioning results) may be obtained according to the formula 8 according to the fourth position after the interpolation and the third position at each sampling time point. The location of the carrier ensures continuity of results.
  • the fourth position can be obtained based on the pedestrian dead reckoning algorithm in a manner similar to the positioning result obtained by the pedestrian dead reckoning algorithm above.
  • the processing module 1103 can acquire the model parameters of the step-frequency model of the carrier of the positioning device similarly as above, for example, the carrier's pitch can be determined according to the acceleration measured by the sensor module. And determining a step size of the carrier according to the first positioning result corresponding to the step point, and determining a model parameter of the step frequency model according to the step frequency and the step length, exemplary Instructions can be found above.
  • the fourth position of the carrier of the positioning device may be determined based on the model parameters of the stepped pitch model, and the acceleration and the angle parameter, based on a pedestrian dead reckoning algorithm.
  • FIG. 12 shows a flow chart of one example of a processing procedure of a processing module. 12 shows an example of a process of determining a fourth position of a carrier of the positioning device based on a pedestrian dead reckoning algorithm based on model parameters of the step size model and the acceleration and the angle parameter ,include:
  • S1201 Acquire a starting position of a pedestrian dead reckoning algorithm.
  • the first positioning result when the positioning device enters the first mode may be used as the starting position.
  • the carrier may be reminded that the first positioning result satisfies a certain quality condition. Then start positioning the boundary. It is also possible to use the first positioning result that satisfies a certain quality condition as the starting position after the positioning device enters the first mode.
  • the present disclosure does not limit the manner in which the starting position is acquired.
  • the execution order of S1201, S1202, and S1203 is not limited.
  • S1204 Determine, according to the real-time stride and the model parameters of the step-frequency model, the step-frequency model to determine the real-time step size of the carrier. See the description about S604.
  • S1205. Determine, according to the real-time heading, the real-time step, and the starting position, a fourth position of the carrier of the positioning device based on a pedestrian dead reckoning algorithm. See the description about S605.
  • the positioning device in each of the above embodiments can be installed in an automatic walking device, and the processing module (for example, the processing module 103 or 1103) in the positioning device can be configured to: In a second mode of the position of the traveling device, determining a position of the autonomous walking device according to at least one of the first positioning result and the inertial positioning result, wherein the inertial positioning result is according to at least the sensor module
  • the output acceleration and angle parameters are determined based on the inertial navigation estimation algorithm.
  • the automatic walking device itself may be positioned according to the first positioning result, or the automatic walking device itself may be positioned according to the inertial positioning result, or may be automatically based on the fusion of the first positioning result and the inertial positioning result.
  • the walking device itself is positioned.
  • the fusion positioning of the first positioning result and the inertial positioning result may be, for example, using Kalman filter to fuse the first positioning result with inertial data such as acceleration and angle parameters in real time, and the acceleration and angle parameters collected by the sensor module and the first positioning module are obtained.
  • the first positioning result is sent to the Kalman filter for fusion, and the first positioning result is closed-loop corrected to obtain the corrected position, velocity and attitude, so as to obtain a higher precision positioning result, that is, to achieve the second mode.
  • the fusion positioning system is used to counter the problem of poor positioning accuracy under the condition of the first positioning module signal occlusion. Fusion positioning can be implemented based on related technologies and will not be described in detail here.
  • the inertial navigation estimation algorithm may include an INS (Inertial Navigation System) algorithm.
  • INS Inertial Navigation System
  • the first positioning module is a GPS module
  • a DGPS Downlink Global Positioning System
  • CORS Continuous Operating Reference Stations
  • Technology further improves positioning accuracy.
  • the mobile station will simultaneously receive the GPS satellite signal and the carrier observation and the position of the reference station from the reference station (fixed and known), and form a phase observation with the carrier itself.
  • Differential observation the technique of using the carrier phase to perform the difference is also called RTK (Real-time kinematic)
  • RTK Real-time kinematic
  • CORS technology is a CORS established by using multi-base station network RTK technology.
  • CORS will use several computers, data communication and Internet technologies to form a differential network using a fixed, continuously running GPS reference station to correct moving targets and greatly improve the mobile. Station positioning accuracy.
  • FIG. 13 shows a block diagram of an automated walking apparatus 1300 comprising an apparatus body 200 and the positioning device described above, such as positioning device 100 or positioning device 1100, in accordance with an embodiment of the present disclosure, wherein The positioning device can be detachably mounted to the device body 200.
  • the autonomous walking apparatus of the present embodiment can operate in an application environment such as that shown in FIG.
  • the boundary of the working area of the autonomous walking device is positioned by a positioning device (e.g., boundary line 50' in Figure 1).
  • the automatic walking device of the embodiment of the present disclosure may be in various forms such as a lawn mower, a vacuum cleaner, an industrial robot, and the like.
  • the cutting mechanism may further include a cutting mechanism, which may include a cutting motor and a cutting blade.
  • the cutting motor drives the cutting blade to rotate. , cutting the lawn.
  • FIG. 13 only schematically shows a schematic diagram of an automatic walking device.
  • the shape of the automatic walking device, the mounting position of the positioning device on the automatic walking device, the shape of the positioning device, etc. are not performed in the present disclosure. limit.
  • the mounting manner between the positioning device and the device body 200 is not limited.
  • the positioning device may be snapped into the card slot of the device body or may be housed in a mounting box provided on the device body.
  • the positioning device When the positioning device is mounted on the device body, it can be electrically connected to other circuits in the device body, for example, to data or power communication with other circuits.
  • FIG. 14 shows a flow chart of a positioning method in accordance with an embodiment of the present disclosure. This method can be applied to the positioning device 100 described above. The method includes:
  • S1401 Acquire a first positioning result of a carrier of the positioning device, an acceleration and an angle parameter of the carrier walking;
  • S1402 in a first mode for determining a boundary of an operating range of the automatic walking device, if the first positioning result satisfies a quality condition, determining a location of a carrier of the positioning device according to the first positioning result; Determining, according to the pedestrian dead reckoning algorithm, a location of a carrier of the positioning device according to the acceleration and the angle parameter if the first positioning result does not satisfy a quality condition;
  • S1403 Determine the boundary according to a location of a carrier of the positioning device.
  • the positioning method integrates the pedestrian dead reckoning technology with other positioning technologies by determining the position of the carrier of the positioning device based on the first positioning result or based on the pedestrian dead reckoning algorithm according to the quality of the first positioning result.
  • the pedestrian dead reckoning technology is not easily affected by the external environment, it can make up for the lack of precision when other positioning technologies are affected by the environment, so that the positioning accuracy is high, the boundary is constructed accurately, and physical boundaries are not required, and user operations are reduced. The complexity.
  • the method may further include: acquiring a model parameter of a step-frequency model of a carrier of the positioning device, wherein the step-frequency model represents a step frequency of the carrier The relationship between the step size and the step size; determining the position of the carrier of the positioning device based on the pedestrian dead reckoning algorithm, including: the model parameter according to the step frequency model of the step, And the acceleration and the angle parameter, determining a location of a carrier of the positioning device based on a pedestrian dead reckoning algorithm.
  • determining a location of a carrier of the positioning device based on a pedestrian dead reckoning algorithm may include And: when the first positioning result does not satisfy the quality condition, acquiring a first positioning result that satisfies the most recent quality condition as a starting position of the pedestrian dead reckoning algorithm; determining a real-time heading of the carrier according to the angle parameter; Determining, according to the acceleration, a real-time stride of the carrier; determining, according to the real-time stride and the model parameter of the step-frequency model, the step-frequency model to determine the real-time step size of the carrier; And determining, according to the real-time heading, the real-time step, and the starting position, a location of the carrier of the positioning device based on a pedestrian dead reckoning algorithm.
  • acquiring the model parameter of the step-frequency model of the carrier of the positioning device may include: determining a step frequency and a stride point of the carrier according to the acceleration measured by the sensor module Determining, according to the first positioning result corresponding to the stride point, a step size of the carrier, and determining, according to the step frequency and the step size, a model parameter of the step size model, wherein the step point A feature point for each step of the carrier.
  • the method may further include: acquiring, when the first positioning result changes from a non-satisfying quality condition to a quality condition, acquiring the positioning device determined by the pedestrian dead reckoning algorithm a first position of the person and a second position of the carrier of the positioning device determined according to the first positioning result of the first positioning module; according to the first position and the second position, A positioning result does not satisfy the position of the carrier of the positioning device determined based on the pedestrian dead reckoning algorithm during the quality condition.
  • determining the boundary according to a location of a carrier of the positioning device may include: performing interpolation processing on a location of a carrier of the positioning device determined by a pedestrian dead reckoning algorithm, Obtaining a position of the carrier of the positioning device after the interpolation process; smoothing the position of the carrier of the positioning device after the interpolation process and the position of the carrier of the positioning device determined according to the first positioning result Filter to determine the boundary.
  • the first positioning module may be a satellite positioning module, and the first positioning result is a satellite positioning result.
  • the method may further include: determining, according to one or both of the number of satellites received by the satellite positioning module and the positioning state of the satellite positioning module, whether the satellite positioning result of the satellite positioning module is Meet the quality conditions.
  • determining whether the satellite positioning and positioning result of the satellite positioning module meets a quality condition according to one or both of a satellite number received by the satellite positioning module and a positioning state of the satellite positioning module may include When the positioning state is the specified state and the number of satellites is not less than the threshold, it is determined that the satellite positioning result satisfies the quality condition.
  • the method may further include: in a second mode for locating the position of the autonomous walking device, according to at least one of the first positioning result and the inertial positioning result, Determining a position of the autonomous walking device, wherein the inertial positioning result is determined based on an inertial navigation estimation algorithm according to at least an acceleration and an angle parameter output by the sensor module.
  • the inertial navigation estimation algorithm includes an INS algorithm.
  • the pedestrian dead reckoning algorithm includes a PDR algorithm.
  • the first positioning module includes a UWB positioning module.
  • the above method can be performed by the positioning device 100, for example by the processing module 103 in the positioning device.
  • the processing module 103 can be configured to execute a dedicated hardware circuit of the above method, and can also perform the above method by executing logic instructions.
  • An exemplary description of the above method can be referred to the above description for the positioning device 100, and will not be repeated here.
  • FIG. 15 shows a flow chart of a positioning method in accordance with an embodiment of the present disclosure. This method can be applied to the positioning device 1100 described above. The method includes:
  • S1501 Obtain a first positioning result of a carrier of the positioning device, an acceleration and an angle parameter of the carrier walking;
  • the positioning method integrates the pedestrian dead reckoning technology with other positioning technologies by determining the position of the carrier of the positioning device based on the position determined according to the first positioning result and the position determined based on the pedestrian dead reckoning algorithm.
  • the virtual boundary is constructed. Because the pedestrian dead reckoning technology is not easily affected by the external environment, it can compensate for the lack of precision when other positioning technologies are affected by the environment, so that the positioning accuracy is high, the boundary is constructed accurately, and physical boundaries are not required, and the user operation is reduced. the complexity.
  • determining the location of the carrier of the positioning device according to the third location and the fourth location may include: according to the third location and the fourth location The fusion determines the location of the carrier of the positioning device.
  • determining the location of the carrier of the positioning device according to the fusion of the third location and the fourth location may include: according to the third location and the A weighted sum of four positions determining the position of the carrier of the positioning device.
  • the method further includes determining, according to a quality of the first positioning result, a weight of each of the third location and the fourth location in the weighted sum.
  • determining a weight of each of the third location and the fourth location in the weighted sum according to a quality of the first positioning result may include: following the first An increase in quality of the positioning result, increasing a weight of the third position in the weighted sum, and reducing a weight of the fourth position in the weighted sum; The quality of the positioning result is reduced, the weight of the third position in the weighted sum is reduced, and the weight of the fourth position in the weighted sum is increased.
  • the first positioning module is a satellite positioning module
  • the first positioning result is a satellite positioning result
  • the method may further include: determining, according to one or both of a number of satellites received by the satellite positioning module and a positioning state of the satellite positioning module, the satellite positioning result of the satellite positioning module. quality.
  • determining the location of the carrier of the positioning device according to the third location and the fourth location may include: according to the third location and stride at the stepping point The fourth position at the point, determining the position of the carrier of the positioning device at the stride point; interpolating the position of the carrier of the positioning device at the stride point to obtain the carrier of the positioning device a location; wherein the stride point is a feature point of each step of the carrier.
  • determining the location of the carrier of the positioning device according to the third location and the fourth location may include: interpolating the fourth location at the stride point Obtaining a fourth position after interpolation; determining a position of a carrier of the positioning device according to the third position and the fourth position after the interpolation; wherein the stepping point is each cross of the carrier The feature point of the step.
  • the method may further include: acquiring a model parameter of a step-frequency model of a carrier of the positioning device, wherein the step-frequency model represents a step frequency of the carrier a relationship between the step size and the determining, according to the acceleration and angle parameters, a fourth position of the carrier of the positioning device based on the pedestrian dead reckoning algorithm, comprising: model parameters according to the step size model of the step, and The acceleration and the angle parameter determine a fourth position of a carrier of the positioning device based on a pedestrian dead reckoning algorithm.
  • the fourth position of the carrier of the positioning device is determined based on the model parameters of the step-frequency model, and the acceleration and the angle parameter, based on a pedestrian dead reckoning algorithm
  • the method may include: obtaining a starting position of the pedestrian dead reckoning algorithm; determining a real-time heading of the carrier according to the angle parameter; determining a real-time walking frequency of the carrier according to the acceleration; and determining the real-time stride according to the acceleration a model parameter of the stepped step frequency model, using the step size model to determine a real-time step size of the carrier; and based on the real-time heading, the real-time step, and the starting position, based on a pedestrian flight
  • the bit inference algorithm determines a fourth position of the carrier of the positioning device.
  • acquiring the model parameter of the step-frequency model of the carrier of the positioning device may include: determining a step frequency and a stride point of the carrier according to the acceleration measured by the sensor module Determining, according to the first positioning result corresponding to the stride point, a step size of the carrier, and determining, according to the step frequency and the step size, a model parameter of the step size model, wherein the step point A feature point for each step of the carrier.
  • the method further includes: determining, according to at least one of the first positioning result and the inertial positioning result, in the second mode for positioning the position of the autonomous walking device The position of the autonomous walking device, wherein the inertial positioning result is determined based on an inertial navigation estimation algorithm according to at least an acceleration and an angle parameter output by the sensor module.
  • the inertial navigation estimation algorithm includes an INS algorithm.
  • the first positioning module includes a UWB positioning module.
  • the pedestrian dead reckoning algorithm includes a PDR algorithm.
  • the above method can be performed by the positioning device 1100, for example by the processing module 1103 in the positioning device.
  • the processing module 1103 can be configured as a dedicated hardware circuit that performs the above method, and can also perform the above method by executing logic instructions.
  • An exemplary description of the above method can be referred to the above description for the positioning device 1100, and will not be repeated here.
  • FIGS. 16a and 16b An application example according to an embodiment of the present disclosure is illustrated below in conjunction with FIGS. 16a and 16b. It should be understood by those skilled in the art that the application examples are merely for ease of understanding and are not intended to limit the disclosure.
  • the application example illustrates an exemplary scenario in which the first positioning module is a GPS module, the angle parameter is an angular velocity, and the positioning is performed using GPS and PDR switching as the quality of the GPS positioning result changes.
  • the user When it is required to determine the virtual boundary position by using the positioning device, the user carries the positioning device to the boundary 50', and the user can walk on the boundary 50' until the starting point S0 where the GPS positioning result meets the quality condition is found, and the positioning device can prompt
  • the user's GPS positioning quality is good (or strong GPS signal area, or GPS positioning accuracy is high), and the positioning of the boundary can be started.
  • the user can command the positioning device to start the boundary positioning by triggering a button on the positioning device, etc., and the positioning device can take the GPS positioning result when receiving the command and the measurement result of the sensor module as the initial value of the measurement, and take the S0 point as the measurement. Starting point.
  • the positioning device may be real time and record the GPS positioning result (x t, y t) is stored as the location of the user, at the same time, the positioning means by the output of the GPS module determines in real time The quality state of the GPS positioning result, and the acceleration Acc and the angular degree Gyr are measured by the sensor module to determine the model parameters k, b of the initial heading ⁇ 0 and the step frequency step model.
  • the positioning device determines that the GPS positioning result does not satisfy the quality condition (or the weak GPS signal area, or the GPS positioning accuracy is low), and the positioning device will last the GPS that satisfies the quality condition.
  • the positioning result obtains the real-time heading ⁇ t according to the initial heading ⁇ 0 and the angular velocity data ⁇ , and uses the step-frequency step model to determine the real-time step size l t according to the real-time step frequency, and calculates the real-time heading according to the real-time heading and real-time step.
  • coordinate point (x 't, y' t ) is stored as the user's location.
  • the positioning device can obtain the PDR positioning result and the GPS positioning result at this time, and according to the deviation between the two, the former unsatisfied quality.
  • Conditional PDR positioning results are corrected.
  • the positioning device can automatically determine that the user returns to the starting point, or the user instructs the positioning device to return to the starting point, for example, the user can issue a command to the positioning device to end positioning.
  • the positioning device may perform interpolation processing on the stored PDR positioning result, smooth filter the GPS positioning result and the interpolated PDR positioning result, and obtain final boundary position data (map boundary point) and store it in the positioning device.
  • the user can install the positioning device on the automatic walking device, and the positioning device can use the GPS/INS fusion positioning technology to locate the position of the automatic walking device and compare it with the stored boundary position to determine Whether the automatic walking device is in the working area, or determining the distance between the automatic walking device and the boundary, etc., thereby controlling the movement mode of the automatic walking device.
  • FIGS. 16a and 16c An application example according to an embodiment of the present disclosure is illustrated below in conjunction with FIGS. 16a and 16c. It should be understood by those skilled in the art that the application examples are merely for ease of understanding and are not intended to limit the disclosure.
  • the application example illustrates an exemplary scenario in which the first positioning module is a GPS module, the angle parameter is an angular velocity, and the weight of the GPS positioning result and the PDR positioning result are changed in the weighted sum as the GPS positioning result quality changes.
  • the user when the virtual boundary position needs to be determined by the positioning device, the user carries the positioning device onto the boundary 50', and the user can walk on the boundary 50' until the starting point S0 where the GPS positioning result satisfies the quality condition is found.
  • the positioning device can prompt the user that the GPS positioning quality is good (or the strong GPS signal area, or the GPS positioning accuracy is high), and the positioning of the boundary can be started.
  • the user can command the positioning device to start the boundary positioning by triggering a button on the positioning device, etc., and the positioning device can use the GPS positioning result when receiving the command and the measurement result of the sensor module as the initial value of the measurement, wherein the GPS positioning result is used as the PDR.
  • the initial position is taken with the S0 point as the starting point for the measurement.
  • the positioning device can obtain and record the GPS positioning result as the third position in real time.
  • the positioning device can measure the acceleration Acc and the angular degree Gyr through the sensor module to determine the initial heading ⁇ 0 .
  • the model parameters k and b of the step frequency step model obtain the real-time heading ⁇ t according to the initial position of the PDR, the initial heading ⁇ 0 and the angular velocity data ⁇ , and determine the real-time step length l t according to the real-time step frequency by using the step frequency step model.
  • the real-time coordinate point is calculated as the fourth position based on the real-time heading and the real-time step size.
  • the positioning device can calculate the weighted sum of the third position and the fourth position as the position of the carrier.
  • the positioning device can determine the quality status of the GPS positioning result in real time through the output of the GPS module, and adjust the weight of the third position and the fourth position in the weighted sum according to the quality status.
  • the positioning device determines that the quality of the GPS positioning result is deteriorated, and the positioning device lowers the weight of the third position and increases the weight of the fourth position.
  • the weight value of the third position can be made zero, or the GPS positioning result is ignored, and the fourth position is stored as the position of the carrier.
  • the positioning device can increase the weight of the third position and reduce the weight of the fourth position. If the quality of the GPS positioning result is good enough, the positioning device can even stop the PDR positioning process, or make the fourth position weight 0, and use the third position as the carrier's position for storage.
  • the positioning device can automatically determine that the user returns to the starting point, or the user instructs the positioning device to return to the starting point, for example, the user can issue a command to the positioning device to end positioning.
  • the positioning device can obtain the final boundary position data (map boundary point) and store it in the positioning device.
  • the user can install the positioning device on the automatic walking device, and the positioning device can use the GPS/INS fusion positioning technology to locate the position of the automatic walking device and compare it with the stored boundary position to determine Whether the automatic walking device is in the working area, or determining the distance between the automatic walking device and the boundary, etc., thereby controlling the movement mode of the automatic walking device.
  • FIG. 17 shows a schematic diagram of an exemplary application environment of an autonomous walking device in accordance with an embodiment of the present disclosure.
  • the autonomous walking device 10' may be, for example, an automatic lawn mower, and the autonomous walking device 10' may be in a working area within the boundary 50'. Automatically walk in 30' to cut the vegetation on the work surface.
  • the automatic walking device 10' When the automatic walking device 10' automatically walks in the working area 30', it can autonomously locate its position, compare its position with the position of the boundary 50' to determine whether it is located in the working area 30', or determine its own distance. The distance of the boundary, and according to the result of the judgment, the movement mode is adjusted to keep itself in the work area 30'.
  • FIG. 18 shows a block diagram of an autonomous walking device in accordance with an embodiment of the present disclosure.
  • the automatic walking device includes:
  • a visual module 11 configured to acquire visual data of a surrounding environment of the automatic walking device
  • the inertial navigation module is specifically an IMU module 12 for acquiring the inertial data of the automatic walking device.
  • the satellite navigation module is specifically configured as a GPS module 13 for acquiring GPS positioning data of the automatic walking device.
  • the processing module 14 is electrically connected to the visual module 11, the IMU module 12, and the GPS module 13.
  • the processing module 14 is configured to:
  • the first fusion result and the GPS positioning data are merged to obtain a second fusion result after the fusion, and the second fusion result is determined as the automatic The location of the walking equipment.
  • visual data, inertial data, and GPS positioning data can be respectively acquired by the visual module 11, the IMU module 12, and the GPS module 13, and the positioning results are merged to determine the position of the automatic walking device, so that The autonomous walking device enables precise autonomous positioning.
  • FIG. 19 shows a schematic diagram of an automatic walking device according to an embodiment of the present disclosure.
  • a lawn mower is taken as an example of an automatic walking device.
  • the autonomous walking device may include a vision module 11, an IMU module 12, and a GPS module 13.
  • the vision module 11 can include a visual sensor, for example, a monochrome CMOS vision sensor that can be a global shutter, with which visual disturbances can be avoided and accuracy can be improved.
  • the lens of the vision sensor can be a fisheye lens, which increases the viewing angle of the vision sensor and captures more visual information.
  • the present disclosure does not limit the specific type of visual sensor of the vision module 11.
  • the IMU module 12 may include an inertial sensor, which may include a gyroscope, an accelerometer, or may also include a geomagnetic sensor and a code wheel on the basis of the gyroscope and the accelerometer.
  • the IMU module 12 may include a 6-axis inertial sensor composed of a gyroscope and an accelerometer; and may also include a 9-axis inertial sensor composed of a gyroscope, an accelerometer, and a geomagnetic sensor.
  • the IMU module 12 can obtain the angular velocity data of the automatic walking device through the gyroscope; the acceleration data of the automatic walking device can be obtained by the accelerometer; the latitude and longitude data of the position where the automatic walking device is located can be obtained by the geomagnetic sensor; Speed data of the walking equipment.
  • the present disclosure does not limit the specific components of the inertial sensor of the IMU module 12.
  • the inertial data acquired by the IMU module 12 may include one or more of velocity, acceleration, angular velocity, and orientation angle.
  • the inertial data may be determined according to the condition of the inertial sensor constituting the IMU module 12, which is not limited in the present disclosure.
  • the optical axis of the vision module 11, the axis of the IMU module 12, and the center of the GPS module 13 are on the central axis of the autonomous device, the vision module 11
  • the optical axis is collinear with one axis of the IMU module 12.
  • the CMOS chip of the vision sensor of the vision module 11 can be mounted back to back with the IMU module 12 and ensure that the optical axis of the CMOS chip is collinear with an axis of the IMU module 12; and the optical axis of the CMOS chip, the IMU module 12 A certain axis and the center of the GPS module 13 may be on the central axis of the autonomous device. In this way, the degree of freedom of the obtained data can be reduced, and the algorithm can be simplified, thereby speeding up the processing flow.
  • the GPS module 13 can be any module capable of implementing GPS based positioning, such as a GPS receiver capable of receiving GPS signals for positioning.
  • the GPS module 13 can be implemented based on the prior art. Whether the GPS positioning data meets the quality condition standard can be arbitrarily set according to requirements. For example, whether the GPS positioning data satisfies the quality condition can be determined according to the strength of the GPS signal, the number of receiving satellites, the positioning state, etc., and the disclosure does not limit this.
  • the center of the GPS module 13 can be above the center point of the two drive wheels of the autonomous vehicle. In this way, the positioning position of the GPS module 13 can be unchanged or only slightly changed when the autonomous walking device rotates, thereby simplifying the processing of data.
  • the processing module 14 may be any processing component capable of performing data processing, such as a single chip microcomputer, a CPU, an MPU, an FPGA, etc., and the processing module 14 may be implemented by a dedicated hardware circuit or may be combined by a general processing component. The logic instructions are implemented to perform the processing of the processing module 14.
  • the autonomous device may further include a storage module (not shown) to store data generated by the processing module 14, such as visual data, inertial data, GPS positioning data, and the like.
  • a storage module (not shown) to store data generated by the processing module 14, such as visual data, inertial data, GPS positioning data, and the like.
  • processing module 14 can be configured to:
  • Step S501 performing visual positioning according to the visual data to obtain a visual positioning result
  • Step S502 performing inertial positioning according to the inertial data to obtain an inertial positioning result
  • Step S503 combining the visual positioning result and the inertial positioning result to obtain a first fusion result after the fusion;
  • Step S504 if the GPS positioning data meets the quality condition, the first fusion result and the GPS positioning data are merged to obtain a second fusion result after the fusion, and the second fusion result is determined as The position of the autonomous walking device.
  • the vision module 11 can capture a frame of visual image as visual data at intervals T, which can be set by the system according to the nature of the visual sensor.
  • the visual sensor can take 30 frames per minute.
  • the time interval T can be 2 seconds. The present disclosure does not limit the specific value of the time interval T.
  • image feature information and corresponding description information may be extracted based on visual data (eg, a framed visual image captured), wherein the image feature information may be a plurality of feature points in the visual image. It is possible to cover various regions in the entire visual image; the description information may be a descriptor corresponding to each feature point, and the feature points are described by descriptors. It should be understood by those skilled in the art that the image feature information and the description information can be extracted from the visual data by a feature extraction algorithm (for example, ORB, FAST, SURF, SIFT algorithm, etc.) well-known in the art, which is not limited in the present disclosure.
  • a feature extraction algorithm for example, ORB, FAST, SURF, SIFT algorithm, etc.
  • the visual module 11, the IMU module 12, and the GPS module 13 may be initialized first, and the initial positioning positions of the visual module 11, the IMU module 12, and the GPS module 13 are set to zero.
  • the vision module 11 can be initialized based on visual data as well as inertial data. For example, based on the visual data of the initial time T1 (for example, a frame of a visual image taken at the initial time T1), the image feature information of the initial time T1 and the corresponding description information may be extracted. By using the description information, the image feature information of the initial time T1 and the image feature information of the previous time T0 can be matched, and the correspondence between the image feature information of the T1 and the image feature information of the previous time T0 can be determined; The relationship can determine the basic matrix F between the visual data of the two adjacent moments of the initial time T1 and the previous time T0.
  • Integrating the inertial data of the IMU module 12 for example, the velocity, acceleration, and angular velocity during the previous time T0 and the initial time T1
  • the baseline distance B between the position of the previous time T0 and the initial time T1 of the automatic walking device is obtained.
  • the relative positions between the two frames of the two adjacent moments of the initial time T1 and the previous time T0 can be obtained.
  • the depth information of the image feature information (each feature point) at the initial time T1 can be determined, thereby determining the three-dimensional initial position of each feature point with respect to the initial positioning position of the autonomous walking device (the position of the initial time T1, that is, the coordinate zero point).
  • the three-dimensional initial position may include three-dimensional coordinates of the feature points (eg, position and depth of the feature points).
  • initialization of the visual positioning can be achieved, and the three-dimensional initial position of the image feature information at the initial time T1 can be determined.
  • step S501 visual positioning is performed based on the visual data to obtain a visual positioning result.
  • the visual positioning can be implemented in an appropriate manner in the related art, and the visual positioning result can be any appropriate result produced by the visual positioning process employed, and the disclosure does not limit this.
  • the image feature information (the plurality of feature points and the corresponding coordinates) of the current time Tn and the corresponding description information (description) may be extracted (description) Sub), where n represents the number of image frames taken from the initial time to the current time, n>1.
  • the description information the image feature information of the current time Tn and the image feature information of the previous time Tn-1 can be matched, and the image feature information of the current time Tn and the image feature information of the previous time Tn-1 are determined. Correspondence relationship.
  • the PNP algorithm performs calculation to obtain the 3D position of the image feature information of the current time Tn relative to the image feature information of the previous time Tn-1; based on the 3D position, the calculation may be performed, for example, by a triangulation algorithm to determine the shooting position (automatically The position of the walking device is at the 3D position of the current time Tn.
  • the 3D position of the image feature information of the current time Tn and the 3D position of the automatic walking device at the current time Tn can also be determined by other well-known algorithms, which is not limited in the present disclosure.
  • the 3D position of the automatic walking device at the current time Tn can be used as a visual positioning result, and the process of visual positioning can be realized.
  • Step S502 performing inertial positioning according to the inertial data to obtain an inertial positioning result.
  • the inertial positioning can be achieved by a suitable method in the related art, and the inertial positioning result can be any suitable result produced by the inertial positioning process employed, which is not limited in the present disclosure.
  • the IMU module 12 may determine inertial data according to sensor parameters and calibration information, which may include, for example, velocity, acceleration, and angular velocity during a previous time Tn-1 to a current time Tn, and an orientation angle at a current time Tn ( The posture of the automatic walking equipment) and the like.
  • sensor parameters and calibration information may include, for example, velocity, acceleration, and angular velocity during a previous time Tn-1 to a current time Tn, and an orientation angle at a current time Tn ( The posture of the automatic walking equipment) and the like.
  • the displacement and the amount of change in the attitude of the autonomous walking device during the previous time Tn-1 to the current time Tn can be acquired. Thereby, the inertial positioning position of the autonomous traveling apparatus at the current time Tn with respect to the previous time Tn-1 can be predicted.
  • the inertial positioning position of the autonomous walking device at the current time Tn can be used as the inertial positioning result, and the process of inertial positioning is realized.
  • Step S503 the visual positioning result and the inertial positioning result are merged to obtain a first fusion result after the fusion.
  • step 503 includes:
  • the inertial positioning result is determined as the first fusion result.
  • the inertial positioning result is determined as the first fusion result.
  • extended Kalman filter is a high-efficiency recursive filter (autoregressive filter), which can realize recursive filtering of nonlinear systems.
  • the current state can be determined by the previous state and the current control amount (eg, input control amount, update control amount, etc.), and the expression of the extended Kalman filter can be expressed schematically as equation (1):
  • x t may represent the state of the current time t
  • x t-1 may represent the state of the last time t-1
  • u t may, for example, represent the input control amount of the current time t
  • ⁇ t may, for example, represent Update the amount of control.
  • the inertial positioning result (for example, the inertial positioning position of the autonomous walking device at the current time Tn) can be used as the input control amount (for example, u t in the formula (1) a visual positioning result (for example, a 3D position of the image feature information of the current time Tn with respect to the image feature information of the previous time Tn-1) as an update control amount (for example, ⁇ t in the formula (1)), thereby
  • the state of the moment ie, the first fusion result at the previous moment, such as x t-1 in equation (1)
  • calculates the state of the current moment ie, the first fusion result at the current moment, such as x in equation (1) t ).
  • the "state" (first fusion result) may include any state of interest, such as the auto-walking device 3D position, the 3D position of the image feature information relative to the three-dimensional initial position, and the relative position between the IMU module 12 and the vision module 11. Etc., and the state of the current moment can be used as the first fusion result after the fusion of visual and inertial positioning.
  • the initial position obtained by the above initialization process may be used as an initial state of extended Kalman filtering, that is, an initial x t-1 .
  • the merged 3D position of the autonomous device at the current time Tn; the error of the IMU module 12; and the relative relationship between the IMU module 12 and the vision module 11 may be included. Location and other information.
  • the positioning fusion of vision and inertia can be realized, and the 3D position of the automatic walking device after the fusion of the current time Tn is determined, and the positioning accuracy is improved.
  • Step S504 if the GPS positioning data meets the quality condition, the first fusion result and the GPS positioning data are merged to obtain a second fusion result after the fusion, and the second fusion result is determined as The position of the autonomous walking device.
  • processing module is further configured to perform the following steps:
  • the first fusion result is determined as the location of the autonomous walking device if the GPS positioning data does not satisfy the quality condition.
  • the GPS module 13 does not receive the GPS signal or the GPS signal is weak, it can be considered that the GPS positioning data does not satisfy the quality condition.
  • the first fusion result of the fusion of the visual and inertial positioning may be taken as an output result, and determined as the position of the autonomous walking device at the current time Tn.
  • the visual positioning of the visual module 11 can be re-initialized by the GPS positioning data.
  • step 504 includes:
  • the GPS module 13 normally receives the GPS signal, it can be considered that the GPS positioning data satisfies the quality condition, and the first fusion result and the GPS positioning data can be fused to obtain the second fusion result after the fusion, and the second fusion result is obtained. It is determined as the position of the autonomous walking device at the current time Tn.
  • the first fusion result and the GPS positioning data may be fused by means of extended Kalman filtering.
  • the first fusion result (eg, the first fused 3D position of the autonomous walking device at the current time Tn) may be taken as an input control amount (eg, u t in equation (1)); GPS positioning data (eg, GPS positioning of the current time Tn) 2D position) as the update control amount (for example, ⁇ t in the formula (1)), using the formula (1), the current time is obtained according to the state x t-1 of the previous moment (ie, the second fusion result at the previous moment) State x t (ie, the second fusion result of the current time); wherein the "state" (second fusion result) may include any state of interest, such as the GPS positioning position of the autonomous device at the previous time Tn-1 (3D position) is a state variable of the previous time Tn-1 (for example, x t-1 in the formula (1)), thereby obtaining a state variable of the current time Tn (for example
  • the state variable of the current time can be used as a second fusion result of the fusion of vision, inertia and GPS positioning (for example, the GPS positioning position of the current time Tn).
  • the second fusion result can be determined as the position of the autonomous walking device at the current time Tn.
  • the first obtained fusion result may be used as an initial state of the extended Kalman filter, that is, the initial x t- 1 .
  • the cumulative error of the inertial positioning may be corrected by, for example, a least squares method, thereby further improving the positioning accuracy.
  • the positioning, fusion of vision, inertia and GPS positioning can be realized, and the position of the automatic walking device at the current time Tn can be determined, and the positioning accuracy is further improved.
  • FIG. 20 shows a flow diagram of one application example of a processing procedure of the processing module 14 in accordance with an embodiment of the present disclosure. It should be understood by those skilled in the art that the application examples are merely for ease of understanding and are not intended to limit the disclosure.
  • the first fusion of the visual positioning result and the inertial positioning result can be performed by extending the Kalman filtering method.
  • the first fusion result and the GPS positioning data may be fused by the extended Kalman filtering method to obtain the second fusion result after the fusion, and the second fusion result is determined.
  • the visual data, the inertial data, and the GPS positioning data can be respectively acquired by the visual module 11, the IMU module 12, and the GPS module 13, and the positioning results are merged twice to determine the automatic walking.
  • the position of the device enables the autonomous walking device to achieve precise autonomous positioning.
  • a positioning method according to another embodiment of the present disclosure includes:
  • Step S501 performing visual positioning according to the visual data to obtain a visual positioning result
  • Step S502 performing inertial positioning according to the inertial data to obtain an inertial positioning result
  • Step S503 combining the visual positioning result and the inertial positioning result to obtain a first fusion result after the fusion;
  • Step S504 if the GPS positioning data meets the quality condition, the first fusion result and the GPS positioning data are merged to obtain a second fusion result after the fusion, and the second fusion result is determined as The position of the autonomous walking device.
  • the method further includes:
  • the first fusion result is determined as the location of the autonomous walking device if the GPS positioning data does not satisfy the quality condition.
  • step 503 includes:
  • the inertial positioning result is determined as the first fusion result.
  • step S504 includes:
  • the inertial data includes one or more of velocity, acceleration, angular velocity, and orientation angle.
  • step S503 includes:
  • the visual positioning result and the inertial positioning result are merged by means of extended Kalman filtering.
  • step S504 includes:
  • the first fusion result and the GPS positioning data are fused by means of extended Kalman filtering.
  • FIG. 22 is a block diagram showing a positioning device according to an embodiment of the present disclosure.
  • the device can be implemented by the processing module 14 above.
  • a positioning device according to another embodiment of the present disclosure includes:
  • the visual positioning module 601 is configured to perform visual positioning according to the visual data to obtain a visual positioning result
  • the inertia positioning module 602 is configured to perform inertial positioning according to the inertial data to obtain an inertial positioning result;
  • a first fusion module 603, configured to fuse the visual positioning result and the inertial positioning result to obtain a first fusion result after the fusion;
  • a second fusion module 604 configured to fuse the first fusion result and the GPS positioning data to obtain a second fusion result after the fusion, where the GPS positioning data meets a quality condition,
  • the second fusion result is determined as the position of the autonomous walking device.
  • visual data, inertial data, and GPS positioning data can be respectively acquired by the visual module 11, the IMU module 12, and the GPS module 13, and the positioning results are merged twice to determine the position of the automatic walking device, so that the automatic The walking device enables precise autonomous positioning.
  • the device can also be used to:
  • the first fusion result is determined as the location of the autonomous walking device if the GPS positioning data does not satisfy the quality condition.
  • the first fusion module 603 can be specifically configured to:
  • the inertial positioning result is determined as the first fusion result.
  • the second fusion module 604 can be specifically configured to:
  • the inertial data includes one or more of velocity, acceleration, angular velocity, and orientation angle.
  • the first fusion module 603 can be specifically configured to:
  • the visual positioning result and the inertial positioning result are merged by means of extended Kalman filtering.
  • the second fusion module 604 can be specifically configured to:
  • the first fusion result and the GPS positioning data are fused by means of extended Kalman filtering.
  • FIG. 23 is a schematic diagram showing an exemplary application environment of an automatic walking device according to an embodiment of the present disclosure.
  • the autonomous vehicle 10 may be, for example, an automatic lawn mower, and the autonomous vehicle 10 may be automatically located in the work area 30 within the boundary 50. Walk and cut the vegetation on the work surface.
  • the automatic walking device 10 When the automatic walking device 10 automatically walks in the work area 30, it can autonomously locate its position, compare its own position with the position of the boundary 50 to determine whether it is located in the work area 30, or determine the distance from the boundary itself. The movement mode is adjusted according to the result of the judgment to keep itself in the work area 30.
  • FIG. 24 shows a block diagram of an automatic walking device in accordance with an embodiment of the present disclosure.
  • the autonomous walking device includes:
  • a laser module 15 for acquiring laser data of a surrounding environment of the automatic walking device
  • the inertial navigation module is specifically an IMU module 12 for acquiring the inertial data of the automatic walking device.
  • the satellite navigation module is specifically configured as a GPS module 13 for acquiring GPS positioning data of the automatic walking device.
  • the processing module 14 is electrically connected to the laser module 15, the IMU module 12, and the GPS module 13.
  • the processing module 14 is configured to:
  • the first fusion result and the GPS positioning data are merged to obtain a second fusion result after the fusion, and the second fusion result is determined as the automatic The location of the walking equipment.
  • laser data, inertial data, and GPS positioning data can be respectively acquired by the laser module 15, the IMU module 12, and the GPS module 13, and the positioning results are fused to determine the position of the automatic walking device, so that The autonomous walking device enables precise autonomous positioning.
  • FIG. 25 shows a schematic diagram of an automatic walking apparatus according to an embodiment of the present disclosure.
  • a lawn mower is taken as an example of an automatic walking device.
  • the autonomous walking apparatus may include a laser module 15, an IMU module 12, and a GPS module 13.
  • the laser module 15 may include a laser radar.
  • the laser radar may be a circular laser radar, and the laser radar capable of acquiring the laser data of an object within a 360 degree range of the surrounding environment may be improved. The accuracy of laser positioning.
  • the present disclosure does not limit the specific type of laser radar of the laser module 15.
  • the IMU module 12 may include an inertial sensor, which may include a gyroscope, an accelerometer, or may also include one of a geomagnetic sensor and a code wheel on the basis of the gyroscope and the accelerometer. Or both.
  • the IMU module 12 may include a 6-axis inertial sensor composed of a gyroscope and an accelerometer; and may also include a 9-axis inertial sensor composed of a gyroscope, an accelerometer, and a geomagnetic sensor.
  • the IMU module 12 can obtain the angular velocity data of the automatic walking device through the gyroscope; the acceleration data of the automatic walking device can be obtained by the accelerometer; the latitude and longitude data of the position where the automatic walking device is located can be obtained by the geomagnetic sensor; Speed data of the walking equipment.
  • the present disclosure does not limit the specific components of the inertial sensor of the IMU module 12.
  • the inertial data acquired by the IMU module 12 may include one or more of velocity, acceleration, angular velocity, and orientation angle.
  • the inertial data may be determined according to the condition of the inertial sensor constituting the IMU module 12, which is not limited in the present disclosure.
  • the axis of the laser module 15, the axis of the IMU module 12, and the center of the GPS module 13 are on the central axis of the autonomous device, the laser module 15
  • the axis is collinear with one axis of the IMU module 12.
  • the laser radar of the laser module 15 can be installed back-to-back with the IMU module 12 and ensure that the axis of the laser radar is collinear with an axis of the IMU module 12; and the axis of the laser radar, one of the IMU modules 12
  • the shaft and the center of the GPS module 13 can be on the central axis of the autonomous vehicle. In this way, the degree of freedom of the obtained data can be reduced, and the algorithm can be simplified, thereby speeding up the processing flow.
  • the GPS module 13 can be any module capable of implementing GPS based positioning, such as a GPS receiver capable of receiving GPS signals for positioning.
  • the GPS module 13 can be implemented based on the prior art. Whether the GPS positioning data meets the quality condition standard can be arbitrarily set according to requirements. For example, whether the GPS positioning data satisfies the quality condition can be determined according to the strength of the GPS signal, the number of receiving satellites, the positioning state, etc., and the disclosure does not limit this.
  • the center of the GPS module 13 can be above the center point of the two drive wheels of the autonomous vehicle. In this way, the positioning position of the GPS module 13 can be unchanged or only slightly changed when the autonomous walking device rotates, thereby simplifying the processing of data.
  • the processing module 14 may be any processing component capable of performing data processing, such as a single chip microcomputer, a CPU, an MPU, an FPGA, etc., and the processing module 14 may be implemented by a dedicated hardware circuit or may be combined by a general processing component. The logic instructions are implemented to perform the processing of the processing module 14.
  • the autonomous device may further include a storage module (not shown) to store data generated by the processing module 14, such as laser data, inertial data, GPS positioning data, and the like.
  • a storage module (not shown) to store data generated by the processing module 14, such as laser data, inertial data, GPS positioning data, and the like.
  • processing module 14 can be configured to:
  • Step S501 performing laser positioning according to laser data to obtain a laser positioning result
  • Step S502 performing inertial positioning according to the inertial data to obtain an inertial positioning result
  • Step S503 combining the laser positioning result and the inertial positioning result to obtain a first fusion result after fusion;
  • Step S504 if the GPS positioning data meets the quality condition, the first fusion result and the GPS positioning data are fused to obtain the fused second fusion result, and the second fusion result is determined as the The location of the automatic walking device.
  • the laser module 15 can measure the distance data between the automatic walking device and the object in the surrounding environment as laser data at intervals T, which can be set by the system according to the nature of the laser radar, for example.
  • the laser radar can measure the distance data between the automatic walking device and the object within 360 degrees of the surrounding environment 30 times per minute, and the time interval T can be 2 seconds.
  • the present disclosure does not limit the specific value of the time interval T.
  • point cloud data (multiple distance points) in the laser data can be extracted based on laser data (eg, distance data between the measured auto-walking device and objects within 360 degrees of the surrounding environment) / feature points), covering as much as possible the various areas of the environment around the autonomous walking device.
  • laser data eg, distance data between the measured auto-walking device and objects within 360 degrees of the surrounding environment
  • feature points covering as much as possible the various areas of the environment around the autonomous walking device.
  • the point cloud data can be extracted from the laser data by a point cloud data extraction algorithm well known in the art, which is not limited in the disclosure.
  • the laser module 15, the IMU module 12, and the GPS module 13 may be initialized first, and the initial positioning positions of the laser module 15, the IMU module 12, and the GPS module 13 are set to zero.
  • the laser module 15 can be initialized based on the laser data and the inertial data. For example, based on the laser data at the initial time T1 (eg, a set of distance data measured at the initial time T1), the point cloud data at the initial time T1 can be extracted. Through the point cloud data, the point cloud data of the initial time T1 can be matched with the point cloud data of the previous time T0, and the correspondence between the point cloud data of the T1 and the point cloud data of the previous time T0 can be determined; Corresponding relationship, the basic matrix F between the point cloud data at two adjacent moments of the initial time T1 and the previous time T0 can be determined.
  • Integrating the inertial data of the IMU module 12 for example, the velocity, acceleration, and angular velocity during the previous time T0 and the initial time T1
  • the baseline distance B between the position of the previous time T0 and the initial time T1 of the automatic walking device is obtained.
  • the relative positions between the two sets of point cloud data at two adjacent moments of the initial time T1 and the previous time T0 can be obtained.
  • the depth information of the point cloud data (each distance point/feature point) at the initial time T1 can be determined, thereby determining the initial positioning position (the position of the initial time T1, that is, the coordinate zero point) of each distance point with respect to the automatic traveling device.
  • a three-dimensional initial position which may include three-dimensional coordinates of the feature points (eg, the position and depth of the distance point).
  • initialization of the laser positioning can be achieved, and the three-dimensional initial position of the image feature information at the initial time T1 can be determined.
  • step S501 laser positioning is performed based on the laser data to obtain a laser positioning result.
  • the laser positioning can be achieved by a suitable method in the related art, and the laser positioning result can be any suitable result produced by the laser positioning process employed, which is not limited in the present disclosure.
  • point cloud data (a plurality of distance points/feature points and corresponding coordinates) of the current time Tn can be extracted, wherein Indicates the number of laser data measured from the initial time to the current time, n>1.
  • the point cloud data of the current time Tn can be matched with the point cloud data of the previous time Tn-1, and the correspondence relationship between the point cloud data of the current time Tn and the point cloud data of the previous time Tn-1 can be determined.
  • the coordinate position (2D position) of the point cloud data of the current time Tn may be calculated by, for example, a PNP algorithm to obtain a 3D position of the point cloud data of the current time Tn relative to the point cloud data of the previous time Tn-1; based on the 3D position, for example, by triangulation
  • the algorithm performs calculations to determine the 3D position of the measurement position (the position of the autonomous walking device) at the current time Tn.
  • the 3D position of the point cloud data of the current time Tn and the 3D position of the automatic walking device at the current time Tn can also be determined by other well-known algorithms, which is not limited in the present disclosure.
  • the 3D position of the automatic walking device at the current time Tn can be used as a laser positioning result, and the laser positioning process can be realized.
  • Step S502 performing inertial positioning according to the inertial data to obtain an inertial positioning result.
  • the inertial positioning can be achieved by a suitable method in the related art, and the inertial positioning result can be any suitable result produced by the inertial positioning process employed, which is not limited in the present disclosure.
  • the IMU module 12 may determine inertial data according to sensor parameters and calibration information, which may include, for example, velocity, acceleration, and angular velocity during a previous time Tn-1 to a current time Tn, and an orientation angle at a current time Tn ( The posture of the automatic walking equipment) and the like.
  • sensor parameters and calibration information may include, for example, velocity, acceleration, and angular velocity during a previous time Tn-1 to a current time Tn, and an orientation angle at a current time Tn ( The posture of the automatic walking equipment) and the like.
  • the displacement and the amount of change in the attitude of the autonomous walking device during the previous time Tn-1 to the current time Tn can be acquired. Thereby, the inertial positioning position of the autonomous traveling apparatus at the current time Tn with respect to the previous time Tn-1 can be predicted.
  • the inertial positioning position of the autonomous walking device at the current time Tn can be used as the inertial positioning result, and the process of inertial positioning is realized.
  • Step S503 combining the laser positioning result and the inertial positioning result to obtain a first fusion result after the fusion.
  • step 503 includes:
  • the laser positioning result is effective, the laser positioning result and the inertial positioning result are fused to obtain a first fusion result after fusion;
  • the inertial positioning result is determined as the first fusion result.
  • the inertial positioning result is determined as the first fusion result.
  • extended Kalman filter is a high-efficiency recursive filter (autoregressive filter), which can realize recursive filtering of nonlinear systems.
  • the current state can be determined by the previous state and the current control amount (eg, input control amount, update control amount, etc.), and the expression of the extended Kalman filter can be expressed schematically as equation (1):
  • x t may represent the state of the current time t
  • x t-1 may represent the state of the last time t-1
  • u t may, for example, represent the input control amount of the current time t
  • ⁇ t may, for example, represent Update the amount of control.
  • the inertial positioning result (for example, the inertial positioning position of the autonomous walking device at the current time Tn) can be used as the input control amount (for example, u t in the formula (1)
  • the laser positioning result (for example, the 3D position of the point cloud data of the current time Tn relative to the point cloud data of the previous time Tn-1) is used as the update control amount (for example, ⁇ t in the formula (1)), thereby
  • the state of the moment ie, the first fusion result at the previous moment, such as x t-1 in equation (1)
  • calculates the state of the current moment ie, the first fusion result at the current moment, such as x in equation (1) t ).
  • the "state" (first fusion result) may include any state of interest, such as the autonomous device 3D position, the 3D position of the point cloud data relative to the three-dimensional initial position, and the relative position between the IMU module 12 and the laser module 15 Etc., and the state of the current moment can be used as the first fusion result after the fusion of the laser and the inertial positioning.
  • the initial position obtained by the above initialization process may be used as an initial state of extended Kalman filtering, that is, an initial x t-1 .
  • the merged 3D position of the autonomous device at the current time Tn; the error of the IMU module 12; and the relative relationship between the IMU module 12 and the laser module 15 may be included. Location and other information.
  • the positioning fusion of the laser and the inertia can be realized, and the 3D position of the automatic walking device after the fusion of the current time Tn is determined, and the positioning accuracy is improved.
  • Step S504 if the GPS positioning data meets the quality condition, the first fusion result and the GPS positioning data are merged to obtain a second fusion result after the fusion, and the second fusion result is determined to be an automatic walking.
  • the location of the device if the GPS positioning data meets the quality condition, the first fusion result and the GPS positioning data are merged to obtain a second fusion result after the fusion, and the second fusion result is determined to be an automatic walking. The location of the device.
  • processing module is further configured to perform the following steps:
  • the first fusion result is determined as the location of the autonomous walking device if the GPS positioning data does not satisfy the quality condition.
  • the GPS module 13 does not receive the GPS signal or the GPS signal is weak, it can be considered that the GPS positioning data does not satisfy the quality condition.
  • the first fusion result of the fusion of the laser and the inertial positioning may be taken as an output result and determined as the position of the autonomous walking device at the current time Tn.
  • the initialization of the laser positioning of the laser module 15 can be re-established by the GPS positioning data.
  • step 504 includes:
  • the GPS module 13 normally receives the GPS signal, it can be considered that the GPS positioning data satisfies the quality condition, and the first fusion result and the GPS positioning data can be fused to obtain the second fusion result after the fusion, and the second fusion result is obtained. It is determined as the position of the autonomous walking device at the current time Tn.
  • the first fusion result and the GPS positioning data may be fused by means of extended Kalman filtering.
  • the first fusion result (eg, the first fused 3D position of the autonomous walking device at the current time Tn) may be taken as an input control amount (eg, u t in equation (1)); GPS positioning data (eg, GPS positioning of the current time Tn) 2D position) as the update control amount (for example, ⁇ t in the formula (1)), using the formula (1), the current time is obtained according to the state x t-1 of the previous moment (ie, the second fusion result at the previous moment) State x t (ie, the second fusion result of the current time); wherein the "state" (second fusion result) may include any state of interest, such as the GPS positioning position of the autonomous device at the previous time Tn-1 (3D position) is a state variable of the previous time Tn-1 (for example, x t-1 in the formula (1)), thereby obtaining a state variable of the current time Tn (for example
  • the state variable at the current time can be used as the second fusion result of the fusion of laser, inertia and GPS positioning (for example, the GPS positioning position of the current time Tn).
  • the second fusion result can be determined as the position of the autonomous walking device at the current time Tn.
  • the first obtained fusion result may be used as an initial state of the extended Kalman filter, that is, the initial x t- 1 .
  • the cumulative error of the inertial positioning may be corrected by, for example, a least squares method, thereby further improving the positioning accuracy.
  • the positioning fusion of laser, inertia and GPS positioning can be realized, the position of the automatic walking device at the current time Tn is determined, and the positioning accuracy is further improved.
  • FIG. 26 shows a flow diagram of one application example of a processing procedure of the processing module 14 in accordance with an embodiment of the present disclosure. It should be understood by those skilled in the art that the application examples are merely for ease of understanding and are not intended to limit the disclosure.
  • the laser fusion result and the inertial positioning result can be first fused by the extended Kalman filter to obtain the first fusion.
  • the first fusion result and the GPS positioning data may be fused by the extended Kalman filtering method to obtain the second fusion result after the fusion, and the second fusion result is determined.
  • the laser data, the inertial data, and the GPS positioning data can be respectively acquired by the laser module 15, the IMU module 12, and the GPS module 13, and the positioning results are merged twice to determine the automatic walking.
  • the position of the device enables the autonomous walking device to achieve precise autonomous positioning.
  • FIG. 27 shows a flow chart of a positioning method according to an embodiment of the present disclosure.
  • the method can be implemented by a processor, such as by processing module 14 above.
  • a positioning method according to another embodiment of the present disclosure includes:
  • Step S501 performing laser positioning according to laser data to obtain a laser positioning result
  • Step S502 performing inertial positioning according to the inertial data to obtain an inertial positioning result
  • Step S503 combining the laser positioning result and the inertial positioning result to obtain a first fusion result after fusion;
  • Step S504 if the GPS positioning data meets the quality condition, the first fusion result and the GPS positioning data are merged to obtain a second fusion result after the fusion, and the second fusion result is determined to be an automatic walking.
  • the location of the device if the GPS positioning data meets the quality condition, the first fusion result and the GPS positioning data are merged to obtain a second fusion result after the fusion, and the second fusion result is determined to be an automatic walking. The location of the device.
  • the method further includes:
  • the first fusion result is determined as the location of the autonomous walking device if the GPS positioning data does not satisfy the quality condition.
  • step 503 includes:
  • the laser positioning result is effective, the laser positioning result and the inertial positioning result are fused to obtain a first fusion result after fusion;
  • the inertial positioning result is determined as the first fusion result.
  • step S504 includes:
  • the inertial data includes one or more of velocity, acceleration, angular velocity, and orientation angle.
  • step S503 includes:
  • the laser positioning result and the inertial positioning result are fused by means of extended Kalman filtering.
  • step S504 includes:
  • the first fusion result and the GPS positioning data are fused by means of extended Kalman filtering.
  • FIG. 28 shows a block diagram of a positioning device in accordance with an embodiment of the present disclosure.
  • the device can be implemented by the processing module 14 above.
  • a positioning device according to another embodiment of the present disclosure includes:
  • the laser positioning module 600 is configured to perform laser positioning according to the laser data to obtain a laser positioning result
  • the inertia positioning module 602 is configured to perform inertial positioning according to the inertial data to obtain an inertial positioning result;
  • a first fusion module 603, configured to fuse the laser positioning result and the inertial positioning result to obtain a first fusion result after the fusion;
  • a second fusion module 604 configured to fuse the first fusion result and the GPS positioning data to obtain a second fusion result after the fusion, and to obtain the second fusion
  • the result is determined as the position of the autonomous walking device.
  • the laser data, the inertial data, and the GPS positioning data can be respectively acquired by the laser module 15, the IMU module 12, and the GPS module 13, and the positioning results are merged twice to determine the position of the automatic walking device, so that the automatic The walking device enables precise autonomous positioning.
  • the device can also be used to:
  • the first fusion result is determined as the location of the autonomous walking device if the GPS positioning data does not satisfy the quality condition.
  • the first fusion module 603 can be specifically configured to:
  • the laser positioning result and the inertial positioning result are combined to obtain a first fusion result after fusion;
  • the inertial positioning result is determined as the first fusion result.
  • the second fusion module 604 can be specifically configured to:
  • the inertial data includes one or more of velocity, acceleration, angular velocity, and orientation angle.
  • the first fusion module 603 can be specifically configured to:
  • the laser positioning result and the inertial positioning result are fused by means of extended Kalman filtering.
  • the second fusion module 604 can be specifically configured to:
  • the first fusion result and the GPS positioning data are fused by means of extended Kalman filtering.
  • the present disclosure can be a system, method, and/or computer program product.
  • the computer program product can comprise a computer readable storage medium having computer readable program instructions embodied thereon for causing a processor to implement various aspects of the present disclosure.
  • the computer readable storage medium can be a tangible device that can hold and store the instructions used by the instruction execution device.
  • the computer readable storage medium can be, for example, but not limited to, an electrical storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • Non-exhaustive list of computer readable storage media include: portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), erasable programmable read only memory (EPROM) Or flash memory), static random access memory (SRAM), portable compact disk read only memory (CD-ROM), digital versatile disk (DVD), memory stick, floppy disk, mechanical encoding device, for example, with instructions stored thereon A raised structure in the hole card or groove, and any suitable combination of the above.
  • a computer readable storage medium as used herein is not to be interpreted as a transient signal itself, such as a radio wave or other freely propagating electromagnetic wave, an electromagnetic wave propagating through a waveguide or other transmission medium (eg, a light pulse through a fiber optic cable), or through a wire The electrical signal transmitted.
  • the computer readable program instructions described herein can be downloaded to a respective computing/processing device from a computer readable storage medium or downloaded to an external computer or external storage device over a network, such as the Internet, a local area network, a wide area network, and/or a wireless network.
  • the network may include copper transmission cables, fiber optic transmissions, wireless transmissions, routers, firewalls, switches, gateway computers, and/or edge servers.
  • a network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium in each computing/processing device .
  • Computer program instructions for performing the operations of the present disclosure may be assembly instructions, instruction set architecture (ISA) instructions, machine instructions, machine related instructions, microcode, firmware instructions, state setting data, or in one or more programming languages.
  • Source code or object code written in any combination including object oriented programming languages such as Smalltalk, C++, etc., as well as conventional procedural programming languages such as the "C" language or similar programming languages.
  • the computer readable program instructions can execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer, partly on the remote computer, or entirely on the remote computer or server. carried out.
  • the remote computer can be connected to the user's computer through any kind of network, including a local area network (LAN) or wide area network (WAN), or can be connected to an external computer (eg, using an Internet service provider to access the Internet) connection).
  • the customized electronic circuit such as a programmable logic circuit, a field programmable gate array (FPGA), or a programmable logic array (PLA), can be customized by utilizing state information of computer readable program instructions.
  • Computer readable program instructions are executed to implement various aspects of the present disclosure.
  • the computer readable program instructions can be provided to a general purpose computer, a special purpose computer, or a processor of other programmable data processing apparatus to produce a machine such that when executed by a processor of a computer or other programmable data processing apparatus Means for implementing the functions/acts specified in one or more of the blocks of the flowcharts and/or block diagrams.
  • the computer readable program instructions can also be stored in a computer readable storage medium that causes the computer, programmable data processing device, and/or other device to operate in a particular manner, such that the computer readable medium storing the instructions includes An article of manufacture that includes instructions for implementing various aspects of the functions/acts recited in one or more of the flowcharts.
  • the computer readable program instructions can also be loaded onto a computer, other programmable data processing device, or other device to perform a series of operational steps on a computer, other programmable data processing device or other device to produce a computer-implemented process.
  • instructions executed on a computer, other programmable data processing apparatus, or other device implement the functions/acts recited in one or more of the flowcharts and/or block diagrams.
  • each block in the flowchart or block diagram can represent a module, a program segment, or a portion of an instruction that includes one or more components for implementing the specified logical functions.
  • Executable instructions can also occur in a different order than those illustrated in the drawings. For example, two consecutive blocks may be executed substantially in parallel, and they may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowcharts, and combinations of blocks in the block diagrams and/or flowcharts can be implemented in a dedicated hardware-based system that performs the specified function or function. Or it can be implemented by a combination of dedicated hardware and computer instructions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Navigation (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

一种定位装置及方法以及自动行走设备,其中定位装置包括第一定位模块(101)、传感器模块(102)和处理模块(103)。根据第一定位模块(101)的定位结果和使用传感器模块(102)测量的加速度和角度参数的行人航位推算算法确定的定位结果来确定定位装置的位置,并由此确定自动行走设备的边界。在针对边界的定位过程中引入了不依赖于外部环境的行人航位推算技术,使得行人航位推算技术与其他定位技术相融合来构建虚拟边界,定位精度高,构建的边界精准,且无需布设物理边界,降低用户操作的复杂度。

Description

定位装置及方法以及自动行走设备 技术领域
本公开涉及定位技术领域,尤其涉及一种定位装置及方法以及自动行走设备。
背景技术
随着科学技术的发展,智能的自动行走设备为人们所熟知,由于自动行走设备可以基于自动预先设置的程序执行预先设置的相关任务,无须人为的操作与干预,因此在工业应用及家居产品上的应用非常广泛。工业上的应用如执行各种功能的机器人,家居产品上的应用如割草机、吸尘器等,这些智能的自动行走设备极大地节省了人们的时间,给工业生产及家居生活都带来了极大的便利。
在实际应用中,通常需要这种自动行走设备自动地在一个预设的工作区域内移动,而不离开预设的工作区域,因此就需要确定工作区域的边界和/或确定自身的位置。
相关技术中,可通过布设边界线的方式设置物理边界,然而这种方式给自动行走设备的使用者增加了麻烦。
相关技术中,也可通过GPS(Global Positioning System,全球定位系统)定位技术等卫星定位技术,或UWB(Ultra Wideband,超宽带)定位技术等无线定位技术构建虚拟边界/确定设备的位置,然而卫星定位或无线定位易受环境干扰,在发生信号遮挡等情况时无法保证边界/设备的定位精度。
如何能够简便、精确地确定自动行走设备的工作区域的边界和/或使得自动行走设备能够精确地自主定位,是有待解决的问题。
发明内容
有鉴于此,本公开提出了一种定位装置及方法以及自动行走设备,以简便、精确地确定自动行走设备的工作区域的边界。
本公开的一方面,提出了一种定位装置,其特征在于,所述定位装置能够由携带者携带行走,所述定位装置包括:第一定位模块,用于获取定位装置的携带者的第一定位结果;传感器模块,用于测量定位装置的携带者行走的加速度和角度参数;以及处理模块,被配 置为:在用于确定自动行走设备的工作范围的边界的第一模式下,如果所述第一定位结果满足质量条件,则根据所述第一定位结果确定所述定位装置的携带者的位置;如果所述第一定位结果不满足质量条件,则根据所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的位置;以及根据所述定位装置的携带者的位置,确定所述边界。
本公开的另一方面,提出了一种定位装置,其特征在于,所述定位装置能够由携带者携带行走,所述定位装置包括:第一定位模块,用于获取定位装置的携带者的第一定位结果;传感器模块,用于测量定位装置的携带者行走的加速度和角度参数;以及处理模块,被配置为:在用于确定自动行走设备的工作范围的边界的第一模式下,根据所述第一定位结果确定所述定位装置的携带者的第三位置;根据所述加速度和角度参数、基于行人航位推算算法确定所述定位装置的携带者的第四位置;根据所述第三位置和所述第四位置,确定所述定位装置的携带者的位置;以及根据所述定位装置的携带者的位置,确定所述边界。
本公开的另一方面,提出了一种自动行走设备,其特征在于,所述自动行走设备包括设备主体和上述定位装置,其中所述定位装置能够以可拆卸的方式安装于所述设备主体。
本公开的另一方面,提出了一种定位方法,所述方法包括:获取定位装置的携带者的第一定位结果、携带者行走的加速度和角度参数;在用于确定自动行走设备的工作范围的边界的第一模式下,如果所述第一定位结果满足质量条件,则根据所述第一定位结果确定所述定位装置的携带者的位置;如果所述第一定位结果不满足质量条件,则根据所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的位置;以及根据所述定位装置的携带者的位置,确定所述边界。
在一种可能的实施方式中,所述方法还可包括:获取所述定位装置的携带者的步长步频模型的模型参数,其中所述步长步频模型表示所述携带者的步频与步长之间的关系;采用所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的位置,可包括:根据所述步长步频模型的模型参数、以及所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的位置。
在一种可能的实施方式中,采用所述步长步频模型的模型参数、以及所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的位置,可包括:当所述第一定位结果不满足质量条件时,获取最近的满足质量条件的第一定位结果作为行人航位推算算法的起始位置;根据所述角度参数确定所述携带者的实时航向;根据所述加速度确定所述携带者的实时步频;根据所述实时步频和所述步长步频模型的模型参数,利用所述步长步频模型确定所述携带者的实时步长;以及根据所述实时航向、所述实时步长以及所述起始位置,基于行人航位推算算法确定所述定位装置的携带者的位置。
在一种可能的实施方式中,获取所述定位装置的携带者的步长步频模型的模型参数,可包括:根据所述传感器模块测量的加速度确定所述携带者的步频和跨步点,根据所述跨步点对应的第一定位结果确定所述携带者的步长,根据所述步频和所述步长确定所述步长步频模型的模型参数,其中所述跨步点为所述携带者每一跨步的特征点。
在一种可能的实施方式中,所述方法还可包括:在所述第一定位结果从不满足质量条件变为满足质量条件时,获取基于行人航位推算算法确定的所述定位装置的携带者的第一位置和根据所述第一定位模块的第一定位结果确定的所述定位装置的携带者的第二位置;根据所述第一位置和所述第二位置,对在所述第一定位结果不满足质量条件期间基于行人航位推算算法确定的所述定位装置的携带者的位置进行校正。
在一种可能的实施方式中,根据所述定位装置的携带者的位置,确定所述边界,可包括:对基于行人航位推算算法确定的所述定位装置的携带者的位置进行插值处理,得到插值处理后的所述定位装置的携带者的位置;对插值处理后的所述定位装置的携带者的位置和根据所述第一定位结果确定的所述定位装置的携带者的位置进行平滑滤波,以确定所述边界。
在一种可能的实施方式中,所述第一定位模块可为卫星定位模块,所述第一定位结果为卫星定位结果。
在一种可能的实施方式中,所述方法还可包括:根据卫星定位模块接收的卫星数和卫星定位模块的定位状态中的一者或两者,判断卫星定位模块的所述卫星定位结果是否满足 质量条件。
在一种可能的实施方式中,根据卫星定位模块接收的卫星数和卫星定位模块的定位状态中的一者或两者,判断卫星定位模块的所述卫星定位定位结果是否满足质量条件,可包括:在定位状态为指定状态,且卫星数不小于阈值的情况下,判断所述卫星定位结果满足质量条件。
在一种可能的实施方式中,所述方法还可包括:在用于定位所述自动行走设备的位置的第二模式下,根据所述第一定位结果以及惯性定位结果的至少其中之一,确定所述自动行走设备的位置,其中,所述惯性定位结果是根据至少所述传感器模块输出的加速度和角度参数、基于惯性导航推算算法确定的。
在一种可能的实施方式中,所述惯性导航推算算法包括INS算法。
在一种可能的实施方式中,所述行人航位推算算法包括PDR算法。
在一种可能的实施方式中,所述第一定位模块包括UWB定位模块。
本公开的另一方面,提出了一种定位方法,所述方法包括:获取定位装置的携带者的第一定位结果、携带者行走的加速度和角度参数;在用于确定自动行走设备的工作范围的边界的第一模式下,根据所述第一定位结果确定所述定位装置的携带者的第三位置;根据所述加速度和角度参数、基于行人航位推算算法确定所述定位装置的携带者的第四位置;根据所述第三位置和所述第四位置,确定所述定位装置的携带者的位置;以及根据所述定位装置的携带者的位置,确定所述边界。
在一种可能的实施方式中,根据所述第三位置和所述第四位置,确定所述定位装置的携带者的位置,可包括:根据所述第三位置和所述第四位置两者的融合,确定所述定位装置的携带者的位置。
在一种可能的实施方式中,根据所述第三位置和所述第四位置两者的融合,确定所述定位装置的携带者的位置,可包括:根据所述第三位置和所述第四位置的加权和,确定所述定位装置的携带者的位置。
在一种可能的实施方式中,所述方法还包括:根据第一定位结果的质量,确定所述第 三位置和所述第四位置各自在所述加权和中所占的权重。
在一种可能的实施方式中,根据第一定位结果的质量,确定所述第三位置和所述第四位置各自在所述加权和中所占的权重,可包括:随着所述第一定位结果的质量的提高,增大所述第三位置在所述加权和中所占的权重,并减小所述第四位置在所述加权和中所占的权重;随着所述第一定位结果的质量的降低,减小所述第三位置在所述加权和中所占的权重,并增大所述第四位置在所述加权和中所占的权重。
在一种可能的实施方式中,所述第一定位模块为卫星定位模块,所述第一定位结果为卫星定位结果。
在一种可能的实施方式中,所述方法还可包括:根据卫星定位模块接收的卫星数和卫星定位模块的定位状态中的一者或两者,判断卫星定位模块的所述卫星定位结果的质量。
在一种可能的实施方式中,根据所述第三位置和所述第四位置,确定所述定位装置的携带者的位置,可包括:根据跨步点处的所述第三位置和跨步点处的所述第四位置,确定跨步点处所述定位装置的携带者的位置;对跨步点处所述定位装置的携带者的位置进行插值处理,得到所述定位装置的携带者的位置;其中所述跨步点为所述携带者每一跨步的特征点。
在一种可能的实施方式中,根据所述第三位置和所述第四位置,确定所述定位装置的携带者的位置,可包括:对跨步点处的所述第四位置进行插值处理,得到插值后的第四位置;根据所述第三位置和所述插值后的第四位置,确定所述定位装置的携带者的位置;其中所述跨步点为所述携带者每一跨步的特征点。
在一种可能的实施方式中,所述方法还可包括:获取所述定位装置的携带者的步长步频模型的模型参数,其中所述步长步频模型表示所述携带者的步频与步长之间的关系;根据所述加速度和角度参数、基于行人航位推算算法确定所述定位装置的携带者的第四位置,包括:根据所述步长步频模型的模型参数、以及所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的第四位置。
在一种可能的实施方式中,根据所述步长步频模型的模型参数、以及所述加速度和所 述角度参数,基于行人航位推算算法确定所述定位装置的携带者的第四位置,可包括:获取行人航位推算算法的起始位置;根据所述角度参数确定所述携带者的实时航向;根据所述加速度确定所述携带者的实时步频;根据所述实时步频和所述步长步频模型的模型参数,利用所述步长步频模型确定所述携带者的实时步长;以及根据所述实时航向、所述实时步长以及所述起始位置,基于行人航位推算算法确定所述定位装置的携带者的第四位置。
在一种可能的实施方式中,获取所述定位装置的携带者的步长步频模型的模型参数,可包括:根据所述传感器模块测量的加速度确定所述携带者的步频和跨步点,根据所述跨步点对应的第一定位结果确定所述携带者的步长,根据所述步频和所述步长确定所述步长步频模型的模型参数,其中所述跨步点为所述携带者每一跨步的特征点。
在一种可能的实施方式中,所述方法还包括:在用于定位所述自动行走设备的位置的第二模式下,根据所述第一定位结果以及惯性定位结果的至少其中之一,确定所述自动行走设备的位置,其中,所述惯性定位结果是根据至少所述传感器模块输出的加速度和角度参数、基于惯性导航推算算法确定的。
在一种可能的实施方式中,所述惯性导航推算算法包括INS算法。
在一种可能的实施方式中,所述第一定位模块包括UWB定位模块。
在一种可能的实施方式中,所述行人航位推算算法包括PDR算法。
本公开的各方面的定位装置、定位方法和自动行走设备在针对边界的定位过程中引入了不依赖于外部环境的行人航位推算技术,使得行人航位推算技术与其他定位技术相融合来构建虚拟边界,定位精度高,构建的边界精准,且无需布设物理边界,降低用户操作的复杂度。
本发明还提供一种自动行走设备,包括:视觉模块,用于获取所述自动行走设备的周围环境的视觉数据;惯性导航模块,用于获取所述自动行走设备的惯性数据;卫星导航模块,用于获取所述自动行走设备的卫星定位数据处理模块,与所述视觉模块、所述惯性导航模块以及所述卫星导航模块电连接,所述处理模块被配置为:根据所述视觉数据进行视觉定位,获得视觉定位结果;根据所述惯性数据进行惯性定位,获得惯性定位结果;对所 述视觉定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;在所述卫星定位数据满足质量条件的情况下,对所述第一融合结果和所述卫星定位数据进行融合,获得融合后的第二融合结果,将所述第二融合结果确定为所述自动行走设备的位置。
在一种可能的实施方式中,所述处理模块还被配置为:在所述卫星定位数据不满足质量条件的情况下,将所述第一融合结果确定为所述自动行走设备的位置。
在一种可能的实施方式中,对所述视觉定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果,包括:
在所述视觉定位结果有效的情况下,对所述视觉定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;
在所述视觉定位结果无效的情况下,将所述惯性定位结果确定为第一融合结果。
在一种可能的实施方式中,在所述卫星定位数据满足质量条件的情况下,对所述第一融合结果和所述卫星定位数据进行融合,获得融合后的第二融合结果,包括:
在所述卫星定位数据满足质量条件的情况下,根据当前时刻的第一融合结果、当前时刻的卫星定位数据以及前一时刻的第二融合结果,获取所述自动行走设备在当前时刻的第二融合结果。
在一种可能的实施方式中,所述视觉模块包括单色CMOS视觉传感器。
在一种可能的实施方式中,所述惯性导航模块包括惯性传感器,所述惯性传感器包括陀螺仪和加速度计,或者所述惯性传感器包括陀螺仪和加速度计,以及地磁传感器和码盘中的一者或两者。
在一种可能的实施方式中,所述惯性数据包括速度、加速度、角速度以及朝向角中的一个或多个。
在一种可能的实施方式中,所述视觉模块的光轴、所述惯性导航模块的一轴以及所述卫星导航模块的中心处于所述自动行走设备的中轴线上,所述视觉模块的光轴与所述惯性导航模块的一轴共线。
在一种可能的实施方式中,所述卫星导航模块的中心处于所述自动行走设备的两个驱 动轮的中心点的上方。
在一种可能的实施方式中,对所述视觉定位结果和所述惯性定位结果进行融合包括:
通过扩展卡尔曼滤波的方式对所述视觉定位结果和所述惯性定位结果进行融合。
在一种可能的实施方式中,对所述第一融合结果和所述卫星定位数据进行融合包括:
通过扩展卡尔曼滤波的方式对所述第一融合结果和所述卫星定位数据进行融合。
本发明还提供一种定位方法,包括:根据视觉数据进行视觉定位,获得视觉定位结果;根据惯性数据进行惯性定位,获得惯性定位结果;对所述视觉定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;在卫星定位数据满足质量条件的情况下,对所述第一融合结果和所述卫星定位数据进行融合,获得融合后的第二融合结果,将所述第二融合结果确定为自动行走设备的位置。
在一种可能的实施方式中,所述方法还包括:在所述卫星定位数据不满足质量条件的情况下,将所述第一融合结果确定为所述自动行走设备的位置。
在一种可能的实施方式中,对所述视觉定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果,包括:
在所述视觉定位结果有效的情况下,对所述视觉定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;
在所述视觉定位结果无效的情况下,将所述惯性定位结果确定为第一融合结果。
在一种可能的实施方式中,在所述卫星定位数据满足质量条件的情况下,对所述第一融合结果和所述卫星定位数据进行融合,获得融合后的第二融合结果,包括:
在所述卫星定位数据满足质量条件的情况下,根据当前时刻的第一融合结果、当前时刻的卫星定位数据以及前一时刻的第二融合结果,获取所述自动行走设备在当前时刻的第二融合结果。
在一种可能的实施方式中,所述惯性数据包括速度、加速度、角速度以及朝向角中的一个或多个。
在一种可能的实施方式中,对所述视觉定位结果和所述惯性定位结果进行融合包括:
通过扩展卡尔曼滤波的方式对所述视觉定位结果和所述惯性定位结果进行融合。
在一种可能的实施方式中,对所述第一融合结果和所述卫星定位数据进行融合包括:
通过扩展卡尔曼滤波的方式对所述第一融合结果和所述卫星定位数据进行融合。
本发明还提供一种定位装置,包括:视觉定位模块,用于根据视觉数据进行视觉定位,获得视觉定位结果;惯性定位模块,用于根据惯性数据进行惯性定位,获得惯性定位结果;第一融合模块,用于对所述视觉定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;第二融合模块,用于在卫星定位数据满足质量条件的情况下,对所述第一融合结果和所述卫星定位数据进行融合,获得融合后的第二融合结果,将所述第二融合结果确定为自动行走设备的位置。
本发明还提供一种自动行走设备,包括:激光模块,用于获取所述自动行走设备的周围环境的激光数据;惯性导航模块,用于获取所述自动行走设备的惯性数据;卫星导航模块,用于获取所述自动行走设备的卫星定位数据;处理模块,与所述激光模块、所述惯性导航模块以及所述卫星导航模块电连接,所述处理模块被配置为:根据所述激光数据进行激光定位,获得激光定位结果;根据所述惯性数据进行惯性定位,获得惯性定位结果;对所述激光定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;在所述卫星定位数据满足质量条件的情况下,对所述第一融合结果和所述卫星定位数据进行融合,获得融合后的第二融合结果,将所述第二融合结果确定为所述自动行走设备的位置。
在一种可能的实施方式中,所述处理模块还被配置为:
在所述卫星定位数据不满足质量条件的情况下,将所述第一融合结果确定为所述自动行走设备的位置。
在一种可能的实施方式中,对所述激光定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果,包括:
在所述激光定位结果有效的情况下,对所述激光定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;
在所述激光定位结果无效的情况下,将所述惯性定位结果确定为第一融合结果。
在一种可能的实施方式中,在所述卫星定位数据满足质量条件的情况下,对所述第一融合结果和所述卫星定位数据进行融合,获得融合后的第二融合结果,包括:
在所述卫星定位数据满足质量条件的情况下,根据当前时刻的第一融合结果、当前时刻的卫星定位数据以及前一时刻的第二融合结果,获取所述自动行走设备在当前时刻的第二融合结果。
在一种可能的实施方式中,所述激光模块包括圆扫激光雷达。
在一种可能的实施方式中,所述惯性导航模块包括惯性传感器,所述惯性传感器包括陀螺仪和加速度计,或者所述惯性传感器包括陀螺仪和加速度计,以及地磁传感器和码盘中的一者或两者。
在一种可能的实施方式中,所述惯性数据包括速度、加速度、角速度以及朝向角中的一个或多个。
在一种可能的实施方式中,所述激光模块的轴心、所述惯性导航模块的一轴以及所述卫星导航模块的中心处于所述自动行走设备的中轴线上,所述激光模块的轴心与所述惯性导航模块的一轴共线。
在一种可能的实施方式中,所述卫星导航模块的中心处于所述自动行走设备的两个驱动轮的中心点的上方。
在一种可能的实施方式中,对所述激光定位结果和所述惯性定位结果进行融合包括:
通过扩展卡尔曼滤波的方式对所述激光定位结果和所述惯性定位结果进行融合。
在一种可能的实施方式中,对所述第一融合结果和所述卫星定位数据进行融合包括:
通过扩展卡尔曼滤波的方式对所述第一融合结果和所述卫星定位数据进行融合。
本发明还提供一种定位方法,包括:根据激光数据进行激光定位,获得激光定位结果;根据惯性数据进行惯性定位,获得惯性定位结果;对所述激光定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;在卫星定位数据满足质量条件的情况下,对所述第一融合结果和所述卫星定位数据进行融合,获得融合后的第二融合结果,将所述第二融合结果确定为自动行走设备的位置。
在一种可能的实施方式中,所述方法还包括:
在所述卫星定位数据不满足质量条件的情况下,将所述第一融合结果确定为所述自动行走设备的位置。
在一种可能的实施方式中,对所述激光定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果,包括:
在所述激光定位结果有效的情况下,对所述激光定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;
在所述激光定位结果无效的情况下,将所述惯性定位结果确定为第一融合结果。
在一种可能的实施方式中,在所述卫星定位数据满足质量条件的情况下,对所述第一融合结果和所述卫星定位数据进行融合,获得融合后的第二融合结果,包括:
在所述卫星定位数据满足质量条件的情况下,根据当前时刻的第一融合结果、当前时刻的卫星定位数据以及前一时刻的第二融合结果,获取所述自动行走设备在当前时刻的第二融合结果。
在一种可能的实施方式中,所述惯性数据包括速度、加速度、角速度以及朝向角中的一个或多个。
在一种可能的实施方式中,对所述激光定位结果和所述惯性定位结果进行融合包括:
通过扩展卡尔曼滤波的方式对所述激光定位结果和所述惯性定位结果进行融合。
在一种可能的实施方式中,对所述第一融合结果和所述GPS定位数据进行融合包括:
通过扩展卡尔曼滤波的方式对所述第一融合结果和所述卫星定位数据进行融合。
本发明还提供一种定位装置,包括:激光定位模块,用于根据激光数据进行激光定位,获得激光定位结果;惯性定位模块,用于根据惯性数据进行惯性定位,获得惯性定位结果;第一融合模块,用于对所述激光定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;第二融合模块,用于在卫星定位数据满足质量条件的情况下,对所述第一融合结果和所述卫星定位数据进行融合,获得融合后的第二融合结果,将所述第二融合结果确定为自动行走设备的位置。
根据下面参考附图对示例性实施例的详细说明,本公开的其它特征及方面将变得清楚。
附图说明
包含在说明书中并且构成说明书的一部分的附图与说明书一起示出了本公开的示例性实施例、特征和方面,并且用于解释本公开的原理。
图1示出本公开实施例的一种示例性的应用环境的示意图。
图2示出根据本公开一实施例的定位装置的结构图。
图3示出了处理模块的处理过程的一个示例的流程图。
图4示出了处理模块的处理过程的一个示例的流程图。
图5示出了根据加速度确定跨步点的过程的一个示例的示意图。
图6示出了处理模块的处理过程的一个示例的流程图。
图7示出了通过线性拟合确定初始航向的一个示例的示意图。
图8示出了处理模块的处理过程的一个示例的流程图。
图9a、图9b示出了对行人航位推算算法的定位结果进行校正的示例的示意图。
图10示出了处理模块的处理过程的一个示例的流程图。
图11示出根据本公开一实施例的定位装置的结构图。
图12示出了处理模块的处理过程的一个示例的流程图。
图13示出根据本公开实施例的一种自动行走设备的结构图。
图14示出了根据本公开实施例的一种定位方法的流程图。
图15示出了根据本公开实施例的一种定位方法的流程图。
图16a-图16c,示出根据本公开实施例的应用示例的示意图。
图17示出了根据本公开一实施例的自动行走设备的一种示例性应用环境的示意图。
图18示出了根据本公开一实施例的一种自动行走设备的框图
图19示出了根据本公开一实施例的一种自动行走设备的示意图。
图20示出了根据本公开一实施例的处理模块的处理过程的一个应用示例的流程图。
图21示出了根据本公开一实施例的一种定位方法的流程图。
图22示出了根据本公开一实施例的一种定位装置的框图。
图23示出了本公开一实施例的一种自动行走设备的一种示例性应用环境的示意图。
图24示出了根据本公开一实施例的一种自动行走设备的框图。
图25示出了根据本公开一实施例的一种自动行走设备的示意图。
图26示出了根据本公开一实施例的处理模块的处理过程的一个应用示例的流程图。
图27示出了根据本公开一实施例的一种定位方法的流程图。
图28示出了根据本公开一实施例的一种定位装置的框图。
具体实施方式
以下将参考附图详细说明本公开的各种示例性实施例、特征和方面。附图中相同的附图标记表示功能相同或相似的元件。尽管在附图中示出了实施例的各种方面,但是除非特别指出,不必按比例绘制附图。
在这里专用的词“示例性”意为“用作例子、实施例或说明性”。这里作为“示例性”所说明的任何实施例不必解释为优于或好于其它实施例。
另外,为了更好的说明本公开,在下文的具体实施方式中给出了众多的具体细节。本领域技术人员应当理解,没有某些具体细节,本公开同样可以实施。在一些实例中,对于本领域技术人员熟知的方法、手段、元件和电路未作详细描述,以便于凸显本公开的主旨。
图1示出本公开实施例的一种示例性的应用环境的示意图。
如图1所示,以自动割草机10’为例,在一种示例性的应用环境中,可利用本公开实施例的定位方法或定位装置确定自动割草机10’的工作区域的虚拟边界50’,边界50’规划出由边界50’围绕而成的工作区域30’和位于边界50’圈外的非工作区域70’。
自动割草机10’在工作区域30’中自动行走时,可将自身位置与边界50’的位置进行比较,以判断自身是否位于工作区域30’内,或判断自身距离边界50’的距离,并根据判断的结果调整移动方式,使自身保持在工作区域30’内。
图2示出根据本公开一实施例的定位装置的结构图。所述定位装置能够由携带者携带 行走。如图2所示,该定位装置100包括:
第一定位模块101,用于获取定位装置100的携带者的第一定位结果;
传感器模块102,用于测量定位装置100的携带者行走的加速度和角度参数;以及
处理模块103,被配置为:
在用于确定自动行走设备的工作范围的边界的第一模式下,如果所述第一定位结果满足质量条件,则根据所述第一定位结果确定所述定位装置的携带者的位置;如果所述第一定位结果不满足质量条件,则根据所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的位置;以及
根据所述定位装置的携带者的位置,确定所述边界。
通过根据第一定位结果的质量,基于第一定位结果或基于行人航位推算算法确定定位装置的携带者的位置,根据本公开实施例的定位装置使行人航位推算技术与其他定位技术相融合来构建虚拟边界,由于行人航位推算技术不易受外部环境的影响,能够弥补其他定位技术受到环境影响时的精度不足,使得定位精度高,构建的边界精准,且无需布设物理边界,降低用户操作的复杂度。
行人航位推算算法可包括基于步长(每跨步的长度)与航向进行定位的定位技术。行人行走的步频(单位时间的步数)与步长之间存在关联关系,行人行走的步态具有周期性,在一个步态周期(一个跨步)中,包含脚离地、摆动、后脚跟着地和站立等步态,各种步态对应的行人的加速度不同。因此,通过分析行人加速度的特征,可以获得步态、步频,根据步频可估算步长,根据步长和航向,即可对行人进行定位。在携带者(行人)携带定位装置100沿边界行走的场景下,通过定位装置100中的传感器模块102测量定位装置的携带者行走的加速度和角度参数,结合上述行人航位推算算法的原理,可进行基于行人航位推算算法的定位。行人航位推算算法例如可以是PDR(Pedestrian Dead Reckoning,行人航位推算)算法,或任意其他行人航位推算算法。
在一种可能的实施方式中,定位装置100可工作于包括第一模式、第二模式等多种工作模式。第一模式用于确定自动行走设备的工作范围的边界。在第一模式下,定位装置100 可以从自动行走设备上拆卸下来,由携带者(例如用户)携带(例如手持),携带者沿期望的边界(例如边界50’)行走,在行走的过程中通过定位装置100的定位,即可确定边界的坐标(边界地图)并进行存储。第二模式用于定位自动行走设备的位置。在第二模式下,定位装置可安装在自动行走设备上,定位自动行走设备的位置,判断自动行走设备是否超出边界。各种工作模式可以通过手动或自动设定和切换,并可在定位装置上予以提示。本领域技术人员应理解,定位装置的工作模式不限于以上举例。定位装置也可以仅具有第一模式,即仅用于确定边界的位置。
第一定位模块101可以是不基于行人航位推算算法的定位模块,例如任何会受外部环境影响的定位模块。例如,第一定位模块101可以是GPS模块、北斗定位模块、伽利略定位模块等卫星定位模块,再例如,第一定位模块101可以是UWB定位模块等无线定位模块。当第一定位模块101是GPS模块时,其可以是任何能够实现基于GPS定位的模块,例如能够接收GPS信号进行定位的GPS接收机。第一定位模块101可以基于现有技术实现。
传感器模块102可以包括能够测量加速度的部件,例如加速度计,和能够测量角度参数的部件,例如陀螺仪、电子罗盘等。其中角度参数例如可以是角速度、航向角等。举例来说,传感器模块102可以包括陀螺仪和加速度计组成的6轴惯性传感器,也可以包括陀螺仪、加速度计和电子罗盘,也可以包括加速度计和电子罗盘。
处理模块103可以是单片机、CPU、MPU、FPGA等任何能进行数据处理的处理部件,处理模块103可以通过专用硬件电路实现,也可以通过通用处理部件结合可执行逻辑指令实现,以执行处理模块103的处理过程。
在一种可能的实施方式中,在定位装置100处于开机状态的情况下,第一定位模块101和传感器模块102可保持工作状态,以实时获得第一定位结果、加速度和角度参数,实时获取的第一定位结果、加速度和角度参数可传送至处理模块103以进行处理。加速度和角度参数等数据可以采用与第一定位模块101统一的采样时钟进行采样,即处理模块103可获得同一采样时间点的第一定位模块的第一定位结果、加速度测量结果和角度参数测量结果。
在一种可能的实施方式中,定位装置100还可包括存储模块(未示出),以存储处理模 块103生成的数据,例如边界的坐标数据等。
第一定位模块的质量可以采用任意适当方式来确定,例如根据第一定位模块的接收信号或输出信号来确定,本公开对此不做限制。
以第一定位模块为卫星定位模块,第一定位结果为卫星定位结果为例,处理模块103可以根据卫星定位模块接收的卫星数和卫星定位模块的定位状态中的一者或两者,判断卫星定位模块的卫星定位结果是否满足质量条件。卫星定位模块接收的卫星数反映了卫星信号强弱及信息量,对卫星定位结果的精度有影响,卫星定位模块的定位状态能够反应卫星定位结果的精度。
例如,以GPS模块作为卫星定位模块为例,根据常用的GPS模块参数设置,定位状态可包括:
定位状态0:未定位
定位状态1:GPS单点定位固定解
定位状态2:差分定位
定位状态3:PPS
定位状态4:RTK固定解
定位状态5:RTK浮点解
定位状态6:估计值
定位状态7:手工输入模式
定位状态8:模拟模式
其中部分定位状态(例如定位状态4)可代表GPS定位结果质量很好,部分定位状态(例如定位状态5)可代表GPS定位结果质量较好,部分定位状态(例如定位状态1、2、3、6、7、8)可代表GPS定位结果质量较差。
在一种可能的实施方式中,可设定质量条件,满足该质量条件,则认为第一定位结果质量较好,利用第一定位结果进行定位,不满足该质量条件,则认为第一定位结果质量较差,切换为利用行人航位推算算法进行定位。
例如,处理模块103可以在定位状态为指定状态,且卫星数不小于阈值的情况下,判断所述卫星定位结果满足质量条件。
仍以第一定位模块为GPS模块进行举例来说,可以以例如定位状态4、5作为指定状态,并设定卫星数的阈值,例如6个。处理模块103可以根据GPS模块的输出信号确定卫星数和定位状态。处理模块103可被配置为在定位状态为指定状态(例如定位状态4或5),且卫星数不小于阈值(例如不小于6个)的情况下,判断GPS定位结果(第一定位结果)满足质量条件;否则,在定位状态不为指定状态(例如不为定位状态4,也不为定位状态5),和/或卫星数小于阈值(例如小于6个),可判断GPS定位结果(第一定位结果)不满足质量条件。
以上质量条件仅为示例,本领域技术人员可根据需要设置其他质量条件来判断第一定位结果的质量。
图3示出了处理模块的处理过程的一个示例的流程图。如图3所示,在一种可能的实施方式中,处理模块103可被配置为:判断定位装置的工作模式(S301);如果定位装置处于第一模式下,则可判断第一定位模块101的第一定位结果是否满足质量条件(S302),如果第一定位结果满足质量条件(例如GPS模块的定位状态4或5,且卫星数不小于6个),则根据所述第一定位结果确定所述定位装置100的携带者的位置(S303),并可保存该位置,如果第一定位结果不满足质量条件(例如定位状态不为4,也不为5,和/或卫星数小于6个,也即不满足“定位状态4或5,且卫星数不小于6个”),则根据传感器模块102测量的加速度和角度参数,基于行人航位推算算法确定所述定位装置的携带者的位置(S304),并可保存该位置,可根据所述定位装置的携带者的位置,确定边界。
图4示出了处理模块的处理过程的一个示例的流程图。图4示出了在第一定位结果不满足质量条件的情况下,基于行人航位推算算法进行定位的一个示例性过程。其中与图3相同的附图标记代表相似的步骤。
在一种可能的实施方式中,如图4所示,处理模块103可被配置为:
S401,获取所述定位装置的携带者的步长步频模型的模型参数,其中所述步长步频模型表示所述携带者的步频与步长之间的关系;
其中,如果所述第一定位结果不满足质量条件,则根据所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的位置,包括:
S3041,如果所述第一定位结果不满足质量条件,则根据所述步长步频模型的模型参数、以及所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的位置。
步长步频模型的模型参数可以通过各种适当的方式来获得,本公开对此不做限制。
举例来说,可以根据加速度进行步态检测,根据步态检测的结果确定模型参数。如前文所述,行人的步态与加速度之间有着一定的对应关系,跨步的周期性会引起加速度的周期性变化,可以根据所述传感器模块测量的加速度确定携带者的步频和跨步点。图5示出了根据加速度确定跨步点的过程的一个示例的示意图。如图5所示,可设定阈值Ta,当加速度变化周期(跨步周期)内某一加速度采样点a i为极大值(a i>a i-1且a i>a i+1),且该极大值大于阈值Ta(a i>Ta),则认为该采样点为跨步点,并认为当前周期为有效的步伐周期,实现步态检测。本领域技术人员应理解,跨步点作为所述携带者每一跨步的特征点,也可对应于加速度的其他特征点,而不限于极大值点。可以以跨步点出现的频率作为步频,或者以加速度的变化周期作为步频。可以根据跨步点对应的(同一采样时间点的)第一定位结果确定所述携带者的步长l。可以根据步频f和步长l确定步长步频模型的模型参数。
步长步频模型可以根据需要进行选择,可以是线性模型,也可以是非线性模型,本公开对步长步频模型的具体形式不做限制。
举例来说,步长步频模型可以是如下线性模型:
l=k×f+b          公式1
其中f为步频,l为步长,k,b为模型参数。
可以利用多组步长l和步频f数据,根据最小二乘法计算参数k和b。
举例来说,在确定当前采样时间点对应跨步点时,可获取当前采样时间点对应的第一定位结果,根据相邻跨步点对应的第一定位结果确定步长l,根据相邻跨步点的时间间隔确定步频f,可存储多组步长和步频数据,并可利用存储的多组步长和步频数据根据最小二乘 法计算参数k和b。
再举例来说,步长步频模型也可以是如下非线性模型:
Figure PCTCN2018088519-appb-000001
其中l为步长,k,b为模型参数,a max和a min分别是跨步周期内加速度的最大值和最小值。模型参数k和b可参考上例利用最小二乘法计算。
图6示出了处理模块的处理过程的一个示例的流程图。图6示出了在第一定位结果不满足质量条件的情况下,根据所述步长步频模型的模型参数、以及所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的位置的过程的一个示例,包括:
S601,当所述第一定位结果不满足质量条件时,获取最近的满足质量条件的第一定位结果作为行人航位推算算法的起始位置。
举例来说,如果在采样时间点t i判断第一定位结果不满足质量条件,而前一采样时间点t i-1的第一定位结果满足质量条件,则可将采样时间点t i-1的第一定位结果(最近的满足质量条件的第一定位结果)作为行人航位推算算法的起始位置。
S602,根据所述角度参数确定所述携带者的实时航向。
举例来说,在角度参数包括角速度的情况下,实时航向可以是定位装置的携带者的初始航向与从初始航向确定的时刻开始的角速度实时测量结果的积分值之和。
在一种可能的实施方式中,处理模块可被配置为获取定位装置的携带者的初始航向。初始航向可以是定位装置进入第一模式时,定位装置的携带者的初始航向,换言之,可以是携带者对边界进行定位的过程开始时的初始航向。在一个示例中,初始航向可以与上述模型参数同时获得。
举例来说,可以根据第一定位模块的第一定位结果确定所述初始航向,例如,可以根据一段时间(例如10步)内跨步点对应的第一定位结果进行线性拟合,将拟合后的直线斜率的反正切值作为初始航向。图7示出了通过线性拟合确定初始航向的一个示例的示意图。其中圆点代表每一跨步点对应的第一定位结果,θ 0角作为初始航向。
以上确定初始航向或步长步频模型的模型参数的方式依赖于第一定位结果。在一种可 能的实施方式中,定位装置100可包括提醒模块,在第一定位结果满足质量条件时,提醒用户可以开始边界定位,不满足质量条件时,提醒用户不能进行边界定位,并在开始进行边界定位(例如进入第一模式)时获取第一定位结果和加速度来确定初始航向和/或模型参数,以确保获得准确的初始航向和/或模型参数。
然而,本领域技术人员应理解,确定初始航向或步频步长模型的模型参数的方式不限于此。例如在进入所述第一模式时,如果所述第一定位结果不满足质量条件,则可确定所述初始航向为0,步长步频模型的模型参数可采用经验值,等到第一定位结果质量满足条件时可对初始航向和模型参数进行校正。再例如,定位装置还可包括罗盘等用于确定航向的部件,在进入所述第一模式时,如果所述第一定位结果不满足质量条件,可利用罗盘确定所述初始航向,步长步频模型的模型参数可使用经验值,等到第一定位结果质量满足条件时可对初始航向和模型参数进行校正。通过这种方式,用户可以在任意条件下开始使用定位装置100进行边界定位,无需寻找或等到第一定位结果质量较好的起始点。
再举例来说,在角度参数包括航向角的情况下,实时航向可以通过航向角直接确定。例如,可将电子罗盘测量的航向角进行坐标系转换,作为实时航向。
S603,根据所述加速度确定所述携带者的实时步频。举例来说,可以通过上文所举例的步态检测的方式,根据加速度确定实时步频。
其中S601、S602和S603的执行顺序不做限制。
S604,根据所述实时步频和所述步长步频模型的模型参数,利用所述步长步频模型确定所述携带者的实时步长。举例来说,可以将实时步频带入根据上文中的例子确定的步长步频模型(例如公式1或公式2)确定实时步长。
S605,根据所述实时航向、所述实时步长以及所述起始位置,基于行人航位推算算法确定所述定位装置的携带者的位置。举例来说,可以根据行人航位推算算法,根据实时航向、实时步长和起始位置,确定每一跨步后携带者与起始位置之间的相对位置,从而实现定位。
通过以上过程,在携带者携带定位装置100沿期望的边界行走一圈的过程中,可获得 第一定位结果质量较高时的第一定位结果和第一定位结果质量较低时的行人航位推算算法的定位结果,由此得到边界的位置。
由于加速度和角度参数的固有零偏,在运行行人航位推算算法过程中可能会出现累积误差。因此,在一种可能的实施方式中,处理模块103还可根据第一定位结果对行人航位推算算法获得的定位结果进行校正,以进一步提高对边界的定位精度。
图8示出了处理模块的处理过程的一个示例的流程图。图8示出的是处理模块进行校正处理的一个示例。如图8所示,所述处理模块103还被配置为:
S801,在所述第一定位结果从不满足质量条件变为满足质量条件时,获取基于行人航位推算算法确定的所述定位装置的携带者的第一位置和根据所述第一定位模块的第一定位结果确定的所述定位装置的携带者的第二位置;
举例来说,如果采样时间点t 1至t i的第一定位结果不满足质量条件,定位装置100在t 1至t i基于行人航位推算算法进行定位得到行人航位推算算法的定位结果(例如定位装置100的携带者所在的坐标)P 1-P i,在采样时间点t i+1,处理模块103判断为第一定位结果从不满足质量条件变为了满足质量条件,则可以获取采样时间点t i+1的行人航位推算算法的定位结果P i+1(第一位置)和第一定位结果P’ i+1(第二位置)
S802,根据所述第一位置和所述第二位置,对在所述第一定位结果不满足质量条件期间基于行人航位推算算法确定的所述定位装置的携带者的位置进行校正。
举例来说,可以根据上述第一位置P i+1和第二位置P’ i+1,对采样时间点t 1至t i的行人航位推算算法的定位结果P 1-P i进行校正。校正的具体方式可以根据需要进行选择,本公开对此不做限制。
图9a示出了对行人航位推算算法的定位结果进行校正的一个示例的示意图。如图9a所示,在一个示例中,可以计算第一位置P i+1和第二位置P’ i+1之间的偏差D:
D=P’ i+1-P i+1           公式3
举例来说,如果第一位置P i+1以坐标(x i+1,y i+1)表示,第二位置P’ i+1以坐标(x’ i+1,y’ i+1)表示,则偏差D可表示为(Δx,Δy),其中:
Δx=x’ i+1-x i+1
Δy=y’ i+1-y i+1           公式4
可将该偏差D平分到在所述第一定位结果不满足质量条件期间基于行人航位推算算法确定的所述定位装置的携带者的位置(行人航位推算算法的定位结果)P 1-P i中,例如可将每个行人航位推算算法的定位结果P j(j为1到i中的一个)累加上上述偏差D与行人航位推算算法的定位结果总数i的商,得到校正后的行人航位推算算法的定位结果P’ j(x’ j,y’ j):
Figure PCTCN2018088519-appb-000002
可保存校正后的行人航位推算算法的定位结果以确定边界。
本领域技术人员应理解,对行人航位推算算法的定位结果的校正方式不限于此,例如也可以是基于非线性均摊算法。图9b示出了对行人航位推算算法的定位结果进行校正的另一个示例的示意图,如图9b所示,可确定行人航位推算算法的定位起始位置(即行人航位推算算法的定位开始前最后一个满足质量条件的第一定位结果)P 0和上述第一位置P i+1之间的第一向量V 1;并确定上述起始位置P 0和上述第二位置P’ i+1之间的第二向量V 2,将第一向量、第二向量两个向量之间的角偏差(角β的大小和方向)平分到行人航位推算算法的定位过程中每一个定位结果P 1-P i中,例如,使得P 0与P 1-P i之间的i个向量均向着朝向V 2的方向(箭头所示方向A)旋转角度β/i,完成对行人航位推算算法的定位结果的修正。
图10示出了处理模块的处理过程的一个示例的流程图。图10示出的是处理模块根据所述定位装置的携带者的位置,确定所述边界的一个示例性过程。如图10所示,所述处理模块103还被配置为:
S1001,对基于行人航位推算算法确定的所述定位装置的携带者的位置进行插值处理,得到插值处理后的所述定位装置的携带者的位置;
S1002,对插值处理后的所述定位装置的携带者的位置和根据所述第一定位结果确定的所述定位装置的携带者的位置进行平滑滤波,以确定所述边界。
由于行人航位推算算法的定位结果以跨步为周期,也就是每一步定位一次,频率约为2Hz,间隔约为0.6m,因此,可进一步对行人航位推算算法的定位结果进行插值处理,得 到插值处理后的定位结果,使得到的位置平滑连续。相邻两个行人航位推算算法的定位结果之间的插值个数不做限制,例如可以等于相邻两行人航位推算算法的定位结果之间的时间间隔对应的第一定位结果的个数(或称为原有第一定位结果的个数)。例如,在行人航位推算算法的定位结果P i和P i+1之间,可依据如下公式进行差值,得到插值处理后的位置P m
p m=p i+m·Δp            公式6
其中1≤m≤N,
Figure PCTCN2018088519-appb-000003
N为原有第一定位结果的个数。
本领域技术人员也可采用其他适当方式进行插值处理,本公开对插值处理的具体方式不做限制。
通过第一定位结果结合行人航位推算算法的定位结果获得的位置可能存在跳动,因此,可进一步对位置进行平滑滤波,减小数据跳动,得到平滑的边界。
举例来说,可采用动态滑动窗口滤波方法进行平滑滤波。对于边界的不同区域,可采用不同大小的滤波窗口,例如对于边界拐角处的位置(拐角点)可使用较小的窗口(例如窗口大小为2)进行滤波,对于非拐角处的位置(非拐角点)可使用较大的窗口(例如窗口大小为5)进行滤波。平滑滤波的一个示例如以下公式所示,其中P”为平滑滤波后的位置:
Figure PCTCN2018088519-appb-000004
本领域技术人员也可采用其他适当方式进行平滑滤波,本公开对平滑滤波的具体方式不做限制。
本领域技术人员可根据使用需要,采取S1001、S1002中的一者或两者进行处理,或者也可不进行S1001、S1002的处理。
图11示出根据本公开一实施例的定位装置的结构图。所述定位装置能够由携带者携带行走。如图11所示,该定位装置1100包括:
第一定位模块1101,用于获取定位装置1100的携带者的第一定位结果;
传感器模块1102,用于测量定位装置1100的携带者行走的加速度和角度参数;以及 处理模块1103,被配置为:
在用于确定自动行走设备的工作范围的边界的第一模式下,根据所述第一定位结果确定的所述定位装置的携带者的第三位置;根据所述加速度和角度参数、基于行人航位推算算法确定所述定位装置的携带者的第四位置;根据所述第三位置和所述第四位置,确定所述定位装置的携带者的位置;以及
根据所述定位装置的携带者的位置,确定所述边界。
通过根据第一定位结果确定的位置以及基于行人航位推算算法确定的位置来确定定位装置的携带者的位置,根据本公开实施例的定位装置使行人航位推算技术与其他定位技术相融合来构建虚拟边界,由于行人航位推算技术不易受外部环境的影响,能够弥补其他定位技术受到环境影响时的精度不足,使得定位精度高,构建的边界精准,且无需布设物理边界,降低用户操作的复杂度。
第一定位模块1101、传感器模块1102和处理模块1103的示例,可参见上文针对第一定位模块101、传感器模块102和处理模块103的举例说明
在一种可能的实施方式中,根据所述第三位置和所述第四位置,确定所述定位装置的携带者的位置,可包括:根据所述第三位置和所述第四位置两者的融合,确定所述定位装置的携带者的位置。
其中,可以采用任何适当的方式将第三位置和第四位置进行融合,可以是紧密的融合,也可以是相对松散的融合,本公开对此不做限制。
举例来说,根据所述第三位置和所述第四位置两者的融合,确定所述定位装置的携带者的位置,可包括:根据所述第三位置和所述第四位置的加权和,确定所述定位装置的携带者的位置。
举例来说,在采样时间点t j,根据所述第一定位结果确定所述定位装置的携带者的第三位置P gj,并根据所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的第四位置P rj,可通过如下公式8来确定采样时间点t j定位装置的携带者的位置P j
P j=w 1×P gj+w 2×P rj           公式8
其中w 1是第三位置的权重值,w 2是第四位置的权重值,0<w 1<1,0<w 2<1,w 1+w 2=1。
w 1、w 2可以根据需要进行取值,本公开对此不作限制。
在一种可能的实施方式中,w 1、w 2可以是固定值。
在另一种可能的实施方式中,处理模块1103还可被配置为:根据第一定位结果的质量,确定所述第三位置和所述第四位置各自在所述加权和中所占的权重。
通过这种方式,使得能够针对不同的第一定位数据的质量适当地调整不同定位技术获得的位置在最终确定的位置中所占的比例,更进一步地保证了定位精度。
确定第一定位结果质量的示例可以参见上文。在一种可能的实施方式中,以第一定位模块为卫星定位模块,第一定位结果为卫星定位结果为例,可以根据卫星定位模块接收的卫星数和卫星定位模块的定位状态中的一者或两者,判断卫星定位模块的所述卫星定位结果的质量。
在一种可能的实施方式中,根据第一定位结果的质量,确定所述第三位置和所述第四位置各自在所述加权和中所占的权重,可包括:随着所述第一定位结果的质量的提高,增大所述第三位置在所述加权和中所占的权重,并减小所述第四位置在所述加权和中所占的权重;随着所述第一定位结果的质量的降低,减小所述第三位置在所述加权和中所占的权重,并增大所述第四位置在所述加权和中所占的权重。
例如,可以随着第一定位结果质量变好,增大w 1,减小w 2,以增大第三位置在所确定位置中所占的比重,随着第一定位结果质量变差,减小w 1,增大w 2,以增大第四位置在所确定位置中所占的比重。
通过这种方式,可进一步提高定位精度。
在一种可能的实施方式中,如果第一定位结果处于非常好的状态,可使w 1=1,w 2=0,此时可视为完全基于第一定位结果确定携带者的位置。在一个示例中,这种情况下也可停止确定第四位置。如果第一定位结果处于非常差的状态,可使w 1=0,w 2=1,此时可视为完全基于行人航位推算算法确定携带者的位置。
例如,可将第一定位结果的质量分为优(第一质量状态)、一般(第三质量状态)和 差(第二质量状态)等多个状态,不同的状态下,可设置w 1、w 2为不同的值。例如可设定两个质量条件,如果满足第一质量条件,则判断第一定位结果处于第一质量状态,使w 1=1,w 2=0,如果不满足第二质量条件,则判断第一定位结果处于第二质量状态,使w 1=0,w 2=1,如果不是以上两种情况(例如不满足第一质量条件,但满足第二质量条件),则判断第一定位数据处于第三质量状态,可使w 1、w 2为大于0、小于1的中间值。以GPS模块作为第一定位模块为例,可以设定卫星数的阈值,例如6个。处理模块103可以根据GPS模块的输出信号确定卫星数和定位状态,如果定位状态为4,认为满足第一质量条件,处理模块103可判断GPS定位结果处于第一质量状态,如果定位状态不为4,也不为5,和/或卫星数小于6个,则认为不满足第二质量条件,处理模块103可判断GPS定位结果处于第二质量状态;如果不是以上两种情况,例如定位状态为5,或者卫星数大于等于6,处理模块103可判断GPS定位结果处于第三质量状态。
以上区分质量状态的条件仅为示例,本领域技术人员可根据需要设置其他条件来区分不同的质量状态,也可设置一个或任意多个质量状态。或者,也可以不设置质量状态,而根据第一定位结果的质量实时计算权重w 1、w 2,本公开对此不做限制。
如上文所述,行人航位推算算法的定位结果(第四位置)可能仅存在于跨步点处,而第一定位结果(第三位置)可能存在于各个采样时间点处,因此,在基于第三位置和第四位置确定定位装置的携带者的位置的过程中,可进行插值处理。
在一种可能的实施方式中,根据所述第三位置和所述第四位置,确定所述定位装置的携带者的位置,可包括:根据跨步点处的所述第三位置和跨步点处的所述第四位置,确定跨步点处所述定位装置的携带者的位置;对跨步点处所述定位装置的携带者的位置进行插值处理,得到所述定位装置的携带者的位置;其中所述跨步点为所述携带者每一跨步的特征点。
举例来说,可以根据跨步点处的第四位置以及跨步点处(即与跨步点对应的采样时间点处)的第三位置,根据公式8得到跨步点处的携带者的位置,并例如利用公式6所示的方式或任意适当方式,对各跨步点处的携带者位置进行插值,插值数量可以是跨步点之间的 时间间隔对应的第一定位结果的个数(或称为原有第一定位结果的个数),从而得到连续的定位装置的携带者的位置,保证了结果的连续性。
在另一种可能的实施方式中,根据所述第三位置和所述第四位置,确定所述定位装置的携带者的位置,可包括:对跨步点处的所述第四位置进行插值处理,得到插值后的第四位置;根据所述第三位置和所述插值后的第四位置,确定所述定位装置的携带者的位置;其中所述跨步点为所述携带者每一跨步的特征点。
举例来说,可以例如利用公式6所示的方式或任意适当方式,对各跨步点处的第四位置进行插值,得到插值后的第四位置,插值数量可以是跨步点之间的时间间隔对应的第一定位结果的个数(或称为原有第一定位结果的个数),可以根据插值后的第四位置以及各采样时间点处的第三位置,根据公式8得到定位装置的携带者的位置,保证了结果的连续性。
在一种可能的实施方式中,可以以与上文获得基于行人航位推算算法的定位结果类似的方式,基于行人航位推算算法获得第四位置。
例如,处理模块1103可以与上文类似地获取所述定位装置的携带者的步长步频模型的模型参数,举例来说,可以根据所述传感器模块测量的加速度确定所述携带者的步频和跨步点,根据所述跨步点对应的第一定位结果确定所述携带者的步长,根据所述步频和所述步长确定所述步长步频模型的模型参数,示例性说明可参见上文。
可以与上文类似地,根据所述步长步频模型的模型参数、以及所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的第四位置。
图12示出了处理模块的处理过程的一个示例的流程图。图12示出了根据所述步长步频模型的模型参数、以及所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的第四位置的过程的一个示例,包括:
S1201,获取行人航位推算算法的起始位置。
举例来说,可以以定位装置进入第一模式时的第一定位结果,作为该起始位置,这种情况下可与上文类似地,提醒携带者在第一定位结果满足一定质量条件的情况下再开始对 边界的定位。也可以以定位装置进入第一模式后,最近的满足一定质量条件的第一定位结果作为该起始位置。本公开对起始位置的获取方式不做限制。
S1202,根据所述角度参数确定所述携带者的实时航向。可参见关于S602的描述。
S1203,根据所述加速度确定所述携带者的实时步频。可参见关于S603的描述。
其中S1201、S1202和S1203的执行顺序不做限制。
S1204,根据所述实时步频和所述步长步频模型的模型参数,利用所述步长步频模型确定所述携带者的实时步长。可参见关于S604的描述。
S1205,根据所述实时航向、所述实时步长以及所述起始位置,基于行人航位推算算法确定所述定位装置的携带者的第四位置。可参见关于S605的描述。
在一种可能的实施方式中,上述各实施例中的定位装置能够安装于自动行走设备,定位装置中的处理模块(例如处理模块103或1103)可被配置为:在用于定位所述自动行走设备的位置的第二模式下,根据所述第一定位结果以及惯性定位结果的至少其中之一,确定所述自动行走设备的位置,其中,所述惯性定位结果是根据至少所述传感器模块输出的加速度和角度参数、基于惯性导航推算算法确定的。其中关于第一模式、第二模式的示例性说明可参见上文。
对于第二模式,可根据第一定位结果对自动行走设备本身进行定位,也可以根据惯性定位结果对自动行走设备本身进行定位,也可基于第一定位结果和惯性定位结果两者的融合对自动行走设备本身进行定位。
第一定位结果和惯性定位结果的融合定位例如可以是利用卡尔曼滤波将第一定位结果和加速度、角度参数等惯性数据进行实时融合,将传感器模块采集的加速度和角度参数以及第一定位模块得到的第一定位结果送到卡尔曼滤波器中进行融合,对第一定位结果进行闭环修正,得到修正后的位置、速度和姿态等,以得到精度更高的定位结果,即实现针对第二模式的融合定位系统,以对抗第一定位模块信号遮挡情况下定位精度差的问题。融合定位可基于相关技术实现,此处不再详述。
在一种可能的实施方式中,惯性导航推算算法可包括INS(Inertial Navigation System, 惯性导航系统)算法。
在一种可能的实施方式中,在第一定位模块为GPS模块的情况下,还可利用DGPS(Differential Global Positioning System,差分全球定位系统)、CORS(Continuously Operating Reference Stations,连续运营参考站)等技术进一步提高定位精度。其中,根据DGPS技术,移动站在实际定位时会同时收到GPS卫星信号和来自基准站(位置固定且已知)的载波观测量和基准站的位置,并与自身接收的载波相位观测量形成差分观测量(利用载波相位进行差分的技术也称RTK(Real-time kinematic,载波相位差分)),进而修正载波相位,得到高精度的定位结果。CORS技术是利用多基站网络RTK技术建立的CORS,CORS将若干个固定的、连续运行的GPS参考站,利用现代计算机、数据通信和互联网技术组成差分网络,对移动目标进行修正,大大提高了移动站的定位精度。
图13示出根据本公开实施例的一种自动行走设备的结构图,所述自动行走设备1300包括设备主体200和上文所述的定位装置,例如定位装置100或定位装置1100,其中所述定位装置能够以可拆卸的方式安装于所述设备主体200。
本实施例的自动行走设备可工作在例如图1所示的应用环境中。该自动行走设备的工作区域的边界由定位装置定位(例如,图1中的边界线50’)。
本公开实施例的自动行走设备可以为割草机、吸尘器、工业机器人等多种形式。自动行走设备为割草机时,还可进一步包括切割机构,切割机构可包括切割电机和切割刀片,割草机在边界线50’界定的工作区域30’内工作时,切割电机驱动切割刀片旋转,切割草坪。
本领域技术人员应理解,图13仅示意性地示出了自动行走设备的示意图,自动行走设备的外形、定位装置在自动行走设备上的安装位置、定位装置的外形等在本公开中不做限制。
定位装置与设备主体200之间的安装方式不做限制,例如定位装置可卡接在设备主体的卡槽中,或容纳于设备主体上设置的安装盒中等。定位装置安装在设备主体上时,可与设备主体中的其他电路电连接,例如可与其他电路进行数据或电力通信。
图14示出了根据本公开实施例的一种定位方法的流程图。该方法可应用于上文中的定 位装置100中。所述方法包括:
S1401,获取定位装置的携带者的第一定位结果、携带者行走的加速度和角度参数;
S1402,在用于确定自动行走设备的工作范围的边界的第一模式下,如果所述第一定位结果满足质量条件,则根据所述第一定位结果确定所述定位装置的携带者的位置;如果所述第一定位结果不满足质量条件,则根据所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的位置;以及
S1403,根据所述定位装置的携带者的位置,确定所述边界。
通过根据第一定位结果的质量,基于第一定位结果或基于行人航位推算算法确定定位装置的携带者的位置,根据本公开实施例的定位方法使行人航位推算技术与其他定位技术相融合来构建虚拟边界,由于行人航位推算技术不易受外部环境的影响,能够弥补其他定位技术受到环境影响时的精度不足,使得定位精度高,构建的边界精准,且无需布设物理边界,降低用户操作的复杂度。
在一种可能的实施方式中,所述方法还可包括:获取所述定位装置的携带者的步长步频模型的模型参数,其中所述步长步频模型表示所述携带者的步频与步长之间的关系;采用所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的位置,可包括:根据所述步长步频模型的模型参数、以及所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的位置。
在一种可能的实施方式中,采用所述步长步频模型的模型参数、以及所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的位置,可包括:当所述第一定位结果不满足质量条件时,获取最近的满足质量条件的第一定位结果作为行人航位推算算法的起始位置;根据所述角度参数确定所述携带者的实时航向;根据所述加速度确定所述携带者的实时步频;根据所述实时步频和所述步长步频模型的模型参数,利用所述步长步频模型确定所述携带者的实时步长;以及根据所述实时航向、所述实时步长以及所述起始位置,基于行人航位推算算法确定所述定位装置的携带者的位置。
在一种可能的实施方式中,获取所述定位装置的携带者的步长步频模型的模型参数, 可包括:根据所述传感器模块测量的加速度确定所述携带者的步频和跨步点,根据所述跨步点对应的第一定位结果确定所述携带者的步长,根据所述步频和所述步长确定所述步长步频模型的模型参数,其中所述跨步点为所述携带者每一跨步的特征点。
在一种可能的实施方式中,所述方法还可包括:在所述第一定位结果从不满足质量条件变为满足质量条件时,获取基于行人航位推算算法确定的所述定位装置的携带者的第一位置和根据所述第一定位模块的第一定位结果确定的所述定位装置的携带者的第二位置;根据所述第一位置和所述第二位置,对在所述第一定位结果不满足质量条件期间基于行人航位推算算法确定的所述定位装置的携带者的位置进行校正。
在一种可能的实施方式中,根据所述定位装置的携带者的位置,确定所述边界,可包括:对基于行人航位推算算法确定的所述定位装置的携带者的位置进行插值处理,得到插值处理后的所述定位装置的携带者的位置;对插值处理后的所述定位装置的携带者的位置和根据所述第一定位结果确定的所述定位装置的携带者的位置进行平滑滤波,以确定所述边界。
在一种可能的实施方式中,所述第一定位模块可为卫星定位模块,所述第一定位结果为卫星定位结果。
在一种可能的实施方式中,所述方法还可包括:根据卫星定位模块接收的卫星数和卫星定位模块的定位状态中的一者或两者,判断卫星定位模块的所述卫星定位结果是否满足质量条件。
在一种可能的实施方式中,根据卫星定位模块接收的卫星数和卫星定位模块的定位状态中的一者或两者,判断卫星定位模块的所述卫星定位定位结果是否满足质量条件,可包括:在定位状态为指定状态,且卫星数不小于阈值的情况下,判断所述卫星定位结果满足质量条件。
在一种可能的实施方式中,所述方法还可包括:在用于定位所述自动行走设备的位置的第二模式下,根据所述第一定位结果以及惯性定位结果的至少其中之一,确定所述自动行走设备的位置,其中,所述惯性定位结果是根据至少所述传感器模块输出的加速度和角 度参数、基于惯性导航推算算法确定的。
在一种可能的实施方式中,所述惯性导航推算算法包括INS算法。
在一种可能的实施方式中,所述行人航位推算算法包括PDR算法。
在一种可能的实施方式中,所述第一定位模块包括UWB定位模块。
以上方法可通过定位装置100执行,例如通过定位装置中的处理模块103执行。处理模块103可配置为执行以上方法的专用硬件电路,也可以通过执行逻辑指令来执行上述方法。上述方法的示例性说明可参照上文针对定位装置100的说明,在此不再重复。
图15示出了根据本公开实施例的一种定位方法的流程图。该方法可应用于上文中的定位装置1100中。所述方法包括:
S1501,获取定位装置的携带者的第一定位结果、携带者行走的加速度和角度参数;
S1502,在用于确定自动行走设备的工作范围的边界的第一模式下,根据所述第一定位结果确定所述定位装置的携带者的第三位置;根据所述加速度和角度参数、基于行人航位推算算法确定所述定位装置的携带者的第四位置;根据所述第三位置和所述第四位置,确定所述定位装置的携带者的位置;以及
S1503,根据所述定位装置的携带者的位置,确定所述边界。
通过根据第一定位结果确定的位置以及基于行人航位推算算法确定的位置来确定定位装置的携带者的位置,根据本公开实施例的定位方法使行人航位推算技术与其他定位技术相融合来构建虚拟边界,由于行人航位推算技术不易受外部环境的影响,能够弥补其他定位技术受到环境影响时的精度不足,使得定位精度高,构建的边界精准,且无需布设物理边界,降低用户操作的复杂度。
在一种可能的实施方式中,根据所述第三位置和所述第四位置,确定所述定位装置的携带者的位置,可包括:根据所述第三位置和所述第四位置两者的融合,确定所述定位装置的携带者的位置。
在一种可能的实施方式中,根据所述第三位置和所述第四位置两者的融合,确定所述定位装置的携带者的位置,可包括:根据所述第三位置和所述第四位置的加权和,确定所 述定位装置的携带者的位置。
在一种可能的实施方式中,所述方法还包括:根据第一定位结果的质量,确定所述第三位置和所述第四位置各自在所述加权和中所占的权重。
在一种可能的实施方式中,根据第一定位结果的质量,确定所述第三位置和所述第四位置各自在所述加权和中所占的权重,可包括:随着所述第一定位结果的质量的提高,增大所述第三位置在所述加权和中所占的权重,并减小所述第四位置在所述加权和中所占的权重;随着所述第一定位结果的质量的降低,减小所述第三位置在所述加权和中所占的权重,并增大所述第四位置在所述加权和中所占的权重。
在一种可能的实施方式中,所述第一定位模块为卫星定位模块,所述第一定位结果为卫星定位结果。
在一种可能的实施方式中,所述方法还可包括:根据卫星定位模块接收的卫星数和卫星定位模块的定位状态中的一者或两者,判断卫星定位模块的所述卫星定位结果的质量。
在一种可能的实施方式中,根据所述第三位置和所述第四位置,确定所述定位装置的携带者的位置,可包括:根据跨步点处的所述第三位置和跨步点处的所述第四位置,确定跨步点处所述定位装置的携带者的位置;对跨步点处所述定位装置的携带者的位置进行插值处理,得到所述定位装置的携带者的位置;其中所述跨步点为所述携带者每一跨步的特征点。
在一种可能的实施方式中,根据所述第三位置和所述第四位置,确定所述定位装置的携带者的位置,可包括:对跨步点处的所述第四位置进行插值处理,得到插值后的第四位置;根据所述第三位置和所述插值后的第四位置,确定所述定位装置的携带者的位置;其中所述跨步点为所述携带者每一跨步的特征点。
在一种可能的实施方式中,所述方法还可包括:获取所述定位装置的携带者的步长步频模型的模型参数,其中所述步长步频模型表示所述携带者的步频与步长之间的关系;根据所述加速度和角度参数、基于行人航位推算算法确定所述定位装置的携带者的第四位置,包括:根据所述步长步频模型的模型参数、以及所述加速度和所述角度参数,基于行人航 位推算算法确定所述定位装置的携带者的第四位置。
在一种可能的实施方式中,根据所述步长步频模型的模型参数、以及所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的第四位置,可包括:获取行人航位推算算法的起始位置;根据所述角度参数确定所述携带者的实时航向;根据所述加速度确定所述携带者的实时步频;根据所述实时步频和所述步长步频模型的模型参数,利用所述步长步频模型确定所述携带者的实时步长;以及根据所述实时航向、所述实时步长以及所述起始位置,基于行人航位推算算法确定所述定位装置的携带者的第四位置。
在一种可能的实施方式中,获取所述定位装置的携带者的步长步频模型的模型参数,可包括:根据所述传感器模块测量的加速度确定所述携带者的步频和跨步点,根据所述跨步点对应的第一定位结果确定所述携带者的步长,根据所述步频和所述步长确定所述步长步频模型的模型参数,其中所述跨步点为所述携带者每一跨步的特征点。
在一种可能的实施方式中,所述方法还包括:在用于定位所述自动行走设备的位置的第二模式下,根据所述第一定位结果以及惯性定位结果的至少其中之一,确定所述自动行走设备的位置,其中,所述惯性定位结果是根据至少所述传感器模块输出的加速度和角度参数、基于惯性导航推算算法确定的。
在一种可能的实施方式中,所述惯性导航推算算法包括INS算法。
在一种可能的实施方式中,所述第一定位模块包括UWB定位模块。
在一种可能的实施方式中,所述行人航位推算算法包括PDR算法。
以上方法可通过定位装置1100执行,例如通过定位装置中的处理模块1103执行。处理模块1103可配置为执行以上方法的专用硬件电路,也可以通过执行逻辑指令来执行上述方法。上述方法的示例性说明可参照上文针对定位装置1100的说明,在此不再重复。
应用示例1
以下结合图16a和图16b,示出根据本公开实施例的一个应用示例。本领域技术人员应理解,该应用示例仅为了便于理解,不以任何目的限制本公开。
该应用实例示意了第一定位模块为GPS模块,角度参数为角速度,随着GPS定位结果质量的变化,使用GPS和PDR切换进行定位的示例性场景。
在需要利用定位装置确定虚拟边界位置时,用户携带定位装置来到边界50’上,用户可在边界50’上行走,直到找到GPS定位结果满足质量条件的起始点S0,此时定位装置可提示用户GPS定位质量良好(或称强GPS信号区,或GPS定位精度高),可以开始对边界的定位。用户可通过触发定位装置上的按钮等方式,命令定位装置开始进行边界定位,定位装置可以以收到命令时的GPS定位结果和传感器模块的测量结果作为测量的初始值,以S0点作为测量的起始点。
随着用户沿边界50’顺时针行走,定位装置可实时获得并记录GPS定位结果(x t,y t)作为用户的位置进行存储,与此同时,定位装置可通过GPS模块的输出实时地判断GPS定位结果的质量状态,并通过传感器模块测量加速度Acc和角度度Gyr,以确定初始航向θ 0和步频步长模型的模型参数k、b。
当用户行走至位置S1,工作区域被房屋或树木遮挡,定位装置判断GPS定位结果不满足质量条件(或称弱GPS信号区,或GPS定位精度低),定位装置将最后一个满足质量条件的GPS定位结果作为PDR的起始坐标,根据初始航向θ 0和角速度数据Δθ得到实时航向θ t,利用步频步长模型根据实时步频确定实时步长l t,根据实时航向和实时步长计算实时坐标点(x’ t,y’ t),作为用户的位置进行存储。
当用户行走至位置S2,GPS定位结果从不满足质量条件变为满足质量条件,定位装置可获得此时的PDR定位结果和GPS定位结果,并根据两者之间的偏差对此前的不满足质量条件的PDR定位结果进行校正。
随着用户沿边界50’行走,上述过程根据GPS定位结果的质量交替进行,直到用户回到起始点S 0。定位装置可自动判断用户回到起始点,或者由用户指示定位装置回到起始点,例如用户可对定位装置发出定位结束的命令。定位装置可对所存储的PDR定位结果进行插值处理,对GPS定位结果和插值后的PDR定位结果进行平滑滤波,得到最终的边界位置数据(地图边界点)并存储在定位装置中。
在需要使用自动行走设备时,用户可将定位装置安装到自动行走设备上,定位装置可利用GPS/INS融合定位技术定位自动行走设备的位置,并与所存储的边界位置进行比对,以判断自动行走设备是否在工作区域中,或者判断自动行走设备与边界之间的距离等,由此可对自动行走设备的运动方式进行控制。
应用示例2
以下结合图16a和图16c,示出根据本公开实施例的一个应用示例。本领域技术人员应理解,该应用示例仅为了便于理解,不以任何目的限制本公开。
该应用实例示意了第一定位模块为GPS模块,角度参数为角速度,随着GPS定位结果质量的变化,使GPS定位结果和PDR定位结果在加权和中的权重值变化的示例性场景。
与应用示例1类似地,在需要利用定位装置确定虚拟边界位置时,用户携带定位装置来到边界50’上,用户可在边界50’上行走,直到找到GPS定位结果满足质量条件的起始点S0,此时定位装置可提示用户GPS定位质量良好(或称强GPS信号区,或GPS定位精度高),可以开始对边界的定位。用户可通过触发定位装置上的按钮等方式,命令定位装置开始进行边界定位,定位装置可以以收到命令时的GPS定位结果和传感器模块的测量结果作为测量的初始值,其中GPS定位结果作为PDR的初始位置,以S0点作为测量的起始点。
随着用户沿边界50’顺时针行走,定位装置可实时获得并记录GPS定位结果作为第三位置,于此同时,定位装置可通过传感器模块测量加速度Acc和角度度Gyr,以确定初始航向θ 0和步频步长模型的模型参数k、b,根据PDR的初始位置、初始航向θ 0和角速度数据Δθ得到实时航向θ t,利用步频步长模型根据实时步频确定实时步长l t,根据实时航向和实时步长计算实时坐标点作为第四位置。定位装置可计算第三位置和第四位置的加权和作为携带者的位置。
定位装置可通过GPS模块的输出实时地判断GPS定位结果的质量状态,并根据质量状态调整第三位置和第四位置在加权和中的权重。当用户行走至位置S1’,工作区域被房屋或树木遮挡,定位装置判断GPS定位结果质量变差,定位装置将降低第三位置的权重,增大 第四位置的权重。
当用户行走至位置S1,GPS定位结果质量变的更差,此时可使第三位置的权重值为0,或忽略GPS定位结果,以第四位置作为携带者的位置进行存储。
当用户行走至位置S2,GPS定位结果质量变好,定位装置可增大第三位置的权重,降低第四位置的权重。如果GPS定位结果质量足够好,定位装置甚至可以停止PDR定位过程,或使第四位置权重为0,利用第三位置作为携带者的位置进行存储。
随着用户沿边界50’行走,上述过程根据GPS定位结果的质量交替进行,直到用户回到起始点S 0。定位装置可自动判断用户回到起始点,或者由用户指示定位装置回到起始点,例如用户可对定位装置发出定位结束的命令。定位装置可得到最终的边界位置数据(地图边界点)并存储在定位装置中。
在需要使用自动行走设备时,用户可将定位装置安装到自动行走设备上,定位装置可利用GPS/INS融合定位技术定位自动行走设备的位置,并与所存储的边界位置进行比对,以判断自动行走设备是否在工作区域中,或者判断自动行走设备与边界之间的距离等,由此可对自动行走设备的运动方式进行控制。
需要说明的是,尽管以上以二维坐标(x,y)为例作为表示位置的坐标数据,本领域技术人员应理解,也可以以三维坐标或其他形式的坐标数据来表示位置,并且坐标系可以根据实际需要进行选取,本公开对此不做限制。
图17示出了根据本公开一实施例的自动行走设备的一种示例性应用环境的示意图。
如图17所示,在一种示例性的应用环境中,根据本公开实施例的自动行走设备10’可以例如为自动割草机,自动行走设备10’可以在边界50’范围内的工作区域30’中自动行走,切割位于工作表面上的植被。
当自动行走设备10’在工作区域30’中自动行走时,可以自主定位自身的位置,将自身位置与边界50’的位置进行比较,以判断自身是否位于工作区域30’内,或判断自身距离边界的距离,并根据判断的结果调整移动方式,使自身保持在工作区域30’内。
图18示出了根据本公开一实施例的一种自动行走设备的框图。如图18所示,该自动行 走设备包括:
视觉模块11,用于获取所述自动行走设备的周围环境的视觉数据;
惯性导航模块,本实施例中具体为IMU模块12,用于获取所述自动行走设备的惯性数据;
卫星导航模块,本实施例中具体为GPS模块13,用于获取所述自动行走设备的GPS定位数据;
处理模块14,与所述视觉模块11、所述IMU模块12以及所述GPS模块13电连接,所述处理模块14被配置为:
根据所述视觉数据进行视觉定位,获得视觉定位结果;
根据所述惯性数据进行惯性定位,获得惯性定位结果;
对所述视觉定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;
在所述GPS定位数据满足质量条件的情况下,对所述第一融合结果和所述GPS定位数据进行融合,获得融合后的第二融合结果,将所述第二融合结果确定为所述自动行走设备的位置。
根据本公开实施例的自动行走设备,能够通过视觉模块11、IMU模块12以及GPS模块13分别获取视觉数据、惯性数据以及GPS定位数据,并将定位结果进行融合,确定自动行走设备的位置,使得自动行走设备能够实现精确的自主定位。
图19示出了本公开一实施例的一种自动行走设备的示意图。图19中以割草机作为自动行走设备的示例。如图19所示,在该应用示例中,自动行走设备可以包括视觉模块11、IMU模块12以及GPS模块13。
在一种可能的实施方式中,视觉模块11可以包括视觉传感器,例如,该视觉传感器可以是全局快门的单色CMOS视觉传感器,采用该种类的视觉传感器能够避免视觉干扰,提高精度。视觉传感器的镜头可以是鱼眼镜头,从而增大视觉传感器的视角,采集更多的视觉信息。本公开对视觉模块11的视觉传感器的具体类型不做限定。
在一种可能的实施方式中,IMU模块12可以包括惯性传感器,所述惯性传感器可以包 括陀螺仪、加速度计,或者,在陀螺仪和加速度计的基础上还可包括地磁传感器和码盘中的一者或两者。举例来说,IMU模块12可以包括陀螺仪和加速度计组成的6轴惯性传感器;也可以包括陀螺仪、加速度计及地磁传感器组成的9轴惯性传感器。例如,IMU模块12可以通过陀螺仪获得自动行走设备的角速度数据;可以通过加速度计获得自动行走设备的加速度数据;可以通过地磁传感器获得自动行走设备所处位置的经纬度数据;可以通过码盘获得自动行走设备的速度数据。本公开对IMU模块12的惯性传感器的具体部件情况不做限定。
在一种可能的实施方式中,IMU模块12获取的惯性数据可以包括速度、加速度、角速度以及朝向角中的一个或多个。惯性数据可以根据构成IMU模块12的惯性传感器的情况确定,本公开对此不做限制。
在一种可能的实施方式中,所述视觉模块11的光轴、所述IMU模块12的一轴以及所述GPS模块13的中心处于所述自动行走设备的中轴线上,所述视觉模块11的光轴与IMU模块12的一轴共线。举例来说,视觉模块11的视觉传感器的CMOS芯片可以与IMU模块12背靠背安装,并保证CMOS芯片的光轴与IMU模块12的某一轴共线;并且,CMOS芯片的光轴、IMU模块12的某一轴以及GPS模块13的中心可以处于自动行走设备的中轴线上。采用这种方式,可以减少所获得数据的自由度,使得算法简化,从而加快处理流程。
在一种可能的实施方式中,GPS模块13可以是任何能够实现基于GPS定位的模块,例如能够接收GPS信号进行定位的GPS接收机。GPS模块13可以基于现有技术实现。GPS定位数据是否满足质量条件的标准,可以根据需要任意设置,例如,可以根据GPS信号的强弱、接收卫星数量、定位状态等判断GPS定位数据是否满足质量条件,本公开对此不做限制。
在一种可能的实施方式中,GPS模块13的中心可以处于自动行走设备的两个驱动轮的中心点的上方。采用这种方式,在自动行走设备旋转时,GPS模块13的定位位置可以不变或仅有很小的变化,从而简化数据的处理。
在一种可能的实施方式中,处理模块14可以是单片机、CPU、MPU、FPGA等任何能进行数据处理的处理部件,处理模块14可以通过专用硬件电路实现,也可以通过通用处理部件结合可执行逻辑指令实现,以执行处理模块14的处理过程。
在一种可能的实施方式中,自动行走设备还可包括存储模块(未示出),以存储处理模块14生成的数据,例如视觉数据、惯性数据以及GPS定位数据等。
在一种可能的实施方式中,处理模块14可被配置为:
步骤S501,根据所述视觉数据进行视觉定位,获得视觉定位结果;
步骤S502,根据所述惯性数据进行惯性定位,获得惯性定位结果;
步骤S503,对所述视觉定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;
步骤S504,在所述GPS定位数据满足质量条件的情况下,对所述第一融合结果和所述GPS定位数据进行融合,获得融合后的第二融合结果,将所述第二融合结果确定为所述自动行走设备的位置。
举例来说,视觉模块11可以每隔时间间隔T拍摄一帧视觉图像作为视觉数据,该时间间隔T可以根据视觉传感器的性质由系统进行默认设定,例如,视觉传感器可以每分钟拍摄30帧视觉图像,则时间间隔T可以为2秒。本公开对时间间隔T的具体取值不做限定。
在一种可能的实施方式中,基于视觉数据(例如拍摄的一帧视觉图像),可以提取图像特征信息和对应的描述信息,其中,图像特征信息可以是视觉图像中的多个特征点,尽可能地覆盖整个视觉图像中的各个区域;描述信息可以是对应于每个特征点的描述子,通过描述子对特征点进行描述。本领域技术人员应当理解,可以通过本领域公知的特征提取算法(例如ORB、FAST、SURF、SIFT算法等)从视觉数据中提取图像特征信息和描述信息,本公开对此不做限定。
在一种可能的实施方式中,可以首先对视觉模块11、IMU模块12以及GPS模块13进行初始化,将视觉模块11、IMU模块12以及GPS模块13的初始定位位置设定为零点。
在一种可能的实施方式中,根据视觉数据以及惯性数据,可以对视觉模块11进行初始化。举例来说,基于初始时刻T1的视觉数据(例如初始时刻T1拍摄的一帧视觉图像),可以提取初始时刻T1的图像特征信息和对应的描述信息。通过该描述信息,可以对初始时刻T1的图像特征信息与前一时刻T0的图像特征信息进行匹配,确定T1的图像特征信息与前一 时刻T0的图像特征信息之间的对应关系;基于该对应关系,可以确定初始时刻T1与前一时刻T0两个相邻时刻的视觉数据之间的基本矩阵F。对IMU模块12的惯性数据(例如前一时刻T0与初始时刻T1期间的速度、加速度以及角速度)进行积分,并结合惯性数据中的初始时刻T1时的朝向角(自动行走设备的姿态),可以获得自动行走设备前一时刻T0与初始时刻T1的位置之间的基线距离B。根据基本矩阵F和基线距离B,可以获得初始时刻T1与前一时刻T0两个相邻时刻的两帧图像之间的相对位置。进而,可以确定初始时刻T1的图像特征信息(各个特征点)的深度信息,从而确定各个特征点相对于自动行走设备的初始定位位置(初始时刻T1的位置,也即坐标零点)的三维初始位置,该三维初始位置可以包括特征点的三维坐标(例如,特征点的位置和深度)。
以上初始化过程仅为举例说明,本领域技术人员应当理解,可以采用本领域公知的算法获取图像特征信息的三维初始位置,本公开对此不做限定。
通过这种方式,可以实现视觉定位的初始化,确定初始时刻T1的图像特征信息的三维初始位置。
在步骤S501中,根据视觉数据进行视觉定位,获得视觉定位结果。
视觉定位可以通过相关技术中的适当方式实现,视觉定位结果可以是所采用的视觉定位过程产生的任意适当结果,本公开对此不做限制。
举例来说,基于当前时刻Tn的视觉数据(当前时刻Tn拍摄的一帧视觉图像),可以提取当前时刻Tn的图像特征信息(多个特征点以及相对应的坐标)和对应的描述信息(描述子),其中,n表示初始时刻至当前时刻拍摄的图像帧的数量,n>1。通过该描述信息,可以对当前时刻Tn的图像特征信息与前一时刻Tn-1的图像特征信息进行匹配,确定当前时刻Tn的图像特征信息与前一时刻Tn-1的图像特征信息之间的对应关系。
在一种可能的实施方式中,根据前一时刻Tn-1的图像特征信息的三维(3D)位置,当前时刻Tn的图像特征信息的坐标位置(2D位置),以及上述的对应关系,可以通过例如PNP算法进行计算,从而获得当前时刻Tn的图像特征信息相对于前一时刻Tn-1的图像特征信息的3D位置;基于该3D位置,可以例如通过三角定位算法进行计算,确定拍摄位置(自动行 走设备的位置)在当前时刻Tn的3D位置。也可以采用其他公知的算法确定当前时刻Tn的图像特征信息的3D位置,以及自动行走设备在当前时刻Tn的3D位置,本公开对此不作限定。
通过这种方式,可以将自动行走设备在当前时刻Tn的3D位置作为视觉定位结果,实现了视觉定位的过程。
步骤S502,根据所述惯性数据进行惯性定位,获得惯性定位结果。
惯性定位可以通过相关技术中的适当方式实现,惯性定位结果可以是所采用的惯性定位过程产生的任意适当结果,本公开对此不做限制。
举例来说,IMU模块12可以根据传感器参数和标定信息确定惯性数据,该惯性数据可以包括例如前一时刻Tn-1至当前时刻Tn期间的速度、加速度以及角速度,当前时刻Tn时的朝向角(自动行走设备的姿态)等。对该惯性数据中的速度、加速度、角速度等数据进行积分,可以获取前一时刻Tn-1至当前时刻Tn期间自动行走设备的位移和姿态变化量。从而,可以预测出自动行走设备在当前时刻Tn相对于前一时刻Tn-1的惯性定位位置。
通过这种方式,可以将自动行走设备在当前时刻Tn的惯性定位位置作为惯性定位结果,实现了惯性定位的过程。
步骤S503,对所述视觉定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果。
在一种可能的实施方式中,步骤503包括:
在所述视觉定位结果有效的情况下,对所述视觉定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;
在所述视觉定位结果无效的情况下,将所述惯性定位结果确定为第一融合结果。
举例来说,如果视觉定位失败(可能的原因例如包括当前时刻Tn的图像特征信息与前一时刻Tn-1的图像特征信息匹配失败;PNP算法无解;三角定位无解等),则可以直接将惯性定位结果确定为第一融合结果。
在一种可能的实施方式中,如果视觉定位成功,则可以通过例如扩展卡尔曼滤波的方式对视觉定位结果和惯性定位结果进行融合。其中,扩展卡尔曼滤波是一种高效率的递归 滤波器(自回归滤波器),可以实现非线性系统的递归滤波。在扩展卡尔曼滤波中,当前状态可以由上一个状态和当前的控制量(例如输入控制量、更新控制量等)决定,扩展卡尔曼滤波的表达式可以示意性地表示为公式(1):
x t=g(u t,x t-1,ε t)        (1)
在公式(1)中,x t可以表示当前时刻t的状态;x t-1可以表示上一个时刻t-1的状态;u t可以例如表示当前时刻t的输入控制量;ε t可以例如表示更新控制量。
在一种可能的实施方式中,通过扩展卡尔曼滤波的方式,可以将惯性定位结果(例如自动行走设备在当前时刻Tn的惯性定位位置)作为输入控制量(例如公式(1)中的u t);视觉定位结果(例如当前时刻Tn的图像特征信息相对于前一时刻Tn-1的图像特征信息的3D位置)作为更新控制量(例如公式(1)中的ε t),从而根据前一时刻的状态(即前一时刻的第一融合结果,例如公式(1)中的x t-1)计算出当前时刻的状态(即当前时刻的第一融合结果,例如公式(1)中的x t)。其中“状态”(第一融合结果)可以包括任何所关心的状态,例如自动行走设备3D位置、图像特征信息相对于三维初始位置的3D位置、以及IMU模块12与视觉模块11之间的相对位置等,并且可以将当前时刻的状态作为视觉和惯性定位融合后的第一融合结果。
在一种可能的实施方式中,可将上述初始化过程获得的初始位置作为扩展卡尔曼滤波的初始状态,即初始的x t-1
在一种可能的实施方式中,在第一融合结果中,可以包括自动行走设备在当前时刻Tn的融合后的3D位置;IMU模块12的误差;以及IMU模块12与视觉模块11之间的相对位置等信息。
通过这种方式,可以实现视觉和惯性的定位融合,确定自动行走设备在当前时刻Tn的融合后的3D位置,提高了定位精度。
步骤S504,在所述GPS定位数据满足质量条件的情况下,对所述第一融合结果和所述GPS定位数据进行融合,获得融合后的第二融合结果,将所述第二融合结果确定为所述自动行走设备的位置。
在一种可能的实施方式中,所述处理模块还被配置为执行如下步骤:
在所述GPS定位数据不满足质量条件的情况下,将所述第一融合结果确定为所述自动行走设备的位置。
举例来说,如果GPS模块13没有收到GPS信号或GPS信号较弱,则可以认为GPS定位数据不满足质量条件。此时,可以将视觉和惯性定位融合的第一融合结果作为输出结果,确定为所述自动行走设备在当前时刻Tn的位置。
在一种可能的实施方式中,如果视觉定位失败,则可以通过GPS定位数据重新对视觉模块11进行视觉定位的初始化。
在一种可能的实施方式中,步骤504包括:
在所述GPS定位数据满足质量条件的情况下,根据当前时刻的第一融合结果、当前时刻的GPS定位数据以及前一时刻的第二融合结果,获取所述自动行走设备在当前时刻的第二融合结果。
举例来说,如果GPS模块13正常接收到GPS信号,可以认为GPS定位数据满足质量条件,可以对第一融合结果和GPS定位数据进行融合,获得融合后的第二融合结果,将第二融合结果确定为自动行走设备在当前时刻Tn的位置。
在一种可能的实施方式中,可以通过扩展卡尔曼滤波的方式对第一融合结果和GPS定位数据进行融合。可以将第一融合结果(例如自动行走设备在当前时刻Tn的第一融合3D位置)作为输入控制量(例如公式(1)中的u t);GPS定位数据(例如当前时刻Tn的GPS定位的2D位置)作为更新控制量(例如公式(1)中的ε t),利用公式(1),根据前一时刻的状态x t-1(即前一时刻的第二融合结果),获得当前时刻的状态x t(即当前时刻的第二融合结果);其中“状态”(第二融合结果)可以包括任何所关心的状态,例如可以将自动行走设备在前一时刻Tn-1的GPS定位位置(3D位置)作为前一时刻Tn-1的状态变量(例如公式(1)中的x t-1),从而获得当前时刻Tn的状态变量(例如公式(1)中的x t)。可以将当前时刻的状态变量作为视觉、惯性及GPS定位融合后的第二融合结果(例如当前时刻Tn的GPS定位位置)。从而,可以将第二融合结果确定为自动行走设备在当前时刻Tn的位置。
在一种可能的实施方式中,在GPS定位数据从不满足质量条件变为满足质量条件时,可将最近一次获得的第一融合结果作为扩展卡尔曼滤波的初始状态,即初始的x t-1
惯性定位可能存在累积误差,因此,在GPS定位数据不满足质量条件的情况下,以第一融合结果确定的自动行走设备的位置的精度可能受到累积误差的影响。在一种可能的实施方式中,可以通过例如最小二乘的方式来矫正GPS定位数据不满足质量条件的情况下,第一融合结果的累积误差,从而进一步提高定位精度。
通过这种方式,可以实现视觉、惯性以及GPS定位三者的定位融合,确定自动行走设备在当前时刻Tn的位置,进一步提高了定位精度。
应用示例
图20示出了根据本公开一实施例的处理模块14的处理过程的一个应用示例的流程图。本领域技术人员应理解,该应用示例仅为了便于理解,不以任何目的限制本公开。
如图20所示,在该应用示例中可以看到,如果视觉定位结果有效(判断结果为是),则可以通过扩展卡尔曼滤波的方式对视觉定位结果和惯性定位结果进行第一融合,获得融合后的第一融合结果;如果视觉定位结果无效(判断结果为否),则可以将惯性定位结果直接确定为第一融合结果。
如果GPS定位数据满足质量条件(判断结果为是),则可以通过扩展卡尔曼滤波的方式对第一融合结果和GPS定位数据进行融合,获得融合后的第二融合结果,将第二融合结果确定为自动行走设备的位置;如果GPS定位数据不满足质量条件(判断结果为否),则可以将第一融合结果确定为自动行走设备的位置。
综上所述,根据本公开的自动行走设备,能够通过视觉模块11、IMU模块12以及GPS模块13分别获取视觉数据、惯性数据以及GPS定位数据,并将定位结果进行两次融合,确定自动行走设备的位置,使得自动行走设备能够实现精确的自主定位。
图21示出了根据本公开一实施例的一种定位方法的流程图。该方法可通过处理器实现,例如通过上文中的处理模块14执行。如图21所示,根据本公开的另一实施例的定位方法包括:
步骤S501,根据所述视觉数据进行视觉定位,获得视觉定位结果;
步骤S502,根据所述惯性数据进行惯性定位,获得惯性定位结果;
步骤S503,对所述视觉定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;
步骤S504,在所述GPS定位数据满足质量条件的情况下,对所述第一融合结果和所述GPS定位数据进行融合,获得融合后的第二融合结果,将所述第二融合结果确定为所述自动行走设备的位置。
在一种可能的实施方式中,所述方法还包括:
在所述GPS定位数据不满足质量条件的情况下,将所述第一融合结果确定为所述自动行走设备的位置。
在一种可能的实施方式中,步骤503包括:
在所述视觉定位结果有效的情况下,对所述视觉定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;
在所述视觉定位结果无效的情况下,将所述惯性定位结果确定为第一融合结果。
在一种可能的实施方式中,步骤S504包括:
在所述GPS定位数据满足质量条件的情况下,根据当前时刻的第一融合结果、当前时刻的GPS定位数据以及前一时刻的第二融合结果,获取所述自动行走设备在当前时刻的第二融合结果。
在一种可能的实施方式中,所述惯性数据包括速度、加速度、角速度以及朝向角中的一个或多个。
在一种可能的实施方式中,步骤S503包括:
通过扩展卡尔曼滤波的方式对所述视觉定位结果和所述惯性定位结果进行融合。
在一种可能的实施方式中,步骤S504包括:
通过扩展卡尔曼滤波的方式对所述第一融合结果和所述GPS定位数据进行融合。
图22是示出了根据本公开一实施例的一种定位装置的框图。该装置可通过上文中的处 理模块14实现。如图22所示,根据本公开的另一实施例的定位装置包括:
视觉定位模块601,用于根据所述视觉数据进行视觉定位,获得视觉定位结果;
惯性定位模块602,用于根据所述惯性数据进行惯性定位,获得惯性定位结果;
第一融合模块603,用于对所述视觉定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;
第二融合模块604,用于在所述GPS定位数据满足质量条件的情况下,对所述第一融合结果和所述GPS定位数据进行融合,获得融合后的第二融合结果,将所述第二融合结果确定为所述自动行走设备的位置。
根据本公开的实施例,能够通过视觉模块11、IMU模块12以及GPS模块13分别获取视觉数据、惯性数据以及GPS定位数据,并将定位结果进行两次融合,确定自动行走设备的位置,使得自动行走设备能够实现精确的自主定位。
在一种可能的实施方式中,所述装置还可用于:
在所述GPS定位数据不满足质量条件的情况下,将所述第一融合结果确定为所述自动行走设备的位置。
在一种可能的实施方式中,第一融合模块603可具体用于:
在所述视觉定位结果有效的情况下,对所述视觉定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;
在所述视觉定位结果无效的情况下,将所述惯性定位结果确定为第一融合结果。
在一种可能的实施方式中,第二融合模块604可具体用于:
在所述GPS定位数据满足质量条件的情况下,根据当前时刻的第一融合结果、当前时刻的GPS定位数据以及前一时刻的第二融合结果,获取所述自动行走设备在当前时刻的第二融合结果。
在一种可能的实施方式中,所述惯性数据包括速度、加速度、角速度以及朝向角中的一个或多个。
在一种可能的实施方式中,第一融合模块603可具体用于:
通过扩展卡尔曼滤波的方式对所述视觉定位结果和所述惯性定位结果进行融合。
在一种可能的实施方式中,第二融合模块604可具体用于:
通过扩展卡尔曼滤波的方式对所述第一融合结果和所述GPS定位数据进行融合。
图23示出了本公开一实施例的一种自动行走设备的一种示例性应用环境的示意图。
如图23所示,在一种示例性的应用环境中,根据本公开实施例的自动行走设备10可以例如为自动割草机,自动行走设备10可以在边界50范围内的工作区域30中自动行走,切割位于工作表面上的植被。
当自动行走设备10在工作区域30中自动行走时,可以自主定位自身的位置,将自身位置与边界50的位置进行比较,以判断自身是否位于工作区域30内,或判断自身距离边界的距离,并根据判断的结果调整移动方式,使自身保持在工作区域30内。
图24示出了根据本公开一实施例的一种自动行走设备的框图。如图24所示,该自动行走设备包括:
激光模块15,用于获取所述自动行走设备的周围环境的激光数据;
惯性导航模块,本实施例中具体为IMU模块12,用于获取所述自动行走设备的惯性数据;
卫星导航模块,本实施例中具体为GPS模块13,用于获取所述自动行走设备的GPS定位数据;
处理模块14,与所述激光模块15、所述IMU模块12以及所述GPS模块13电连接,所述处理模块14被配置为:
根据所述激光数据进行激光定位,获得激光定位结果;
根据所述惯性数据进行惯性定位,获得惯性定位结果;
对所述激光定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;
在所述GPS定位数据满足质量条件的情况下,对所述第一融合结果和所述GPS定位数据进行融合,获得融合后的第二融合结果,将所述第二融合结果确定为所述自动行走设备的位置。
根据本公开实施例的自动行走设备,能够通过激光模块15、IMU模块12以及GPS模块13分别获取激光数据、惯性数据以及GPS定位数据,并将定位结果进行融合,确定自动行走设备的位置,使得自动行走设备能够实现精确的自主定位。
图25示出了本公开一实施例的一种自动行走设备的示意图。图25中以割草机作为自动行走设备的示例。如图25所示,在该应用示例中,自动行走设备可以包括激光模块15、IMU模块12以及GPS模块13。
在一种可能的实施方式中,激光模块15可以包括激光雷达,例如,该激光雷达可以是圆扫激光雷达,采用该种类的激光雷达能够获取周围环境360度范围内的物体的激光数据,提高激光定位的精度。本公开对激光模块15的激光雷达的具体类型不做限定。
在一种可能的实施方式中,IMU模块12可以包括惯性传感器,所述惯性传感器可以包括陀螺仪、加速度计,或者在陀螺仪和加速度计的基础上还可包括地磁传感器和码盘中的一者或两者。举例来说,IMU模块12可以包括陀螺仪和加速度计组成的6轴惯性传感器;也可以包括陀螺仪、加速度计及地磁传感器组成的9轴惯性传感器。例如,IMU模块12可以通过陀螺仪获得自动行走设备的角速度数据;可以通过加速度计获得自动行走设备的加速度数据;可以通过地磁传感器获得自动行走设备所处位置的经纬度数据;可以通过码盘获得自动行走设备的速度数据。本公开对IMU模块12的惯性传感器的具体部件情况不做限定。
在一种可能的实施方式中,IMU模块12获取的惯性数据可以包括速度、加速度、角速度以及朝向角中的一个或多个。惯性数据可以根据构成IMU模块12的惯性传感器的情况确定,本公开对此不做限制。
在一种可能的实施方式中,所述激光模块15的轴心、所述IMU模块12的一轴以及所述GPS模块13的中心处于所述自动行走设备的中轴线上,所述激光模块15的轴心与IMU模块12的一轴共线。举例来说,激光模块15的激光雷达可以与IMU模块12背靠背安装,并保证激光雷达的轴心与IMU模块12的某一轴共线;并且,激光雷达的轴心、IMU模块12的某一轴以及GPS模块13的中心可以处于自动行走设备的中轴线上。采用这种方式,可以减少所获得数据的自由度,使得算法简化,从而加快处理流程。
在一种可能的实施方式中,GPS模块13可以是任何能够实现基于GPS定位的模块,例如能够接收GPS信号进行定位的GPS接收机。GPS模块13可以基于现有技术实现。GPS定位数据是否满足质量条件的标准,可以根据需要任意设置,例如,可以根据GPS信号的强弱、接收卫星数量、定位状态等判断GPS定位数据是否满足质量条件,本公开对此不做限制。
在一种可能的实施方式中,GPS模块13的中心可以处于自动行走设备的两个驱动轮的中心点的上方。采用这种方式,在自动行走设备旋转时,GPS模块13的定位位置可以不变或仅有很小的变化,从而简化数据的处理。
在一种可能的实施方式中,处理模块14可以是单片机、CPU、MPU、FPGA等任何能进行数据处理的处理部件,处理模块14可以通过专用硬件电路实现,也可以通过通用处理部件结合可执行逻辑指令实现,以执行处理模块14的处理过程。
在一种可能的实施方式中,自动行走设备还可包括存储模块(未示出),以存储处理模块14生成的数据,例如激光数据、惯性数据以及GPS定位数据等。
在一种可能的实施方式中,处理模块14可被配置为:
步骤S501,根据激光数据进行激光定位,获得激光定位结果;
步骤S502,根据惯性数据进行惯性定位,获得惯性定位结果;
步骤S503,对所述激光定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;
步骤S504,在GPS定位数据满足质量条件的情况下,对所述第一融合结果和所述GPS定位数据进行融合,获得融合后的第二融合结果,将所述第二融合结果确定为所述自动行走设备的位置。
举例来说,激光模块15可以每隔时间间隔T测量自动行走设备与周围环境中的物体之间的距离数据作为激光数据,该时间间隔T可以根据激光雷达的性质由系统进行默认设定,例如,激光雷达可以每分钟测量30次自动行走设备与周围环境360度范围内的物体之间的距离数据,则时间间隔T可以为2秒。本公开对时间间隔T的具体取值不做限定。
在一种可能的实施方式中,基于激光数据(例如,测量的自动行走设备与周围环境360 度范围内的物体之间的距离数据),可以提取激光数据中的点云数据(多个距离点/特征点),尽可能地覆盖自动行走设备周围环境的各个区域。本领域技术人员应当理解,可以通过本领域公知的点云数据提取算法,从激光数据中提取点云数据,本公开对此不做限定。
在一种可能的实施方式中,可以首先对激光模块15、IMU模块12以及GPS模块13进行初始化,将激光模块15、IMU模块12以及GPS模块13的初始定位位置设定为零点。
在一种可能的实施方式中,根据激光数据以及惯性数据,可以对激光模块15进行初始化。举例来说,基于初始时刻T1的激光数据(例如初始时刻T1测量的一组距离数据),可以提取初始时刻T1的点云数据。通过该点云数据,可以对初始时刻T1的点云数据与前一时刻T0的点云数据进行匹配,确定T1的点云数据与前一时刻T0的点云数据之间的对应关系;基于该对应关系,可以确定初始时刻T1与前一时刻T0两个相邻时刻的点云数据之间的基本矩阵F。对IMU模块12的惯性数据(例如前一时刻T0与初始时刻T1期间的速度、加速度以及角速度)进行积分,并结合惯性数据中的初始时刻T1时的朝向角(自动行走设备的姿态),可以获得自动行走设备前一时刻T0与初始时刻T1的位置之间的基线距离B。根据基本矩阵F和基线距离B,可以获得初始时刻T1与前一时刻T0两个相邻时刻的两组点云数据之间的相对位置。进而,可以确定初始时刻T1的点云数据(各个距离点/特征点)的深度信息,从而确定各个距离点相对于自动行走设备的初始定位位置(初始时刻T1的位置,也即坐标零点)的三维初始位置,该三维初始位置可以包括特征点的三维坐标(例如,距离点的位置和深度)。
以上初始化过程仅为举例说明,本领域技术人员应当理解,可以采用本领域公知的算法获取点云数据的三维初始位置,本公开对此不做限定。
通过这种方式,可以实现激光定位的初始化,确定初始时刻T1的图像特征信息的三维初始位置。
在步骤S501中,根据激光数据进行激光定位,获得激光定位结果。
激光定位可以通过相关技术中的适当方式实现,激光定位结果可以是所采用的激光定位过程产生的任意适当结果,本公开对此不做限制。
举例来说,基于当前时刻Tn的激光数据(当前时刻Tn测量的一组激光数据),可以提取当前时刻Tn的点云数据(多个距离点/特征点以及相对应的坐标),其中,n表示初始时刻至当前时刻测量的激光数据的数量,n>1。可以对当前时刻Tn的点云数据与前一时刻Tn-1的点云数据进行匹配,确定当前时刻Tn的点云数据与前一时刻Tn-1的点云数据之间的对应关系。
在一种可能的实施方式中,根据前一时刻Tn-1的点云数据(多个距离点/特征点)的三维(3D)位置,当前时刻Tn的点云数据的坐标位置(2D位置),以及上述的对应关系,可以通过例如PNP算法进行计算,从而获得当前时刻Tn的点云数据相对于前一时刻Tn-1的点云数据的3D位置;基于该3D位置,可以例如通过三角定位算法进行计算,确定测量位置(自动行走设备的位置)在当前时刻Tn的3D位置。也可以采用其他公知的算法确定当前时刻Tn的点云数据的3D位置,以及自动行走设备在当前时刻Tn的3D位置,本公开对此不作限定。
通过这种方式,可以将自动行走设备在当前时刻Tn的3D位置作为激光定位结果,实现了激光定位的过程。
步骤S502,根据惯性数据进行惯性定位,获得惯性定位结果。
惯性定位可以通过相关技术中的适当方式实现,惯性定位结果可以是所采用的惯性定位过程产生的任意适当结果,本公开对此不做限制。
举例来说,IMU模块12可以根据传感器参数和标定信息确定惯性数据,该惯性数据可以包括例如前一时刻Tn-1至当前时刻Tn期间的速度、加速度以及角速度,当前时刻Tn时的朝向角(自动行走设备的姿态)等。对该惯性数据中的速度、加速度、角速度等数据进行积分,可以获取前一时刻Tn-1至当前时刻Tn期间自动行走设备的位移和姿态变化量。从而,可以预测出自动行走设备在当前时刻Tn相对于前一时刻Tn-1的惯性定位位置。
通过这种方式,可以将自动行走设备在当前时刻Tn的惯性定位位置作为惯性定位结果,实现了惯性定位的过程。
步骤S503,对所述激光定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果。
在一种可能的实施方式中,步骤503包括:
在所述激光定位结果有效的情况下,对所述激光定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;
在所述激光定位结果无效的情况下,将所述惯性定位结果确定为第一融合结果。
举例来说,如果激光定位失败(可能的原因例如包括当前时刻Tn的点云数据与前一时刻Tn-1的点云数据匹配失败;PNP算法无解;三角定位无解等),则可以直接将惯性定位结果确定为第一融合结果。
在一种可能的实施方式中,如果激光定位成功,则可以通过例如扩展卡尔曼滤波的方式对激光定位结果和惯性定位结果进行融合。其中,扩展卡尔曼滤波是一种高效率的递归滤波器(自回归滤波器),可以实现非线性系统的递归滤波。在扩展卡尔曼滤波中,当前状态可以由上一个状态和当前的控制量(例如输入控制量、更新控制量等)决定,扩展卡尔曼滤波的表达式可以示意性地表示为公式(1):
x t=g(u t,x t-1,ε t)         (1)
在公式(1)中,x t可以表示当前时刻t的状态;x t-1可以表示上一个时刻t-1的状态;u t可以例如表示当前时刻t的输入控制量;ε t可以例如表示更新控制量。
在一种可能的实施方式中,通过扩展卡尔曼滤波的方式,可以将惯性定位结果(例如自动行走设备在当前时刻Tn的惯性定位位置)作为输入控制量(例如公式(1)中的u t);激光定位结果(例如当前时刻Tn的点云数据相对于前一时刻Tn-1的点云数据的3D位置)作为更新控制量(例如公式(1)中的ε t),从而根据前一时刻的状态(即前一时刻的第一融合结果,例如公式(1)中的x t-1)计算出当前时刻的状态(即当前时刻的第一融合结果,例如公式(1)中的x t)。其中“状态”(第一融合结果)可以包括任何所关心的状态,例如自动行走设备3D位置、点云数据相对于三维初始位置的3D位置、以及IMU模块12与激光模块15之间的相对位置等,并且可以将当前时刻的状态作为激光和惯性定位融合后的第一融合结果。
在一种可能的实施方式中,可将上述初始化过程获得的初始位置作为扩展卡尔曼滤波 的初始状态,即初始的x t-1
在一种可能的实施方式中,在第一融合结果中,可以包括自动行走设备在当前时刻Tn的融合后的3D位置;IMU模块12的误差;以及IMU模块12与激光模块15之间的相对位置等信息。
通过这种方式,可以实现激光和惯性的定位融合,确定自动行走设备在当前时刻Tn的融合后的3D位置,提高了定位精度。
步骤S504,在GPS定位数据满足质量条件的情况下,对所述第一融合结果和所述GPS定位数据进行融合,获得融合后的第二融合结果,将所述第二融合结果确定为自动行走设备的位置。
在一种可能的实施方式中,所述处理模块还被配置为执行如下步骤:
在所述GPS定位数据不满足质量条件的情况下,将所述第一融合结果确定为所述自动行走设备的位置。
举例来说,如果GPS模块13没有收到GPS信号或GPS信号较弱,则可以认为GPS定位数据不满足质量条件。此时,可以将激光和惯性定位融合的第一融合结果作为输出结果,确定为所述自动行走设备在当前时刻Tn的位置。
在一种可能的实施方式中,如果激光定位失败,则可以通过GPS定位数据重新对激光模块15进行激光定位的初始化。
在一种可能的实施方式中,步骤504包括:
在所述GPS定位数据满足质量条件的情况下,根据当前时刻的第一融合结果、当前时刻的GPS定位数据以及前一时刻的第二融合结果,获取所述自动行走设备在当前时刻的第二融合结果。
举例来说,如果GPS模块13正常接收到GPS信号,可以认为GPS定位数据满足质量条件,可以对第一融合结果和GPS定位数据进行融合,获得融合后的第二融合结果,将第二融合结果确定为自动行走设备在当前时刻Tn的位置。
在一种可能的实施方式中,可以通过扩展卡尔曼滤波的方式对第一融合结果和GPS定 位数据进行融合。可以将第一融合结果(例如自动行走设备在当前时刻Tn的第一融合3D位置)作为输入控制量(例如公式(1)中的u t);GPS定位数据(例如当前时刻Tn的GPS定位的2D位置)作为更新控制量(例如公式(1)中的ε t),利用公式(1),根据前一时刻的状态x t-1(即前一时刻的第二融合结果),获得当前时刻的状态x t(即当前时刻的第二融合结果);其中“状态”(第二融合结果)可以包括任何所关心的状态,例如可以将自动行走设备在前一时刻Tn-1的GPS定位位置(3D位置)作为前一时刻Tn-1的状态变量(例如公式(1)中的x t-1),从而获得当前时刻Tn的状态变量(例如公式(1)中的x t)。可以将当前时刻的状态变量作为激光、惯性及GPS定位融合后的第二融合结果(例如当前时刻Tn的GPS定位位置)。从而,可以将第二融合结果确定为自动行走设备在当前时刻Tn的位置。
在一种可能的实施方式中,在GPS定位数据从不满足质量条件变为满足质量条件时,可将最近一次获得的第一融合结果作为扩展卡尔曼滤波的初始状态,即初始的x t-1
惯性定位可能存在累积误差,因此,在GPS定位数据不满足质量条件的情况下,以第一融合结果确定的自动行走设备的位置的精度可能受到累积误差的影响。在一种可能的实施方式中,可以通过例如最小二乘的方式来矫正GPS定位数据不满足质量条件的情况下,第一融合结果的累积误差,从而进一步提高定位精度。
通过这种方式,可以实现激光、惯性以及GPS定位三者的定位融合,确定自动行走设备在当前时刻Tn的位置,进一步提高了定位精度。
应用示例
图26示出了根据本公开一实施例的处理模块14的处理过程的一个应用示例的流程图。本领域技术人员应理解,该应用示例仅为了便于理解,不以任何目的限制本公开。
如图26所示,在该应用示例中可以看到,如果激光定位结果有效(判断结果为是),则可以通过扩展卡尔曼滤波的方式对激光定位结果和惯性定位结果进行第一融合,获得融合后的第一融合结果;如果激光定位结果无效(判断结果为否),则可以将惯性定位结果直接确定为第一融合结果。
如果GPS定位数据满足质量条件(判断结果为是),则可以通过扩展卡尔曼滤波的方式 对第一融合结果和GPS定位数据进行融合,获得融合后的第二融合结果,将第二融合结果确定为自动行走设备的位置;如果GPS定位数据不满足质量条件(判断结果为否),则可以将第一融合结果确定为自动行走设备的位置。
综上所述,根据本公开的自动行走设备,能够通过激光模块15、IMU模块12以及GPS模块13分别获取激光数据、惯性数据以及GPS定位数据,并将定位结果进行两次融合,确定自动行走设备的位置,使得自动行走设备能够实现精确的自主定位。
图27示出了根据本公开一实施例的一种定位方法的流程图。该方法可通过处理器实现,例如通过上文中的处理模块14执行。如图27所示,根据本公开的另一实施例的定位方法包括:
步骤S501,根据激光数据进行激光定位,获得激光定位结果;
步骤S502,根据惯性数据进行惯性定位,获得惯性定位结果;
步骤S503,对所述激光定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;
步骤S504,在GPS定位数据满足质量条件的情况下,对所述第一融合结果和所述GPS定位数据进行融合,获得融合后的第二融合结果,将所述第二融合结果确定为自动行走设备的位置。
在一种可能的实施方式中,所述方法还包括:
在所述GPS定位数据不满足质量条件的情况下,将所述第一融合结果确定为所述自动行走设备的位置。
在一种可能的实施方式中,步骤503包括:
在所述激光定位结果有效的情况下,对所述激光定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;
在所述激光定位结果无效的情况下,将所述惯性定位结果确定为第一融合结果。
在一种可能的实施方式中,步骤S504包括:
在所述GPS定位数据满足质量条件的情况下,根据当前时刻的第一融合结果、当前时 刻的GPS定位数据以及前一时刻的第二融合结果,获取所述自动行走设备在当前时刻的第二融合结果。
在一种可能的实施方式中,所述惯性数据包括速度、加速度、角速度以及朝向角中的一个或多个。
在一种可能的实施方式中,步骤S503包括:
通过扩展卡尔曼滤波的方式对所述激光定位结果和所述惯性定位结果进行融合。
在一种可能的实施方式中,步骤S504包括:
通过扩展卡尔曼滤波的方式对所述第一融合结果和所述GPS定位数据进行融合。
图28示出了根据本公开一实施例的一种定位装置的框图。该装置可通过上文中的处理模块14实现。如图28所示,根据本公开的另一实施例的定位装置包括:
激光定位模块600,用于根据激光数据进行激光定位,获得激光定位结果;
惯性定位模块602,用于根据惯性数据进行惯性定位,获得惯性定位结果;
第一融合模块603,用于对所述激光定位结果和所述惯性定位结果进行融合,获得融合后的第一融合结果;
第二融合模块604,用于在GPS定位数据满足质量条件的情况下,对所述第一融合结果和所述GPS定位数据进行融合,获得融合后的第二融合结果,将所述第二融合结果确定为自动行走设备的位置。
根据本公开的实施例,能够通过激光模块15、IMU模块12以及GPS模块13分别获取激光数据、惯性数据以及GPS定位数据,并将定位结果进行两次融合,确定自动行走设备的位置,使得自动行走设备能够实现精确的自主定位。
在一种可能的实施方式中,所述装置还可用于:
在所述GPS定位数据不满足质量条件的情况下,将所述第一融合结果确定为所述自动行走设备的位置。
在一种可能的实施方式中,第一融合模块603可具体用于:
在所述激光定位结果有效的情况下,对所述激光定位结果和所述惯性定位结果进行融 合,获得融合后的第一融合结果;
在所述激光定位结果无效的情况下,将所述惯性定位结果确定为第一融合结果。
在一种可能的实施方式中,第二融合模块604可具体用于:
在所述GPS定位数据满足质量条件的情况下,根据当前时刻的第一融合结果、当前时刻的GPS定位数据以及前一时刻的第二融合结果,获取所述自动行走设备在当前时刻的第二融合结果。
在一种可能的实施方式中,所述惯性数据包括速度、加速度、角速度以及朝向角中的一个或多个。
在一种可能的实施方式中,第一融合模块603可具体用于:
通过扩展卡尔曼滤波的方式对所述激光定位结果和所述惯性定位结果进行融合。
在一种可能的实施方式中,第二融合模块604可具体用于:
通过扩展卡尔曼滤波的方式对所述第一融合结果和所述GPS定位数据进行融合。
本公开可以是系统、方法和/或计算机程序产品。计算机程序产品可以包括计算机可读存储介质,其上载有用于使处理器实现本公开的各个方面的计算机可读程序指令。
计算机可读存储介质可以是可以保持和存储由指令执行设备使用的指令的有形设备。计算机可读存储介质例如可以是――但不限于――电存储设备、磁存储设备、光存储设备、电磁存储设备、半导体存储设备或者上述的任意合适的组合。计算机可读存储介质的更具体的例子(非穷举的列表)包括:便携式计算机盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、静态随机存取存储器(SRAM)、便携式压缩盘只读存储器(CD-ROM)、数字多功能盘(DVD)、记忆棒、软盘、机械编码设备、例如其上存储有指令的打孔卡或凹槽内凸起结构、以及上述的任意合适的组合。这里所使用的计算机可读存储介质不被解释为瞬时信号本身,诸如无线电波或者其他自由传播的电磁波、通过波导或其他传输媒介传播的电磁波(例如,通过光纤电缆的光脉冲)、或者通过电线传输的电信号。
这里所描述的计算机可读程序指令可以从计算机可读存储介质下载到各个计算/处理 设备,或者通过网络、例如因特网、局域网、广域网和/或无线网下载到外部计算机或外部存储设备。网络可以包括铜传输电缆、光纤传输、无线传输、路由器、防火墙、交换机、网关计算机和/或边缘服务器。每个计算/处理设备中的网络适配卡或者网络接口从网络接收计算机可读程序指令,并转发该计算机可读程序指令,以供存储在各个计算/处理设备中的计算机可读存储介质中。
用于执行本公开操作的计算机程序指令可以是汇编指令、指令集架构(ISA)指令、机器指令、机器相关指令、微代码、固件指令、状态设置数据、或者以一种或多种编程语言的任意组合编写的源代码或目标代码,所述编程语言包括面向对象的编程语言—诸如Smalltalk、C++等,以及常规的过程式编程语言—诸如“C”语言或类似的编程语言。计算机可读程序指令可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络—包括局域网(LAN)或广域网(WAN)—连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。在一些实施例中,通过利用计算机可读程序指令的状态信息来个性化定制电子电路,例如可编程逻辑电路、现场可编程门阵列(FPGA)或可编程逻辑阵列(PLA),该电子电路可以执行计算机可读程序指令,从而实现本公开的各个方面。
这里参照根据本公开实施例的方法、装置(系统)和计算机程序产品的流程图和/或框图描述了本公开的各个方面。应当理解,流程图和/或框图的每个方框以及流程图和/或框图中各方框的组合,都可以由计算机可读程序指令实现。
这些计算机可读程序指令可以提供给通用计算机、专用计算机或其它可编程数据处理装置的处理器,从而生产出一种机器,使得这些指令在通过计算机或其它可编程数据处理装置的处理器执行时,产生了实现流程图和/或框图中的一个或多个方框中规定的功能/动作的装置。也可以把这些计算机可读程序指令存储在计算机可读存储介质中,这些指令使得计算机、可编程数据处理装置和/或其他设备以特定方式工作,从而,存储有指令的计算 机可读介质则包括一个制造品,其包括实现流程图和/或框图中的一个或多个方框中规定的功能/动作的各个方面的指令。
也可以把计算机可读程序指令加载到计算机、其它可编程数据处理装置、或其它设备上,使得在计算机、其它可编程数据处理装置或其它设备上执行一系列操作步骤,以产生计算机实现的过程,从而使得在计算机、其它可编程数据处理装置、或其它设备上执行的指令实现流程图和/或框图中的一个或多个方框中规定的功能/动作。
附图中的流程图和框图显示了根据本公开的多个实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段或指令的一部分,所述模块、程序段或指令的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。在有些作为替换的实现中,方框中所标注的功能也可以以不同于附图中所标注的顺序发生。例如,两个连续的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或动作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
以上已经描述了本公开的各实施例,上述说明是示例性的,并非穷尽性的,并且也不限于所披露的各实施例。在不偏离所说明的各实施例的范围和精神的情况下,对于本技术领域的普通技术人员来说许多修改和变更都是显而易见的。本文中所用术语的选择,旨在最好地解释各实施例的原理、实际应用或对市场中的技术改进,或者使本技术领域的其它普通技术人员能理解本文披露的各实施例。

Claims (31)

  1. 一种定位装置,其特征在于,所述定位装置能够由携带者携带行走,所述定位装置包括:
    第一定位模块,用于获取定位装置的携带者的第一定位结果;
    传感器模块,用于测量定位装置的携带者行走的加速度和角度参数;以及
    处理模块,被配置为:
    在用于确定自动行走设备的工作范围的边界的第一模式下,如果所述第一定位结果满足质量条件,则根据所述第一定位结果确定所述定位装置的携带者的位置;如果所述第一定位结果不满足质量条件,则根据所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的位置;以及
    根据所述定位装置的携带者的位置,确定所述边界。
  2. 根据权利要求1所述的定位装置,其特征在于,所述处理模块还被配置为:
    获取所述定位装置的携带者的步长步频模型的模型参数,其中所述步长步频模型表示所述携带者的步频与步长之间的关系;
    采用所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的位置,包括:
    根据所述步长步频模型的模型参数、以及所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的位置。
  3. 根据权利要求2所述的定位装置,其特征在于,采用所述步长步频模型的模型参数、以及所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的位置,包括:
    当所述第一定位结果不满足质量条件时,获取最近的满足质量条件的第一定位结果作为行人航位推算算法的起始位置;
    根据所述角度参数确定所述携带者的实时航向;
    根据所述加速度确定所述携带者的实时步频;
    根据所述实时步频和所述步长步频模型的模型参数,利用所述步长步频模型确定所 述携带者的实时步长;以及
    根据所述实时航向、所述实时步长以及所述起始位置,基于行人航位推算算法确定所述定位装置的携带者的位置。
  4. 根据权利要求2所述的定位装置,其特征在于,获取所述定位装置的携带者的步长步频模型的模型参数,包括:
    根据所述传感器模块测量的加速度确定所述携带者的步频和跨步点,根据所述跨步点对应的第一定位结果确定所述携带者的步长,根据所述步频和所述步长确定所述步长步频模型的模型参数,
    其中所述跨步点为所述携带者每一跨步的特征点。
  5. 根据权利要求1所述的定位装置,其特征在于,所述处理模块还被配置为:
    在所述第一定位结果从不满足质量条件变为满足质量条件时,获取基于行人航位推算算法确定的所述定位装置的携带者的第一位置和根据所述第一定位模块的第一定位结果确定的所述定位装置的携带者的第二位置;
    根据所述第一位置和所述第二位置,对在所述第一定位结果不满足质量条件期间基于行人航位推算算法确定的所述定位装置的携带者的位置进行校正。
  6. 根据权利要求1所述的定位装置,其特征在于,根据所述定位装置的携带者的位置,确定所述边界,包括:
    对基于行人航位推算算法确定的所述定位装置的携带者的位置进行插值处理,得到插值处理后的所述定位装置的携带者的位置;
    对插值处理后的所述定位装置的携带者的位置和根据所述第一定位结果确定的所述定位装置的携带者的位置进行平滑滤波,以确定所述边界。
  7. 根据权利要求1所述的定位装置,其特征在于,所述第一定位模块为卫星定位模块,所述第一定位结果为卫星定位结果。
  8. 根据权利要求7所述的定位装置,其特征在于,所述处理模块还被配置为:
    根据卫星定位模块接收的卫星数和卫星定位模块的定位状态中的一者或两者,判断 卫星定位模块的所述卫星定位结果是否满足质量条件。
  9. 根据权利要求8所述的定位装置,其特征在于,根据卫星定位模块接收的卫星数和卫星定位模块的定位状态中的一者或两者,判断卫星定位模块的所述卫星定位定位结果是否满足质量条件,包括:
    在定位状态为指定状态,且卫星数不小于阈值的情况下,判断所述卫星定位结果满足质量条件。
  10. 根据权利要求1所述的定位装置,其特征在于,所述定位装置能够安装于自动行走设备,
    所述处理模块被配置为:在用于定位所述自动行走设备的位置的第二模式下,根据所述第一定位结果以及惯性定位结果的至少其中之一,确定所述自动行走设备的位置,
    其中,所述惯性定位结果是根据至少所述传感器模块输出的加速度和角度参数、基于惯性导航推算算法确定的。
  11. 根据权利要求10所述的定位装置,其特征在于,所述惯性导航推算算法包括INS算法。
  12. 根据权利要求1所述的定位装置,其特征在于,所述行人航位推算算法包括PDR算法。
  13. 根据权利要求1所述的定位装置,其特征在于,所述第一定位模块包括UWB定位模块。
  14. 一种自动行走设备,其特征在于,所述自动行走设备包括设备主体和根据权利要求1至13中任意一项所述的定位装置,其中所述定位装置能够以可拆卸的方式安装于所述设备主体。
  15. 一种定位装置,其特征在于,所述定位装置能够由携带者携带行走,所述定位装置包括:
    第一定位模块,用于获取定位装置的携带者的第一定位结果;
    传感器模块,用于测量定位装置的携带者行走的加速度和角度参数;以及
    处理模块,被配置为:
    在用于确定自动行走设备的工作范围的边界的第一模式下,根据所述第一定位结果确定所述定位装置的携带者的第三位置;根据所述加速度和角度参数、基于行人航位推算算法确定所述定位装置的携带者的第四位置;根据所述第三位置和所述第四位置,确定所述定位装置的携带者的位置;以及
    根据所述定位装置的携带者的位置,确定所述边界。
  16. 根据权利要求15所述的定位装置,其特征在于,根据所述第三位置和所述第四位置,确定所述定位装置的携带者的位置,包括:
    根据所述第三位置和所述第四位置两者的融合,确定所述定位装置的携带者的位置。
  17. 根据权利要求16所述的定位装置,其特征在于,根据所述第三位置和所述第四位置两者的融合,确定所述定位装置的携带者的位置,包括:
    根据所述第三位置和所述第四位置的加权和,确定所述定位装置的携带者的位置。
  18. 根据权利要求17所述的定位装置,其特征在于,所述处理模块还被配置为:
    根据第一定位结果的质量,确定所述第三位置和所述第四位置各自在所述加权和中所占的权重。
  19. 根据权利要求18所述的定位装置,其特征在于,根据第一定位结果的质量,确定所述第三位置和所述第四位置各自在所述加权和中所占的权重,包括:
    随着所述第一定位结果的质量的提高,增大所述第三位置在所述加权和中所占的权重,并减小所述第四位置在所述加权和中所占的权重;
    随着所述第一定位结果的质量的降低,减小所述第三位置在所述加权和中所占的权重,并增大所述第四位置在所述加权和中所占的权重。
  20. 根据权利要求15至19中任意一项所述的定位装置,其特征在于,所述第一定位模块为卫星定位模块,所述第一定位结果为卫星定位结果。
  21. 根据权利要求20所述的定位装置,其特征在于,所述处理模块还被配置为:
    根据卫星定位模块接收的卫星数和卫星定位模块的定位状态中的一者或两者,判断卫星定位模块的所述卫星定位结果的质量。
  22. 根据权利要求15所述的定位装置,其特征在于,根据所述第三位置和所述第四位置,确定所述定位装置的携带者的位置,包括:
    根据跨步点处的所述第三位置和跨步点处的所述第四位置,确定跨步点处所述定位装置的携带者的位置;
    对跨步点处所述定位装置的携带者的位置进行插值处理,得到所述定位装置的携带者的位置;
    其中所述跨步点为所述携带者每一跨步的特征点。
  23. 根据权利要求15所述的定位装置,其特征在于,根据所述第三位置和所述第四位置,确定所述定位装置的携带者的位置,包括:
    对跨步点处的所述第四位置进行插值处理,得到插值后的第四位置;
    根据所述第三位置和所述插值后的第四位置,确定所述定位装置的携带者的位置;
    其中所述跨步点为所述携带者每一跨步的特征点。
  24. 根据权利要求15所述的定位装置,其特征在于,所述处理模块还被配置为:
    获取所述定位装置的携带者的步长步频模型的模型参数,其中所述步长步频模型表示所述携带者的步频与步长之间的关系;
    根据所述加速度和角度参数、基于行人航位推算算法确定所述定位装置的携带者的第四位置,包括:
    根据所述步长步频模型的模型参数、以及所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的第四位置。
  25. 根据权利要求24所述的定位装置,其特征在于,根据所述步长步频模型的模型参数、以及所述加速度和所述角度参数,基于行人航位推算算法确定所述定位装置的携带者的第四位置,包括:
    获取行人航位推算算法的起始位置;
    根据所述角度参数确定所述携带者的实时航向;
    根据所述加速度确定所述携带者的实时步频;
    根据所述实时步频和所述步长步频模型的模型参数,利用所述步长步频模型确定所述携带者的实时步长;以及
    根据所述实时航向、所述实时步长以及所述起始位置,基于行人航位推算算法确定所述定位装置的携带者的第四位置。
  26. 根据权利要求24所述的定位装置,其特征在于,获取所述定位装置的携带者的步长步频模型的模型参数,包括:
    根据所述传感器模块测量的加速度确定所述携带者的步频和跨步点,根据所述跨步点对应的第一定位结果确定所述携带者的步长,根据所述步频和所述步长确定所述步长步频模型的模型参数,
    其中所述跨步点为所述携带者每一跨步的特征点。
  27. 根据权利要求15所述的定位装置,其特征在于,所述定位装置能够安装于自动行走设备,
    所述处理模块被配置为:在用于定位所述自动行走设备的位置的第二模式下,根据所述第一定位结果以及惯性定位结果的至少其中之一,确定所述自动行走设备的位置,
    其中,所述惯性定位结果是根据至少所述传感器模块输出的加速度和角度参数、基于惯性导航推算算法确定的。
  28. 根据权利要求27所述的定位装置,其特征在于,所述惯性导航推算算法包括INS算法。
  29. 根据权利要求15所述的定位装置,其特征在于,所述第一定位模块包括UWB定位模块。
  30. 根据权利要求15所述的定位装置,其特征在于,所述行人航位推算算法包括PDR算法。
  31. 一种自动行走设备,其特征在于,所述自动行走设备包括设备主体和根据权利 要求15至30中任意一项所述的定位装置,其中所述定位装置能够以可拆卸的方式安装于所述设备主体。
PCT/CN2018/088519 2017-05-26 2018-05-25 定位装置及方法以及自动行走设备 WO2018214978A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US16/613,271 US11448775B2 (en) 2017-05-26 2018-05-25 Positioning apparatus and method and self-moving device
EP18806033.9A EP3633410A4 (en) 2017-05-26 2018-05-25 POSITIONING DEVICE AND METHOD AS WELL AS AUTOMATIC MOVING DEVICE

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
CN201710385778.2 2017-05-26
CN201710385778.2A CN108957512A (zh) 2017-05-26 2017-05-26 定位装置及方法以及自动行走设备
CN201710972337.2A CN109682371A (zh) 2017-10-18 2017-10-18 自动行走设备及其定位方法及装置
CN201710978239.X 2017-10-18
CN201710978239.XA CN109683604A (zh) 2017-10-18 2017-10-18 自动行走设备及其定位方法及装置
CN201710972337.2 2017-10-18

Publications (1)

Publication Number Publication Date
WO2018214978A1 true WO2018214978A1 (zh) 2018-11-29

Family

ID=64395304

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/088519 WO2018214978A1 (zh) 2017-05-26 2018-05-25 定位装置及方法以及自动行走设备

Country Status (3)

Country Link
US (1) US11448775B2 (zh)
EP (1) EP3633410A4 (zh)
WO (1) WO2018214978A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111486843A (zh) * 2020-04-03 2020-08-04 深圳市百慕大工业有限公司 一种复杂环境下的定位方法、装置及定位设备
SE2150454A1 (en) * 2021-04-13 2022-10-14 Husqvarna Ab System and method for determining operating boundaries of a robotic work tool
EP4121834A4 (en) * 2020-03-18 2024-04-03 Husqvarna Ab ROBOTIC WORK TOOL SYSTEM AND METHOD

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11448775B2 (en) 2017-05-26 2022-09-20 Positec Power Tools (Suzhou) Co., Ltd Positioning apparatus and method and self-moving device
US11348269B1 (en) * 2017-07-27 2022-05-31 AI Incorporated Method and apparatus for combining data to construct a floor plan
US20210276593A1 (en) * 2020-02-25 2021-09-09 Next Energy, LLC Automated and interchangeable functional device
SE544298C2 (en) * 2020-04-14 2022-03-29 Husqvarna Ab Robotic work tool system and method for defining a working area
CN113534227B (zh) * 2021-07-26 2022-07-01 中国电子科技集团公司第五十四研究所 一种适用于复杂非合作场景的多传感器融合绝对定位方法
WO2023158606A1 (en) * 2022-02-15 2023-08-24 Exmark Manufacturing Company Incorporated System and method for defining a work region boundary for use by an autonomous grounds care vehicle

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103299209A (zh) * 2011-01-07 2013-09-11 三星电子株式会社 用于使用导航算法检测位置信息的设备和方法
CN103324192A (zh) * 2012-03-23 2013-09-25 苏州宝时得电动工具有限公司 边界设置方法及边界设置系统
CN104035444A (zh) * 2014-06-27 2014-09-10 东南大学 机器人地图构建存储方法
US20160026185A1 (en) 2013-03-15 2016-01-28 Mtd Products Inc Autonomous mobile work system comprising a variable reflectivity base station
CN105607104A (zh) * 2016-01-28 2016-05-25 成都佰纳瑞信息技术有限公司 一种基于gnss与ins的自适应导航定位系统及方法
CN106462161A (zh) * 2014-03-31 2017-02-22 美国iRobot公司 自主型移动机器人
CN107462260A (zh) * 2017-08-22 2017-12-12 上海斐讯数据通信技术有限公司 一种运动轨迹生成方法、装置及可穿戴设备
EP3444694A1 (en) 2016-04-15 2019-02-20 Positec Power Tools (Suzhou) Co., Ltd Automatic working system, mobile device, and control method therefor
EP3557355A1 (en) 2016-12-15 2019-10-23 Positec Power Tools (Suzhou) Co., Ltd Autonomous moving device, method thereof for giving alarm on positioning fault, and automatic working system
EP3633410A1 (en) 2017-05-26 2020-04-08 Positec Power Tools (Suzhou) Co., Ltd Positioning device and method and automatically moving apparatus

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10037689B2 (en) * 2015-03-24 2018-07-31 Donald Warren Taylor Apparatus and system to manage monitored vehicular flow rate
GB201419883D0 (en) 2014-11-07 2014-12-24 F Robotics Acquisitions Ltd Domestic robotic system and method
EP3239659B1 (en) * 2016-04-26 2019-02-27 Volvo Car Corporation Method and system for in a timed manner enabling a user device on the move to utilize digital content associated with entities ahead

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103299209A (zh) * 2011-01-07 2013-09-11 三星电子株式会社 用于使用导航算法检测位置信息的设备和方法
CN103324192A (zh) * 2012-03-23 2013-09-25 苏州宝时得电动工具有限公司 边界设置方法及边界设置系统
US20160026185A1 (en) 2013-03-15 2016-01-28 Mtd Products Inc Autonomous mobile work system comprising a variable reflectivity base station
CN106462161A (zh) * 2014-03-31 2017-02-22 美国iRobot公司 自主型移动机器人
CN104035444A (zh) * 2014-06-27 2014-09-10 东南大学 机器人地图构建存储方法
CN105607104A (zh) * 2016-01-28 2016-05-25 成都佰纳瑞信息技术有限公司 一种基于gnss与ins的自适应导航定位系统及方法
EP3444694A1 (en) 2016-04-15 2019-02-20 Positec Power Tools (Suzhou) Co., Ltd Automatic working system, mobile device, and control method therefor
EP3557355A1 (en) 2016-12-15 2019-10-23 Positec Power Tools (Suzhou) Co., Ltd Autonomous moving device, method thereof for giving alarm on positioning fault, and automatic working system
EP3633410A1 (en) 2017-05-26 2020-04-08 Positec Power Tools (Suzhou) Co., Ltd Positioning device and method and automatically moving apparatus
CN107462260A (zh) * 2017-08-22 2017-12-12 上海斐讯数据通信技术有限公司 一种运动轨迹生成方法、装置及可穿戴设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3633410A4

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4121834A4 (en) * 2020-03-18 2024-04-03 Husqvarna Ab ROBOTIC WORK TOOL SYSTEM AND METHOD
CN111486843A (zh) * 2020-04-03 2020-08-04 深圳市百慕大工业有限公司 一种复杂环境下的定位方法、装置及定位设备
CN111486843B (zh) * 2020-04-03 2022-01-11 深圳市百慕大工业有限公司 一种复杂环境下的定位方法、装置及定位设备
SE2150454A1 (en) * 2021-04-13 2022-10-14 Husqvarna Ab System and method for determining operating boundaries of a robotic work tool
SE544856C2 (en) * 2021-04-13 2022-12-13 Husqvarna Ab System and method for determining operating boundaries of a robotic work tool

Also Published As

Publication number Publication date
EP3633410A4 (en) 2021-01-20
EP3633410A1 (en) 2020-04-08
US11448775B2 (en) 2022-09-20
US20210165109A1 (en) 2021-06-03

Similar Documents

Publication Publication Date Title
WO2018214978A1 (zh) 定位装置及方法以及自动行走设备
US11649052B2 (en) System and method for providing autonomous photography and videography
CN111121767B (zh) 一种融合gps的机器人视觉惯导组合定位方法
EP4290878A2 (en) Techniques for co-optimization of motion and sensory control
US11125563B2 (en) Systems and methods for autonomous machine tracking and localization of mobile objects
CN109885080B (zh) 自主控制系统及自主控制方法
US10322819B2 (en) Autonomous system for taking moving images from a drone, with target tracking and improved target location
CN108957512A (zh) 定位装置及方法以及自动行走设备
KR20220028042A (ko) 포즈 결정 방법, 장치, 전자 기기, 저장 매체 및 프로그램
US20210208608A1 (en) Control method, control apparatus, control terminal for unmanned aerial vehicle
CN113498498B (zh) 行动控制设备和行动控制方法、以及程序
WO2018133077A1 (zh) 一种智能轮椅的环境信息收集与反馈系统及方法
JP7077598B2 (ja) 位置決定及び追跡のための方法、プログラム、及びシステム
CN110730934A (zh) 轨迹切换的方法和装置
CN113965646B (zh) 定位控制方法及装置、电子设备、存储介质
CN109737957B (zh) 一种采用级联FIR滤波的INS/LiDAR组合导航方法及系统
US11294379B2 (en) Systems and methods for controlling intelligent wheelchair
US11859980B2 (en) Method of movement tracking, processing module and lawn mower
US9619714B2 (en) Device and method for video generation
WO2018133076A1 (zh) 一种智能轮椅的机械传动控制方法与系统
US20230296793A1 (en) Motion-Based Calibration Of An Aerial Device
Li et al. A monocular odometer for a quadrotor using a homography model and inertial cues
Santos et al. Breadcrumb: An indoor simultaneous localization and mapping system for mobile devices
Gareau Visual-Inertial Odometry for 3D Pose Estimation and Scene Reconstruction using Unmanned Aerial Vehicles
Tsao Observability Analysis and Performance Evaluation for a Graph-Based GNSS–Visual–Inertial Odometry on Matrix Lie Groups

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18806033

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2018806033

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2018806033

Country of ref document: EP

Effective date: 20200102