WO2023065810A1 - Procédé et appareil d'acquisition d'images, terminal mobile et support d'enregistrement informatique - Google Patents

Procédé et appareil d'acquisition d'images, terminal mobile et support d'enregistrement informatique Download PDF

Info

Publication number
WO2023065810A1
WO2023065810A1 PCT/CN2022/114291 CN2022114291W WO2023065810A1 WO 2023065810 A1 WO2023065810 A1 WO 2023065810A1 CN 2022114291 W CN2022114291 W CN 2022114291W WO 2023065810 A1 WO2023065810 A1 WO 2023065810A1
Authority
WO
WIPO (PCT)
Prior art keywords
image acquisition
positioning
frequency
acquisition frequency
mobile terminal
Prior art date
Application number
PCT/CN2022/114291
Other languages
English (en)
Chinese (zh)
Inventor
史翔
黄国胜
叶帅
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023065810A1 publication Critical patent/WO2023065810A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • G01S19/34Power consumption
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system

Definitions

  • the invention relates to the field of positioning technology, in particular to an image acquisition method, device, mobile terminal and computer storage medium.
  • LBS Location Based Service
  • LBS Location Based Service
  • GNSS Global Navigation Satellite System, Global Navigation Satellite System
  • PPP precise point positioning, precise point positioning
  • RTK real time kinematic, carrier phase differential positioning
  • RTD real time differential, pseudorange differential positioning
  • the positioning accuracy depends heavily on the quality of satellite signals. In scenes where there are occluded objects such as urban high-rise buildings, under elevated roads, tree-lined roads, and tunnels, it is easy to fail to receive satellite signals or the quality of received satellite signals. Poor, at this time, it may cause positioning failure or low positioning accuracy.
  • This embodiment provides an image acquisition method, device, and mobile terminal, which can adjust the image acquisition frequency based on the positioning accuracy, so as to adapt the image acquisition frequency and positioning accuracy, so that the mobile terminal does not always collect images at high frequencies during the positioning process.
  • Environmental images thereby reducing the power consumption of the mobile terminal; in addition, when the positioning accuracy is low, the navigation through the environmental images can ensure the positioning accuracy.
  • this embodiment provides an image acquisition method applied to a mobile terminal, including: determining the first image acquisition frequency based on the positioning accuracy information, and the positioning accuracy information is used to reflect the positioning accuracy of the mobile terminal, or the positioning accuracy The information is used to indicate the positioning accuracy of the mobile terminal; the first environment image is collected based on the first image collection frequency, and the first environment image is an image of the current surrounding environment of the mobile terminal.
  • the image acquisition frequency can be adjusted based on the positioning accuracy, so that the image acquisition frequency and the positioning accuracy can be adapted, so that the mobile terminal does not always collect environmental images at high frequency during the positioning process, thereby reducing the power consumption of the mobile terminal; in addition , navigating through the environment image can ensure the positioning accuracy when the positioning accuracy is low.
  • the collected first environment image is an image captured by a target camera
  • the target camera is a camera communicating with the mobile terminal
  • the target camera is a camera of the mobile terminal.
  • the environmental images are captured by multiple cameras, so that the environmental images from different shooting angles can be obtained, so that the environmental images can more accurately represent the surrounding environment of the mobile terminal.
  • the target camera is the camera of the driving recorder communicating with the mobile terminal, and the camera of the driving recorder is used to collect the surrounding environment video; based on the first image acquisition frequency, the surrounding environment video is sampled to obtain the second An image of the environment.
  • the video collected by the driving recorder is sampled through the image collection frequency.
  • the mobile terminal does not always collect environmental images at high frequency during the positioning process, thereby reducing the power consumption of the mobile terminal. On the other hand, it does not Interfere with the normal use of the driving recorder.
  • the mobile terminal communicates with the driving recorder wirelessly or through wired communication.
  • the method further includes: before obtaining the first environmental image, acquiring a second environmental image based on a second image acquisition frequency, where the second environmental image is an image of the surrounding environment of the mobile terminal in a previous period.
  • the mobile terminal does not always acquire environmental images at high frequency.
  • the method further includes: determining a first positioning result based on the collected second environment image, where the first positioning result is used to represent a positioning result of the mobile terminal.
  • the positioning result determined based on the environment image can more accurately represent the actual situation.
  • the first positioning result can be the lane where the mobile terminal is located, and the user of the mobile terminal can be accurately guided to make a straight line, turn, change, etc. based on the first positioning result. Road and other operations.
  • the positioning accuracy information includes an error between the first positioning result and the second positioning result
  • the second positioning result is used to represent the result of the non-visual positioning of the mobile terminal
  • determine The first image acquisition frequency includes: determining that the first image acquisition frequency is greater than the second image acquisition frequency when the error between the first positioning result and the second positioning result is greater than or equal to a first threshold.
  • the environment image can represent the actual situation more accurately, if the error between the second positioning result obtained by non-visual positioning of the mobile terminal and the first positioning result obtained by positioning based on the environment image is large, It can be explained that the positioning accuracy is low, which can reflect the level of positioning accuracy to a certain extent.
  • the positioning accuracy is represented by the error between the first positioning result and the second positioning result, and the difference between the first positioning result and the second positioning result is
  • the error between is greater than or equal to the first threshold, it means that the positioning accuracy is low, and it is necessary to increase the frequency of image acquisition and rely on environmental images to ensure driving safety.
  • the positioning accuracy information includes an error between the first positioning result and the second positioning result
  • the second positioning result is used to represent the result of the non-visual positioning of the mobile terminal; based on the positioning accuracy, determine the first An image acquisition frequency, including: determining the first image acquisition frequency when the error between the first positioning result and the second positioning result is greater than or equal to the first threshold, and the second image acquisition frequency is not equal to the preset high-frequency limit greater than the second image acquisition frequency; or, when the error between the first positioning result and the second positioning result is greater than or equal to the first threshold, and the second image acquisition frequency is equal to the preset high-frequency limit, determine the first image acquisition The frequency is equal to the second image acquisition frequency.
  • the image acquisition frequency will not be higher than the high-frequency limit value, so as to ensure that the power consumption of the mobile terminal will not be too high.
  • the method further includes: determining a second positioning result based on satellite signals captured by the mobile terminal.
  • the method further includes: determining the second positioning result based on the satellite signal captured by the mobile terminal and in combination with the angular velocity and acceleration of the mobile terminal.
  • positioning is realized by combining satellite navigation and inertial navigation, so as to ensure the continuity, reliability and accuracy of positioning.
  • the satellite navigation method does not rely on visual information, so the error between the first positioning result determined based on the environmental image and the second positioning result obtained by non-visual positioning can be used to judge the positioning accuracy.
  • the first positioning result or the second positioning result is the lane where the mobile terminal was located in a previous period.
  • the first positioning result or the second positioning result is the location of the mobile terminal in a previous period.
  • the positioning accuracy information includes the signal quality of the satellite signal captured by the mobile terminal, and the positioning accuracy of the mobile terminal is determined; based on the positioning accuracy information, determining the first image acquisition frequency includes: when the signal quality is less than or equal to When the second threshold is reached, it is determined that the first image acquisition frequency is greater than the second image acquisition frequency.
  • the signal quality of the satellite signals captured by the mobile terminal can reflect the level of positioning accuracy.
  • the positioning accuracy is characterized by the signal quality of the satellite signals.
  • the quality is poor, it means that the positioning accuracy is low. In this case, the frequency of image acquisition needs to be increased.
  • the positioning accuracy information includes the signal quality of satellite signals captured by the mobile terminal; determining the first image acquisition frequency based on the positioning accuracy information includes: when the signal quality is less than or equal to the second threshold, and the second When the image acquisition frequency is not equal to the preset high-frequency limit, determine that the first image acquisition frequency is greater than the second image acquisition frequency; or, when the signal quality is less than or equal to the second threshold, and the second image acquisition frequency is equal to the preset high-frequency limit , it is determined that the first image acquisition frequency is equal to the second image acquisition frequency.
  • the image acquisition frequency will not be higher than the high-frequency limit value, so as to ensure that the power consumption of the mobile terminal will not be too high.
  • the signal quality is determined by one or more of the following factors: the signal strength of the satellite signal, the number of signals whose carrier phase of the satellite signal is in the frequency-locked loop, and the carrier phase of the satellite signal is in the phase-locked loop.
  • the positioning accuracy information includes a positioning accuracy value
  • determining the first image acquisition frequency based on the positioning accuracy information includes: when the positioning accuracy value is greater than a third threshold, determining that the first image acquisition frequency is greater than the second Image acquisition frequency; or, when the positioning accuracy value is less than the second threshold, it is determined that the first image acquisition frequency is less than the second image acquisition frequency.
  • the frequency of image acquisition is increased, relying on visual information to ensure driving safety; when the positioning accuracy is low, the image acquisition frequency is reduced to reduce power consumption as much as possible, so as to ensure positioning accuracy It can reduce power consumption on the basis of accurate positioning to the lane.
  • the positioning accuracy information includes a positioning accuracy value
  • determining the first image acquisition frequency based on the positioning accuracy information includes: when the positioning accuracy value is greater than a third threshold, and the second image acquisition frequency is equal to the preset When the high-frequency limit is high, determine that the first image acquisition frequency is equal to the second image acquisition frequency; when the positioning accuracy value is greater than the third threshold, and the second image acquisition frequency is not equal to the preset high frequency limit, determine that the first image acquisition The frequency is greater than the second image acquisition frequency; when the positioning accuracy value is less than the fourth threshold and the second image acquisition frequency is not equal to the preset low frequency limit, it is determined that the first image acquisition frequency is less than the second image acquisition frequency; or, when When the positioning accuracy value is less than the fourth threshold and the second image acquisition frequency is equal to the preset low frequency limit, it is determined that the first image acquisition frequency is equal to the second image acquisition frequency.
  • the image acquisition frequency will not be higher than the high-frequency limit value, so as to ensure that the power consumption of the mobile terminal will not be too high.
  • the image acquisition frequency is a preset high-frequency limit.
  • the image acquisition frequency is initially set to the highest to ensure positioning accuracy, such as accurate positioning to the lane, so as to ensure driving safety.
  • the method further includes: receiving a first operation, the first operation indicating that the user agrees to perform navigation based on the environment image; in response to the first operation, based on the collected first environment image, send to navigate.
  • user experience and privacy are ensured by interacting with users; in addition, navigation based on environmental images ensures positioning accuracy, such as accurate positioning to the lane, and subsequently can accurately guide the user of the mobile terminal to make straight lines, turn, change, etc. Road and other operations.
  • this embodiment provides an image acquisition device, the device corresponds to the method one by one, and the beneficial effects achieved are also the same, so details will not be repeated here.
  • the image acquisition device may be a mobile terminal.
  • the image acquisition device may be a part of the mobile terminal.
  • the image acquisition device communicates with the target camera involved in the first aspect, for example, wireless communication or wired communication.
  • the target camera involved in the first aspect is located outside the image acquisition device, and can communicate with the image acquisition device by wire or wirelessly.
  • the image acquisition device includes the target camera mentioned in the first aspect, and generally communicates with the image acquisition device by wire.
  • the image acquisition device communicates with the driving recorder involved in the first aspect, for example, wireless communication or wired communication.
  • the driving recorder involved in the first aspect is located outside the image acquisition device, and can also communicate with the image acquisition device by wire or wirelessly.
  • the image acquisition device includes the driving recorder mentioned in the first aspect, and generally communicates with the image acquisition device by wire.
  • the image acquisition device is a vehicle
  • a driving recorder is installed in the vehicle
  • the target camera may be a vehicle camera other than the driving recorder of the vehicle, or may be a camera of the driving recorder of the vehicle.
  • the image acquisition device is a mobile phone or a vehicle-mounted terminal placed in a vehicle, and a driving recorder is arranged in the vehicle, and the target camera can be a camera of a mobile phone, or a vehicle camera other than the vehicle driving recorder, or a The camera of the dashcam of the vehicle.
  • the driving recorder since the driving recorder is used to record the surrounding environment video during the whole driving process of the vehicle, the normal operation of the driving recorder cannot be disturbed.
  • the driving recorder can include most of the cameras installed in the vehicle. For the vehicle cameras not included in the driving recorder, it may have a specific role, such as reversing, so it does not need to work in real time and can be idle. Of course, the driving recorder can also include all the cameras installed in the vehicle.
  • the image acquisition device may control the camera of a mobile phone or a vehicle camera other than a driving recorder, and collect several frames based on the first image acquisition frequency. image.
  • the image acquisition device may instruct to sample the surrounding environment video collected by the camera of the driving recorder based on the first image acquisition frequency, to obtain Several frames of environment images.
  • this embodiment provides a mobile terminal, including: at least one memory for storing programs; at least one processor for executing the programs stored in the memory, and when the programs stored in the memory are executed, the processor is used to Execute the method provided by the first aspect.
  • the mobile terminal may be a vehicle-mounted terminal.
  • the mobile terminal can be a mobile phone.
  • the mobile terminal may be a vehicle.
  • this embodiment provides a system, including: a driving recorder and a mobile terminal; wherein the mobile terminal is used to execute the method provided in the first aspect.
  • the mobile terminal may be a vehicle-mounted terminal or a mobile phone.
  • this embodiment provides a system, including: a vehicle and a mobile phone; wherein, the mobile phone is used to execute the method provided in the first aspect.
  • a vehicle includes a dash cam and a vehicle camera in addition to the dash cam.
  • the mobile phone can control the built-in camera or the vehicle camera other than the driving recorder to shoot according to the first image collection frequency, and can also sample the surrounding environment video collected by the camera of the driving recorder based on the first image collection frequency.
  • this embodiment provides a computer storage medium, in which instructions are stored, and when the instructions are run on a computer, the computer is made to execute the method provided in the first aspect.
  • the present embodiment provides a computer program product containing instructions, and when the instructions are run on a computer, the computer is made to execute the method provided in the first aspect.
  • this embodiment provides an electronic device, including a memory and a processor, where executable codes are stored in the memory, and when the processor executes the executable codes, each possible realization of the first aspect can be realized.
  • an electronic device including a memory and a processor, where executable codes are stored in the memory, and when the processor executes the executable codes, each possible realization of the first aspect can be realized. Example.
  • FIG. 1 is a schematic diagram of a scene of a mobile phone vehicle provided in this embodiment
  • FIG. 2 is a schematic structural diagram of a mobile terminal provided in this embodiment
  • FIG. 3 is a first schematic diagram of the architecture of the positioning system provided by this embodiment.
  • FIG. 4 is a schematic diagram of a scene of the positioning system provided by this embodiment.
  • FIG. 5 is a schematic display diagram of a user interaction interface provided by this embodiment.
  • Fig. 6 is a schematic display diagram of a navigation route provided by this embodiment.
  • Fig. 7 is a schematic flow chart of determining the type of road environment provided by this embodiment.
  • FIG. 8 is a schematic diagram of a frequency switching method provided in this embodiment.
  • FIG. 9 is a second schematic diagram of the architecture of the positioning system provided by this embodiment.
  • FIG. 10 is a schematic flowchart of an image acquisition solution provided in this embodiment.
  • FIG. 11 is a schematic flowchart of another image acquisition solution provided in this embodiment.
  • FIG. 12 is a schematic flowchart of the image acquisition method provided in this embodiment.
  • words such as “exemplary”, “for example” or “for example” are used as examples, illustrations or illustrations. Any embodiment or design described as “exemplary”, “for example” or “for example” in this embodiment shall not be construed as being more preferred or more advantageous than other embodiments or design. Rather, the use of words such as “exemplary”, “for example” or “for example” is intended to present related concepts in a specific manner.
  • the term "and/or" is just an association relationship describing associated objects, which means that there can be three relationships, for example, A and/or B can mean: A exists alone, A exists alone B, there are three situations of A and B at the same time.
  • the term "plurality" means two or more. For example, multiple systems refer to two or more systems, and multiple mobile terminals refer to two or more mobile terminals.
  • first and second are used for descriptive purposes only, and cannot be understood as indicating or implying relative importance or implicitly specifying indicated technical features. Thus, a feature defined as “first” and “second” may explicitly or implicitly include one or more of these features.
  • the terms “including”, “comprising”, “having” and variations thereof mean “including but not limited to”, unless expressly stated otherwise.
  • Pseudo-range refers to the approximate distance obtained by multiplying the propagation time and propagation speed of the signal between the receiver and the navigation satellite during the satellite positioning process.
  • Pseudo-range rate the speed measurement value obtained by differentiating the pseudo-range within the actual measurement time interval.
  • Carrier phase It is the phase difference between the carrier after the receiver captures and tracks the satellite signal and the carrier generated by the receiver's oscillator.
  • the carrier wave (carrier wave, carrier signal or carrier) is an electric wave generated by an oscillator and transmitted on a communication channel, and is modulated to transmit voice or other information.
  • Doppler frequency shift It means that when there is relative motion between the sound source and the receiver, when the receiver moves in a certain direction at a constant rate, the phase and frequency will change due to the difference in propagation path .
  • Phase-locked loop It refers to a phase-locked loop, which is used in a communication receiver. Its function is to track the captured satellite signal and extract the phase of a certain clock from it. For the phase-locked loop, the so-called locking means that the frequency of the VCO is completely consistent with the frequency of the synchronous signal, but a stable phase error is allowed.
  • Frequency-locked loop Its function is to track the captured satellite signal and extract the frequency of a certain clock from it, which is essentially an automatic frequency fine-tuning circuit. For the frequency-locked loop, a small steady-state frequency error between the VCO and the synchronization signal can be allowed when locked. It should be understood that phase-locked loops and frequency-locked loops are used to enable tracking of acquired satellite signals.
  • Cycle slips Refers to the jump or interruption of the entire cycle count caused by the loss of lock of the captured satellite signal in the carrier phase measurement of satellite positioning.
  • Reference station refers to the long-term continuous observation of satellite signals emitted by navigation satellites, and the observation data is transmitted to the data processing center in real time or regularly by communication facilities.
  • GNSS data observation station fixed on the ground.
  • the coordinates of the reference station should be known accurately, and its coordinates can be determined by high-precision satellite positioning methods (PPP, RTK, etc.).
  • the observation data may include pseudo-range, pseudo-range rate, carrier phase, Doppler frequency shift, etc. obtained after capturing and tracking satellite signals.
  • Rover station It is a detection station set up by a receiver that moves within a certain range of the base station.
  • the rover can be understood as a mobile terminal.
  • Data processing center It has two working models, one-way mode and two-way mode.
  • the user obtains relevant information from the data processing center, and the relevant information obtained by all rover stations should be consistent; in the two-way mode, the rover station also needs to send its own rough position (which can be generated by the single point positioning method) ) to the data processing center, and the data processing center generates relevant information in a targeted manner and transmits it to a specific mobile station.
  • the relevant information obtained by multiple mobile stations may be different.
  • the relevant information may be information used for positioning, such as information such as pseudorange, pseudorange correction amount, carrier phase correction amount, or carrier phase.
  • RTD Use the pseudo-range observations or pseudo-range corrections (pseudo-range corrections) provided by the reference station, combined with the pseudo-range observations obtained by the mobile terminal itself, and at the same time, use least squares, Kalman, etc.
  • the parameter estimator through the form of difference, eliminates errors such as the atmosphere and satellite ephemeris, and achieves meter-level positioning accuracy (positioning error is above 1m) or even sub-meter positioning accuracy (positioning error is not greater than 100cm).
  • RTK Utilize the pseudorange, carrier phase observation or pseudorange correction number (pseudorange correction quantity) and carrier phase correction quantity (carrier phase correction quantity) provided by the reference station, combined with the pseudorange, At the same time, using Kalman and other parameter estimators to eliminate the errors of the atmosphere and satellite ephemeris and determine the ambiguity value of the carrier phase in the form of difference, so as to achieve decimeter-level positioning accuracy (positioning error is not greater than 50cm ) or even centimeter-level positioning accuracy (positioning error is not greater than 10cm).
  • PPP Using a single GNSS receiver, using the precise ephemeris and satellite clock error provided by the International GNSS Service (IGS), based on carrier phase observations, high-precision positioning at the millimeter to decimeter level can be achieved. The difference between it and RTK is that there is no need to set up additional reference stations. There are already some basic reference stations. The data sent by the navigation satellites are separated and processed to inform the navigation satellites. After the errors are eliminated, the navigation satellites send satellite signals. .
  • IGS International GNSS Service
  • Visual matching positioning use the images or videos captured by the camera for positioning, and determine the position and posture of the camera in the real world.
  • the feature extraction is performed on the image captured by the camera, and matched with the feature database of the high-precision map of the surrounding environment of the mobile terminal to determine the position of the mobile terminal on the high-precision map. Since the position is based on the environment image i.e.
  • the obtained visual information can be called visual matching position for convenience of description and distinction.
  • the matching features mainly include two categories.
  • the first category is feature points, which are usually pixels describing the edges and corners of objects;
  • the second category is semantic features, usually lane lines, road edges, poles, signs, traffic lights, etc. .
  • feature matching is performed using the second type of semantic features.
  • the high-precision map can be understood as a map whose positioning accuracy can reach the lane level, including multi-dimensional high-precision data, such as the curvature, elevation, slope, coordinates of the road, and the height data of bridges and tunnels.
  • the high-precision map also stores data for visual matching and positioning, such as the position and attitude of poles, signs, lane lines, traffic lights, and road edges in the world coordinate system, and obtains a feature database of high-precision maps.
  • Lane level indicates at least sub-meter level (positioning error is not greater than 100cm) positioning accuracy, for example, decimeter level (positioning error is not greater than 50cm), or centimeter level (positioning error is not greater than 10cm), the above positioning accuracy is only as an example, It does not constitute a specific limitation.
  • the high-precision map of the surrounding environment of the mobile terminal can be understood as a map of all roads in front of the mobile terminal, for example, a map of all roads within 10 km in front of the vehicle.
  • the mobile terminal obtains a high-precision map by accessing the cloud server in real time, and the map includes the surrounding environment of the mobile terminal. For example, when the mobile terminal is navigating, it can automatically query the cloud server for the high-precision map of the starting point and the end point of the navigation, and if there is one, download it to the mobile terminal, so that the high-precision map stored in the mobile terminal can be directly used . It should be noted that, since this embodiment needs to achieve lane-level positioning, if a high-precision map cannot be obtained, the lane where the mobile terminal is located may not be accurately obtained, so there is no need to determine the visual matching result.
  • the visual matching result it is possible to determine the lane where the visual matching position is located, the total number of lanes on the road, etc., and also determine the position of the lane line of the lane, and the type of lane line of the lane, such as dashed lines, solid lines, road edge lines, etc. , since the above information is obtained based on the visual matching position and high-precision map matching, for the convenience of distinction and description, it can be called the visual matching result.
  • the visual matching position may be a coordinate in the latitude-longitude coordinate system, in other words, it may be characterized by latitude-longitude coordinates.
  • the lane line recognition can be understood as identifying the positional relationship between the lane line and the vehicle, whether the lane line is a solid line or a dotted line, the positional relationship between the lane line and the road edge, etc.
  • the lane line recognition can determine the mobile terminal The lane where the lane is located, the type of lane line of the lane where it is located, the distance between the lane line of the lane where it is located and the mobile terminal, the distance between the mobile terminal and the edge of the road, the total number of lanes of the road where the mobile terminal is located, etc., because these information are all related to the environment image That is, the visual information obtained through lane line recognition can be called the visual recognition result for the convenience of distinction and description.
  • the mobile terminal recognizes lane lines on multiple frames of environmental images, and then performs lane line fitting to restore the real situation of the road, thereby determining where the mobile terminal is located.
  • Inertial Navigation System It is a kind of dead reckoning (Dead Reckoning, DR for short) software algorithm and an inertial measurement unit (Inertial Measurement Unit, IMU for short) installed on a moving carrier to measure A system of motion vector positions.
  • the INS includes a hardware unit IMU and a DR software algorithm. Take the vehicle as an example to illustrate the IMU and DR software algorithms.
  • the IMU is a device that measures the three-axis attitude angle (or angular rate) and acceleration of an object.
  • an IMU contains three single-axis accelerometers and three single-axis gyroscopes.
  • INS integrates the acceleration data collected by the accelerometer to determine the current speed of the vehicle; integrates the angular velocity data collected by the gyroscope to determine the current attitude of the vehicle.
  • the IMU can be a MEMS (Micro-Electro-Mechanical System, referred to as Micro-Electro-Mechanical System) IMU, which has the characteristics of low price and small size, and is widely used in mobile phones, drones, VR (Virtual Reality, virtual reality technology), robotics, etc.
  • MEMS Micro-Electro-Mechanical System
  • VR Virtual Reality, virtual reality technology
  • the DR software algorithm is a commonly used navigation and positioning technology. It is a method of calculating the position at the next time by measuring the distance and azimuth of the movement under the condition of knowing the current position, and can realize continuous autonomous positioning.
  • the reckoning process is an accumulative process, the error accumulates with time.
  • the reckoning can only determine the relative position and heading.
  • VDR Vehicle Dead Reckoning, vehicle dead reckoning
  • GNSS refers to all satellite navigation systems, including global, regional and enhanced.
  • the GNSS may include a global positioning system (global positioning system, GPS), a global satellite system (global navigation satellite system, GLONASS), a Beidou navigation satellite system (beidou navigation satellite system, BDS), a quasi-zenith satellite system (quasi- zenith satellite system, QZSS) and/or satellite based augmentation systems (satellite based augmentation systems, SBAS), etc.
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • Integrated navigation refers to the combination of two or more navigation systems, which can make full use of the information of each system to achieve information fusion and complementarity.
  • the integrated navigation may be GNSS/INS, or GPS (Global Positioning System, Global Positioning System for short)/INS.
  • GPS Global Positioning System, Global Positioning System for short
  • the above combined navigation is only an example, and does not constitute a specific limitation, which needs to be determined in combination with actual requirements.
  • GNSS in integrated navigation can realize vehicle positioning through RTK
  • INS can realize vehicle positioning through VDR algorithm
  • integrated navigation can be RTK/VDR.
  • the above-mentioned navigation satellites may be GPS satellites, GNSS satellites, Beidou satellites or other satellites
  • the above-mentioned receivers may be GPS receivers, GNSS receivers, Beidou system receivers or other types of positioning system receivers.
  • the mobile terminal in this embodiment may be a device at least capable of positioning, such as a mobile phone, a vehicle-mounted terminal, a vehicle, and the like.
  • a mobile phone when the mobile terminal is a mobile phone, the scene where the mobile phone is placed in the vehicle is used to locate the vehicle.
  • FIG. 2 is a schematic diagram of a hardware structure of a mobile terminal provided in this embodiment.
  • the mobile terminal 100 may include a processor 110 , an IMU 120 , a camera 130 , a touch screen 140 , a memory 150 and a communication module 160 .
  • the structure shown in this embodiment does not constitute a specific limitation on the mobile terminal 100 .
  • the mobile terminal 100 may include more or fewer components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the mobile terminal 100 may also include an odometer.
  • the odometer is used to measure distance and speed.
  • the mobile terminal may also include a driving recorder connected to the processor 110 .
  • the driving recorder includes a camera 130 .
  • the processor 110 may include one or more processors, for example, the processor 110 may include an application processor (application processor, AP), a modem (modem), a graphics processing unit (graphics processing unit, GPU), an image signal processor ( image signal processor, ISP), video codec, digital signal processor (digital signal processor, DSP), baseband processor, satellite positioning chip and neural network processor (neural-network processing unit, NPU), etc. or more. Wherein, different processors may be independent devices, or may be integrated in one or more processors. Wherein, the controller can generate an operation control signal according to the instruction operation code and the timing signal, so as to complete the control of the instruction and execute the instruction.
  • application processor application processor
  • modem graphics processing unit
  • ISP image signal processor
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • satellite positioning chip and neural network processor neural-network processing unit, NPU
  • the satellite positioning chip is used to capture and track satellite signals emitted by navigation satellites to obtain satellite navigation information.
  • satellite navigation information is used to represent information obtained by tracking satellite signals, and may include pseudorange, pseudorange rate, carrier phase, Kepler frequency shift, ephemeris data, and the like.
  • the satellite positioning chip can obtain satellite signal tracking information when capturing and tracking satellite signals emitted by navigation satellites.
  • the satellite signal tracking status information is used to indicate the status of tracking the satellite signal, and may include the signal strength of the satellite signal, whether the carrier phase of the satellite signal is in the frequency-locked loop, whether the carrier phase of the satellite signal is in the phase-locked loop, the carrier phase of the satellite signal Whether there is a cycle slip in the phase, the Position Dilution of Precision (PDOP) of the satellite signal, etc. are used to indicate the tracking situation.
  • the satellite signal tracking status information indicates the respective tracking status of multiple satellite signals, one satellite signal corresponds to one satellite, and the number of satellites is consistent with the number of signals.
  • the satellite signals captured by the mobile terminal 100 represent satellite signals transmitted by multiple navigation satellites received by the mobile terminal 100 at the same time.
  • the satellite positioning chip can also perform positioning based on satellite navigation information and positioning technology to obtain satellite positioning data.
  • the positioning technology may be the above-mentioned PPP technology.
  • the satellite positioning chip can perform positioning based on satellite navigation information and related information sent by the data processing center to obtain the position of the mobile terminal 100 , and also obtain the speed and altitude of the mobile terminal 100 .
  • the positioning technology may be the above-mentioned RTK or RTD technology.
  • the location of the mobile terminal 100 is usually represented by a latitude-longitude coordinate system, for example, CGCS2000, WGS84, WGS84 is usually selected.
  • the satellite navigation positioning chip may be a GPS chip or a GNSS chip.
  • the satellite signal can be GNSS (Global Navigation Satellite System, global satellite system, such as GPS, Beidou, etc.) signal or GPS signal.
  • GNSS Global Navigation Satellite System, global satellite system, such as GPS, Beidou, etc.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is a cache memory.
  • the memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to use the instruction or data again, it can be called directly from the memory, so as to avoid repeated access, reduce the waiting time of the processor 110, and improve the efficiency of the system.
  • the memory may cache the surrounding environment video finally collected by the driving recorder, so that the processor 110 can process the finally collected surrounding environment video.
  • the camera 130 is used to capture still images or videos, for example, to capture information of the surrounding environment and the like.
  • the object passes through the lens to generate an optical image that is projected onto the photosensitive element.
  • the photosensitive element can be a charge coupled device (charge coupled device, CCD) or a complementary metal oxide semiconductor (ComplementaryMetalOxideSemiconductor, CMOS) phototransistor.
  • CCD charge coupled device
  • CMOS complementary metal oxide semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the mobile terminal 100 may include one or more cameras.
  • the camera 130 may collect information about the surrounding environment of the mobile terminal 100, and transmit the information about the surrounding environment to the processor 110, so that the processor 110 performs lane line recognition and/or visual matching
  • the mobile terminal 100 can realize the shooting function through the ISP, the camera 130 , the video codec, the GPU and the application processor.
  • the ISP is used to process data fed back by the camera 130 . For example, when shooting, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 130 .
  • the touch screen 140 is used to display images, videos, and the like.
  • the touch screen 140 includes a display screen and a touch sensor, and the display screen includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the mobile terminal 100 may include one or more display screens.
  • the display screen can be used to display an interface of an application program, display a display window of an application program, and the like.
  • the touch sensor is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • the mobile terminal 100 realizes the display function through the GPU, the touch screen 140 , and the AP.
  • the GPU is a microprocessor for image processing, and is connected to the touch screen 140 and the AP. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the memory 150 may be used to store computer-executable program code, which includes instructions.
  • the processor 110 executes various functional applications and data processing of the mobile terminal 100 by executing instructions stored in the internal memory.
  • the internal memory may include an area for storing programs and an area for storing data.
  • the stored program area can store an operating system, at least one application program required by a function (such as a sound playing function, an image playing function, etc.) and the like.
  • the data storage area can store data created during the use of the mobile terminal 100 (such as audio data, phonebook, etc.) and the like.
  • the internal memory may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • a non-volatile memory such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the memory 150 of the driving recorder or the vehicle-mounted terminal can store the surrounding environment video collected by the driving recorder, so as to record the surrounding environment video of the whole driving process of the vehicle.
  • memory 150 also includes an external memory interface.
  • the external memory interface can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the mobile terminal 100.
  • the external memory card communicates with the processor 110 through the external memory interface to realize the data storage function. Such as saving music, video and other files in the external memory card.
  • the communication module 160 is used to realize the wireless communication function of the mobile terminal 100, and includes an antenna 1, an antenna 2, a mobile communication module, and a wireless communication module. Specifically, the wireless communication function can be realized through the antenna 1, the antenna 2, a mobile communication module, a wireless communication module, a modem, and a baseband processor.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in mobile terminal 100 can be used to cover single or multiple communication frequency bands. Different antennas can also be multiplexed to improve the utilization of the antennas.
  • Antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module can provide wireless communication solutions including 2G/3G/4G/5G applied on the mobile terminal 100 .
  • the mobile communication module may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA) and the like.
  • the mobile communication module can receive electromagnetic waves by at least two antennas of the antenna 1, filter and amplify the received electromagnetic waves, and send them to the modem for demodulation.
  • the mobile communication module can also amplify the signal modulated by the modem, and convert it into electromagnetic wave and radiate it through the antenna 1 .
  • at least part of the functional modules of the mobile communication module may be disposed in the processor 110 .
  • at least part of the functional modules of the mobile communication module and at least part of the modules of the processor 110 may be disposed in the same device.
  • a modem may include both a modulator and a demodulator.
  • the modulator is used to modulate the low-frequency baseband signal to be transmitted into a medium-high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator sends the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is passed to the application processor after being processed by the baseband processor.
  • the application processor outputs sound signals through audio equipment (not limited to speakers, receivers, etc.), or displays images or videos through the display screen 160 .
  • the modem may be a stand-alone device.
  • the modem may be independent of the processor 110, and be set in the same device as the mobile communication module or other functional modules.
  • the mobile communication module may be a module in a modem.
  • the wireless communication module can provide applications on the mobile terminal 100 including wireless local area networks (wireless local area networks, WLAN) (such as wireless fidelity (wireless fidelity, Wi-Fi) network), bluetooth (bluetooth, BT), global satellite system ( Global navigation satellite system (GNSS), frequency modulation (frequency modulation, FM), near field communication (near field communication, NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • the wireless communication module may be one or more devices integrating at least one communication processing module.
  • the wireless communication module receives electromagnetic waves via the antenna 2 , frequency-modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module can also receive the signal to be sent from the processor 110 , frequency-modulate it, amplify it, and convert it into electromagnetic wave and radiate it through the antenna 2 .
  • the antenna 1 of the mobile terminal 100 is coupled to the mobile communication module, and the antenna 2 is coupled to the wireless communication module, so that the mobile terminal 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (long term evolution, LTE), fifth generation, new air interface ( new radio, NR), BT, GNSS, WLAN, NFC, FM, and/or IR technology, etc.
  • GSM global system for mobile communications
  • general packet radio service general packet radio service
  • CDMA code division multiple access
  • WCDMA broadband Code division multiple access
  • TD-SCDMA time-division code division multiple access
  • LTE long term evolution
  • NR new air interface
  • BT GNSS
  • WLAN NFC
  • FM
  • the mobile terminal 100 may also include a universal serial bus (universal serial bus, USB) interface, a charging management module, a power management module, a battery, an audio module, a speaker, a receiver, a microphone, an earphone jack, buttons, a motor, and an indicator And subscriber identification module (subscriber identification module, SIM) card interface, other sensors other than the inertial measurement unit, etc.
  • a universal serial bus universal serial bus, USB
  • the mobile phone mainly connects to the driving recorder through wifi or turns on the built-in camera to obtain real-time and high-frequency environmental images, but real-time and high-frequency acquisition of environmental images will increase the power consumption of the mobile phone.
  • this embodiment provides an image acquisition method, which can maintain high positioning accuracy and save power consumption of the mobile terminal.
  • FIG. 3 is a structural diagram of the positioning system provided by this embodiment. As shown in FIG. 3 , the positioning system includes a vehicle 200 and a mobile terminal 100 .
  • the mobile terminal 100 is placed in the vehicle 200 for positioning the vehicle 200, determining positioning accuracy information, determining the current image acquisition frequency based on the positioning accuracy information, and then collecting environmental images based on the current image acquisition frequency,
  • the mobile terminal 100 does not always collect images at high frequency, which can reduce the power consumption of the mobile terminal 100 to a certain extent.
  • the positioning accuracy information is used to reflect (represent) the positioning accuracy of the vehicle 200 , for example, the signal quality of the satellite signal, and the positioning accuracy value of the vehicle 200 .
  • the positioning accuracy value can describe the positioning accuracy of the vehicle 200 from a quantitative perspective. For example, a larger positioning accuracy value indicates a larger positioning error, and a smaller positioning accuracy value indicates a smaller positioning error and a higher positioning accuracy.
  • positioning The accuracy value can be the positioning error value mentioned below (indicating the difference between the position of the vehicle 200 obtained by positioning the vehicle 200 and the real position), it can also be the standard deviation value of the positioning error value, and it can also be the position precision factor (Position Dilution of Precision, abbreviated as PDOP).
  • PDOP Position Dilution of Precision
  • the positioning accuracy value can be the position of the vehicle 200 obtained by locating the vehicle 200
  • the credible probability value may also be the closeness value between the position of the vehicle 200 obtained by locating the vehicle 200 and the real position. It should be understood that the above positioning accuracy values are merely examples, and the positioning accuracy values in this embodiment may be values determined according to actual needs.
  • the acquisition of environmental images based on the current image acquisition frequency can be understood as, if the current image acquisition frequency is not equal to the pause acquisition frequency, several frames of environmental images are acquired based on the current image acquisition frequency; if the current image acquisition frequency When equal to the pause collection frequency, the collection of environmental images is suspended, in other words, no environmental images can be collected based on the current image collection frequency. It should be understood that if the current image collection frequency is equal to the pause collection frequency, it means that the positioning accuracy is high, and there is no need to collect environmental images at this time, so that the power consumption of the mobile terminal 100 can be greatly reduced.
  • the pause collection frequency is 0 Hz, and the description below takes 0 Hz as the pause collection frequency as an example.
  • the mobile terminal 100 continuously determines and updates the current image acquisition frequency based on a preset time interval.
  • the preset time interval may be 1 second or 2 seconds, which needs to be determined according to actual needs.
  • the current moment of image acquisition frequency is used as the current moment in the following, then the current image acquisition frequency can be understood as the image acquisition frequency of the period between the current moment and the next time when the image acquisition frequency is determined.
  • the current The image acquisition frequency of is the last determined image acquisition frequency.
  • the image acquisition frequencies determined at the 1st second, the 2nd second, ..., the i-1th second, the i-th second, and the i+1th second are respectively P1, P2, ..., Pi-1, Pi, Pi+ 1.
  • the current image acquisition frequency that is, the last determined image acquisition frequency is Pi, which is used as the image acquisition frequency for the period between the i-th second and the i+1 second.
  • Pi is not equal to
  • the image acquisition frequency of the previous time is the image acquisition frequency determined for the penultimate time, that is, Pi-1.
  • the image acquisition frequency of the previous time can also be understood as the image acquisition frequency of the previous period, and the previous period can be It is understood as the period for determining the previous image acquisition frequency, for example, the period between the i-1th second and the i-th second above, and the previous image acquisition frequency is named relative to the current image acquisition frequency , does not have a special meaning, and indicates the latest image acquisition frequency determined before the current image acquisition frequency and at the current moment.
  • the image collection frequency indicates the number of image frames of the environment image collected per unit time. For example, if the number of image frames is 5, the image acquisition frequency indicates that there are 5 environmental images acquired per unit time. For another example, if the number of image frames is 0, it means that the image acquisition frequency is 0Hz.
  • the environment image indicates the surrounding environment of the vehicle 200, for example, the front environment, the rear environment, the left environment and/or the right environment of the vehicle 200. The environment and/or the right-hand environment are used as aids to gain a comprehensive understanding of the surrounding environment. It should be noted that the image acquisition frequency essentially indicates the number of image frames of the environmental image collected per unit time for image processing.
  • the number of image frames indicated by the image acquisition frequency is 5.
  • the environment images are collected according to the image acquisition frequency to obtain several frames of environment images;
  • the images are transmitted to the processor of the mobile terminal 100, that is, the environmental images are sent periodically, and correspondingly, the processor of the mobile terminal 100 receives the environmental images periodically.
  • the preset time interval may be determined in combination with actual conditions, for example, may be a unit time, that is, 1 second.
  • the multi-frame environmental images collected from T0 to T0+1*t will be packaged and transmitted to the processor of the mobile terminal 100; T0+1*t to T0
  • the multi-frame environmental images collected by +2*t are packaged and transmitted to the processor of the mobile terminal 100, and then the processor starts to process these environmental images.
  • the mobile terminal 100 performs image processing on several frames of environmental images acquired based on the current image acquisition frequency to determine the visual positioning result, since only Image processing is performed on several frames of environment images acquired according to the image acquisition frequency, therefore, image processing is not performed at high frequency all the time, and the power consumption of the mobile terminal 100 can be reduced to a certain extent.
  • the image processing may be image preprocessing (for example, image enhancement, de-distortion, de-noising, etc.), lane line recognition and/or visual matching and positioning. It should be understood that this embodiment does not limit the frequency of image processing, as long as it can meet the requirements of real-time positioning.
  • the processor of the mobile terminal 100 will receive the environment image in real time, and will also perform image processing on the received environment image in real time.
  • the processor to which the environment image is transmitted and the processor for processing the image may be the same.
  • the processor to which the environment image is transmitted is different from the processor performing image processing.
  • the processor to which the environment image is transmitted is the above AP, and the processor performing image processing is the above NPU.
  • the AP obtains the environment image and performs image preprocessing and sends it to the NPU.
  • the NPU performs other image processing besides image preprocessing, such as lane line recognition, visual matching and positioning, and sends the processing result to the AP. .
  • each frame of environment image collected based on the current image collection frequency will be subjected to image processing.
  • the visual positioning result is the result of positioning based on several frames of environment images.
  • the positioning here can be understood as visual matching positioning to determine the visual matching position, and further match with the high-precision map to determine the lane where the vehicle 200 is located, the total number of lanes, lane lines and Information such as the position of the vehicle 200, that is, the above-mentioned visual matching result, can also be understood as lane line recognition to determine information such as the lane where the vehicle 200 is located, the total number of lanes, the distance between the lane line and the vehicle 200, that is, the above-mentioned visual recognition result, for convenience Description and distinction Localization based on several frames of environment images can be called visual localization.
  • the visual positioning result may include the lane where the vehicle 200 is located, that is, the visual matching position obtained by visual matching positioning, the lane where the vehicle 200 is located (which may be called the visual lane), the total number of lanes (which may be called the total number of visual lanes), and other information.
  • the visual positioning result may only include the visual lane and the total number of visual lanes.
  • the total number of visual lanes is 2, and the visual lane is the left or right lane; in an example, the total number of visual lanes is 3, and the visual lane is the middle lane, the left lane, or the right lane; Lane code for any one of the 5 lanes, for example, numbers or letters, the 5 lanes are coded differently.
  • the visual positioning result may also be referred to as the first positioning result.
  • the image processing is lane recognition
  • the visual lanes may be the lanes in the above visual recognition result
  • the total number of visual lanes may be the total number of lanes in the above visual recognition result.
  • image processing is visual matching positioning
  • the total number of visual lanes may be the total number of lanes in the above visual matching result.
  • the lanes based on the visual matching results are called visual matching lanes
  • the total number of lanes is called the total number of visual matching lanes
  • the lanes in the visual recognition results are called visual recognition lanes
  • the total number of lanes is called visual recognition lanes. total.
  • the processor of the mobile terminal 100 usually collects several frames of environmental images based on the current image acquisition frequency, and determines the current image Lane line recognition is performed on several frames of environmental images collected in the latest period before the time of collection frequency, so as to determine the visually recognized lane and the total number of visually recognized lanes.
  • the end time of the latest period is the moment of the current image acquisition frequency, and the duration of the latest period needs to be determined in combination with actual needs, for example, 5 seconds, 10 seconds, which is not specifically limited in this embodiment, and the latest mentioned below
  • the time period can be understood in this way, the difference is that the duration of the most recent time period may be different.
  • the latest time period is [T-P,T).
  • the most recent period is usually longer than the previous period.
  • the end time of the latest period is the same as the end time of the previous period, because the previous period indicates the period between the time when the previous image acquisition frequency is determined and the time when the current image acquisition frequency is determined , so the most recent period includes the previous period.
  • the latest period and the previous period may be the same period.
  • the mobile terminal 100 can interact with the user, and when the user agrees to perform navigation based on the environment image, periodically determine the current image acquisition frequency, and when the current image acquisition frequency is greater than 0 Hz, based on the current image acquisition frequency
  • the mobile terminal 100 is navigated by several frames of environment images obtained by frequency collection.
  • the mobile terminal 100 may receive a user operation, the user operation indicates that the user agrees to perform navigation based on the environment image; in response to the user operation, periodically determine the current image acquisition frequency, and when the current image acquisition frequency is greater than 0 Hz, Several frames of environmental images are acquired based on the current image acquisition frequency, and the mobile terminal is navigated.
  • FIG. 5 shows a schematic diagram of the display of the user interaction page of the mobile terminal 100. As shown in FIG. Mobile phone camera/travel recorder image, used to improve positioning accuracy), agree button 502 and disagree button 503.
  • performing navigation is usually based on several frames of environment images collected at the current image collection frequency, performing lane line recognition on the several frames of environment images currently collected to obtain visual recognition results, and planning a navigation route based on the visual recognition results.
  • FIG. 6 is a schematic diagram of displaying a navigation route provided in this embodiment, in which a navigation route 601 is shown.
  • the touch screen 140 of the mobile terminal 100 displays the user interaction page shown in FIG.
  • the mobile terminal 100 determines the image acquisition frequency in real time. If the image acquisition frequency is not equal to 0 Hz, the visual recognition result is determined based on several frames of environmental images collected at the image acquisition frequency, and then the navigation route is planned. .
  • the mobile terminal 100 can control the touch screen 140 to display a high-precision map of the surrounding environment of the vehicle 200, or a high-precision map of the starting point and end point of the navigation; and display the high-precision map of the vehicle 200 on the displayed high-precision map. location.
  • the positioning system further includes a reference station terminal 300 .
  • the reference station terminal 300 includes a reference station network 301 and a data processing center 302.
  • the relevant content of the data processing center 302 please refer to the above, which will not be repeated here.
  • the mobile terminal 100 uses RTK or RTD to perform positioning.
  • the relevant information is differential data
  • the mobile terminal 100 realizes positioning by means of RTK or RTD based on the differential data sent by the data processing center 302 .
  • the differential data may be a pseudorange correction amount, or may be a pseudorange correction amount, a carrier phase correction amount, or the like.
  • the reference station network 301 is composed of GNSS reference stations near the mobile terminal 100 .
  • the reference station and the data processing center 302 may communicate with the network through wireless communication technology.
  • the communication network may be a wired network or a wireless network.
  • the wired network can be a cable network, optical fiber network, digital data network DON, etc.
  • the wireless network can be a telecommunication network, an internal network, the Internet, a local area network (LAN), a wide area network (YAN), a wireless local area network (YLAN), Metropolitan Area Network (MAN), Public Switched Telephone Network (PSTN), Bluetooth Network, ZigBee Network (ZigBee), Near Field Communication (NF), mobile phone GSM, CDMA network, CPRS network, etc. or any combination thereof.
  • LAN local area network
  • YAN wide area network
  • YLAN wireless local area network
  • MAN Metropolitan Area Network
  • PSTN Public Switched Telephone Network
  • PSTN Public Switched Telephone Network
  • Bluetooth Network ZigBee Network
  • ZigBee Near Field Communication
  • mobile phone GSM CDMA network
  • CPRS network
  • the data processing center 302 and the mobile terminal 100 can communicate with the network through wireless communication technology.
  • the wireless communication technology and the network refer to the above, which will not be repeated here.
  • data processing center 302 may be a server.
  • the mobile terminal 100 determines the signal quality of acquired satellite signals.
  • non-visual positioning can be satellite navigation using RTK, RTD, PPP and other technologies, and can also be integrated navigation including satellite navigation.
  • satellite navigation is only used as an example and does not constitute a specific limitation.
  • non-visual positioning can be used, for example, wifi real-time positioning.
  • non-visual positioning is realized by solving positioning equations. It should be noted that the positioning equations are constructed based on different positioning technologies, such as the aforementioned RTK, RTD, and PPP.
  • the mobile terminal 100 performs periodic non-visual positioning on the vehicle 200, and the processing methods of different non-visual positioning are the same.
  • Non-visual positioning that is, solving the positioning equation, determining whether the positioning is successful, the location of the vehicle 200, the positioning error value and other information. Since these information are all obtained after solving the positioning equation, in order to facilitate description and distinction, these information can be called For the calculation result, the position of the vehicle 200 in the calculation result is called the non-visual matching position.
  • whether the positioning is successful is determined by the state of solving the positioning equation. For example, if the calculation status is calculation success, positioning may be considered successful; if the calculation status is calculation failure, positioning may be considered failure. In this embodiment, if the positioning error is infinitely large or infinitely small, it can be considered that the positioning fails, otherwise it can be considered that the positioning is successful.
  • the positioning error value indicates the probability that the non-visual matching position determined in this positioning can be believed.
  • the positioning error value may be determined based on the values of several influencing factors affecting the positioning accuracy.
  • non-visual positioning includes satellite navigation
  • several influencing factors can include PDOP, and can also include whether the positioning is successful, the quality of satellite signals, and whether the type of road environment is "underpass", “urban canyon”, “tree-lined road” or “tunnel” etc.
  • PDOP PDOP
  • the above influencing factors are only examples and do not constitute specific limitations, and can be deleted or added according to actual conditions.
  • the non-visual matching position may be a coordinate in the latitude-longitude coordinate system, that is, a latitude-longitude coordinate, or a coordinate in the world coordinate system.
  • the lane-level positioning condition may be that the estimated positioning error value is not greater than a preset threshold. Failure to meet the lane-level positioning condition may mean that the positioning is successful and the estimated positioning error value is greater than the lane-level positioning error, or the estimated positioning error is greater than the lane-level positioning error.
  • the lane-level positioning error can be 100cm; when it reaches the decimeter level, the lane-level positioning error can be 50cm; when it reaches the centimeter level, the lane-level positioning error can be 10cm.
  • the mobile terminal 100 can estimate the level of positioning accuracy. When the estimated positioning accuracy is high, it can be considered that the lane-level positioning condition is met, and when the estimated positioning accuracy is low, it can be considered that the lane-level positioning condition is not satisfied.
  • non-visual matching position based on the matching of the non-visual matching position and the high-precision map, it is possible to determine the lane where the non-visual matching position is located, the total number of lanes on the road, etc., and further determine the position of the lane line of the lane where it is located, the type of lane line of the lane where it is located, For example, information such as dotted lines, solid lines, and road edge lines. Since these information are based on the information obtained by non-visual matching positions and high-precision map matching, in order to facilitate distinction and description, non-visual matching positions can be matched with high-precision maps.
  • the obtained information is called the non-visual matching result
  • the lane where the non-visual matching position is located in the non-visual matching result is called the non-visual matching lane
  • the total number of lanes of the road where the non-visual matching position is located is called the total number of non-visual matching lanes.
  • non-visual positioning results include calculation results, whether lane-level positioning conditions are met, and non-visual matching results.
  • the non-visual positioning result may also be referred to as a second positioning result.
  • non-visual positioning results will be determined for different times of non-visual positioning.
  • non-visual matching results are not determined without the presence of a high-resolution map. In other words, the calculation result, whether the lane-level positioning condition is satisfied, and the non-visual matching result are continuously determined based on a preset time interval.
  • the method used by the mobile terminal 100 for non-visual positioning of the vehicle 200 may be a single satellite navigation.
  • a single satellite navigation can realize positioning through RTK, PPP or RTD.
  • the non-visual positioning method adopted by the mobile terminal 100 for the vehicle 200 is integrated navigation.
  • Integrated navigation can be satellite navigation system combined with INS, satellite navigation system can be GNSS or GPS, and positioning can be realized through RTK, PPP or RTD.
  • the mobile terminal 100 when the mobile terminal 100 adopts the positioning method of GNSS/INS integrated navigation, the mobile terminal 100 can realize positioning based on a switchable combination and filtering model.
  • the switching combination refers to switching between different navigation systems in different modes.
  • the mobile terminal 100 includes a satellite navigation mode and a dead reckoning mode.
  • the satellite navigation mode is used for navigation, and the initial estimated position of the dead reckoning mode is refreshed; otherwise, the dead reckoning mode is used for navigation.
  • Dead reckoning mode corrects the initial reckoning position based on the obtained offset position and heading angle change.
  • the offset position and heading angle change may be determined based on acceleration and angular velocity collected by IMU 120 , and/or based on mileage and velocity collected by an odometer.
  • the filtering model is used to optimally estimate the system state through the system input and output observation data (including the influence of noise and interference in the system), so as to determine the position and heading angle of the vehicle 200 on the two-dimensional plane.
  • the filtering model may be a linear system state equation, such as a Kalman filtering (Kalman filtering, KF) model.
  • KF Kalman filtering
  • the filtering model may be a state equation of a nonlinear system, for example, an Extended Kalman Filter (EKF) model. It should be noted that both the KF and the EKF are existing technologies, and details are not repeated here.
  • filtering models can also be used, such as other adaptive Kalman filtering (Adaptive Kalman filtering, AKF) models, unscented Kalman filtering (Unscented Kalman filtering, UKF) models.
  • Adaptive Kalman filtering AKF
  • unscented Kalman filtering Unscented Kalman filtering, UKF
  • the filtering model is located in the filter, and the function of the filtering model is realized by the filter.
  • the filter model is constructed based on the principle of different positioning methods, such as RTK, RTD or PPP.
  • the way the filter model realizes positioning can be a loose combination or a tight combination.
  • the loose combination or tight combination is described below by taking the GNSS receiver and INS as an example.
  • the loose combination is that the difference between the position and velocity output by the GNSS receiver and the position and velocity output by the INS is used as an observation, and the position, velocity, attitude of the INS and the error parameters of the IMU are estimated and corrected through the filtering model. Since the navigation solution of the GNSS receiver is input into the filtering model as observations, the GNSS receiver must be able to capture and track more than 4 navigation satellites at the same time to obtain the navigation solution. Among them, the attitude describes the space attitude of the IMU, which can be represented by Euler angles (heading angle, pitch angle, roll angle).
  • the error residual of the IMU refers to the system error of the device, which mainly includes the bias error (Bias) and scale factor error (Scale factor) of each axis of sensors such as accelerometers and gyroscopes. Do not constitute a specific limitation.
  • the tight combination uses the difference between the original information output by the GNSS receiver and the original information estimated based on the INS position and velocity as the observation quantity, and the INS position, velocity, attitude, IMU error parameters and receiver Clock errors, etc. are estimated and corrected.
  • the error parameters of attitude and IMU refer to the above, so I won’t go into details here.
  • the original information may be the pseudorange and pseudorange rate output by the GNSS receiver, or the pseudorange and carrier phase output by the GNSS receiver.
  • the original information output by the GNSS receiver is used to participate in the combination. When there are less than 4 navigation satellites, the original information output by the GNSS receiver can be used for fusion positioning. Compared with loose combination, it has higher robustness and positioning accuracy. .
  • the integrated navigation based on the filtering model its output can provide a more accurate initial estimated position and attitude for the INS, so that when the GNSS fails, the INS alone can maintain a high positioning accuracy for a long time.
  • the mobile terminal 100 locates the vehicle 200 taking into account the captured satellite signal, then the mobile terminal 100 determines the signal quality of the captured satellite signal in real time, in other words, based on the preset time interval. Signal quality.
  • the signal quality can be determined by several factors that affect the signal quality. These factors can include the signal strength of the satellite signal, the number of signals whose carrier phase of the satellite signal is in the frequency-locked loop, and the carrier phase of the satellite signal is in the phase-locked loop. The number of signals, the number of signals whose carrier phase of the satellite signal has a cycle slip, the number of signals whose carrier phase of the satellite signal does not undergo a cycle slip, and the position precision factor corresponding to the satellite signal, it should be understood that usually after receiving the satellite signal , will determine the respective values of the above factors. Exemplarily, several factors are as follows:
  • Factor 1 the first signal number of the satellite signal captured last time, the first signal number indicates the number of satellite signals whose signal strength is not less than 28dBHz.
  • Factor 2 the average value of the first signal numbers of the satellite signals captured multiple times in the latest time period, that is, the average value of the multiple first signal numbers.
  • Factor 3 The median signal strength of the last acquired satellite signal.
  • Factor 4 the average value after averaging the medians of the signal strengths of the satellite signals captured multiple times in the latest time period, that is, the average value of multiple medians.
  • Factor 6 the average value after averaging the signal strengths of satellite signals captured multiple times in the latest period, that is, the average value of multiple signal strengths.
  • Factor 7 the first relative ratio of the last captured satellite signal, the first relative ratio indicates the ratio of the number of signals whose carrier phase is in the frequency-locked loop to the number of signals in the phase-locked loop.
  • Factor 8 the average value after averaging the first relative ratios of satellite signals captured multiple times in the latest time period, that is, the average value of multiple first relative ratios.
  • Factor 9 the second relative ratio of the satellite signals captured last time, the second relative ratio indicates the ratio of the number of signals with carrier phase cycle slip to the number of signals without cycle slip.
  • Factor 10 an average value after averaging the second relative ratios of satellite signals captured multiple times in the latest period, that is, an average value of multiple second relative ratios.
  • Factor 12 the average value of the position precision factors PDOP of the satellite signals captured multiple times in the latest period after averaging, that is, the mean value of multiple position precision factors PDOP.
  • the duration of the most recent period among the above 12 factors may be 5s, 10s, or 15s, etc., which needs to be determined based on actual needs, and is not specifically limited in this embodiment.
  • the relationship between these factors and signal quality can be mined, for example, the values of these factors are used as model input, the signal quality is used as the output of the model for model training, and the signal quality is determined through the trained model.
  • the model is a regression model, and the dependent variable is continuous.
  • the signal quality is characterized by continuous values, and a larger value indicates better signal quality.
  • the mobile terminal 100 performs feature extraction on satellite signal tracking situation information obtained by tracking satellite signals to determine feature data; and performs multiple types of signal quality classification based on the number of features.
  • the mobile terminal 100 inputs the feature quantity into the classification model, and the classification result is a vector composed of probabilities corresponding to multiple types of signal quality.
  • the classification model may be any model capable of classification, such as support vector machine (SVM), neural network, decision tree, and the like. Considering the simplicity, speed and high precision of SVM, SVM is preferred in this embodiment.
  • SVM support vector machine
  • the feature data includes feature values of multiple features, which are represented as feature vectors. Exemplarily, these characteristics are Factor 1 - Factor 12 above.
  • the smoothing process needs to be realized by a filter.
  • the filter is equivalent to a window containing weighting coefficients. When using this filter to smooth the image classification results, slide this window over the classification results.
  • the filter may be a normalized block filter, a Gaussian filter, etc., which shall be determined in combination with actual requirements.
  • the signal quality of the satellite signal is good, it can indicate that the top of the vehicle 200 is open, that is, the type of road environment is "open";
  • the signal quality of the satellite signal identified by the different types of road environment is different. For example, if the type of road environment is an urban canyon, there are fewer satellites directly above the vehicle and the signal is stronger; if the type of road environment is under an elevated road, there are fewer satellites directly above the vehicle and the signal is weaker.
  • the type of road environment is a tree-lined road, and the satellite signal observed by the vehicle is generally weak; the type of road environment is a tunnel, and the satellite signal observed by the vehicle is extremely weak and the number is very small; therefore, the signal of the satellite signal
  • road environments such as “underpasses”, “urban canyons”, “tree-lined roads”, and “tunnels”
  • satellite signals or tracked satellite signals may not be tracked.
  • the quality is poor.
  • the relationship between the signal quality of different types of road environments can be: “Open” > "Urban Canyon” > "Under Elevated Road” > "Boulevard” > “Tunnel”.
  • the type of signal quality is determined based on the type of road environment, so that multiple types of signal quality correspond to multiple types of road environment one-to-one. For example, when the type of signal quality is good, the corresponding type of road environment is "open”; when the type of signal quality is fair, the type of corresponding road environment is "urban canyon”; The type of road environment is "underpass”; when the type of signal quality is poor, the type of corresponding road environment is “tree-lined road”; when the type of signal quality is very poor, the type of corresponding road environment is "tunnel” .
  • the type of signal quality can be characterized by the type of road environment. Then the mobile terminal 100 determines the type of the road environment based on the satellite signal tracking information of the real-time tracking satellite signal, in other words, continuously determines the type of the road environment based on the preset time interval.
  • FIG. 7 is a schematic flow chart of acquiring the type of road environment provided by this embodiment. As shown in FIG. 7 , the mobile terminal 100 inputs the feature quantity into the classification model, and outputs a classification result, which is a vector composed of probabilities corresponding to multiple types of road environments. See above for feature data and classification model, so I won’t go into details here.
  • the multiple types of road environment may be "open”, “underpass”, “urban canyon”, “tree-lined road”, “tunnel” and the like.
  • the identification of the road environment based on the tracking information of satellite signals can reduce power consumption on the one hand, and on the other hand, due to the strong relationship between satellite signals and road environment The correlation of the , can identify the type of road environment more accurately.
  • the type of road environment can also be determined based on the environment image. For example, several frames of environmental images collected based on the previous image collection frequency can also be combined with the previously collected environmental images to identify the type of road environment. It should be understood that determining the type of road environment based on this method will increase the power consumption of image processing, and it is preferable to determine the type of road environment based on satellite signal tracking information.
  • the positioning accuracy information in this embodiment is described below.
  • the positioning accuracy information can be understood as information for reflecting the positioning accuracy, which is information before the current moment. Therefore, in this embodiment, through the adaptation of the image acquisition frequency and positioning accuracy, not only can the image acquisition frequency be increased when the positioning accuracy is low to ensure the positioning effect of the vehicle 200, but also the image acquisition frequency can be reduced when the positioning accuracy is high. Frequency, on the premise of ensuring the positioning accuracy, reduce the power consumption of the mobile terminal 100 . Wherein, the positioning accuracy information may reflect the level of positioning accuracy to a certain extent.
  • the positioning accuracy information may include positioning error values obtained from multiple non-visual positionings in a recent period. Among them, when the positioning fails, it can be considered that the positioning error value is infinite; when the positioning error value is less than or equal to the lane-level positioning error, it can be considered that the positioning is successful and meets the lane-level positioning conditions; when the positioning error value is greater than the lane-level positioning error, then It can be considered that the positioning is successful and the lane-level positioning conditions are not met.
  • the positioning error value can be understood as a positioning accuracy value.
  • the frequency of image acquisition is determined based on the positioning error values obtained by multiple non-visual positioning in the latest period.
  • the current image acquisition frequency is greater than the image acquisition frequency of the previous period, considering the balance between positioning accuracy and power consumption. Under the premise of positioning accuracy, such as accurate positioning to the lane, power consumption can be reduced as much as possible.
  • the latest period includes the previous period; for example, the current time is the i-th second, the duration of the latest period is 5 seconds, and the duration of the previous period is 1 second, then the latest period starts from the i-5th second, The end of the i-th second can be expressed as [i-5, i], and the previous period is expressed as [i-1, i].
  • the relationship between the latest time period and the previous time period below is similar.
  • the multiple non-visual positioning in the recent period can be understood as the multiple non-visual positioning selected from the M non-visual positioning in the recent period (that is, the last M non-visual positioning), or it can be the selected M non-visual positioning Non-visual positioning.
  • the M times of non-visual positioning indicates the total number of non-visual positioning in the latest period. For example, assuming that the total number of non-visual positioning in the most recent period is 20, you can select 10 non-visual positioning from the 20 non-visual positioning, or you can select 20 non-visual positioning as multiple non-visual positioning in the most recent period .
  • the current image acquisition frequency is greater than the image acquisition frequency in the previous period.
  • the current image acquisition frequency is greater than the image acquisition frequency in the previous period. For example, assuming that the number of times of multiple non-visual positioning in the latest period is 10, and the preset number is 3, then when the positioning error value of 3 out of 10 non-visual positioning in the latest period is greater than the lane-level positioning error, it is determined that the current image acquisition frequency is greater than the image acquisition frequency in the previous period.
  • the mobile terminal 100 locates the vehicle 200 taking into account the captured satellite signals, then for tracking satellite signals, the mobile terminal 100 determines the signal quality of the captured satellite signals, in other words, based on a preset time interval Constantly determine signal quality.
  • the positioning accuracy information includes the last determined signal quality.
  • the signal quality determined last time can also be understood as the signal quality of the previous period, which is used to reflect the positioning accuracy of the vehicle 200, and in some possible cases, can also be used to determine the positioning accuracy of the vehicle 200, for example, in non-visual During positioning, the signal quality is considered when determining the positioning error value.
  • the image acquisition frequency needs to be increased, that is, the current image acquisition frequency determined based on the last determined signal quality is greater than the previous image acquisition frequency, considering the positioning accuracy and power consumption.
  • the power consumption can be reduced as much as possible.
  • the type of signal quality indicates the type of road environment. Then the mobile terminal 100 determines the type of the road environment based on the satellite signal tracking situation information obtained by tracking the satellite signal in real time, in other words, continuously determines the type of the road environment based on the preset time interval.
  • the positioning accuracy information includes the last determined road environment type.
  • the type of the road environment determined last time can also be understood as the type of the road environment in the previous period, which is used to reflect the positioning accuracy of the vehicle 200, and can also be used to determine the positioning accuracy of the vehicle 200 in some possible cases, such as , in the process of non-visual positioning, the type of road environment is considered when determining the positioning error value.
  • the last determined road environment type indicates that there is an obstacle above the road where the vehicle 200 is currently located, for example, the road environment type is "underpass”, “urban canyon”, “greenway”, “tunnel” etc.
  • the satellite signal may not be tracked or the signal quality of the tracked satellite signal is poor, which further indicates that the positioning accuracy is poor.
  • the image acquisition frequency is higher than the previous image acquisition frequency, considering the balance between positioning accuracy and power consumption. On the premise of ensuring positioning accuracy, for example, it can accurately locate the lane and reduce power consumption as much as possible.
  • the mobile terminal 100 determines the image acquisition frequency in real time. If the image acquisition frequency is not equal to 0 Hz, image processing is performed on several frames of environmental images acquired based on the image acquisition frequency to determine the visual positioning result. Therefore, the visual The positioning results are also constantly updated.
  • the positioning accuracy information includes the error between the first positioning result and the second positioning result, or the result error between the first positioning result and the second positioning result, the first positioning result is determined based on visual positioning , the second positioning result is used to represent a non-visual positioning result of the mobile terminal.
  • the positioning accuracy information includes the result error determined in the previous period, and the result error indicates the matching degree between the visual positioning result determined in the previous period and the non-visual positioning result determined in the last non-visual positioning, and is used to reflect the positioning of the vehicle 200
  • the accuracy information can also be used to determine the positioning accuracy of the vehicle 200 in some possible cases, for example, in the process of non-visual positioning, the result error is considered when determining the positioning error value.
  • the result error between non-visual positioning results and visual positioning results can reflect the level of positioning accuracy. If the error is small, it means that the positioning accuracy is relatively high. High, for example, satisfies the lane-level positioning accuracy. On the contrary, it indicates that the positioning accuracy is low.
  • the image acquisition frequency needs to be increased.
  • the current image acquisition frequency is determined based on the result error determined in the previous period.
  • the current image acquisition frequency is greater than the previous one.
  • the frequency of image acquisition takes into account the balance between positioning accuracy and power consumption. On the premise of ensuring positioning accuracy, for example, it can accurately locate the lane and reduce power consumption as much as possible.
  • the result error may be a lane error value, which may indicate whether the lane of the mobile terminal indicated by the visual positioning result and the non-visual positioning result are consistent.
  • the lane error value can be 0, which can also indicate that the lane-level positioning requirements are met; otherwise, the lane error value is 1, indicating that the lanes are inconsistent, and it can also indicate that the lane-level positioning requirements are not met.
  • the lane error is determined by judging whether the visual lane in the visual positioning result and the non-visual matching lane in the non-visual positioning result are the same.
  • the visual lanes, the total number of visual lanes, the lane line type of the visual lanes, and the lane line positions of the visual lanes in the visual positioning results are respectively compared with the non-visual matching lanes, the total number of non-visual matching lanes, Whether the lane line type of the non-visual matching lane and the lane line position of the non-visual matching lane are the same, if they are the same, it means that the lanes are consistent, and 0 is used as the lane error value.
  • the result error may be a position error value, which may indicate the distance between the position of the vehicle 200 indicated by the visual positioning result and the non-visual positioning result.
  • the larger the position error value the lower the positioning accuracy.
  • the absolute value of the difference between the visual matching position in the visual positioning result and the non-visual matching position in the non-visual positioning result is taken as the position error value.
  • the mobile terminal 100 determines the visual positioning result based on several frames of environmental images collected by the previous image collection frequency, and the visual positioning result is the visual positioning result determined in the previous period . If the previous image acquisition frequency is equal to 0 Hz, in an example, the visual positioning result will not be determined in the previous period, and correspondingly, the lane error value will not be determined in the previous period. In addition, when the image acquisition frequency is determined for the first time, the previous image acquisition frequency can be considered as 0 Hz.
  • the above positioning accuracy information is only an example.
  • the positioning accuracy information in this embodiment may be other information determined according to actual needs.
  • the positioning accuracy information is a positioning accuracy value, for example, the positioning accuracy value is a position factor of precision PDOP.
  • the mobile terminal 100 stores Y adjustment conditions.
  • the adjustment condition indicates the information of several influencing factors affecting the positioning accuracy.
  • the adjustment conditions and positioning accuracy information should be adaptive.
  • the adjustment conditions are some possible situations determined based on the positioning accuracy information. any one or more of conditions, the type of road environment determined last time, and the lane error value determined in the previous period.
  • the adjustment condition may be a high-accuracy positioning condition RH (indicating a high positioning accuracy).
  • RH indicating a high positioning accuracy
  • the high-precision positioning condition for example, the vehicle can be accurately positioned to the lane under the premise of ensuring the positioning accuracy. In order to reduce power consumption as much as possible, the frequency of image acquisition needs to be reduced.
  • the first high-precision positioning condition RH1 may be that non-visual positioning has been successful for multiple times in the recent period and the lane-level positioning condition is satisfied.
  • the duration of the latest period may be 10 seconds.
  • the second high-precision positioning condition RH2 may be that multiple times of non-visual positioning in the latest period have been successful and the lane-level positioning condition is met, and the lane error value determined in the previous period is 0.
  • the duration of the latest period may be 10 seconds.
  • the third high-precision positioning condition RH3 may be that multiple times of non-visual positioning in the recent period have been successful and the lane-level positioning condition is met, and the type of the road environment determined last time is open.
  • the duration of the latest period may be 10 seconds.
  • the fourth high-precision positioning condition RH4 may be that multiple times of non-visual positioning have been successful in the recent period and the lane-level positioning condition is satisfied, the lane error value determined in the previous period is 0, and the type of the road environment determined last time is open .
  • the duration of the latest period may be 10 seconds.
  • the fifth high-precision positioning condition RH5 may be that the last determined positioning error value is within a preset positioning error interval.
  • the preset positioning error interval is used to indicate a reasonable range of high-precision positioning errors.
  • the last determined positioning error value may be the last non-visual positioning positioning error value, or may be a value obtained by fusing multiple non-visual positioning positioning error values in a recent period.
  • the fusion method may be a summation average, or a weighted average.
  • the above-mentioned high-precision positioning condition RH is only an example, and does not constitute a specific limitation.
  • the positioning in the above-mentioned high-precision positioning condition RH is all non-visual positioning.
  • the adjustment condition may be a low-accuracy positioning condition RL (indicating a situation where the positioning accuracy is low).
  • the positioning accuracy can be considered to be low, that is, the positioning is unreliable. Therefore, when the low-accuracy positioning condition is satisfied, in order to ensure the positioning accuracy, such as accurate positioning to the lane, it is necessary to increase the frequency of image acquisition.
  • the first low-accuracy positioning condition RL1 may be that multiple times of non-visual positioning have failed within a recent period or the lane-level positioning condition has not been satisfied.
  • the duration of the latest period may be 5 seconds. For example, if positioning has been performed 5 times in the last 5 seconds, and positioning fails 3 times, and fails to meet the lane-level positioning condition 2 times, then the first low-precision positioning condition is met. When this condition is met, in order to ensure the positioning accuracy, for example, to accurately locate the lane, it is necessary to increase the frequency of image acquisition.
  • the second low-accuracy positioning condition RL2 may be a small number of conditions that do not satisfy the lane-level positioning condition in a recent period.
  • the ratio of the number of times that the lane-level positioning condition is not satisfied to the total number of positioning times is within a preset interval, for example, 0.2-0.5.
  • the duration of the latest period may be 10 seconds. It should be noted that when there are a small number of conditions that do not meet the lane-level positioning conditions, it means that the satellite signal is blocked when the vehicle briefly passes over obstacles such as overpasses and overpasses. When this condition is met, in order to ensure the positioning accuracy, for example, to accurately locate the lane, it is necessary to increase the frequency of image acquisition.
  • the third low-accuracy positioning condition RL3 may be that the last determined road environment type is "underpass”, “urban canyon”, “tree-tree road” or “tunnel”.
  • this condition is met, in order to ensure the positioning accuracy, for example, to accurately locate the lane, it is necessary to increase the frequency of image acquisition.
  • the fourth low-accuracy positioning condition RL4 is that a large number of lane-level positioning conditions are not satisfied in the latest period, and the lane error value determined in the previous period is 1.
  • the ratio of the number of times that the lane-level positioning condition is not satisfied to the total number of positioning times is not less than a preset threshold, for example, the threshold may be 0.6.
  • the duration of the latest period may be 10 seconds.
  • the fifth low-accuracy positioning condition RL5 may be that the last determined positioning error value is within a preset positioning error interval.
  • the preset positioning error interval is used to indicate a reasonable range of low-precision positioning errors. For the determination method of the last determined positioning error value, see above. When this condition is met, in order to ensure the positioning accuracy, for example, to accurately locate the lane, it is necessary to increase the frequency of image acquisition.
  • the above low-accuracy positioning condition RL is only an example, and does not constitute a specific limitation.
  • the positioning in the above-mentioned high-precision positioning condition RH is all non-visual positioning.
  • the duration of the latest period in the low-accuracy positioning condition is not greater than the duration of the latest period in the high-precision positioning condition.
  • the mobile terminal 100 On the basis of storing Y adjustment conditions, the mobile terminal 100 also stores a high-frequency limit value and a low-frequency limit value.
  • the high-frequency limit indicates the maximum value of the image acquisition frequency
  • the low-frequency limit indicates the minimum value of the image acquisition frequency, and it is preferable to suspend the acquisition frequency, which needs to be determined according to the actual situation.
  • the high-frequency limit is 5Hz
  • the low-frequency limit is 0Hz.
  • the current image acquisition frequency is less than or equal to 5Hz and greater than or equal to 0Hz.
  • 5Hz and 0HZ are only examples.
  • the high-frequency limit of this embodiment and the low frequency limit can be other values determined according to actual needs.
  • the positioning accuracy when any high-precision positioning condition RH in the Y adjustment conditions is satisfied, the positioning accuracy can be considered to be high and the positioning is reliable. If the previous image acquisition frequency is not equal to (greater than When equal to) the low frequency limit, the current image acquisition frequency is less than the previous image acquisition frequency; when any low-precision positioning condition RL in the Y adjustment conditions is satisfied, the positioning accuracy is considered to be low and the positioning is unreliable.
  • the image acquisition frequency of one time is not equal to (less than or equal to) the high-frequency limit value, the current image acquisition frequency is greater than the previous image acquisition frequency. Therefore, the conditions in the Y adjustment condition are parallel, and if Y is greater than 2, they need to be detected sequentially, and there is only one satisfied adjustment condition.
  • the high-precision positioning condition RH and the low-precision positioning condition RL are mutually exclusive. In other words, either the high-precision positioning condition RH or the low-precision positioning condition RL is satisfied, and it is impossible to satisfy both.
  • the mobile terminal 100 sequentially detects Y adjustment conditions to determine an adjustment condition satisfied among the Y adjustment conditions; when the satisfied adjustment condition is the high-precision positioning condition RH , and the previous image acquisition frequency is not equal to (that is, greater than or equal to) the low frequency limit, then the determined current image acquisition frequency is greater than the previous image acquisition frequency; when the adjustment condition met is the high-precision positioning condition RH, and When the previous image acquisition frequency is equal to the low frequency limit, the determined current image acquisition frequency is equal to the previous image acquisition frequency; when the satisfied adjustment condition is the low-precision positioning condition RL, and the previous image acquisition frequency is not equal to ( That is, when it is less than or equal to) the high-frequency limit value, the determined current image acquisition frequency is less than the frequency of the previous image acquisition frequency; when the satisfied adjustment condition is the low-precision positioning condition RL, and the previous image acquisition frequency is equal to the high-frequency When the limit value is set, the determined current image acquisition frequency is equal to the previous image acquisition
  • the image acquisition frequency is switched between Z specified frequencies, and the mobile terminal 100 presets frequency levels corresponding to each of the Z specified frequencies and level differences corresponding to each of the Y adjustment conditions.
  • the level of the frequency level is proportional to the size of the specified frequency. That is, the higher the frequency level, the greater the specified frequency.
  • the level difference indicates a step difference between frequency levels. For example, if one frequency class is 5 and the other frequency class is 8, the class difference is 3. It should be noted that the maximum value among the Z specified frequencies is the high frequency limit, and the minimum value is the low frequency limit.
  • Z is a positive integer greater than or equal to 2.
  • the frequency level of the current image acquisition frequency is smaller than the frequency level of the previous image acquisition frequency, and the level difference between the frequency levels is the level corresponding to the condition or, the current image acquisition frequency is equal to the low frequency limit, the frequency level of the current image acquisition frequency is equal to or less than the frequency level of the previous image acquisition frequency, and the level difference between the frequency levels is less than or equal to the corresponding poor grades.
  • the frequency level of the current image acquisition frequency is greater than the frequency level of the previous image acquisition frequency, and the difference between the frequency levels is the level corresponding to the condition Difference.
  • the current image acquisition frequency is the high frequency limit
  • the frequency level of the current image acquisition frequency is equal to or greater than the frequency level of the previous image acquisition frequency
  • the level difference between the frequency levels is less than or equal to the level corresponding to the condition Difference.
  • the level differences corresponding to each high-precision positioning condition may be completely the same, for example, 1, or may be partially the same, or may be completely different, and the low-precision positioning conditions are similar. It should be understood that the level difference may be different due to differences in positioning accuracy indicated by different adjustment conditions.
  • the image acquisition frequency will be continuously reduced until reaching the low frequency limit.
  • the mobile terminal 100 is preset with a frequency step corresponding to each of the Y adjustment conditions, for example, 1 Hz.
  • the current image acquisition frequency is the frequency obtained by subtracting the frequency step corresponding to the condition from the previous image acquisition frequency; or, the current image acquisition frequency is Low frequency limit, the difference between the current image acquisition frequency and the previous image acquisition frequency is less than or equal to the frequency step corresponding to this condition.
  • the current image acquisition frequency is the frequency after the previous image acquisition frequency plus the frequency step corresponding to the condition; or, the current image acquisition frequency is High frequency limit, the difference between the current image acquisition frequency and the previous image acquisition frequency is less than or equal to the frequency step corresponding to this condition.
  • the frequency steps corresponding to each high-precision positioning condition may be completely the same, partially the same, or completely different, and the low-precision positioning conditions are similar.
  • the frequency step sizes corresponding to each of the Y adjustment conditions are exactly the same, which is 1 Hz.
  • the image acquisition frequency will be continuously reduced according to the frequency step corresponding to the condition until reaching the low frequency limit.
  • Y positioning error intervals are respectively used as an adjustment condition, and the mobile terminal 100 is preset with a designated frequency corresponding to each of the Y adjustment conditions.
  • the mobile terminal 100 may construct an incremental function based on the positioning accuracy and image acquisition frequency, determine the frequency range of the positioning error interval based on the incremental function, and determine the specified frequency corresponding to the positioning error interval based on the frequency range.
  • the mobile terminal 100 may determine the current positioning error; determine a specified frequency corresponding to the positioning error interval to which the current positioning error belongs, and use this frequency as the current image acquisition frequency.
  • the positioning error indicated by the positioning error interval is proportional to the specified frequency.
  • the mobile terminal 100 is preset with an adjustment condition list for each of the Z designated frequencies, and each adjustment condition in the adjustment condition list corresponds to another designated frequency among the Z designated frequencies.
  • P1 corresponds to two adjustment conditions R1 and R2
  • the designated frequency corresponding to R1 is P2, which means that when the When R1 is used, P1 is switched to P2
  • the specified frequency corresponding to R2 is P3, which means that P1 is switched to P3 when R2 is satisfied.
  • the Z specified frequencies corresponding to the list of adjustment conditions are used as reference frequencies, and another specified frequency corresponding to each adjustment condition in the list of adjustment conditions is used as the adjusted frequency.
  • the adjustment condition list includes one or more adjustment conditions; when the adjustment condition list includes multiple adjustment conditions, the order in the list may indicate the detection order. It should be understood that the detection order of the multiple adjustment conditions in the adjustment condition list is artificially preset. As a possible situation, the adjustment condition list may correspond to different adjusted frequencies, and the adjustment condition list may indicate the detection order of the adjustment conditions corresponding to the adjusted frequencies. For example, suppose there are three designated frequencies P1, P2 and P3, and the image acquisition frequency is switched between P1, P2 and P3. For P1, there are two adjusted frequencies, which are P2 and P3 respectively. If P2 and The adjustment conditions corresponding to P3 are denoted as L1 and L2 respectively.
  • the adjustment condition list includes L1 and L2, and they are arranged according to L1 and L2.
  • L1 is detected first, and if L1 is not satisfied L2 is detected under certain circumstances.
  • L1 and L2 there can be more than one L1 and L2, which need to be determined according to the actual situation.
  • the Y adjustment conditions determined by the mobile terminal 100 are all the conditions in the adjustment condition list corresponding to the previous image acquisition frequency, and the Y adjustment conditions are detected sequentially according to the order in the list.
  • the list of adjustment conditions corresponding to the high-frequency limit can indicate the situation of high positioning accuracy.
  • the high-frequency limit is 5 Hz
  • the adjustment condition list corresponding to 5 Hz can include any one of the above-mentioned high-precision positioning conditions RH1 to RH4, so that when the high-precision positioning condition RH is satisfied, the image acquisition frequency is reduced to reduce the power consumption as much as possible. consumption.
  • the reference frequency is not the high-frequency limit among the Z specified frequencies
  • the adjustment conditions in the adjustment condition list indicate the situation of high positioning accuracy
  • the other part of the adjustment conditions indicate the situation of low positioning accuracy
  • the list of adjustment conditions includes any of the above-mentioned high-precision positioning conditions RH1 to RH5 One, and any one or more of the above-mentioned low-precision positioning conditions RL1 to RL5; when the high-precision positioning condition RH is met, reduce the image acquisition frequency or keep the low-frequency limit to reduce power consumption as much as possible; When the precision positioning condition is RL, increase the image acquisition frequency or keep the high frequency limit to ensure the positioning accuracy.
  • all the adjustment conditions can indicate the situation that the positioning accuracy is low.
  • the adjustment condition list includes any one or more of the above-mentioned low-precision positioning conditions RL1 to RL5, so that when any low-precision positioning condition is satisfied In RL, increase the image acquisition frequency or maintain the high-frequency limit to ensure positioning accuracy.
  • FIG. 8 is a schematic diagram of a frequency switching manner provided by this embodiment.
  • the specified frequencies corresponding to the high frequency and low frequency shown in Figure 8 can be set according to the actual needs, as long as the specified frequency of the high frequency is greater than the specified frequency of the low frequency, for example, the specified frequency corresponding to the high frequency is 5Hz , the specified frequency corresponding to the low frequency is 2Hz.
  • the designated frequency corresponding to pause is 0Hz. In the following, 0Hz, 2Hz, and 5Hz are taken as examples for description.
  • the level difference between high frequency and pause is 2
  • the level difference between high frequency and low frequency is 1
  • the level difference between low frequency and pause is 1.
  • 6 modes can be set to switch, as follows:
  • Mode switch 1 start the specified frequency corresponding to the high frequency for image acquisition; a corresponding adjustment condition is that the first positioning is successful.
  • Mode switch 2 switch the specified frequency corresponding to high frequency from 5Hz to the specified frequency corresponding to low frequency 2Hz; a corresponding adjustment condition is that the non-visual positioning has been successful multiple times in the last 10 seconds and the lane-level positioning condition (RH1) is met.
  • Mode switch 3 switch the specified frequency corresponding to the low frequency 2Hz to the specified frequency corresponding to the high frequency 5Hz; corresponding to 3 adjustment conditions.
  • Condition 1 the last determined type of road environment is "underpass”, “urban canyon”, “tree-lined road” or “tunnel” (RL3); condition 2, multiple non-visual positioning failures in the last 5s or lane-level failure Positioning condition (RL1); condition 3, in the last 10s, there have been a large number of situations that did not meet the lane-level positioning conditions, and the lane error value determined in the previous period was 1 (RL4).
  • Mode switching 4 switch the specified frequency corresponding to the low frequency from 2Hz to 0Hz, that is, to suspend image acquisition; a corresponding adjustment condition is that multiple non-visual positioning has been successful in the last 10 seconds and the lane-level positioning condition (RH1) is met.
  • Mode switching 5 switch from 0Hz to the designated frequency corresponding to the low frequency 2Hz; one of the corresponding adjustment conditions is that there are a small number of conditions that do not meet the lane-level positioning conditions in the last 10 seconds (RL2).
  • Mode switch 6 switch 0Hz to the specified frequency corresponding to high frequency 5Hz; there are 3 adjustment conditions correspondingly.
  • Condition 1 the last determined type of road environment is "underpass”, “urban canyon”, “tree-lined road” or “tunnel” (RL3); condition 2, multiple non-visual positioning failures in the last 5s or lane-level failure Positioning condition (RL1); condition 3, in the last 10s, there have been a large number of situations that did not meet the lane-level positioning conditions, and the lane error value determined in the previous period was 1 (RL4).
  • the initial image acquisition frequency is determined to be 5Hz, that is, the image acquisition is performed according to the mode switching 1.
  • the adjustment condition list is the adjustment condition corresponding to the mode switching 2 mentioned above.
  • the current image acquisition frequency is determined to be 2Hz.
  • the adjustment condition list is the adjustment conditions corresponding to the above mode switching 3 and mode switching 4.
  • the current image acquisition frequency is determined to be 5Hz.
  • the adjustment condition list includes the corresponding adjustment conditions of the above-mentioned mode switching 6 and mode switching 5.
  • the current image acquisition frequency is determined to be 5Hz.
  • the conditions in mode switching 5 are met, determine that the current image acquisition frequency is 2Hz.
  • the Y adjustment conditions are usually any one of the above-mentioned high-precision positioning conditions RH1 to RH5, and any one or more of the above-mentioned low-precision positioning conditions RL1 to RL5.
  • it may also be any one of the above-mentioned high-precision positioning conditions RH1 to RH5, or any one or more of the above-mentioned low-precision positioning conditions RL1 to RL5.
  • the Y adjustment conditions are all low-precision positioning conditions RL, if any adjustment condition is not satisfied, the positioning is considered reliable, and at this time the image acquisition frequency needs to be reduced, and the determined current image acquisition frequency is greater than the previous one
  • the image acquisition frequency of and the method of reducing it can be seen above, so I won’t go into details here.
  • the Y adjustment conditions are all high-precision positioning conditions RH, if any adjustment condition is not satisfied, the positioning is considered unreliable, and at this time, the image acquisition frequency needs to be increased, that is, the determined current image acquisition frequency is less than For the previous image acquisition frequency and how to improve it, please refer to the above, so I won’t go into details here.
  • the Y adjustment conditions are composed of low-precision positioning conditions RL and high-precision positioning conditions RH
  • any adjustment condition is not satisfied, for example, when the previous image acquisition frequency is the high-frequency limit, keep No change, that is, the current image acquisition frequency is equal to the previous image acquisition frequency, or, the specified frequency corresponding to the positioning error interval to which the last determined positioning error belongs is taken as the current image acquisition frequency.
  • the following describes the process of acquiring environmental images according to the current image acquisition frequency.
  • the mobile terminal 100 can control a camera, for example, a built-in camera of the mobile phone 100 , or a vehicle camera other than the driving recorder 201 of the vehicle 200 .
  • the mobile terminal 100 collects environmental images according to the current image collection frequency. It can be understood that the mobile terminal 100 controls the camera 130 it has or the vehicle camera other than the driving recorder 201 to shoot according to the current image collection frequency to collect environmental images.
  • the mobile terminal 100 controls the camera 130 it has or the vehicle camera other than the driving recorder 201 to suspend shooting.
  • the mobile terminal 100 may send an image acquisition request to the camera 130 .
  • the image acquisition request carries the current image acquisition frequency. It should be noted that when the previous image acquisition frequency is consistent with the current image acquisition frequency, there is no need to send the image acquisition request again. In addition, when the current image acquisition frequency is equal to 0 Hz, the mobile terminal 100 may send a pause image acquisition request to the camera 130 .
  • the mobile terminal 100 may be provided with N cameras 130 , where N is a positive integer greater than or equal to 1.
  • one camera 130 may be selected from the N cameras 130, and the camera 130 is controlled to collect environmental images according to the current image collection frequency. Wherein, the selected camera 130 should be used to take pictures of the front, rear, left or right of the vehicle 200 .
  • M cameras 130 may be selected from N cameras 130, and M (greater than or equal to 2, less than or equal to N) cameras 130 may be controlled to collect environmental images according to the current image collection frequency.
  • M cameras may be determined based on shooting directions of the N cameras.
  • M cameras should be used to shoot any one or more directions of the front, rear, left and right of the vehicle 200, for example, two cameras, one camera is used to shoot the front of the vehicle 200, The other is used to take pictures of the rear of the vehicle 200 , or both cameras are used to take pictures of the front of the vehicle 200 .
  • the M cameras all collect environmental images according to the current image collection frequency.
  • the current image collection frequency is divided to obtain respective collection frequencies of the M cameras 130; wherein, the collection frequency refers to the number of frames of the environment image collected by the cameras.
  • the working states of the M cameras 130 can be combined. For example, when the working hours of the cameras 130 are longer, the acquisition frequency can be reduced, and vice versa; it can also be determined based on the shooting directions of the M cameras 130, such as If the front area in the driving direction of the vehicle 200 is more important, then the acquisition frequency of the camera 130 for photographing the front of the vehicle 200 may be increased.
  • the camera 130 only when the camera 130 is not in use, or when the frequency of collecting environmental images by the camera 130 is lower than the current image collection frequency, can the camera 130 be controlled to shoot according to the current image collection frequency; otherwise, in order not to If the normal operation of the camera 130 is affected, several frames of environmental images collected by the camera 130 will be sampled based on the current image collection frequency, so as to collect several frames of environmental images.
  • the collected environment image is an image captured by a camera in the driving recorder 201 . Since the image collection of the driving recorder 201 cannot be interfered, the mobile terminal 100 collects environmental images according to the current image collection frequency, which can be understood as the final collection of the driving recorder 201 by the mobile terminal 100 according to the current image collection frequency. The surrounding environment video is sampled to obtain several frames of environment images. In addition, when the current image collection frequency is equal to 0 Hz, the mobile terminal 100 instructs to suspend the sampling of the surrounding environment video finally collected by the driving recorder 201 .
  • start time of the surrounding environment video finally collected by the driving recorder 201 is equal to or earlier than the determination time of the current image acquisition frequency.
  • the frame starts to be sampled.
  • the mobile terminal 100 can send an image acquisition request to the driving recorder 201 or the vehicle 200, instructing the driving recorder 201 or the vehicle 200 to process the last captured image according to the current image collection frequency.
  • the surrounding environment video is sampled, and several frames of environment images obtained by sampling are sent to the mobile terminal 100 .
  • the image acquisition request carries the current image acquisition frequency. It should be noted that if the previous image acquisition frequency is consistent with the current image acquisition frequency, there is no need to send an image acquisition request.
  • the mobile terminal 100 and the driving recorder 201 can communicate with the network through the above-mentioned wireless communication technology, so that the mobile terminal 100 and the driving recorder 201 can realize data interaction, preferably short-distance wireless communication technology and network.
  • the mobile terminal 100 can realize wireless communication with the driving recorder 201 through the communication module 170 .
  • the mobile terminal 100 and the driving recorder 201 communicate through a Wifi network.
  • the mobile terminal 100 may sample the surrounding environment video finally collected by the driving recorder 201 based on the current image collection frequency to obtain several frames of environment images.
  • the vehicle 200 generally includes a driving recorder 201 .
  • the vehicle-mounted terminal may include a driving recorder 201 , and may also be wired or wirelessly connected to the driving recorder 201 .
  • FIG. 9 is a structural diagram of another positioning system provided by this embodiment.
  • the positioning system includes a vehicle 200 and a mobile terminal 100 .
  • the mobile terminal 100 includes a satellite positioning chip, an IMU 120 , a positioning module 401 , a vision module 402 , a navigation application 403 , and a camera 130 .
  • the mobile terminal 100, the vehicle 200 and the reference station terminal 300 can communicate with the network (not shown in the figure) through the above wireless communication technology. Between the mobile terminal 100 and the vehicle 200, short-distance wireless communication technology and network are preferred.
  • the modules in the mobile terminal 100 are connected by an in-device bus, in-device lines, cables, etc., or any combination thereof.
  • the positioning module 401 is used to position the vehicle 200 in real time, and determine a non-visual positioning result; wherein, the non-visual positioning result includes a positioning error.
  • the positioning module 401 performs positioning based on the satellite navigation information demodulated by the satellite positioning chip and the acceleration and angular velocity collected by the IMU 120 of the mobile terminal 100 .
  • the positioning module 401 is further configured to determine the type of road environment based on satellite signal tracking situation information obtained by tracking satellite signals in real time.
  • the positioning module 401 is further configured to determine the type of road environment based on the last time, the positioning error value of multiple non-visual positioning in the latest period, and the result error determined in the previous period (for example, the above-mentioned lane error or Any one or more of position error) to determine the current image acquisition frequency. It should be understood that the result error will only be determined when the previous image acquisition frequency is not equal to 0 Hz.
  • the navigation application 403 may be Baidu map, Gaode map, Tencent map and other applications that can be used to implement lane-level navigation.
  • the navigation application 403 stores a high-precision map of the starting point and end point of the navigation or a high-precision map requested from the cloud, and controls the touch screen 140 to display the high-precision map; The location of the user's vehicle 200 .
  • the vision module 402 is used to control the camera 130 to shoot according to the current image acquisition frequency when the current image acquisition frequency is greater than 0 Hz, so as to obtain several frames of environmental images currently acquired; when the current image acquisition frequency is equal to When 0 Hz, the camera 130 is controlled to pause shooting.
  • the mobile terminal 100 can be connected to the driving recorder 201, or include the driving recorder; the vision module 402 is used to indicate that the driving recorder 201 should be detected based on the current image acquisition frequency when the current image acquisition frequency is greater than 0 Hz. Sampling the last collected surrounding environment video to obtain several frames of currently collected environment images; when the current image collection frequency is equal to 0 Hz, instruct to suspend the sampling of the last collected surrounding environment video by the driving recorder 201 .
  • the vision module 402 is used to report the visual positioning result determined in the previous period to the positioning module 401, so that the positioning module 401 is used to determine the non-visual positioning result determined in the previous period and the result of the last non-visual positioning determination.
  • the error between the visual positioning results that is, the error of the results determined in the previous period.
  • the visual positioning result determined in the previous period and reported to the positioning module 401 is a visual matching result.
  • several frames of environment images acquired according to the previous image acquisition frequency are respectively subjected to lane line recognition and/or visual matching positioning to determine a visual positioning result, which is a visual positioning result determined in a previous period.
  • the vision module 402 can obtain a high-precision map from the navigation application 403 to perform visual matching and positioning.
  • the positioning module 401 is configured to report the last determined non-visual positioning result to the navigation application 403, so that the navigation application 403 is used for non-visual matching position and positioning error value based on the reported last determined non-visual positioning result, Determine the position of the vehicle 200 that needs to be notified to the user, and display the position on the displayed high-precision map.
  • the visual module 402 is also configured to report the visual positioning result determined in the previous period to the navigation application 403, so that the navigation application 403 is used for the non-visual matching position based on the last reported non-visual positioning result determined, and the previous The navigation route is planned based on the visual positioning results determined in a period of time.
  • the visual positioning result determined in the previous period reported to the navigation application 403 is a visual recognition result, and correspondingly, the lane navigation route may be recognized based on the non-visual matching position and the visual recognition result.
  • the navigation application 403 is used to interact with the user.
  • the positioning module 401 is triggered to determine the image acquisition frequency in real time
  • the vision module 402 is used to acquire the environment based on the image acquisition frequency determined in real time. images, and navigate based on several frames of environment images collected.
  • the collected environment images may be periodically sent to the vision module 402 for image processing, so as to determine the visual positioning result.
  • the positioning module 401, the vision module 402 and the navigation application 403 can be understood as software instructions, stored in the memory 150, and the positioning module 401 and the vision module 402 are implemented by the processor 100 executing the software instructions. and navigation application 403 functions.
  • software instructions may be embedded in hardware forming firmware.
  • Step 101 The processor of the mobile phone receives a user operation, and the user operation indicates to perform navigation based on the environment image.
  • the mobile phone interacts with the user through the user interaction page, and when the user agrees to acquire environmental images for navigation, the image acquisition frequency is determined in real time.
  • Step 102 The processor of the mobile phone determines the image acquisition frequency for the first time based on the first positioning accuracy information; wherein, the first positioning accuracy information includes the last determined type of road environment and/or multiple non-visual positioning in the latest period respective positioning error values.
  • the process of determining the frequency of image acquisition and the process of non-visual positioning in this embodiment are independent of each other, and the frequency of image acquisition for the first time is only to illustrate the number of changes in the frequency of image acquisition.
  • the end time of the latest period is the moment for determining the image acquisition frequency for the first time.
  • the mobile phone executes steps S1 to S2 in parallel; wherein,
  • Step S1 The processor of the mobile phone performs non-visual positioning in real time, and determines non-visual positioning results, including non-visual matching positions, positioning error values and non-visual matching results.
  • Step S2 The processor of the mobile phone tracks the situation information based on the real-time satellite signal to determine the type of road environment.
  • the last determined road environment and/or the positioning error values of multiple non-visual positionings in the latest period are all updated in real time and obtained before the first image acquisition frequency is determined.
  • multiple times of non-visual positioning within a recent period may be understood as the last multiple times of non-visual positioning, and may also be understood as multiple times of non-visual positioning in the last multiple times of non-visual positioning.
  • the first image acquisition frequency is a preset high-frequency limit. For example, if the positioning is successful for the first time, or the last determined road environment type is "underpass”, “urban canyon”, “tree-lined road” or “tunnel”, that is, the above-mentioned third low-precision positioning condition RL3, the first The second image acquisition frequency is set to the preset high frequency limit.
  • Step 103 Use the first image acquisition frequency as the current image acquisition frequency.
  • Step 104a The processor of the mobile phone sends an image acquisition request to the camera of the mobile phone/vehicle; Step 104a2.
  • the camera of the mobile phone/vehicle shoots according to the current image collection frequency to obtain several frames of environmental images currently collected; Step 104a3.
  • the camera of the mobile phone/vehicle sends several frames of environment images currently collected to the processor of the mobile phone.
  • the image processing request carries the current image acquisition frequency.
  • the image acquisition request carries the respective identifiers of the multiple cameras that can be controlled, and the respective IDs of the multiple cameras collection frequency.
  • the camera of the vehicle is a camera of the vehicle other than the driving recorder of the vehicle.
  • either the camera of the mobile phone or the camera of the vehicle is controlled, and usually the camera of the mobile phone and the camera of the vehicle are not controlled at the same time.
  • Step 104b1 The processor of the mobile phone sends an image acquisition request to the camera of the mobile phone/vehicle; Step 104b2.
  • the driving recorder samples the last collected surrounding environment video according to the current image collection frequency to obtain the currently collected image. Several frames of environmental images; step 104b3.
  • the driving recorder sends the currently collected several frames of environmental images to the processor of the mobile phone.
  • the image processing request carries the current image acquisition frequency.
  • the dashcam also performs the following steps:
  • the driving recorder records the surrounding environment to obtain the final collected surrounding environment video; Y2.
  • the driving recorder sends the finally collected surrounding environment video to the vehicle terminal.
  • the driving recorder is used to store all collected surrounding environment videos, or cache the last collected surrounding environment videos, both of which may be implemented.
  • Step 104c1 The vehicle-mounted terminal sends an image acquisition request to the driving recorder; Step 104c2.
  • the vehicle-mounted terminal samples the surrounding environment video finally collected by the driving recorder according to the current image collection frequency, so as to obtain several currently collected images.
  • the vehicle terminal sends several frames of environment image currently collected to the processor of the mobile phone.
  • the image processing request carries the current image acquisition frequency.
  • the vehicle terminal will also perform the following steps:
  • the vehicle-mounted terminal stores the last captured surrounding environment video sent by the driving recorder.
  • this implementation manner may be used when the vehicle-mounted terminal is used to store all collected surrounding environment videos. It should be understood that if the driving recorder is used to store all collected surrounding environment videos, the vehicle-mounted terminal is used to buffer the last collected surrounding environment videos. If this implementation method is adopted, the driving recorder needs to send the last collected surrounding environment video Video to vehicle terminal will increase the power consumption of video transmission.
  • Step 105 Determine the visual positioning result based on the currently collected frames of environmental images.
  • the visual positioning result is the above-mentioned visual matching position and visual matching result; for example, the visual matching positioning is performed based on several frames of environment images currently collected to determine the visual matching position and visual matching result.
  • the visual positioning result is the above-mentioned visual recognition result.
  • lane line recognition is performed based on several frames of environment images currently collected, so as to determine a visual recognition result.
  • Step 106 The processor of the mobile phone determines a result error based on the visual positioning result and the non-visual positioning result determined by the last non-visual positioning.
  • the resulting error includes the aforementioned lane error and/or position error.
  • Step 107 The processor of the mobile phone performs navigation based on the visual recognition lane and/or visual matching position in the visual positioning result.
  • navigation can be understood as the mobile phone planning a navigation route based on the visual recognition lane in the visual positioning result, and accurately guiding movement
  • the terminal performs operations such as straight lines, turns, and lane changes.
  • the location of the vehicle can also be determined based on the visual matching position in the visual positioning result and the non-visual matching position determined by the final positioning, and the navigation route can be planned based on the vehicle position and the visual recognition lane in the visual positioning result to accurately guide the mobile terminal Perform operations such as straight lines, turns, and lane changes.
  • step 107 and step 106 may be executed in parallel.
  • Step 108 The processor of the mobile phone determines the image acquisition frequency for the second time based on the second positioning accuracy information; wherein, the second positioning accuracy information includes the type of road environment determined last time, the result error and/or multiple non-identical images in the latest period. Each positioning error value of visual positioning.
  • the detection information that can be used to detect the adjustment conditions will be determined, which may include information reflecting the positioning accuracy, such as the type of the above-mentioned road environment, the difference between the visual positioning result and the non-visual positioning result
  • the error of the result can also include the positioning accuracy, for example, the positioning error of multiple non-visual positioning in the recent period.
  • the second positioning accuracy information may be understood as detection information corresponding to a satisfied adjustment condition.
  • the satisfied adjustment condition is any one of the above-mentioned high-precision positioning conditions RH1 to RH5, or any one of the low-precision positioning conditions RL1 to RL5.
  • the first positioning accuracy information is used as historical positioning accuracy information.
  • the result error is the result error determined in step 106.
  • the result error determined in step 106 is the result error determined in the previous period.
  • Step 109 Use the second image acquisition frequency as the current image acquisition frequency, and the first image processing frequency as the previous image acquisition frequency.
  • the image acquisition frequency for the second time is not equal to 0 Hz
  • the image acquisition frequency for the second time is used as the current image acquisition frequency, and steps 104a1-104a3, steps 104b1-104b3, or steps 104c1-104c3 are executed, Then refresh the last determined visual positioning result. If the second image acquisition frequency is equal to 0 Hz, the pause image acquisition request is sent to the camera of the mobile phone/vehicle, the driving recorder, or the vehicle terminal.
  • the i-time positioning accuracy information it includes the type of road environment determined last time, the result error determined in the previous period, and/or the positioning error values of multiple non-visual positioning times in the latest period.
  • the process of determining the image acquisition frequency and the non-visual positioning process in this embodiment are independent of each other.
  • the previous image acquisition frequency is described relative to the current image acquisition frequency, which can be understood as determining the current image acquisition frequency.
  • the final image acquisition frequency determined before the moment is not necessarily related to multiple non-visual positioning in the recent period. Since the positioning error values of multiple non-visual positioning in the latest period are to determine the current image acquisition frequency, in order to ensure that the current positioning accuracy can be accurately reflected, the latest period includes the moment when the previous image acquisition frequency is determined, and also includes Before the current image acquisition frequency is determined, a time period for collecting environmental images based on the previous image acquisition frequency. In addition, the i-th image acquisition frequency is used to illustrate the number of changes in the image acquisition frequency, and has no special meaning.
  • FIG. 11 Another image acquisition solution provided by this embodiment will be described with reference to FIG. 11 .
  • This solution is applied to a vehicle/vehicle terminal scenario.
  • “/" in the figure means “or”.
  • the processor of the vehicle may be a processor of the vehicle terminal. It can be understood that, for part or all of the content in this solution, reference can be made to the description of the foregoing embodiments.
  • Step 111 The processor of the vehicle/vehicle terminal receives a user operation, and the user operation indicates navigation based on the environment image.
  • Step 112. The processor of the vehicle/vehicle terminal determines the image acquisition frequency for the first time based on the first positioning accuracy information; wherein, the first positioning accuracy information includes the type of road environment determined last time and/or multiple times in the latest period Each positioning error value of non-visual positioning.
  • the vehicle/vehicle terminal executes steps S1 to S3 in parallel; wherein,
  • Step S1 The processor of the mobile phone performs non-visual positioning in real time, and determines non-visual positioning results, including non-visual matching positions, positioning error values and non-visual matching results.
  • Step S2 The processor of the mobile phone tracks the situation information based on the real-time satellite signal to determine the type of road environment.
  • Step 113 Use the first image acquisition frequency as the current image acquisition frequency.
  • Step 114a1 The processor of the vehicle/vehicle terminal caches the last collected surrounding environment video sent by the driving recorder; Step 114a2. The processor of the vehicle/vehicle terminal collects the final image of the driving recorder according to the current image collection frequency The obtained surrounding environment video is sampled to obtain several frames of environment images currently collected.
  • the dashcam also performs the following steps:
  • the driving recorder records the surrounding environment to obtain the last collected surrounding environment video; Y2.
  • the driving recorder sends the final collected surrounding environment video to the vehicle/vehicle terminal .
  • the vehicle camera other than the driving recorder of the vehicle may also shoot according to the current image collection frequency, so as to obtain several frames of environment images currently collected.
  • Step 114b1 The processor of the vehicle/vehicle terminal sends an image acquisition request to the camera of the vehicle; Step 114b2.
  • the camera of the vehicle shoots according to the current image acquisition frequency to obtain several frames of environmental images currently collected; step 114b3.
  • the camera of the vehicle sends several frames of environmental images currently collected to the processor of the vehicle/vehicle terminal.
  • the image processing request carries the current image acquisition frequency.
  • the camera of the vehicle is a camera of the vehicle other than the driving recorder of the vehicle.
  • Step 115 The processor of the vehicle/vehicle terminal determines the visual positioning result based on several frames of environment images currently collected.
  • Step 116 The processor of the vehicle/vehicle terminal determines the error of the result based on the visual positioning result and the non-visual positioning result of the last positioning determination.
  • Step 117 The processor of the vehicle/vehicle terminal performs navigation based on the visual recognition lane and/or visual matching position in the visual positioning result.
  • Step 118 The processor of the vehicle/vehicle terminal determines the second image acquisition frequency based on the second positioning accuracy information; wherein, the second positioning accuracy information includes the last determined road environment type, result error and/or latest time period The positioning error value of each non-visual positioning within multiple times.
  • Step 119 Use the second image acquisition frequency as the current image acquisition frequency, and the first image processing frequency as the previous image acquisition frequency.
  • the second image acquisition frequency is replaced by the current image acquisition frequency, and steps 114a1, 114a2, or 114b1-114b3 are executed to update the last determined visual positioning result.
  • This embodiment provides an image acquisition method. It can be understood that, for part or all of the content of the method, reference can be made to the description of the foregoing embodiments.
  • FIG. 12 is a schematic flowchart of the image acquisition method provided in this embodiment. It can be understood that this method can be executed by any mobile terminal with computing capability. For example, vehicles, mobile phones, vehicle terminals. The following describes the mobile phone installed in the vehicle as the execution subject. As shown in Figure 12, the image acquisition method specifically includes:
  • Step 121 Determine the first image acquisition frequency based on the positioning accuracy information, which is used to reflect the positioning accuracy of the mobile terminal.
  • the mobile phone performs non-visual positioning on the vehicle to determine the second positioning result and the positioning error value.
  • the second positioning result may be the non-visual matching position obtained in the above last positioning, and/or, the non-visual matching lane may also be the above-mentioned non-visual positioning result;
  • the positioning error value may be the positioning error value when the above-mentioned positioning is successful, In addition, if the positioning fails, it can be considered that the positioning error value is infinite.
  • the positioning error value can be understood as a positioning accuracy value.
  • the positioning accuracy information includes the positioning error values of multiple non-visual positionings in the latest period.
  • the mobile phone pauses to collect the second environmental image; wherein, the second image collection frequency can be understood as the previous image collection frequency, that is, the image collection frequency of the previous period, hereinafter referred to as The second image acquisition frequency is described as the previous image acquisition frequency.
  • the second environment image indicates the environment image of the previous period.
  • the positioning accuracy information includes the positioning error values of multiple non-visual positionings in the latest period. It should be understood that the latest time period includes the time period during which the environment image is collected based on the second image collection frequency, in other words, includes the previous time period.
  • the second image collection frequency is not equal to 0 Hz
  • the first positioning result is determined based on the second environment image collected at the second image collection frequency.
  • the first positioning result may be the visual matching position and/or visual lane determined in the previous period, or the above visual positioning result.
  • the positioning accuracy information includes the positioning error values of multiple non-visual positionings in the latest period, and/or, the error between the first positioning result and the second positioning result.
  • the error between the first positioning result and the second positioning result is the above result error, for example, may be a lane error value or a position error value, which is used to reflect the positioning accuracy.
  • the non-visual positioning may be positioning based on captured satellite signals.
  • the mobile phone performs positioning based on the captured satellite signal to determine the second positioning result.
  • the mobile phone performs positioning based on the satellite navigation information obtained from the captured satellite signals, so as to determine the second positioning result.
  • the mobile phone performs positioning based on the captured satellite signal in combination with the acceleration and angular velocity of the mobile terminal, so as to determine the second positioning result.
  • the manner of positioning may be the aforementioned GNSS/INS.
  • acceleration and angular velocity are collected by an inertial measurement unit (IMU).
  • IMU inertial measurement unit
  • the signal quality of the captured satellite signal can also be determined to reflect the positioning accuracy.
  • the mobile phone determines the signal quality based on satellite signal tracking situation information obtained by tracking the captured satellite signals.
  • the positioning accuracy information includes the positioning error value, signal quality, and/or error between the first positioning result and the second positioning result for multiple non-visual positionings in the latest period.
  • the signal quality is characterized by the type of road environment.
  • the positioning accuracy information includes the positioning error values of multiple non-visual positionings in the latest period, the type of road environment, and/or the error between the first positioning result and the second positioning result.
  • the mobile phone determines the type of the road environment based on the captured satellite signal, and determines the signal quality based on the type of the road environment. In practical applications, the mobile phone determines the type of road environment based on satellite signal tracking information obtained by tracking captured satellite signals.
  • the positioning accuracy information may be detection information corresponding to a satisfied adjustment condition.
  • the detection information indicates the information used to detect the adjustment condition, and may include information reflecting the positioning accuracy, such as the above-mentioned signal quality, the error between the first positioning result and the second positioning result, and may also include a positioning error value , for example, the positioning error values of multiple non-visual positionings in the recent period.
  • the satisfied adjustment condition is any one of the above-mentioned high-precision positioning conditions RH1 to RH5, or any one of the low-precision positioning conditions RL1 to RL5.
  • the positioning accuracy information is the error between the first positioning result and the second positioning result, and the positioning error value of multiple non-visual positionings in the latest period.
  • the positioning accuracy information it can be determined whether the positioning is reliable.
  • the positioning is unreliable, it is determined that the first image acquisition frequency is greater than the second image acquisition frequency.
  • the positioning is reliable, it is determined that the first image acquisition frequency is less than the second image acquisition frequency.
  • Image acquisition frequency is the last determined image acquisition frequency, that is, the current image acquisition frequency.
  • the positioning accuracy information includes an error between the first positioning result and the second positioning result, and when the error is greater than or equal to a first threshold, the positioning may be considered unreliable.
  • the error may be a lane error value, and the first threshold is 1.
  • the first threshold is a lane-level positioning error, such as 100cm, 50cm, or 10cm, then when the position error value is greater than the lane-level positioning error, the positioning is unreliable.
  • the location accuracy information includes signal quality of acquired satellite signals.
  • the positioning is unreliable.
  • the type of road environment is "underpass”, “urban canyon”, “tree-lined road”, or “tunnel”
  • the positioning is considered to be unreliable.
  • the above-mentioned third low-precision positioning condition RL3 is satisfied when , it reflects that the signal quality is poor, and the positioning can be considered unreliable.
  • the positioning accuracy information includes a positioning error value, that is, a positioning accuracy value.
  • a positioning error value that is, a positioning accuracy value.
  • the third threshold and the fourth threshold may be equal or unequal. When they are equal, the location equal to the third threshold may be regarded as reliable or unreliable, which needs to be determined according to actual requirements.
  • both the third threshold and the fourth threshold may be lane-level positioning errors.
  • the positioning accuracy information includes the positioning error values of multiple non-visual positionings in the latest period, and when the positioning error values of multiple non-visual positioning estimates are all less than the third threshold, the positioning is considered reliable; at least one positioning When the error value is greater than the fourth threshold, it is considered that the positioning is unreliable.
  • the respective positioning error values of the multiple non-visual positionings are all smaller than the third threshold, it can be considered that the multiple non-visual positionings are all successful and satisfy the lane-level positioning condition. For example, when the above-mentioned low-precision positioning condition RH1 is satisfied, the positioning is considered to be reliable.
  • the positioning error value of at least one positioning is greater than the fourth threshold, it can be considered that multiple non-visual positioning fails, multiple non-visual positioning does not meet the lane-level positioning conditions, and there are a small number or a large number of conditions that do not meet the lane-level positioning conditions. For example, when the above-mentioned low-precision positioning conditions RL1, RL2, and RL3 are satisfied, the positioning is considered to be unreliable.
  • the positioning accuracy information includes a positioning error value of the last non-visual positioning.
  • the positioning accuracy information includes a positioning error value obtained by fusing the positioning error values of multiple non-visual positionings within a recent period. For example, when the above-mentioned low-precision positioning condition RL5 is satisfied, the positioning is considered to be unreliable. When the above-mentioned low-precision positioning condition RH5 is satisfied, the positioning is considered to be reliable.
  • the positioning accuracy information may include a preset positioning error value corresponding to a satisfied adjustment condition.
  • the adjustment condition is a low-precision positioning condition RL or a high-precision positioning condition RH
  • the preset positioning error of the low-precision positioning condition RL is greater than the fourth threshold
  • the preset positioning error of the high-precision positioning condition RH is smaller than the third threshold.
  • the positioning errors corresponding to the above-mentioned low-precision positioning conditions RL1 to RL4 and the above-mentioned low-precision positioning conditions RH1 to RH4 can be set in advance, and the positioning accuracy is the preset corresponding to a satisfied low-precision positioning condition RL or high-precision positioning condition RH positioning error.
  • the positioning accuracy information may include the positioning accuracy values of multiple non-visual positionings in the recent period, and the positioning accuracy value may indicate whether the lane-level positioning requirements are met. For example, if the positioning accuracy value is 0, it means that the lane level is not met. Level positioning requirements or positioning failure, the positioning accuracy value is 1, indicating that the positioning is successful and meets the lane level positioning requirements.
  • the positioning accuracy information may include any two of the error between the first positioning result and the second positioning result, the signal quality of the captured satellite signal, and the positioning error value of non-visual positioning. For example, when the above-mentioned low-precision positioning condition RL4 is satisfied, the positioning is unreliable, and when the above-mentioned high-precision positioning condition RH2 or RH3 is satisfied, the positioning is reliable.
  • the image acquisition frequency is switched among Z specified frequencies, and the first image acquisition frequency and the second image acquisition frequency are any one of the Z specified frequencies.
  • each of the Z specified frequencies corresponds to an adjustment condition, and whether the positioning is reliable is determined based on the corresponding adjustment condition.
  • one or more adjustment conditions corresponding to the second image acquisition frequency are determined, and the adjustment conditions are sequentially detected to determine whether the positioning is reliable.
  • the mobile phone presets the first frequency step size when the positioning is reliable for each of the Z specified frequencies, and the second frequency step size when the positioning is unreliable; wherein, the first frequency step size and the second frequency step size of any one or more specified frequencies
  • the two frequency step sizes can be equal or different, which need to be determined according to actual needs.
  • the determined first image acquisition frequency is the frequency obtained by subtracting the corresponding first frequency step from the second image acquisition frequency.
  • the determined first image acquisition frequency is the frequency obtained by adding the corresponding second frequency step to the second image acquisition frequency.
  • the mobile terminal 100 presets the frequency levels corresponding to each of the Z specified frequencies, and the mobile phone presets the first level difference when the positioning of each of the Z specified frequencies is reliable, and the second level difference when the positioning is unreliable; wherein, The first level difference and the second level difference of any one or more specified frequencies may be equal or not, which needs to be determined in combination with actual requirements.
  • the positioning is reliable, the determined frequency level of the first image acquisition frequency is lower than the frequency level of the second image acquisition frequency, and the level difference between the frequency level of the first image acquisition frequency and the frequency level of the second image acquisition frequency is the first The first level difference corresponding to the two image acquisition frequencies.
  • the determined frequency level of the first image acquisition frequency is higher than the frequency level of the second image acquisition frequency, and the level difference between the determined frequency level of the first image acquisition frequency and the frequency level of the second image acquisition frequency is, is the second level difference corresponding to the second image acquisition frequency.
  • the mobile terminal 100 presets a first frequency and a second frequency corresponding to each of the Z specified frequencies.
  • the first frequency is the frequency lower than the specified frequency among the Z specified frequencies when the positioning is reliable; the second frequency is the frequency higher than the specified frequency among the Z specified frequencies when the positioning is unreliable.
  • the determined first image acquisition frequency is the first frequency corresponding to the second image acquisition frequency.
  • the determined first image acquisition frequency is the second frequency corresponding to the second image acquisition frequency.
  • the first image acquisition frequency is determined based on the positioning accuracy, the high-frequency limit, and the low-frequency limit.
  • the low frequency limit is the minimum value of the image acquisition frequency
  • the high frequency limit is the maximum value of the image acquisition frequency.
  • the low frequency limit is equal to 0 Hz to suspend acquisition frequency.
  • the first image collection frequency is equal to the second image collection frequency
  • the first image collection frequency is greater than the second image collection frequency.
  • the second image acquisition frequency is increased according to the preset frequency increment, and the frequency level is increased by one level.
  • the first image collection frequency is less than the second image collection frequency.
  • the second image acquisition frequency is decreased according to the preset frequency, and the frequency level is reduced by one level.
  • the first image collection frequency is equal to the second image collection frequency
  • the magnitude of the first image acquisition frequency may be determined based on the above-mentioned feasible implementation manners A1 to A4. Whether the positioning is reliable is described above.
  • Step 122 Collect a first environment image based on the first image collection frequency, where the first environment image is an image of the current surrounding environment of the mobile terminal.
  • the acquisition frequency is suspended, and if the first image acquisition frequency is equal to the low frequency limit, at this time, the acquisition of the environmental image is suspended. If the first image collection frequency is not equal to the low frequency limit, at this time, several frames of the first environment image are collected based on the first image collection frequency.
  • the low frequency limit value is greater than 0 Hz, since the low frequency limit value is the minimum value of the image acquisition frequency, at this time, several frames of the first environment image are acquired based on the first image acquisition frequency.
  • the collected first environment image is an image captured by a target camera.
  • the target camera is a camera communicating with the mobile phone, or the target camera is a camera of a mobile terminal.
  • the target camera is a camera of a mobile phone, or a vehicle camera other than a driving recorder of the vehicle.
  • the mobile phone collects the first environmental image according to the first image collection frequency. It can be understood that the mobile phone controls the camera it has to collect the first environmental image according to the first image collection frequency, or the mobile phone controls the traffic recorder of the vehicle.
  • the vehicle camera acquires the first environment image according to the first image acquisition frequency.
  • the target camera is a camera of a driving recorder communicating with a mobile phone
  • the camera of the driving recorder is used to collect video of the surrounding environment.
  • the mobile phone collects the first environment image according to the first image collection frequency, which can be understood as sampling the collected surrounding environment video based on the first image collection frequency to obtain the first environment image.
  • the mobile phone instructs the vehicle, the vehicle-mounted terminal, and the driving recorder to sample the collected surrounding environment video to obtain the first environment image.
  • wireless communication between the mobile phone and the vehicle, between the mobile phone and the driving recorder, between the mobile phone and the vehicle terminal for example, wifi.
  • the mobile phone performs image processing based on the collected first environment image.
  • the mobile phone does not always perform image processing at a high frequency, thereby further reducing power consumption of the mobile terminal.
  • the mobile phone performs image processing on the collected first environment image to determine and update the first positioning result.
  • the mobile phone receives a first operation, the first operation indicates that the user agrees to navigate based on the environment image, and in response to the user operation, navigates the mobile terminal based on the collected first environment image.
  • the first operation is the above user operation.
  • the mobile phone collects the first environment images based on the first image collection frequency, performs lane line recognition on these first environment images respectively, to determine the result of visual recognition of the lane, and plans the navigation route based on the result.
  • the method steps in the embodiments of the present invention may be implemented by means of hardware, or may be implemented by means of a processor executing software instructions.
  • the software instructions can be composed of corresponding software modules, and the software modules can be stored in random access memory (random access memory, RAM), flash memory, read-only memory (read-only memory, ROM), programmable read-only memory (programmable rom) , PROM), erasable programmable read-only memory (erasable PROM, EPROM), electrically erasable programmable read-only memory (electrically EPROM, EEPROM), register, hard disk, mobile hard disk, CD-ROM or known in the art any other form of storage medium.
  • An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium.
  • the storage medium may also be a component of the processor.
  • the processor and storage medium can be located in the ASIC.
  • all or part of them may be implemented by software, hardware, firmware or any combination thereof.
  • software When implemented using software, it may be implemented in whole or in part in the form of a computer program product.
  • the computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on the computer, all or part of the processes or functions described in this embodiment will be generated.
  • the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in or transmitted via a computer-readable storage medium.
  • the computer instructions may be transmitted from one website site, computer, server, or data center to another website site by wired (such as coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (such as infrared, wireless, microwave, etc.) , computer, server or data center for transmission.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer, or a data storage device such as a server or a data center integrated with one or more available media.
  • the available medium may be a magnetic medium (such as a floppy disk, a hard disk, or a magnetic tape), an optical medium (such as a DVD), or a semiconductor medium (such as a solid state disk (solid state disk, SSD)), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

Procédé et appareil d'acquisition d'images, terminal mobile (100) et support d'enregistrement informatique. Le procédé consiste à : déterminer une première fréquence d'acquisition d'images sur la base d'informations de précision de positionnement, les informations de précision de positionnement étant utilisées afin de refléter la précision de positionnement d'un terminal mobile (100) (121) ; et acquérir une première image environnementale sur la base de la première fréquence d'acquisition d'images, la première image environnementale étant une image de l'environnement ambiant actuel du terminal mobile (100) (122). La fréquence d'acquisition d'images peut être ajustée sur la base de la précision de positionnement afin d'adapter la fréquence d'acquisition d'images à la précision de positionnement, de sorte que le terminal mobile (100) n'acquiert pas toujours une image environnementale à une fréquence élevée pendant le processus de positionnement, ce qui permet de réduire la consommation d'énergie du terminal mobile (100). En outre, lorsque la précision de positionnement est relativement faible, la navigation est effectuée au moyen de l'image environnementale, de sorte que la précision de positionnement peut être garantie.
PCT/CN2022/114291 2021-10-20 2022-08-23 Procédé et appareil d'acquisition d'images, terminal mobile et support d'enregistrement informatique WO2023065810A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111221316.X 2021-10-20
CN202111221316.XA CN115993626A (zh) 2021-10-20 2021-10-20 图像采集方法、装置、移动终端及计算机存储介质

Publications (1)

Publication Number Publication Date
WO2023065810A1 true WO2023065810A1 (fr) 2023-04-27

Family

ID=85992942

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/114291 WO2023065810A1 (fr) 2021-10-20 2022-08-23 Procédé et appareil d'acquisition d'images, terminal mobile et support d'enregistrement informatique

Country Status (2)

Country Link
CN (1) CN115993626A (fr)
WO (1) WO2023065810A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103559711A (zh) * 2013-11-05 2014-02-05 余洪山 基于三维视觉系统图像特征和三维信息的运动估计方法
CN107478221A (zh) * 2017-08-11 2017-12-15 黄润芳 一种用于移动终端的高精度定位方法
CN108303721A (zh) * 2018-02-12 2018-07-20 北京经纬恒润科技有限公司 一种车辆定位方法及系统
US10178154B2 (en) * 2012-10-23 2019-01-08 Telefonaktiebolaget Lm Ericsson (Publ) Method and system for cloud service deployment
CN109255817A (zh) * 2018-09-14 2019-01-22 北京猎户星空科技有限公司 一种智能设备的视觉重定位方法及装置
CN109606358A (zh) * 2018-12-12 2019-04-12 禾多科技(北京)有限公司 应用于智能驾驶汽车的图像采集装置及其采集方法
CN211180212U (zh) * 2019-11-29 2020-08-04 北京四维图新科技股份有限公司 采集地图数据的移动终端及地图采集系统
CN112769877A (zh) * 2019-10-21 2021-05-07 比亚迪股份有限公司 团雾预警方法、云端服务器及车辆和介质

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10178154B2 (en) * 2012-10-23 2019-01-08 Telefonaktiebolaget Lm Ericsson (Publ) Method and system for cloud service deployment
CN103559711A (zh) * 2013-11-05 2014-02-05 余洪山 基于三维视觉系统图像特征和三维信息的运动估计方法
CN107478221A (zh) * 2017-08-11 2017-12-15 黄润芳 一种用于移动终端的高精度定位方法
CN108303721A (zh) * 2018-02-12 2018-07-20 北京经纬恒润科技有限公司 一种车辆定位方法及系统
CN109255817A (zh) * 2018-09-14 2019-01-22 北京猎户星空科技有限公司 一种智能设备的视觉重定位方法及装置
CN109606358A (zh) * 2018-12-12 2019-04-12 禾多科技(北京)有限公司 应用于智能驾驶汽车的图像采集装置及其采集方法
CN112769877A (zh) * 2019-10-21 2021-05-07 比亚迪股份有限公司 团雾预警方法、云端服务器及车辆和介质
CN211180212U (zh) * 2019-11-29 2020-08-04 北京四维图新科技股份有限公司 采集地图数据的移动终端及地图采集系统

Also Published As

Publication number Publication date
CN115993626A (zh) 2023-04-21

Similar Documents

Publication Publication Date Title
JP5871952B2 (ja) ナビゲーションのためのカメラ対応ヘッドセット
US10082583B2 (en) Method and apparatus for real-time positioning and navigation of a moving platform
US8589070B2 (en) Apparatus and method for compensating position information in portable terminal
WO2019119289A1 (fr) Procédé et dispositif de positionnement, appareil électronique et produit-programme d'ordinateur
CN111366161B (zh) 车辆定位方法及电子设备
KR20210118119A (ko) 차량 센서들 및 카메라 어레이들로부터의 구조화된 지도 데이터의 생성
WO2020146283A1 (fr) Estimation de pose de véhicule et correction d'erreur de pose
US10132915B2 (en) System and method for integrated navigation with wireless dynamic online models
JP2016188806A (ja) 移動体及びシステム
CN114466308B (zh) 一种定位方法和电子设备
US11169280B2 (en) Systems and methods for direction estimation in indoor and outdoor locations
US11651598B2 (en) Lane mapping and localization using periodically-updated anchor frames
WO2015035501A1 (fr) Système et procédé pour une meilleure navigation intégrée avec angle d'arrivée sans fil
WO2021238785A1 (fr) Procédé de positionnement, équipement utilisateur, support d'enregistrement et dispositif électronique
CN114216457A (zh) 一种基于超宽带信号的多源数据融合定位方法及系统
WO2023065810A1 (fr) Procédé et appareil d'acquisition d'images, terminal mobile et support d'enregistrement informatique
CN109737957A (zh) 一种采用级联FIR滤波的INS/LiDAR组合导航方法及系统
Wei et al. A High-Precision Indoor Positioning Algorithm via Integration of Pedestrian Dead Reckoning and Visible Light Positioning on Mobile Devices
US20190204455A1 (en) Method of adaptive weighting adjustment positioning
RU2772620C1 (ru) Создание структурированных картографических данных с помощью датчиков транспортного средства и массивов камер
EP4220580A1 (fr) Procédé d'assistance à la conduite d'un véhicule dans une zone délimitée
CN115014337A (zh) 一种定位方法、装置、芯片、用户设备及存储介质
WO2024138110A2 (fr) Procédé et système de construction de carte à l'aide de capteurs radar et de mouvement
CN118042408A (zh) 一种室内定位指纹的采集方法、装置、设备及存储介质
CN117809463A (zh) 车辆定位方法及电子设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22882429

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE