CN110702139B - Time delay calibration method and device, electronic equipment and medium - Google Patents

Time delay calibration method and device, electronic equipment and medium Download PDF

Info

Publication number
CN110702139B
CN110702139B CN201910933197.7A CN201910933197A CN110702139B CN 110702139 B CN110702139 B CN 110702139B CN 201910933197 A CN201910933197 A CN 201910933197A CN 110702139 B CN110702139 B CN 110702139B
Authority
CN
China
Prior art keywords
time delay
image
image collector
calibration
relative
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910933197.7A
Other languages
Chinese (zh)
Other versions
CN110702139A (en
Inventor
李冰
周志鹏
张丙林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Apollo Zhilian Beijing Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201910933197.7A priority Critical patent/CN110702139B/en
Publication of CN110702139A publication Critical patent/CN110702139A/en
Application granted granted Critical
Publication of CN110702139B publication Critical patent/CN110702139B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass

Abstract

The embodiment of the application discloses a time delay calibration method, a time delay calibration device, electronic equipment and a medium, and relates to the technical field of automatic driving. The specific implementation scheme is as follows: acquiring the actual motion track of the calibration plate and a view finding picture of an image collector in the automatic driving moving carrier simultaneously through the terminal equipment to obtain a calibration image; the view-finding picture is obtained by acquiring the actual motion track in real time by the image acquisition device; and processing the actual motion track and the view-finding picture in the calibration image to obtain the absolute time delay of the image collector. Through the technical scheme, the time delay of the sensing equipment related to the ARHUD system can be accurately calibrated.

Description

Time delay calibration method and device, electronic equipment and medium
Technical Field
The embodiment of the application relates to the technical field of computers, in particular to the technical field of automatic driving, and specifically relates to a time delay calibration method, a time delay calibration device, electronic equipment and a medium.
Background
An ARHUD (vehicular heads-up Display) system can Display an AR animation that fits an actual road scene in real time within a visual area of a driver. Specifically, the ARHUD can generate real-time navigation AR animation according to navigation data given by a navigation map and positioning data of a navigation positioning system in the driving process of the vehicle, so that accurate driving direction guidance is provided for a driver.
However, the ARHUD system involves many sensors, such as an image acquisition Unit, an IMU (Inertial Measurement Unit), a speed sensor, a navigation positioning system, and the like, and due to receiving and system reasons, there is a time delay in the received sensing data. The ARHUD is sensitive to the existence of delay, and the delay can make the route drawn by the ARHUD or the framed recognition frame not conform to the real scene, resulting in poor user experience.
Disclosure of Invention
The embodiment of the application discloses a time delay calibration method, a time delay calibration device, electronic equipment and a medium, which can accurately calibrate the time delay of sensing equipment related to an ARHUD system.
In a first aspect, an embodiment of the present application discloses a time delay calibration method, including:
acquiring the actual motion track of the calibration plate and a view finding picture of an image collector in the automatic driving moving carrier simultaneously through the terminal equipment to obtain a calibration image; the view-finding picture is obtained by acquiring the actual motion track in real time by the image acquisition device;
and processing the actual motion track and the view-finding picture in the calibration image to obtain the absolute time delay of the image collector.
One embodiment in the above application has the following advantages or benefits: the actual motion track of the calibration plate and the view-finding picture obtained by acquiring the actual motion track in real time by the image acquisition device in the automatic driving moving carrier are acquired simultaneously by using the terminal equipment, so that the actual motion track and the view-finding picture are ensured to be positioned under the same time axis; and then, the acquired actual motion track and framing picture in the calibration image are processed, so that the absolute time delay of the image collector can be accurately obtained, the ARHUD can predict based on the absolute time delay of the image collector, the phenomenon that the ARHUD does not accord with the actual scene is avoided, and the user experience is improved.
Optionally, the processing the actual motion trajectory in the calibration image and the framing picture to obtain the absolute time delay of the image collector includes:
extracting a target motion track of a calibration plate from a framing picture of the calibration image;
and obtaining the absolute time delay of the image collector according to the actual motion track and the target motion track in the calibration image.
Optionally, obtaining the absolute time delay of the image collector according to the actual motion trajectory and the target motion trajectory in the calibration image, includes:
and determining the cross-correlation error between the actual motion track and the target motion track in the calibration image to obtain the absolute time delay of the image collector.
The above alternative has the following advantages or benefits: the method provides a thought for accurately determining the absolute time delay of the image collector based on the actual motion track and the target motion track in the calibration image.
Optionally, after obtaining the absolute time delay of the image collector, the method further includes:
and determining the relative time delay of other sensing equipment according to the displacement curve of the image collector and the acceleration curve and/or the speed curve of other sensing equipment.
Optionally, after determining the relative time delay of the other sensing devices according to the displacement curve of the image collector and the acceleration curve and/or the velocity curve of the other sensing devices, the method further includes:
and obtaining the absolute time delay of other sensing equipment according to the absolute time delay of the image collector and the relative time delay of other sensing equipment.
The above alternative has the following advantages or benefits: the absolute time delay of other sensing equipment can be accurately calibrated through the absolute time delay of the image collector and the determined relative time delay of the image collector and other sensing equipment. An idea is provided for calibrating the absolute time delay of other sensing equipment related to the ARHUD.
In a second aspect, an embodiment of the present application discloses a delay calibration apparatus, including:
the calibration image determining module is used for simultaneously acquiring the actual motion track of the calibration plate and the view finding picture of the image collector in the automatic driving moving carrier through the terminal equipment to obtain a calibration image; the view-finding picture is obtained by acquiring the actual motion track in real time by the image acquisition device;
and the image collector absolute time delay determining module is used for processing the actual motion track in the calibration image and the view finding picture to obtain the absolute time delay of the image collector.
In a third aspect, an embodiment of the present application further discloses an electronic device, including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a latency scaling method as described in any embodiment of the present application.
In a fourth aspect, embodiments of the present application further disclose a non-transitory computer-readable storage medium storing computer instructions for causing a computer to execute the latency calibration method according to any of the embodiments of the present application.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
fig. 1 is a flowchart of a delay calibration method according to a first embodiment of the present application;
fig. 2 is a flowchart of a delay calibration method according to a second embodiment of the present application;
fig. 3 to 5 are flowcharts of a delay calibration method according to a third embodiment of the present application;
fig. 6 is a schematic structural diagram of a delay calibration apparatus according to a fourth embodiment of the present application;
fig. 7 is a block diagram of an electronic device for implementing the delay calibration method provided in the embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
First embodiment
Fig. 1 is a flowchart of a delay calibration method according to a first embodiment of the present application, which is applicable to calibrating a delay of a sensing device related to an ARHUD system. The method can be executed by a delay calibration device, which can be implemented in software and/or hardware, and can be integrated on any device with computing capability, such as a terminal device or a server. As shown in fig. 1, the method for calibrating a delay provided in this embodiment may include:
and S110, acquiring the actual motion track of the calibration plate and the view finding picture of the image collector in the automatic driving mobile carrier simultaneously through the terminal equipment to obtain a calibration image.
In this embodiment, the calibration board is a flat board with an array of fixed-pitch patterns (such as solid circles or checkerboards), and can be generally used for calibrating the internal reference of the camera; the image collector can be a camera or a video camera, etc.
Alternatively, a point on the calibration board may be selected as a reference point, and the motion of the reference point may be regarded as the motion of the calibration board. Further, the color of the reference point may be distinguished from the color of other points in the calibration plate. The calibration image is a picture-in-picture image and comprises an actual motion track of the calibration plate and a view finding picture of the image collector, wherein the view finding picture is obtained by collecting the actual motion track in real time by the image collector. Further, the calibration image may be a frame image, or may be a video stream of multiple frame images, or discrete multiple frame images.
Specifically, in a situation where the image collector and the terminal device such as a mobile phone are simultaneously aligned with the calibration board and the terminal device can simultaneously shoot a viewing frame (i.e., a screen) of the image collector and the calibration board, the calibration board is moved at a certain speed; and then the actual motion trail of the calibration plate and the framing picture of the image collector can be collected simultaneously through the terminal equipment to obtain a calibration image.
It should be noted that, in this embodiment, the terminal device is used to simultaneously acquire the actual motion trajectory of the calibration board and the framing picture of the image acquirer, so that it is ensured that the actual motion trajectory and the framing picture start to be acquired at the same time point, that is, the actual motion trajectory and the framing picture are located under the same time axis.
And S120, processing the actual motion track and the view finding picture in the calibration image to obtain the absolute time delay of the image collector.
Optionally, the time delay of the terminal device is set to be zero by default in this embodiment, and further, the absolute time delay of the image acquirer can be obtained by processing the actual motion trajectory and the framing picture in the calibration image. Specifically, the actual motion trajectory and the framing picture in the calibration image may be processed by using an image processing technique to obtain the absolute time delay of the image collector.
For example, the processing the actual motion trajectory and the framing picture in the calibration image to obtain the absolute time delay of the image acquirer may include: extracting a target motion track of the calibration plate from a framing picture of the calibration image; and obtaining the absolute time delay of the image collector according to the actual motion track and the target motion track in the calibration image.
Specifically, if the calibration image is a frame image, the actual motion trajectory of the calibration plate and the target motion trajectory of the calibration plate in the view-finding picture can be simultaneously extracted from the calibration image by adopting an image processing technology; and then, the time difference between the actual motion track and the target motion track under the same time axis can be used as the absolute time delay of the image collector. Further, in order to improve the accuracy of the absolute time delay of the image collector, a plurality of frames of images (that is, a calibration image may be a video stream or a discrete multi-frame image) may be collected, a time difference between an actual motion trajectory and a target motion trajectory in each frame of image is determined, and then, an average value of the time differences of the plurality of frames of images is used as the absolute time delay of the image collector.
Optionally, the obtaining of the absolute time delay of the image collector according to the actual motion trajectory and the target motion trajectory in the calibration image may be: and determining the cross-correlation error between the actual motion track and the target motion track in the calibration image to obtain the absolute time delay of the image collector.
If the calibration image is a video stream, the actual motion trajectory of the calibration plate in each frame of image can be regarded as an actual track point of the reference point motion, and correspondingly, the actual motion trajectory of the calibration plate in the framing picture of each frame of image can be regarded as a target track point of the reference point motion. Specifically, according to the acquisition time, an image processing technology can be adopted to simultaneously extract an actual motion track of the calibration plate, namely an actual track point of reference point motion, and a target motion track of the calibration plate, namely a target track point of reference point motion, in the view finding picture from each frame of image in sequence; then, fitting each actual track point to obtain an actual displacement curve of the reference point, namely an actual displacement curve of the calibration plate, and fitting each target track point to obtain a target displacement curve of the reference point, namely a target displacement curve of the calibration plate; and further determining the absolute time delay of the image collector according to the actual displacement curve and the target displacement curve of the calibration plate. For example, the cross-correlation function of the continuous signals can be used to solve the cross-correlation between the actual displacement curve and the target displacement curve, and calculate the time difference with the minimum cross-correlation error, so as to obtain the absolute time delay of the image collector.
If the calibration image is a discrete multi-frame image, an image processing technology can be adopted to simultaneously extract an actual motion track of the calibration plate, namely an actual track point of reference point motion, and a target motion track of the calibration plate, namely a target track point of reference point motion in a view finding picture from each frame of image in sequence according to the acquisition time; and then, the cross-correlation function of the discrete signals can be adopted to solve the cross-correlation row between the actual track point and the target track point, and the time difference with the minimum cross-correlation error is calculated to obtain the absolute time delay of the image collector.
According to the technical scheme provided by the embodiment of the application, the actual motion track of the calibration plate and the view-finding picture obtained by acquiring the actual motion track in real time by the image acquisition device in the automatic driving mobile carrier are acquired simultaneously by the terminal equipment, so that the actual motion track and the view-finding picture are ensured to be positioned under the same time axis; and then, the acquired actual motion track and framing picture in the calibration image are processed, so that the absolute time delay of the image collector can be accurately obtained, the ARHUD can predict based on the absolute time delay of the image collector, the phenomenon that the ARHUD does not accord with the actual scene is avoided, and the user experience is improved.
Second embodiment
Fig. 2 is a flowchart of a time delay calibration method according to a second embodiment of the present application, and this embodiment provides a method for calibrating absolute time delays of other sensing devices in an ARHUD system, such as an inertial measurement unit, a navigation positioning system, and a speed sensor device, based on the above embodiments. As shown in fig. 2, the method for calibrating a delay provided in this embodiment may include:
s210, acquiring the actual motion track of the calibration plate and the view finding picture of the image collector in the automatic driving moving carrier simultaneously through the terminal equipment to obtain a calibration image.
And S220, processing the actual motion track and the view finding picture in the calibration image to obtain the absolute time delay of the image collector.
And S230, determining the relative time delay of other sensing equipment according to the displacement curve of the image collector and the acceleration curve and/or the speed curve of other sensing equipment.
Alternatively, a landmark point may be selected from an object that is stationary in the scene in which the autonomous mobile carrier is located, for example, a point on a certain landmark building, or a point on a certain tree, etc. And then in the moving process of the automatic driving moving carrier in the scene, a displacement curve of the image collector moving relative to the road sign point, an acceleration curve of the inertial measurement unit moving relative to the road sign point, a displacement curve of the navigation positioning system moving relative to the road sign point, a speed curve of the speed sensor device moving relative to the road sign point and the like can be obtained.
And then, carrying out secondary derivation on the displacement curve of the image collector moving relative to the road sign point to obtain an acceleration curve of the image collector moving relative to the road sign point, solving the cross correlation between the acceleration curve of the image collector moving relative to the road sign point and the acceleration curve of the inertial measurement unit moving relative to the road sign point by adopting a cross correlation function of continuous signals, and calculating the time difference with the minimum cross correlation error to obtain the relative time delay of the image collector and the inertial measurement unit.
Meanwhile, the cross-correlation function of continuous signals can be adopted to solve the cross-correlation between the displacement curve of the image collector moving relative to the road mark point and the displacement curve of the navigation positioning system moving relative to the road mark point, and the time difference with the minimum cross-correlation error is calculated to obtain the relative time delay of the image collector and the navigation positioning system. In addition, the displacement curve of the image collector moving relative to the road sign point can be derived once to obtain the speed curve of the image collector moving relative to the road sign point, the cross correlation between the speed curve of the image collector moving relative to the road sign point and the speed curve of the speed sensor device moving relative to the road sign point is solved by adopting the cross correlation function of continuous signals, and the time difference with the minimum cross correlation error is calculated to obtain the relative time delay of the image collector and the speed sensor device.
Optionally, when determining the relative time delay of the other sensing devices, the relative time delay of each of the other sensing devices and the image collector may be determined directly according to the displacement curve, the acceleration curve, or the speed curve of each of the sensing devices and the displacement curve of the image collector; the relative time delay of each other sensing device and the image collector can also be determined in an indirect manner. For example, after determining the relative time delay between the image collector and the navigation positioning system according to the displacement curve of the image collector moving relative to the landmark point and the displacement curve of the navigation positioning system moving relative to the landmark point, the first derivation may be performed on the displacement curve of the navigation positioning system moving relative to the landmark point to obtain the speed curve of the navigation positioning system moving relative to the landmark point, and the relative time delay between the navigation positioning system and the speed sensor device may be determined according to the speed curve of the navigation positioning system moving relative to the landmark point and the speed curve of the speed sensor device moving relative to the landmark point; and then the relative time delay of the image collector and the speed sensor equipment can be indirectly determined according to the relative time delay of the navigation positioning system and the speed sensor equipment and the relative time delay of the image collector and the navigation positioning system.
And S240, obtaining the absolute time delay of other sensing equipment according to the absolute time delay of the image collector and the relative time delay of other sensing equipment.
Specifically, after the relative time delay of other sensing devices is determined according to the displacement curve of the image collector and the acceleration curve and/or the velocity curve of other sensing devices, the absolute time delay of the inertial measurement unit can be determined according to the absolute time delay of the image collector and the relative time delay of the image collector and the inertial measurement unit; the absolute time delay of the navigation positioning system can be determined according to the absolute time delay of the image collector and the relative time delay of the image collector and the navigation positioning system; the absolute time delay of the speed sensor equipment can be determined according to the absolute time delay of the image collector and the relative time delay of the image collector and the speed sensor equipment.
According to the technical scheme provided by the embodiment of the application, the actual motion track of the calibration plate and the view-finding picture obtained by acquiring the actual motion track in real time by the image acquisition device in the automatic driving mobile carrier are acquired simultaneously by the terminal equipment, so that the actual motion track and the view-finding picture are ensured to be positioned under the same time axis; then, the actual motion track and the framing picture in the acquired calibration image are processed, so that the absolute time delay of the image acquisition device can be accurately obtained; and then, the absolute time delay of other sensing equipment can be accurately calibrated through the absolute time delay of the image collector and the determined relative time delay of the image collector and other sensing equipment, and an idea is provided for calibrating the absolute time delay of other sensing equipment related to the ARHUD. And then ARHUD can predict based on the absolute time delay of sensing equipment, avoids appearing not according with the actual scene phenomenon to promote user's experience.
Third embodiment
Fig. 3 to 5 are flowcharts of a time delay calibration method according to a third embodiment of the present invention, and this embodiment further explains determining the relative time delays of other sensing devices on the basis of the above embodiments.
Referring to fig. 3, the method for determining the relative time delay between the image acquisition unit and the inertial measurement unit may specifically include:
s310, acquiring the actual motion track of the calibration plate and the view finding picture of the image collector in the automatic driving moving carrier simultaneously through the terminal equipment to obtain a calibration image.
And S320, processing the actual motion track and the framing picture in the calibration image to obtain the absolute time delay of the image collector.
S330, determining an acceleration curve of the image collector relative to the calibration plate according to the displacement curve of the image collector relative to the calibration plate.
In order to ensure the accuracy of the finally determined absolute time delay of the inertia measurement unit, in this embodiment, under a scene that the calibration board is fixed and the mobile image collector shoots the calibration board, a displacement curve of the movement of the image collector relative to the calibration board and an acceleration curve of the movement of the inertia measurement unit relative to the calibration board are obtained. It should be noted that, in this embodiment, the image collector and the inertial measurement unit are rigidly connected, and then the image collector moves and the inertial measurement unit moves synchronously.
Specifically, the obtained displacement curve of the image collector moving relative to the calibration plate can be subjected to secondary derivation, and then the acceleration curve of the image collector moving relative to the calibration plate is obtained.
S340, determining the cross-correlation error of the acceleration curve of the image collector relative to the movement of the calibration plate and the acceleration curve of the inertia measurement unit relative to the movement of the calibration plate to obtain the relative time delay of the image collector and the inertia measurement unit.
Specifically, the cross correlation between the acceleration curve of the motion of the image collector relative to the calibration plate and the acceleration curve of the motion of the inertial measurement unit relative to the calibration plate can be solved by adopting the cross correlation function of the continuous signals, and the time difference with the minimum cross correlation error is calculated to obtain the relative time delay of the image collector and the inertial measurement unit.
Referring to fig. 4, the method for determining the relative time delay between the image acquirer and the navigation positioning system may specifically include:
and S410, acquiring the actual motion track of the calibration plate and the view finding picture of the image collector in the automatic driving mobile carrier simultaneously through the terminal equipment to obtain a calibration image.
And S420, processing the actual motion track and the framing picture in the calibration image to obtain the absolute time delay of the image collector.
And S430, determining the relative time delay of the image collector and the inertial measurement unit according to the displacement curve of the image collector relative to the movement of the calibration plate and the acceleration curve of the inertial measurement unit relative to the movement of the calibration plate.
Specifically, the method for determining the relative time delay between the image acquisition device and the inertia measurement unit may be referred to in fig. 3, and details are not repeated here.
S440, determining the cross-correlation error of the acceleration curve of the navigation positioning system relative to the movement of the landmark points and the acceleration curve of the inertia measurement unit relative to the movement of the landmark points to obtain the relative time delay of the navigation positioning system and the inertia measurement unit.
Specifically, in the moving process of the automatic driving mobile carrier, a displacement curve of the navigation positioning system relative to the landmark point and an acceleration curve of the inertia measurement unit relative to the landmark point can be obtained; and then, carrying out secondary derivation on the displacement curve of the navigation positioning system relative to the movement of the landmark points to obtain an acceleration curve of the navigation positioning system relative to the movement of the landmark points, solving the cross correlation between the acceleration curve of the inertial measurement unit relative to the landmark points and the acceleration curve of the navigation positioning system relative to the movement of the landmark points by adopting a cross correlation function of continuous signals, and calculating the time difference with the minimum cross correlation error to obtain the relative time delay of the navigation positioning system and the inertial measurement unit.
S450, determining the relative time delay of the image collector and the navigation positioning system according to the relative time delay of the image collector and the inertia measuring unit and the relative time delay of the navigation positioning system and the inertia measuring unit.
Specifically, after the relative time delay between the navigation positioning system and the inertial measurement unit is determined, the relative time delay between the image collector and the navigation positioning system can be obtained according to the relative time delay between the image collector and the inertial measurement unit and the relative time delay between the navigation positioning system and the inertial measurement unit. For example, the two relative time delays may be subtracted.
Referring to fig. 5, for determining the relative time delay between the image acquirer and the speed sensor device, the method may specifically include:
and S510, acquiring the actual motion track of the calibration plate and the view finding picture of the image collector in the automatic driving mobile carrier simultaneously through the terminal equipment to obtain a calibration image.
And S520, processing the actual motion track and the view finding picture in the calibration image to obtain the absolute time delay of the image collector.
S530, determining the relative time delay of the image collector and the navigation positioning system according to the displacement curve of the image collector moving relative to the calibration plate, the acceleration curve of the inertia measurement unit moving relative to the road mark point and the acceleration curve of the navigation positioning system moving relative to the road mark point.
Specifically, the relative time delay between the image collector and the inertial measurement unit can be determined according to a displacement curve of the movement of the image collector relative to the calibration plate and an acceleration curve of the movement of the inertial measurement unit relative to the calibration plate; determining the relative time delay of the inertial measurement unit and the navigation positioning system according to the acceleration curve of the inertial measurement unit relative to the movement of the landmark points and the acceleration curve of the navigation positioning system relative to the movement of the landmark points; and then the relative time delay of the image collector and the navigation positioning system can be determined according to the relative time delay of the image collector and the inertia measuring unit and the relative time delay of the inertia measuring unit and the navigation positioning system. The specific implementation process can be referred to the description of fig. 3 and fig. 4.
And S540, determining the cross-correlation error of the speed curve of the navigation positioning system relative to the movement of the landmark point and the displacement curve of the speed sensing equipment relative to the movement of the landmark point, and obtaining the relative time delay of the navigation positioning system and the speed sensing equipment.
Specifically, the speed curve of the navigation positioning system relative to the movement of the road mark point can be obtained by performing one derivation on the displacement curve of the navigation positioning system relative to the movement of the road mark point, which is obtained in the moving process of the automatic driving moving carrier; and then, the cross-correlation function of the continuous signals is adopted to solve the cross-correlation between the speed curve of the navigation positioning system relative to the movement of the road mark point and the speed curve of the speed sensor equipment relative to the movement of the road mark point, and the time difference with the minimum cross-correlation error is calculated so as to obtain the relative time delay of the navigation positioning system and the speed sensor equipment.
And S550, determining the relative time delay of the image collector and the speed sensing equipment according to the relative time delay of the image collector and the navigation positioning system and the relative time delay of the navigation positioning system and the speed sensing equipment.
Specifically, after the relative time delay between the navigation positioning system and the speed sensing device is determined, the relative time delay between the image collector and the speed sensing device may be determined according to the relative time delay between the image collector and the navigation positioning system and the relative time delay between the navigation positioning system and the speed sensing device. For example, the two relative time delays may be subtracted.
It should be noted that the related processes for determining the relative time delay between the image collector and the inertial measurement unit, determining the relative time delay between the image collector and the navigation positioning system, and determining the relative time delay between the image collector and the speed sensor device, which are provided in this embodiment, may be simultaneously present in an embodiment, or may be independently used as an embodiment.
The technical scheme provided by the embodiment of the application provides a thought for directly or indirectly determining the relative time delay between other sensing equipment and the image collector, and lays a foundation for accurately determining the absolute time delay of other sensing equipment based on the absolute time delay of the image collector in the follow-up process.
Fourth embodiment
Fig. 6 is a schematic structural diagram of a delay calibration apparatus according to a fourth embodiment of the present application, where the apparatus is capable of executing a delay calibration method provided in any embodiment of the present application, and has corresponding functional modules and beneficial effects of the execution method. Alternatively, the device can be implemented in a software and/or hardware manner, and can be integrated in the vehicle-mounted equipment. As shown in fig. 6, the apparatus 600 may include:
a calibration image determining module 610, configured to acquire an actual motion trajectory of the calibration plate and a view finding picture of an image collector in the autonomous driving mobile carrier at the same time through the terminal device, so as to obtain a calibration image; the view finding picture is obtained by acquiring the actual motion track in real time by the image acquisition device;
and an image collector absolute time delay determining module 620, configured to process the actual motion trajectory and the framing picture in the calibration image to obtain the absolute time delay of the image collector.
According to the technical scheme provided by the embodiment of the application, the actual motion track of the calibration plate and the view-finding picture obtained by acquiring the actual motion track in real time by the image acquisition device in the automatic driving mobile carrier are acquired simultaneously by the terminal equipment, so that the actual motion track and the view-finding picture are ensured to be positioned under the same time axis; and then, the acquired actual motion track and framing picture in the calibration image are processed, so that the absolute time delay of the image collector can be accurately obtained, the ARHUD can predict based on the absolute time delay of the image collector, the phenomenon that the ARHUD does not accord with the actual scene is avoided, and the user experience is improved.
For example, the image collector absolute time delay determining module 620 may include:
the processing unit is used for extracting a target motion track of the calibration plate from a framing picture of the calibration image;
and the image collector absolute time delay determining unit is used for obtaining the absolute time delay of the image collector according to the actual motion track and the target motion track in the calibration image.
Illustratively, the image collector absolute time delay determining unit may be specifically configured to:
and determining the cross-correlation error between the actual motion track and the target motion track in the calibration image to obtain the absolute time delay of the image collector.
Illustratively, the above-mentioned apparatus may further include:
and the relative time delay determining module is used for determining the relative time delays of other sensing equipment according to the displacement curve of the image collector and the acceleration curve and/or the speed curve of other sensing equipment after the absolute time delay of the image collector is obtained.
For example, the relative delay determination module may be specifically configured to:
determining an acceleration curve of the movement of the image collector relative to the calibration plate according to the displacement curve of the movement of the image collector relative to the calibration plate;
and determining the cross-correlation error of the acceleration curve of the image collector relative to the movement of the calibration plate and the acceleration curve of the inertia measurement unit relative to the movement of the calibration plate to obtain the relative time delay of the image collector and the inertia measurement unit.
For example, the relative delay determination module may be further specifically configured to:
determining the relative time delay of the image collector and the inertia measuring unit according to a displacement curve of the image collector moving relative to the calibration plate and an acceleration curve of the inertia measuring unit moving relative to the calibration plate;
determining the cross-correlation error of the acceleration curve of the navigation positioning system relative to the movement of the landmark points and the acceleration curve of the inertia measurement unit relative to the movement of the landmark points to obtain the relative time delay of the navigation positioning system and the inertia measurement unit;
and determining the relative time delay of the image collector and the navigation positioning system according to the relative time delay of the image collector and the inertial measurement unit and the relative time delay of the navigation positioning system and the inertial measurement unit.
For example, the relative delay determination module may be further specifically configured to:
determining the relative time delay of the image collector and the navigation positioning system according to a displacement curve of the image collector moving relative to the calibration plate, an acceleration curve of the inertia measuring unit moving relative to the road mark point and an acceleration curve of the navigation positioning system moving relative to the road mark point;
determining a speed curve of the navigation positioning system relative to the movement of the landmark points and a cross-correlation error of a displacement curve of the speed sensing equipment relative to the movement of the landmark points to obtain the relative time delay of the navigation positioning system and the speed sensing equipment;
and determining the relative time delay of the image collector and the speed sensing equipment according to the relative time delay of the image collector and the navigation positioning system and the relative time delay of the navigation positioning system and the speed sensing equipment.
Illustratively, the apparatus may further include:
and the absolute time delay determining module of other sensing equipment is used for determining the relative time delay of other sensing equipment according to the displacement curve of the image collector and the acceleration curve and/or the speed curve of other sensing equipment, and then obtaining the absolute time delay of other sensing equipment according to the absolute time delay of the image collector and the relative time delay of other sensing equipment.
According to an embodiment of the present application, an electronic device and a readable storage medium are also provided.
Fig. 7 is a block diagram of an electronic device according to the delay calibration method in the embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 7, the electronic apparatus includes: one or more processors 701, a memory 702, and interfaces for connecting the various components, including a high-speed interface and a low-speed interface. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display Graphical information for a Graphical User Interface (GUI) on an external input/output device, such as a display device coupled to the Interface. In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations, e.g., as a server array, a group of blade servers, or a multi-processor system. In fig. 7, one processor 701 is taken as an example.
The memory 702 is a non-transitory computer readable storage medium as provided herein. The memory stores instructions executable by at least one processor to cause the at least one processor to perform the latency scaling method provided by the present application. The non-transitory computer readable storage medium of the present application stores computer instructions for causing a computer to perform the latency scaling method provided herein.
The memory 702, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules corresponding to the latency scaling method in the embodiment of the present application, for example, the scaled image determination module 610 and the image collector absolute latency determination module 620 shown in fig. 6. The processor 701 executes various functional applications and delay calibration of the server by running non-transitory software programs, instructions and modules stored in the memory 702, so as to implement the delay calibration method in the above method embodiment.
The memory 702 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created from use of the electronic device used to implement the delay calibration method, and the like. Further, the memory 702 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 702 may optionally include memory located remotely from the processor 701, which may be connected via a network to an electronic device for implementing the latency calibration method. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device for implementing the time delay calibration method may further include: an input device 703 and an output device 704. The processor 701, the memory 702, the input device 703 and the output device 704 may be connected by a bus or other means, and fig. 7 illustrates an example of a connection by a bus.
The input device 703 may receive input numeric or character information and generate key signal inputs related to user settings and function controls of the electronic device used to implement the time delay calibration method, such as a touch screen, keypad, mouse, track pad, touch pad, pointer stick, one or more mouse buttons, track ball, joystick, or other input device. The output device 704 may include a display apparatus, an auxiliary lighting device such as a Light Emitting Diode (LED), a tactile feedback device such as a vibration motor, and the like. The Display device may include, but is not limited to, a Liquid Crystal Display (LCD), an LED Display, and a plasma Display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, Integrated circuitry, Application Specific Integrated Circuits (ASICs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs, also known as programs, software applications, or code, include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or Device for providing machine instructions and/or data to a Programmable processor, such as a magnetic disk, optical disk, memory, Programmable Logic Device (PLD), including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device for displaying information to a user, for example, a Cathode Ray Tube (CRT) or an LCD monitor; and a keyboard and a pointing device, such as a mouse or a trackball, by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback, e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here may be implemented in a computing system that includes a back-end component, e.g., as a data server; or in a computing system that includes middleware components, e.g., an application server; or in a computing system that includes a front-end component, e.g., a user computer with a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described herein, or in a computing system that includes any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication, e.g., a communication network. Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.
According to the technical scheme of the embodiment of the application, the actual motion track of the calibration plate and the view-finding picture obtained by acquiring the actual motion track in real time by the image acquisition device in the automatic driving mobile carrier are acquired by using the terminal equipment at the same time, so that the actual motion track and the view-finding picture are ensured to be positioned under the same time axis; and then, the acquired actual motion track and framing picture in the calibration image are processed, so that the absolute time delay of the image collector can be accurately obtained, the ARHUD can predict based on the absolute time delay of the image collector, the phenomenon that the ARHUD does not accord with the actual scene is avoided, and the user experience is improved.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, and the present invention is not limited thereto as long as the desired results of the technical solutions disclosed in the present application can be achieved.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (11)

1. A time delay calibration method is characterized by comprising the following steps:
acquiring the actual motion track of the calibration plate and a view finding picture of an image collector in the automatic driving moving carrier simultaneously through the terminal equipment to obtain a calibration image; the view-finding picture is obtained by acquiring the actual motion track in real time by the image acquisition device;
and processing the actual motion track and the view-finding picture in the calibration image to obtain the absolute time delay of the image collector.
2. The method of claim 1, wherein processing the actual motion trajectory and the view frame in the calibration image to obtain the absolute time delay of the image collector comprises:
extracting a target motion track of a calibration plate from a framing picture of the calibration image;
and obtaining the absolute time delay of the image collector according to the actual motion track and the target motion track in the calibration image.
3. The method of claim 2, wherein obtaining the absolute time delay of the image collector according to the actual motion trajectory and the target motion trajectory in the calibration image comprises:
and determining the cross-correlation error between the actual motion track and the target motion track in the calibration image to obtain the absolute time delay of the image collector.
4. The method of claim 1, wherein obtaining the absolute time delay of the image collector further comprises:
determining the relative time delay of other sensing equipment according to the displacement curve of the image collector and the acceleration curve and/or the speed curve of other sensing equipment;
the other sensing devices include at least one of: an inertial measurement unit, a navigational positioning system, and a velocity sensor device.
5. The method of claim 4, wherein determining the relative time delay of the inertial measurement unit according to the displacement curve of the image collector and the acceleration curve of the inertial measurement unit comprises:
determining an acceleration curve of the image collector moving relative to the calibration plate according to a displacement curve of the image collector moving relative to the calibration plate;
and determining the cross-correlation error of the acceleration curve of the image collector moving relative to the calibration plate and the acceleration curve of the inertia measurement unit moving relative to the calibration plate to obtain the relative time delay of the image collector and the inertia measurement unit.
6. The method of claim 4, wherein determining the relative time delay of the navigational positioning system according to the displacement curve of the image acquisition device, the acceleration curve of the inertial measurement unit, and the acceleration curve of the navigational positioning system comprises:
determining the relative time delay of the image collector and the inertial measurement unit according to a displacement curve of the image collector moving relative to the calibration plate and an acceleration curve of the inertial measurement unit moving relative to the calibration plate;
determining a cross-correlation error of an acceleration curve of a navigation positioning system relative to the movement of a road mark point and an acceleration curve of an inertia measurement unit relative to the movement of the road mark point to obtain the relative time delay of the navigation positioning system and the inertia measurement unit;
and determining the relative time delay of the image collector and the navigation positioning system according to the relative time delay of the image collector and the inertial measurement unit and the relative time delay of the navigation positioning system and the inertial measurement unit.
7. The method of claim 4, wherein determining the relative time delay of the navigational positioning system according to the displacement curve of the image acquisition device, the acceleration curve of the inertial measurement unit, the acceleration curve and the velocity curve of the navigational positioning system, and the displacement curve of the velocity sensing device comprises:
determining the relative time delay of the image collector and the navigation positioning system according to a displacement curve of the image collector moving relative to the calibration plate, an acceleration curve of the inertia measuring unit moving relative to a road marking point and an acceleration curve of the navigation positioning system moving relative to the road marking point;
determining a speed curve of the navigation positioning system moving relative to the road mark point and a cross-correlation error of a displacement curve of the speed sensing equipment moving relative to the road mark point to obtain the relative time delay of the navigation positioning system and the speed sensing equipment;
and determining the relative time delay of the image collector and the speed sensing equipment according to the relative time delay of the image collector and the navigation positioning system and the relative time delay of the navigation positioning system and the speed sensing equipment.
8. The method of claim 4, wherein after determining the relative time delays of the other sensing devices according to the displacement curve of the image collector and the acceleration curve and/or the velocity curve of the other sensing devices, the method further comprises:
and obtaining the absolute time delay of other sensing equipment according to the absolute time delay of the image collector and the relative time delay of other sensing equipment.
9. A delay calibration apparatus, comprising:
the calibration image determining module is used for simultaneously acquiring the actual motion track of the calibration plate and the view finding picture of the image collector in the automatic driving moving carrier through the terminal equipment to obtain a calibration image; the view-finding picture is obtained by acquiring the actual motion track in real time by the image acquisition device;
and the image collector absolute time delay determining module is used for processing the actual motion track in the calibration image and the view finding picture to obtain the absolute time delay of the image collector.
10. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the latency scaling method of any of claims 1-8.
11. A non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the latency scaling method of any one of claims 1-8.
CN201910933197.7A 2019-09-29 2019-09-29 Time delay calibration method and device, electronic equipment and medium Active CN110702139B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910933197.7A CN110702139B (en) 2019-09-29 2019-09-29 Time delay calibration method and device, electronic equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910933197.7A CN110702139B (en) 2019-09-29 2019-09-29 Time delay calibration method and device, electronic equipment and medium

Publications (2)

Publication Number Publication Date
CN110702139A CN110702139A (en) 2020-01-17
CN110702139B true CN110702139B (en) 2021-08-27

Family

ID=69197450

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910933197.7A Active CN110702139B (en) 2019-09-29 2019-09-29 Time delay calibration method and device, electronic equipment and medium

Country Status (1)

Country Link
CN (1) CN110702139B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111551191B (en) * 2020-04-28 2022-08-09 浙江商汤科技开发有限公司 Sensor external parameter calibration method and device, electronic equipment and storage medium
CN111537995B (en) * 2020-05-19 2022-08-12 北京爱笔科技有限公司 Time delay obtaining method and device and electronic equipment
CN112835341B (en) * 2020-12-31 2022-02-01 北京国家新能源汽车技术创新中心有限公司 Real vehicle test evaluation method of automatic driving area controller
CN112873209B (en) * 2021-02-05 2022-04-15 深圳市普渡科技有限公司 Positioning sensor time delay calibration method and device, computer equipment and storage medium
CN114399555B (en) * 2021-12-20 2022-11-11 禾多科技(北京)有限公司 Data online calibration method and device, electronic equipment and computer readable medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105579811A (en) * 2013-09-27 2016-05-11 高通股份有限公司 Exterior hybrid photo mapping
CN106802716A (en) * 2016-12-30 2017-06-06 维沃移动通信有限公司 The data processing method and virtual reality terminal of a kind of virtual reality terminal
CN108253964A (en) * 2017-12-29 2018-07-06 齐鲁工业大学 A kind of vision based on Time-Delay Filter/inertia combined navigation model building method
CN108629793A (en) * 2018-03-22 2018-10-09 中国科学院自动化研究所 The vision inertia odometry and equipment demarcated using line duration
CN109147059A (en) * 2018-09-06 2019-01-04 联想(北京)有限公司 A kind of determination method and apparatus for the numerical value that is delayed

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7576684B2 (en) * 2007-05-10 2009-08-18 Honeywell International Inc. Integrated attitude altimeter
CA2953335C (en) * 2014-06-14 2021-01-05 Magic Leap, Inc. Methods and systems for creating virtual and augmented reality
CN106873768B (en) * 2016-12-30 2020-05-05 中兴通讯股份有限公司 Augmented reality method, device and system
CN109325456B (en) * 2018-09-29 2020-05-12 佳都新太科技股份有限公司 Target identification method, target identification device, target identification equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105579811A (en) * 2013-09-27 2016-05-11 高通股份有限公司 Exterior hybrid photo mapping
CN106802716A (en) * 2016-12-30 2017-06-06 维沃移动通信有限公司 The data processing method and virtual reality terminal of a kind of virtual reality terminal
CN108253964A (en) * 2017-12-29 2018-07-06 齐鲁工业大学 A kind of vision based on Time-Delay Filter/inertia combined navigation model building method
CN108629793A (en) * 2018-03-22 2018-10-09 中国科学院自动化研究所 The vision inertia odometry and equipment demarcated using line duration
CN109147059A (en) * 2018-09-06 2019-01-04 联想(北京)有限公司 A kind of determination method and apparatus for the numerical value that is delayed

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Towards large scale high fidelity collaborative augmented reality;Damien Constantine RompapasChristian SandorHirokazu Kato;《Computers & Graphics》;20190827;全文 *
未知环境下预测显示遥操作关键技术研究;胡欢;《中国博士学位论文全文数据库 信息科技辑》;20180915;全文 *

Also Published As

Publication number Publication date
CN110702139A (en) 2020-01-17

Similar Documents

Publication Publication Date Title
CN110702139B (en) Time delay calibration method and device, electronic equipment and medium
CN110246182B (en) Vision-based global map positioning method and device, storage medium and equipment
CN111220154A (en) Vehicle positioning method, device, equipment and medium
CN111649739B (en) Positioning method and device, automatic driving vehicle, electronic equipment and storage medium
CN110806215B (en) Vehicle positioning method, device, equipment and storage medium
CN111959495B (en) Vehicle control method and device and vehicle
CN112415552A (en) Vehicle position determining method and device and electronic equipment
CN111666891B (en) Method and device for estimating movement state of obstacle
EP3842749A2 (en) Positioning method, positioning device and electronic device
CN111578839B (en) Obstacle coordinate processing method and device, electronic equipment and readable storage medium
CN111523471B (en) Method, device, equipment and storage medium for determining lane where vehicle is located
CN111721305B (en) Positioning method and apparatus, autonomous vehicle, electronic device, and storage medium
CN111402609A (en) Special lane driving reminding method, device, equipment and storage medium
CN111784834A (en) Point cloud map generation method and device and electronic equipment
CN112785715A (en) Virtual object display method and electronic device
CN112147632A (en) Method, device, equipment and medium for testing vehicle-mounted laser radar perception algorithm
CN111652103B (en) Indoor positioning method, device, equipment and storage medium
CN111462179A (en) Three-dimensional object tracking method and device and electronic equipment
CN113610702B (en) Picture construction method and device, electronic equipment and storage medium
CN111783611B (en) Unmanned vehicle positioning method and device, unmanned vehicle and storage medium
CN111612851B (en) Method, apparatus, device and storage medium for calibrating camera
CN112577524A (en) Information correction method and device
CN111949816A (en) Positioning processing method and device, electronic equipment and storage medium
CN110689575B (en) Image collector calibration method, device, equipment and medium
CN113628284A (en) Pose calibration data set generation method, device and system, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20211021

Address after: 100176 101, floor 1, building 1, yard 7, Ruihe West 2nd Road, Beijing Economic and Technological Development Zone, Daxing District, Beijing

Patentee after: Apollo Zhilian (Beijing) Technology Co.,Ltd.

Address before: 100085 Baidu Building, 10 Shangdi Tenth Street, Haidian District, Beijing

Patentee before: BAIDU ONLINE NETWORK TECHNOLOGY (BEIJING) Co.,Ltd.