CN110146869A - Determine method, apparatus, electronic equipment and the storage medium of coordinate system conversion parameter - Google Patents
Determine method, apparatus, electronic equipment and the storage medium of coordinate system conversion parameter Download PDFInfo
- Publication number
- CN110146869A CN110146869A CN201910423326.8A CN201910423326A CN110146869A CN 110146869 A CN110146869 A CN 110146869A CN 201910423326 A CN201910423326 A CN 201910423326A CN 110146869 A CN110146869 A CN 110146869A
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- imaging device
- offset
- image
- map
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
Abstract
Embodiment of the disclosure provides method, apparatus, electronic equipment and the computer readable storage medium of a kind of coordinate system conversion parameter of determining imaging device.In the method, the reference value of coordinate system conversion parameter and the reference picture of imaging device capture are obtained.Reference value is to be in the reference position in world coordinate system in imaging device and determine with reference to when being orientated.Reference picture is in reference position by imaging device and is captured with reference to when being orientated.In response to imaging device in the position and orientation in world coordinate system at least one of change, determine imaging device change occur after the offset of the target image that is captured relative to reference picture.Based on coordinate corresponding in the alive boundary's coordinate system of reference image vegetarian refreshments in reference picture come correcting offset.The target value of coordinate system conversion parameter is obtained based on reference value and corrected offset.Embodiment of the disclosure calibration easy to operate is high-efficient, and the trackside for automatic Pilot and independently to stop perceives and provides precision guarantee.
Description
Technical field
Embodiment of the disclosure is generally related to the technical field of imaging device and automatic Pilot, and more particularly, relates to
And a kind of method, apparatus, electronic equipment and the computer readable storage medium of determining coordinate system conversion parameter.
Background technique
In recent years, the technologies such as automatic Pilot and autonomous parking are gradually shown up prominently, and the basis of these technologies is to vehicle
The perception of ambient enviroment, i.e., the specific situation of environment near identification vehicle.It has suggested that, in addition to vehicle-mounted (also referred to as " vehicle
Side ") sensor device (for example, mobile lidar, imaging device etc.) except, can also by vehicle outside (also referred to as " road
Side ") sensor device (for example, being mounted on the imaging device in both sides of the road or parking lot) obtain the correlation of vehicle environmental
Data, so as to the automatic Pilot for preferably supporting vehicle or independently parking.The vehicle due to automatic Pilot or independently to stop is usual
It is all with world coordinate system (for example, Universal Trans Meridian UTM coordinate system) with reference to being positioned, so in order to realize to certainly
The support for driving or independently stopping is moved, the imaging device outside vehicle needs to carry out the calibration of outer parameter first, that is, determining that the world is sat
Conversion parameter between mark system and the camera coordinates system of imaging device.
Currently, the external parameters calibration of vehicle-mounted imaging device is usually to pass through between calibration mobile lidar and imaging device
Relationship realize that and the outer imaging device of vehicle can pass through in the covering of global positioning system (GPS) signal
External parameters calibration is completed based on the measurement of GPS signal.However, when the position of imaging device or orientation change outside vehicle,
The outer parameter of imaging device will also change therewith.To mark online (that is, in real time) therefore, it is necessary to a kind of effective solution scheme
The outer parameter of imaging device is determined, for use in the technologies such as auxiliary automatic Pilot or autonomous parking.
Summary of the invention
Embodiment of the disclosure is related to a kind of technical solution of the coordinate system conversion parameter of determining imaging device.
In the disclosure in a first aspect, providing a kind of method of the coordinate system conversion parameter of determining imaging device.The party
Method includes: the reference value for obtaining coordinate system conversion parameter and the reference picture of imaging device capture, and reference value is in imaging device
What reference position in world coordinate system and with reference to orientation when, determined, reference picture by imaging device be in reference position and
It is captured when with reference to orientation.This method further include: in response in position of the imaging device in world coordinate system and orientation at least
One changes, determine imaging device change occur after the offset of the target image that is captured relative to reference picture.It should
Method further include: based on coordinate corresponding in the alive boundary's coordinate system of reference image vegetarian refreshments in reference picture, carry out correcting offset.It should
Method further comprises: reference value and corrected offset is based on, to obtain the target value of coordinate system conversion parameter.
In the second aspect of the disclosure, a kind of device of the coordinate system conversion parameter of determining imaging device is provided.The dress
Setting includes: the first acquisition module, be configured as obtain coordinate system conversion parameter reference value and imaging device capture with reference to figure
Picture, reference value be imaging device be in the reference position in world coordinate system and with reference to orientation when determine, reference picture by
Imaging device is in reference position and captures with reference to when being orientated.The device further include: deflection determination module is configured to respond to
At least one of imaging device in the position and orientation in world coordinate system changes, and determines that imaging device is changing generation
Offset of the target image captured afterwards relative to reference picture.The device further include: offset correction module is configured as being based on
Corresponding coordinate, carrys out correcting offset in the alive boundary's coordinate system of reference image vegetarian refreshments in reference picture.The device further comprises:
Second obtains module, is configured as based on reference value and corrected offset, to obtain the target value of coordinate system conversion parameter.
In the third aspect of the disclosure, a kind of electronic equipment is provided.The electronic equipment includes one or more processors;
And storage device, for storing one or more programs.When one or more programs are executed by one or more processors,
So that the method that one or more processors realize first aspect.
In the fourth aspect of the disclosure, a kind of computer readable storage medium is provided, computer program is stored thereon with,
The method of first aspect is realized when the computer program is executed by processor.
It should be appreciated that content described in Summary be not intended to limit embodiment of the disclosure key or
Important feature, it is also non-for limiting the scope of the present disclosure.Other features of the disclosure will become easy reason by description below
Solution.
Detailed description of the invention
The following detailed description is read with reference to the accompanying drawings, above-mentioned and other purposes, the feature of embodiment of the disclosure
It will be easy to understand with advantage.In the accompanying drawings, several implementations of the disclosure are shown by way of example rather than limitation
Example, in which:
Fig. 1 shows some embodiments of the present disclosure can be in the schematic diagram for the example context wherein realized;
Fig. 2 shows the exemplary methods of the coordinate system conversion parameter of determining imaging device according to an embodiment of the present disclosure
Schematic flow chart;
Fig. 3 shows the reference picture of imaging device according to an embodiment of the present disclosure and the schematic diagram of target image;
Fig. 4 shows according to an embodiment of the present disclosure for determining the pixel in target image in world coordinate system
Coordinate exemplary method schematic flow chart;
Fig. 5 shows the signal of the device of the coordinate system conversion parameter of determining imaging device according to an embodiment of the present disclosure
Property block diagram;And
Fig. 6 shows a kind of schematic block diagram of equipment that can be used to implement embodiment of the disclosure.
Through all attached drawings, same or similar reference label is used to represent same or similar component.
Specific embodiment
Several exemplary embodiments shown in below with reference to the accompanying drawings describe the principle and spirit of the disclosure.It should
Understand, describes these specific embodiments merely to enabling those skilled in the art to more fully understand and realizing this public affairs
It opens, and not limits the scope of the present disclosure in any way.
As used in this article, term " coordinate system conversion parameter " for example can be camera coordinates system, image coordinate system,
It carries out converting required parameter, such as translation matrix, spin matrix, etc. between pixel coordinate system and world coordinate system.?
In the context of the disclosure, world coordinate system can refer to the reference coordinate system of range covering the whole world, such as can be used for
The automatic Pilot for assisting vehicle or independently parking etc., example includes UTM coordinate system, latitude and longitude coordinates system, etc..Camera
The origin of coordinate system can be located at the optical center of imaging device, and vertical pivot (z-axis) can be with the optical axis coincidence of imaging device, horizontal axis (x
Axis) and the longitudinal axis (y-axis) can be parallel with imaging plane.The origin of pixel coordinate system can be in the upper left corner of image, horizontal axis and vertical
Axis can be respectively the pixel column and pixel column where image, and unit can be pixel.The origin of image coordinate system can scheme
The center (i.e. the midpoint of pixel coordinate system) of picture, horizontally and vertically parallel with pixel coordinate system, unit is can millimeter.But
It will be understood that in other examples, these coordinate systems can also be according to other reasonable manners received in the art
It is defined.
In embodiment of the disclosure, " coordinate system conversion parameter " may include or refer to so-called in camera calibration field
" outer ginseng ", " outer parameter ", " external parameter ", " joining matrix outside ", etc..It is set in general, " outer parameter " can refer to specific imaging
Conversion parameter between standby associated camera coordinates system and world coordinate system (for example, UTM coordinate system)." external parameters calibration " can
To refer to the determination to the conversion parameter between camera coordinates system and world coordinate system.Therefore, retouching in embodiment of the disclosure
In stating, for convenience, term " outer parameter " can be replaced with term " coordinate system conversion parameter ".
As noted above, when the position of imaging device or orientation change outside vehicle, the outer parameter of imaging device
It will change therewith.Specifically, the imaging device of automatic Pilot or self-stopping outside vehicle (is such as blown in inclement weather
Deng) may shake, lead to outer Parameters variation.It is sensed in imaging device region without GPS signal or laser radar outside vehicle
In the case where device, it may be difficult to which the camera coordinates system for directly obtaining imaging device turns to world coordinate system (for example, UTM coordinate system)
Change relationship.However, can just better use the imaging device after the outer ginseng for obtaining imaging device only to assist vehicle
Automatic Pilot or autonomous parking, such as execute the algorithm, etc. that monocular vision returns to three-dimensional (3D).Therefore, it is necessary to one kind
The scaling method of outer parameter obtains the transformational relation between the camera coordinates system of imaging device and world coordinate system in real time.
One is proposed in view of the above problem present in traditional scheme and potential other problems, embodiment of the disclosure
Kind determines method, apparatus, electronic equipment and the computer readable storage medium of the coordinate system conversion parameter of imaging device, to realize
For the real-time of the outer parameter of conversion between camera coordinates system and world coordinate system (for example, UTM coordinate system) to imaging device
Calibration.Embodiment of the disclosure can obtain in real time effectively under conditions of no GPS and field end laser radar sensor
Outer ginseng of the imaging device to world coordinate system.It specifically addresses in Changes in weather caused by the shake of bring imaging device
Outer ginseng variation issue.The calibration easy to operate of the scheme of embodiment of the disclosure is high-efficient, and mean pixel error can be less than or wait
In two pixels, the trackside perception for automatic Pilot and independently stopped provides precision guarantee.This is described with reference to the accompanying drawing
Disclosed several embodiments.
Fig. 1 shows some embodiments of the present disclosure can be in the schematic diagram for the example context 100 wherein realized.Such as Fig. 1
Shown, example context 100 schematically depicts the scene in some parking lot.Specifically, in discribed parking lot
Multiple parking stalls are provided with, for example, the parking stall " CW185 " indicated by parking stall number 108.In addition, on the ground in parking lot also
Drafting has lane line 101, guide symbol 104, parking stall line 106, etc..It should be appreciated that these facilities and mark depicted in figure 1
Knowledge is only example, will there is different facility and mark or other facility or mark, the disclosure in other parking lots
Embodiment be not limited in this respect.It is to be further understood that embodiment of the disclosure is not limited to the discribed parking of Fig. 1
The scene of field, but associated scene of generally stopping suitable for any with automatic Pilot or independently.More generally, this public affairs
The embodiment opened is also applied for the imaging device of any purposes, and the imaging for being not limited to auxiliary automatic Pilot or independently stopping is set
It is standby.
In the example of fig. 1, multiple vehicle 110-1 to 110-5 (collectively referred to hereinafter as vehicle 110) are just being parked in corresponding
On parking stall.Vehicle 110, which can be, can carry people and/or object and any type mobile by dynamical systems such as engines
Vehicle, including but not limited to car, truck, bus, electric vehicle, motorcycle, caravan, train, etc..In example context 100
One or more vehicles 110 can be the vehicle with automatic Pilot or autonomous stop capability, such vehicle is also referred to as
Automatic driving vehicle.Certainly, one in example context 100 or some vehicles 110 be also possible to do not have automatic Pilot or from
The vehicle of main stop capability.
It there also is provided imaging device (also referred to as imaging sensor) 105 in example context 100, for capturing in example context
Image.In the context of the disclosure, imaging device generically refers to any equipment with imaging function, including vehicle-mounted
Imaging device and vehicle outside imaging device and imaging device for any purpose.Such imaging device includes but not
Be limited to camera, video camera, video camera, camera, monitoring probe, automobile data recorder, with taking pictures or the movement of camera function
Equipment, etc..In some embodiments, imaging device 105 can be independently of vehicle 110, for monitoring the shape of example context 100
Condition is to obtain perception information relevant to example context 100, come the automatic Pilot for assisting vehicle 110 or independently parking.In order to subtract
It blocks less, the higher position in example context 100 can be set in imaging device 105.For example, being arranged in fixed link or wall
Higher position, preferably to monitor example context 100.
Although illustrating only an imaging device 105 in Fig. 1, it will be understood that, in each region of example context 100
Multiple imaging devices can be disposed with.In some embodiments, other than the imaging device 105 for being fixed on specific position, show
Removable or rotatable imaging device, etc. can also be set in example environment 100.In addition, although imaging device in Fig. 1
105 are depicted as that the outside of vehicle 110 is arranged in, it will be understood that, embodiment of the disclosure is also equally applicable to setting and exists
Imaging device on vehicle 110, i.e., vehicle-mounted imaging device.As shown, imaging device 105 communicatedly (for example, wiredly or
It wirelessly) is connected to and calculates equipment 120.To imaging device 105 execute external parameters calibration when, the position of imaging device 105 and/
Or orientation information and institute's captured image data can be provided to and calculate equipment 120, to determine imaging device 105
It is used when coordinate system conversion parameter.In addition, various control signals can also be sent to imaging device 105 to control by calculating equipment 120
Be formed as the various operations of equipment 105, such as control the capture of imaging device 105 image, moved or rotated, etc..
It will be understood that calculating equipment 120 can be any type of mobile terminal, fixed terminal or portable terminal, including
Mobile phone, website, unit, equipment, multimedia computer, multimedia plate, internet node, communicator, desktop computer,
Laptop computer, notebook computer, netbook computer, tablet computer, PCS Personal Communications System (PCS) equipment, individual
Navigation equipment, personal digital assistant (PDA), audio/video player, digital camera/video camera, positioning device, television reception
Device, radio broadcast receiver, electronic book equipment, game station or any combination thereof, accessory including these equipment and outer
If any combination thereof.Any type of interface for user can be supported (all it is also contemplated that calculating equipment 120
Such as " wearable " circuit).More generally, calculating equipment 120 can be used for determining the coordinate system conversion of imaging device
Any server or client device of parameter.Determining imaging according to an embodiment of the present disclosure is described below with reference to Fig. 2 to set
The exemplary method of standby coordinate system conversion parameter.
Fig. 2 shows the example sides of the coordinate system conversion parameter of determining imaging device 105 according to an embodiment of the present disclosure
The schematic flow chart of method 200.In some embodiments, method 200 can be realized by the calculating equipment 120 in Fig. 1, such as
It can be realized by the processor or processing unit of calculating equipment 120.In other embodiments, all or part of method 200
Can also by being realized independently of the calculating equipment of example context 100, or can by other units in example context 100 Lai
It realizes.For the ease of discussing, method 200 will be described in conjunction with Fig. 1.
As mentioned above, the imaging device 105 in example context 100 can be used for assisting vehicle 110 in parking lot
It is independently stopped or automatic Pilot.More generally, in the automatic Pilot scene on traffic route, the imaging device outside vehicle can
Automatic Pilot is carried out similarly to assist vehicle.Because vehicle 110 is usually independently to be stopped with reference to world coordinate system
Or automatic Pilot, so the autonomous parking or automatic Pilot, imaging device 105 in order to assist vehicle 110 may need to demarcate
Coordinate system conversion parameter between outer parameter, namely the camera coordinates system and world coordinate system of determining imaging device 105.Pass through institute
Determining coordinate system conversion parameter, calculating equipment 120 can be used for the image information that imaging device 105 is captured to assist vehicle
110 carry out automatic Pilots or autonomous parking.
Therefore, at 210, reference value and imaging that equipment 120 obtains the coordinate system conversion parameter of imaging device 105 are calculated
The reference picture that equipment 105 captures.The reference value is reference position and the reference being in world coordinate system when imaging device 105
It is determined when orientation.For example, the reference position and reference orientation may be the position and take that imaging device 105 is initially mounted
To.In other embodiments, the reference position and reference orientation are also possible to imaging device 105 for executing appointing for image capture
What position and orientation.Correspondingly, imaging device 105 capture reference picture be in by imaging device 105 reference position with
It is captured when with reference to orientation.Therefore, the reference value of the reference picture of imaging device 105 and coordinate system conversion parameter is associated
's.In other words, it when imaging device 105 is provided or installed on reference position and reference orientation in example context 100, calculates
Equipment 120 can execute the calibration to the outer parameter of imaging device 105, so that it is determined that the reference of the outer parameter of imaging device 105
Value, and can control imaging equipment 105 to capture piece image as reference picture associated with the reference value of outer parameter.
Calculate equipment 120 various modes can be used determine imaging device 105 outer parameter reference value.For example,
There are in the case where GPS signal or laser radar sensor, calculate equipment 120 to be imaged by detection in example context 100
The GPS signal of 105 position of equipment or the radar detection for utilizing laser radar sensor, to complete to imaging device 105
Outer parameter calibration, that is, determining the reference value of the coordinate system conversion parameter of imaging device 105.However, this method by
It is limited to the presence of GPS signal and laser radar sensor, therefore does not have universality, is not suitable for lacking GPS signal and laser
The scene of radar sensor.
For this purpose, embodiment of the disclosure also proposed does not have GPS signal at the position of imaging device 105, and do not have
In the case where laser radar sensor, the possibility mode of the reference value of outer parameter is obtained.For example, since vehicle 110 needs to carry out
Automatic Pilot or autonomous parking, so high-precision map may have been depicted in advance at the imaging region of imaging device 105,
And coordinate of the various objects in world coordinate system in imaging region is had recorded in high-precision map.In this case, calculating is set
Standby 120 can use high-precision map to calculate the reference value of the outer parameter of imaging device 105.More specifically, measurement can be used
Tool roughly measures orientation of the imaging device 105 in world coordinate system first, to obtain the initial outer of imaging device 105
Parameter.Then, calculating equipment 120 can use the three-dimensional coordinate information in high-precision map to optimize obtained initial outer ginseng
Number, to obtain final outer parameter.
In addition, or alternatively, at the imaging region of imaging device 105, a cloud map may have been depicted in advance
(for example, laser point cloud map) or reflected value map etc. have the map of coordinate information of the object in world coordinate system.At this
In the case of kind, equipment 120 is calculated point cloud map or reflected value at the imaging region of imaging device 105 can be similarly used
Figure, come calculate imaging device 105 outer parameter reference value.It is roughly measured first more specifically, measuring tool can be used
Orientation of the imaging device 105 in world coordinate system, to obtain the initial outer parameter of imaging device 105.Then, equipment is calculated
120 can use the three-dimensional coordinate information in a cloud map or reflected value map to optimize obtained initial outer parameter, to obtain
To final outer parameter.It in this way, can be true independent of GPS positioning signal and/or laser radar sensor
The reference value of the outer parameter of imaging device 105 is made, to improve the flexibility and universality of external parameters calibration.
At 220, calculates equipment 120 and determine whether position and/or orientation of the imaging device 105 in world coordinate system are sent out
It is raw to change.In general, the camera coordinates system of imaging device 105 is using the optical center of imaging device 105 as origin, and with imaging device
105 optical axis is vertical pivot (z-axis) Lai Jianli's.Therefore, the camera coordinates system of imaging device 105 is with imaging device 105
The variation of position and/or orientation and change.That is, working as position and/or orientation of the imaging device 105 in world coordinate system
When change, camera coordinates system is also changed correspondingly relative to the position of world coordinate system and/or orientation.This means that imaging device
105 outer parameter also changes, it is therefore desirable to redefine.
The variation of position and/or orientation of the imaging device 105 in world coordinate system may cause due to various reasons.?
In some exemplary scenes, imaging device 105 is firmly fixed and is intended to its position and orientation is fixed and invariable, still
Due to bad weather (for example, blowing leads to the shake of imaging device), to installation foundation facility (for example, mounting rod or wall)
Factors, the position of imaging device 105 and the orientations such as maintenance may change.In addition, imaging is set in other exemplary scenes
Standby 105 are also possible to be configured as to move or rotating in world coordinate system, so as to from different positions or angle
To execute the image capture to example context 100.
As described above, the change of position and/or orientation of the imaging device 105 in world coordinate system will lead to its camera
Offset of the coordinate system relative to world coordinate system.In this case, imaging device 105 needs to re-scale, that is, again really
The value of fixed outer parameter.In other words, if calculating equipment 120 detects position of the imaging device 105 in world coordinate system
And/or orientation changed namely imaging device 105 be no longer on reference position associated with the reference value of outer parameter and
With reference to orientation, then the current value of the coordinate system conversion parameter of imaging device 105 will no longer be reference value.Therefore, equipment is calculated
120 can redefine the value of the coordinate system conversion parameter of imaging device 105.In the context of the disclosure, which can also
With referred to as current value or target value.
At 230, after determining that position and/or orientation of the imaging device 105 in world coordinate system change, meter
Offset of the target image that the determining imaging device 105 of calculation equipment 120 is captured after changing generation relative to reference picture.It can be with
Understand, since position of the imaging device 105 in world coordinate system and/or orientation change, imaging device 105
The target image currently captured is to be orientated what different orientation positions captured in the position different from reference position and/or with reference,
It is inclined between the target image and reference picture that this position difference and/or orientation difference will cause imaging device 105 to capture
It moves, which is since Current camera coordinate system is relative to reference camera coordinate system associated with reference position and reference orientation
Offset caused by.In other words, by determining the offset between target image and reference picture, calculate equipment 120 can between
Ground connection determines the current position of imaging device 105 and/or orientation relative to reference bit associated with the reference value of outer parameter
Set and/or with reference to orientation variation, that is, offset of the Current camera coordinate system relative to reference camera coordinate system.It hereafter will knot
Fig. 3 is closed this to be further detailed.
Fig. 3 shows the reference picture 310 and target image 320 of imaging device 105 according to an embodiment of the present disclosure
Schematic diagram.As shown in figure 3, imaging device 105 is orientated the reference picture 310 of place capture schematically in reference position and reference
Column imaging 312, the parking stall number of parking stall number 108 imaging 317 including the column (not shown in figure 1) in example context 100, with
And the parking line imaging 319 of stop line 106.Correspondingly, imaging device 105 is captured in current location and current orientation place
Target image 320 schematically includes that 322, parking stall number are imaged in the column of the column (not shown in figure 1) in example context 100
108 parking stall number imaging 327 and the parking line imaging 329 of stop line 106.
As shown in figure 3, because target image 320 is that imaging device 105 captures when in current position and orientation,
And the current position and/or orientation be different from when imaging device 105 captures reference picture 310 present reference position and/
Or with reference to orientation, so there is offset between target image 320 and reference picture 310.The offset is embodied in same object in mesh
The offset between imaging in logo image 320 and reference picture 310.For example, column imaging 322, parking stall number imaging 327, parking
Line imaging 329 and column imaging 312, parking stall number, which are imaged between 317, parking line imaging 319, has offset.
Physically, which can be characterized by rotation amount and translational movement, and wherein rotation amount indicates target image 320
Relative to the rotation size that reference picture 310 occurs, and translational movement indicates that target image 320 occurs relative to reference picture 310
Translation size.It is noted that since the undesirable mobile range of imaging device 105 is usually lesser, so it is assumed herein that
Reference picture 310 and target image 320 still capture the imaging of essentially identical object.It is to be understood, however, that in imaging device
In the biggish scene of 105 mobile range, embodiment of the disclosure be also it is feasible, as long as reference picture 310 and target image
Identical captures object or region are still had between 320.
Calculating equipment 120 can be used various modes to determine offset of the target image 320 relative to reference picture 310.
For example, calculating equipment 120, that imaging of the same object in target image 320 can be measured simply by measuring tool is opposite
In the offset being imaged in reference picture 310.In other embodiments, in order to obtain target image 320 relative to reference picture
310 offset, the fisrt feature to match can be determined respectively in reference picture 310 and target image 320 by calculating equipment 120
Point 315 and second feature point 325.In the example of fig. 3, fisrt feature point 315 can be the angle point of column imaging 312, and the
Two characteristic points 325 can be the correspondence angle point of column imaging 322.It is several between the imaging point in different images according to identical point
What relationship, will have between matched fisrt feature point 315 and second feature point 325 to pole the constraint relationship.In some instances,
The lookup and matching of features described above point can be carried out by existing Image Feature Matching algorithm, such as ORB algorithm or SIFT
Algorithm, etc..
Then, calculating equipment 120 can use closing between fisrt feature point 315 and second feature point 325 to pole constraint
System, to calculate target image 320 relative to rotation amount and translational movement included in the offset of reference picture 310.For example, calculating
Equipment 120 can be according to the pixel for the pixel coordinate and second feature point 325 for solving fisrt feature point 315 to pole the constraint relationship
Transformational relation between coordinate, to obtain above rotation and translational movement.Compared to use measuring tool to reference picture 310
The mode for measuring or comparing with target image 320, the above-mentioned mode using matching characteristic point can be improved identified inclined
The accuracy of shifting.Furthermore, it is noted that although being described by taking a characteristic point as an example here through the matching of characteristic point come really
Fixed above-mentioned offset, but in other embodiments, determining respectively in reference picture 310 and target image 320 it can match
Two set of characteristic points calculate above-mentioned offset, to further increase the accuracy of offset calculated.
According to the imaging geometry of imaging device 105 and coordinate transformation relation it is found that calculating what equipment 120 was determined
Target image 320 is the Current camera coordinate system phase of imaging device 105 relative to the rotation amount in the offset of reference picture 310
For the spin matrix of reference camera coordinate system.However, being not fully equal to for translational movement included in the offset
Translation vector of the Current camera coordinate system of imaging device 105 relative to reference camera coordinate system, but have one each other
A zoom factor.For example, passing through the translation obtained to pole the constraint relationship between fisrt feature point 315 and second feature point 325
Amount will all meet this to pole the constraint relationship multiplied by arbitrary zoom factor.This is because above-mentioned offset is based on two-dimensional reference
What image 310 and the determination of target image 320 obtained, and Current camera coordinate system is three relative to the conversion of reference camera coordinate system
The transformational relation of dimension, so passing through the information that will lack third dimension in offset determined by two two dimensional images.Therefore, it is
The current value of the coordinate system conversion parameter of final determining imaging device 105 calculates equipment 120 and needs to being based on reference picture
310 and target image 320 determined by offset be corrected.
Referring back to Fig. 2, at 240, equipment 120 is calculated based on the reference image vegetarian refreshments 315 in reference picture 310 in the world
Corresponding coordinate carrys out correcting offset in coordinate system.As described above, it is determined by reference to image 310 and target image 320
Offset needs the information being corrected be due to lacking a dimension, and calculating equipment 120 can use in target image 320
Any one coordinate of pixel in world coordinate system corrects above-mentioned offset, for example, determine translation included in offset
The zoom factor of amount.In some embodiments, coordinate of the pixel in target image 320 in world coordinate system can pass through
Coordinate of corresponding pixel points of the pixel in reference picture 310 in world coordinate system determines, because they are in the world
Coordinate in coordinate system is identical.It will be understood that calculating equipment 120 can be used any method appropriate to determine reference picture
Coordinate of the corresponding pixel points in world coordinate system in 310.For example, the coordinate of corresponding pixel points can be sat with the world
It marks and is obtained under the map auxiliary of the coordinate of system.This is specifically described below in conjunction with Fig. 4.
Fig. 4 shows according to an embodiment of the present disclosure for determining the pixel in target image 320 in world coordinates
The schematic flow chart of the exemplary method 400 of coordinate in system.In some embodiments, method 400 can be by the calculating in Fig. 1
Equipment 120 is realized, such as can be realized by the processor or processing unit of calculating equipment 120.In other embodiments, side
The all or part of method 400 can also be by realizing independently of the calculating equipment of example context 100, or can be by example context
Other units in 100 are realized.By exemplary method 400, target image 320 can be effectively determined out by calculating equipment 120
In coordinate of the pixel in world coordinate system, to realize the offset to target image 320 relative to reference picture 310
Correction.
At 410, the map of imaging region of imaging device 105 can be obtained by calculating equipment 120, which has the world
The coordinate of coordinate system.As an example, which can be is made using the laser radar collection point on automatic driving vehicle
Point cloud map.The object reflected detection laser light in imaging region is had recorded in point cloud map is formed by reflection point in world's seat
Coordinate and corresponding reflected intensity in mark system.Therefore, point cloud map has the coordinate of world coordinate system, is determined for mesh
Coordinate of the pixel in world coordinate system in logo image 320.Alternately or additionally, there is the coordinate of world coordinate system
Similar map further includes but is not limited to, high-precision map and reflected value map, etc..
At 420, calculating equipment 120 can use the known reference value of coordinate system conversion parameter of imaging device 105, will
Pixel coordinate system where map projection with the coordinate in world coordinate system to reference picture 310, to be projected
Image.It will be understood that in the projection process, except needing using the outer parameter of imaging device 105, may also relate into
The other parameters such as the intrinsic parameter as equipment 105.In the context of the disclosure, due to focusing mainly on the outer of imaging device 105
The determination of parameter, therefore the intrinsic parameter of imaging device 105 or other parameters assume that be known.In this case,
In above-mentioned projection process, calculating equipment 120 can be by the outer parameter of imaging device 105 and intrinsic parameter etc., to be projected
The mapping relations one by one or corresponding relationship between the three-dimensional coordinate in pixel coordinate and world coordinate system in image.
At 430, calculating equipment 120 can be determining opposite with reference image vegetarian refreshments 315 in the projected image that projection obtains
The projection image's vegetarian refreshments answered.For example, calculating equipment 120 projected image can be compared with reference picture to determine and refer to
The correspondence projection image vegetarian refreshments of pixel 315.In another example calculate equipment 120 can also by the matching algorithm of pixel (for example,
ORB algorithm or SIFT algorithm) determine above-mentioned corresponding projection image's vegetarian refreshments.Corresponding projection image's vegetarian refreshments may be considered ginseng
The pixel that one or more object points are formed in projected image corresponding to pixel 315 is examined, therefore they will correspond to
Same coordinate in world coordinate system.In view of this, calculating equipment 120 can be by the alive boundary's coordinate of projection image's vegetarian refreshments at 440
Coordinate in system is determined as the coordinate of reference image vegetarian refreshments 315.
After obtaining the coordinate corresponding in world coordinate system of the reference image vegetarian refreshments 315 in reference picture 310, calculating is set
Standby 120 can be used for the offset between correction target image 320 and reference picture 310.For example, as mentioned above, meter
Target pixel points 325 corresponding with reference image vegetarian refreshments 315 can be determined in target image 320 by calculating equipment 120, and the mesh
Marking coordinate of the pixel 325 in world coordinate system is considered identical as coordinate of the reference image vegetarian refreshments 315 in world coordinate system.
Then, calculate equipment 120 can based on pixel coordinate of the target pixel points 325 in target image 320 and determine its
Coordinate in world coordinate system comes further according to the coordinate transformation relation between the coordinate in pixel coordinate and world coordinate system
Determine the zoom factor that will be applied onto the translational movement in offset.That is, finally determining flat in the outer parameter of imaging device 105
The current value or target value for the amount of shifting to.In this way, imaging device 105 can more accurately be determined by calculating equipment 120
Translation vector in outer parameter, to improve the accuracy of the on-line proving of imaging device 105.
Referring back to Fig. 2, at 250, calculate reference value of the equipment 120 based on coordinate system conversion parameter and it is corrected partially
It moves, to obtain the target value of coordinate system conversion parameter.As noted above, inclined between target image 320 and reference picture 310
Rotation amount in shifting indicates the rotation size between two images, and the translational movement in the offset indicates the translation between two images
Size.Therefore, corrected to deviate between the current location for characterizing imaging device 105 and orientation and reference position and orientation
Transformational relation, that is, the transformational relation between the Current camera coordinate system and reference camera coordinate system of imaging device 105.Into one
Step ground, the transformational relation between reference camera coordinate system and world coordinate system is known (that is, reference value of outer parameter), so
From the reference value of corrected offset and outer parameter it can be concluded that the current value or target value of outer parameter.
For example, the transformation matrix for indicating corrected offset can be determined by calculating equipment 120, such as using corrected inclined
Rotation amount and translational movement in shifting form the transformation matrix.The transformation matrix characterize imaging device 105 current location and
Current orientation relative to reference position and with reference to the transformation relation of orientation, that is, the Current camera coordinate system of imaging device 105 with
Transformational relation between reference camera coordinate system.Then, transformation matrix can be applied to coordinate system conversion ginseng by calculating equipment 120
Several reference value, to obtain the target value of coordinate system conversion parameter.For example, the reference of the transformation matrix and coordinate system conversion parameter
Value (it can also be characterized by matrix) can directly be multiplied to obtain the target value of coordinate system conversion parameter.Pass through this side
Formula, calculating equipment 102 can simply be obtained from the reference value of the outer parameter of imaging device 105 by the method for matrix operation
Current target value out.
Fig. 5 shows the device 500 of the coordinate system conversion parameter of determining imaging device according to an embodiment of the present disclosure
Schematic block diagram.In some embodiments, device 500 can be included in the calculating equipment 120 of Fig. 1 or be implemented as to count
Calculate equipment 120.
As shown in figure 5, device 500 includes the first acquisition module 510, deflection determination module 520, offset correction module 530
Module 540 is obtained with second.First acquisition module 510 is configured as obtaining the reference value and imaging device of coordinate system conversion parameter
The reference picture of capture.Reference value is to be in the reference position in world coordinate system in imaging device and determine with reference to when being orientated
's.Reference picture is in reference position by imaging device and is captured with reference to when being orientated.
Deflection determination module 520 is configured to respond to imaging device in the position and orientation in world coordinate system extremely
One item missing changes, determine imaging device change occur after the offset of the target image that is captured relative to reference picture.
Offset correction module 530 be configured as based on coordinate corresponding in the alive boundary's coordinate system of reference image vegetarian refreshments in reference picture come
Correcting offset.Second acquisition module 540 is configured as based on reference value and corrected offset, to obtain coordinate system conversion parameter
Target value.
In some embodiments, the first acquisition module 510 may include reference value computing module.Reference value computing module quilt
It is configured at least one in the high-precision map, point cloud map and reflected value map of the imaging region of imaging device, to count
Calculate reference value.
In some embodiments, deflection determination module 520 may include characteristic point determining module and rotation amount and translation
Measure computing module.Characteristic point determining module is configured as determining that match first is special respectively in reference picture and target image
Sign point and second feature point, have between fisrt feature point and second feature point to pole the constraint relationship.Rotation amount and translation meter
Module is calculated to be configured as utilizing to pole the constraint relationship, to calculate the rotation amount and translational movement of offset.
In some embodiments, device 500 may further include map and obtain module, projection module, projection image's vegetarian refreshments
Determining module and coordinate determining module.Map obtains module and is configured as obtaining the map of the imaging region of imaging device, should
Map has the coordinate of world coordinate system.Projection module is configured as using reference value, will be where map projection to reference picture
Pixel coordinate system, to obtain projected image.Projection image's vegetarian refreshments determining module is configured as determining and referring in projected image
The corresponding projection image's vegetarian refreshments of pixel.Coordinate determining module is configured as the coordinate in the alive boundary's coordinate system of projection image's vegetarian refreshments
It is determined as the coordinate of reference image vegetarian refreshments.
In some embodiments, offset correction module 530 may include that target pixel points determining module and zoom factor are true
Cover half block.Target pixel points determining module is configured as determining object pixel corresponding with reference image vegetarian refreshments in the target image
Point.Zoom factor determining module is configured as the pixel coordinate based on target pixel points and between the coordinate in world coordinate system
Transformational relation, come determine will be applied onto offset in translational movement zoom factor.
In some embodiments, the second acquisition module 540 may include transformation matrix determining module and transformation matrix application
Module.Transformation matrix determining module is configured to determine that the transformation matrix for indicating offset.Transformation matrix application module is configured as
Transformation matrix is applied to reference value, to obtain target value.
Fig. 6 schematically shows a kind of block diagram of equipment 600 that can be used to implement embodiment of the disclosure.Such as figure
Shown in 6, equipment 600 includes central processing unit (CPU) 601, can be according to being stored in read only memory devices (ROM)
Computer program instructions in 602 are loaded into the calculating in random access memory device (RAM) 603 from storage unit 608
Machine program instruction, to execute various movements appropriate and processing.In RAM 603, can also store equipment 600 operate it is required each
Kind program and data.CPU 601, ROM 602 and RAM 603 are connected with each other by bus 604.Input/output (I/O) interface
605 are also connected to bus 604.
Multiple components in equipment 600 are connected to I/O interface 605, comprising: input unit 606, such as keyboard, mouse etc.;
Output unit 607, such as various types of displays, loudspeaker etc.;Storage unit 608, such as disk, CD etc.;And it is logical
Believe unit 609, such as network interface card, modem, wireless communication transceiver etc..Communication unit 609 allows equipment 600 by such as
The computer network of internet and/or various telecommunication networks exchange information/data with other equipment.
Each process as described above and processing, such as exemplary method 200 and 400 can be executed by processing unit 601.
For example, in some embodiments, exemplary method 200 and 400 can be implemented as computer software programs, it is tangibly embodied in
Machine readable media, such as storage unit 608.In some embodiments, some or all of of computer program can be via
ROM 602 and/or communication unit 609 and be loaded into and/or be installed in equipment 600.When computer program is loaded into RAM
603 and by CPU 601 execute when, the one or more steps of example described above method 200 and 400 can be executed.
As it is used herein, term " includes " and its similar term should be understood as that opening includes, i.e., " including but not
It is limited to ".Term "based" should be understood as " being based at least partially on ".Term " one embodiment " or " embodiment " should manage
Solution is " at least one embodiment ".Term " first ", " second " etc. may refer to different or identical object.May be used also herein
It can include other specific and implicit definition.
As it is used herein, term " determination " covers various movements.For example, " determination " may include operation,
It calculates, processing, export, investigation, searches (for example, searching in table, database or another data structure), finds out.In addition,
" determination " may include receiving (for example, receiving information), access (for example, data in access memory) etc..In addition, " determination "
It may include parsing, selection, selection, foundation etc..
It should be noted that embodiment of the disclosure can be realized by the combination of hardware, software or software and hardware.Firmly
Part part can use special logic to realize;Software section can store in memory, by instruction execution system appropriate,
Such as microprocessor or special designs hardware execute.It will be appreciated by those skilled in the art that above-mentioned device and method can
It is realized with using computer executable instructions and/or being included in the processor control code, such as in programmable memory
Or such code is provided in the data medium of such as optics or electrical signal carrier.
In addition, although describing the operation of disclosed method in the accompanying drawings with particular order, this do not require that or
Person implies must execute these operations in this particular order, or has to carry out operation shown in whole and be just able to achieve expectation
Result.On the contrary, the step of describing in flow chart can change and execute sequence.Additionally or alternatively, it is convenient to omit Mou Xiebu
Suddenly, multiple step groups are combined into a step to execute, and/or a step is decomposed into execution of multiple steps.It shall also be noted that
It can be embodied in one apparatus according to the feature and function of two or more devices of the disclosure.Conversely, above-described
The feature and function of one device can be to be embodied by multiple devices with further division.
Although describing the disclosure by reference to several specific embodiments, but it is to be understood that it is public that the present disclosure is not limited to institutes
The specific embodiment opened.The disclosure is intended to cover in spirit and scope of the appended claims included various modifications and equivalent
Arrangement.
Claims (14)
1. a kind of method of the coordinate system conversion parameter of determining imaging device, comprising:
Obtain the coordinate system conversion parameter reference value and the imaging device capture reference picture, the reference value be
The imaging device be in the reference position in world coordinate system and with reference to orientation when determine, the reference picture by it is described at
It is captured when being in the reference position and the reference orientation as equipment;
It changes, determines in response at least one of the imaging device in the position and orientation in the world coordinate system
The imaging device is in the offset for changing the target image captured after generation relative to the reference picture;
It is described inclined to correct based on the coordinate corresponding in the world coordinate system of the reference image vegetarian refreshments in the reference picture
It moves;And
Based on the reference value and the corrected offset, to obtain the target value of the coordinate system conversion parameter.
2. according to the method described in claim 1, wherein obtaining the reference value and including:
At least one of in the high-precision map of imaging region based on the imaging device, point cloud map and reflected value map, come
Calculate the reference value.
3. according to the method described in claim 1, wherein determining that the offset includes:
Determine the fisrt feature point and second feature point to match respectively in the reference picture and the target image, it is described
Have between fisrt feature point and second feature point to pole the constraint relationship;And
Using described to pole the constraint relationship, to calculate the rotation amount and translational movement of the offset.
4. according to the method described in claim 1, further comprising:
The map of the imaging region of the imaging device is obtained, the map has the coordinate of the world coordinate system;
Using the reference value, by the pixel coordinate system where the map projection to the reference picture, to obtain perspective view
Picture;
Projection image's vegetarian refreshments corresponding with the reference image vegetarian refreshments is determined in the projected image;And
Coordinate of the projection image's vegetarian refreshments in the world coordinate system is determined as to the coordinate of the reference image vegetarian refreshments.
5. according to the method described in claim 1, wherein correct it is described offset include:
Target pixel points corresponding with the reference image vegetarian refreshments are determined in the target image;And
Pixel coordinate based on the target pixel points and the transformational relation between the coordinate in the world coordinate system come true
Surely it will be applied onto the zoom factor of the translational movement in the offset.
6. according to the method described in claim 1, wherein obtaining the target value and including:
Determine the transformation matrix for indicating the offset;And
The transformation matrix is applied to the reference value, to obtain the target value.
7. a kind of device of the coordinate system conversion parameter of determining imaging device, comprising:
First obtains module, is configured as obtaining the ginseng of the reference value of the coordinate system conversion parameter and imaging device capture
Image is examined, the reference value is to be in the reference position in world coordinate system in the imaging device and determine with reference to when being orientated
, the reference picture captures when being in the reference position and the reference orientation by the imaging device;
Deflection determination module is configured to respond to the imaging device in the position and orientation in the world coordinate system
At least one changes, and determines the imaging device in the target image captured after generation that changes relative to the ginseng
Examine the offset of image;
It is right to be configured as based on the reference image vegetarian refreshments in the reference picture institute in the world coordinate system for offset correction module
The coordinate answered, to correct the offset;And
Second obtains module, is configured as turning based on the reference value and the corrected offset to obtain the coordinate system
Change the target value of parameter.
8. device according to claim 7, wherein the first acquisition module includes:
Reference value computing module is configured as the high-precision map of the imaging region based on the imaging device, point cloud map and anti-
At least one in value map is penetrated, to calculate the reference value.
9. device according to claim 7, wherein the deflection determination module includes:
Characteristic point determining module is configured as determining first to match respectively in the reference picture and the target image
Characteristic point and second feature point have between the fisrt feature point and second feature point to pole the constraint relationship;And
Rotation amount and translation amount calculate module, are configured as using described to pole the constraint relationship, to calculate the rotation of the offset
Amount and translational movement.
10. device according to claim 7, further comprises:
Map obtains module, is configured as obtaining the map of the imaging region of the imaging device, the map has the generation
The coordinate of boundary's coordinate system;
Projection module is configured as sitting the pixel where the map projection to the reference picture using the reference value
Mark system, to obtain projected image;
Projection image's vegetarian refreshments determining module is configured as determining throwing corresponding with the reference image vegetarian refreshments in the projected image
Image vegetarian refreshments;And
Coordinate determining module is configured as coordinate of the projection image's vegetarian refreshments in the world coordinate system being determined as the ginseng
Examine the coordinate of pixel.
11. device according to claim 7, wherein the offset correction module includes:
Target pixel points determining module is configured as determining mesh corresponding with the reference image vegetarian refreshments in the target image
Mark pixel;And
Zoom factor determining module is configured as the pixel coordinate based on the target pixel points and in the world coordinate system
Coordinate between transformational relation, to determine the zoom factor that will be applied onto the translational movement in the offset.
12. device according to claim 7, wherein the second acquisition module includes:
Transformation matrix determining module is configured to determine that the transformation matrix for indicating the offset;And
Transformation matrix application module is configured as the transformation matrix being applied to the reference value, to obtain the target value.
13. a kind of electronic equipment, comprising:
One or more processors;And
Storage device, for storing one or more programs, when one or more of programs are by one or more of processing
When device executes, so that one or more of processors realize such as method of any of claims 1-6.
14. a kind of computer readable storage medium is stored thereon with computer program, realization when described program is executed by processor
Such as method of any of claims 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910423326.8A CN110146869B (en) | 2019-05-21 | 2019-05-21 | Method and device for determining coordinate system conversion parameters, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910423326.8A CN110146869B (en) | 2019-05-21 | 2019-05-21 | Method and device for determining coordinate system conversion parameters, electronic equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110146869A true CN110146869A (en) | 2019-08-20 |
CN110146869B CN110146869B (en) | 2021-08-10 |
Family
ID=67592598
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910423326.8A Active CN110146869B (en) | 2019-05-21 | 2019-05-21 | Method and device for determining coordinate system conversion parameters, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110146869B (en) |
Cited By (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110788858A (en) * | 2019-10-23 | 2020-02-14 | 武汉库柏特科技有限公司 | Image-based object position correction method, intelligent robot and position correction system |
CN111145259A (en) * | 2019-11-28 | 2020-05-12 | 上海联影智能医疗科技有限公司 | System and method for automatic calibration |
CN111192308A (en) * | 2019-12-31 | 2020-05-22 | 浙江商汤科技开发有限公司 | Image processing method and device, electronic equipment and computer storage medium |
CN111311743A (en) * | 2020-03-27 | 2020-06-19 | 北京百度网讯科技有限公司 | Three-dimensional reconstruction precision testing method and device and electronic equipment |
CN111323751A (en) * | 2020-03-25 | 2020-06-23 | 苏州科达科技股份有限公司 | Sound source positioning method, device and storage medium |
CN111460071A (en) * | 2020-03-31 | 2020-07-28 | 北京百度网讯科技有限公司 | Deflection method, device and equipment of high-precision map and readable storage medium |
CN111477013A (en) * | 2020-04-01 | 2020-07-31 | 清华大学苏州汽车研究院(吴江) | Vehicle measuring method based on map image |
CN111578839A (en) * | 2020-05-25 | 2020-08-25 | 北京百度网讯科技有限公司 | Obstacle coordinate processing method and device, electronic equipment and readable storage medium |
CN111612852A (en) * | 2020-05-20 | 2020-09-01 | 北京百度网讯科技有限公司 | Method and apparatus for verifying camera parameters |
CN111681281A (en) * | 2020-04-16 | 2020-09-18 | 北京诺亦腾科技有限公司 | Calibration method and device for limb motion capture, electronic equipment and storage medium |
CN111741214A (en) * | 2020-05-13 | 2020-10-02 | 北京迈格威科技有限公司 | Image processing method and device and electronic equipment |
CN111832642A (en) * | 2020-07-07 | 2020-10-27 | 杭州电子科技大学 | Image identification method based on VGG16 in insect taxonomy |
CN111914048A (en) * | 2020-07-29 | 2020-11-10 | 北京天睿空间科技股份有限公司 | Automatic generation method for longitude and latitude coordinate and image coordinate corresponding point |
CN112150542A (en) * | 2020-09-24 | 2020-12-29 | 上海联影医疗科技股份有限公司 | Method and device for measuring radiation field, electronic equipment and storage medium |
CN112313640A (en) * | 2019-11-01 | 2021-02-02 | 深圳市大疆创新科技有限公司 | Data storage and processing method, related device and storage medium |
CN112509058A (en) * | 2020-11-30 | 2021-03-16 | 北京百度网讯科技有限公司 | Method and device for calculating external parameters, electronic equipment and storage medium |
CN112560769A (en) * | 2020-12-25 | 2021-03-26 | 北京百度网讯科技有限公司 | Method for detecting obstacle, electronic device, road side device and cloud control platform |
CN112819896A (en) * | 2019-11-18 | 2021-05-18 | 商汤集团有限公司 | Calibration method and device of sensor, storage medium and calibration system |
CN112924955A (en) * | 2021-01-29 | 2021-06-08 | 同济大学 | Dynamic correction method for point cloud coordinates of roadside laser radar |
CN113129382A (en) * | 2019-12-31 | 2021-07-16 | 华为技术有限公司 | Method and device for determining coordinate conversion parameters |
CN113311422A (en) * | 2020-02-27 | 2021-08-27 | 富士通株式会社 | Coordinate conversion method and device and data processing equipment |
CN113329181A (en) * | 2021-06-08 | 2021-08-31 | 厦门四信通信科技有限公司 | Angle switching method, device, equipment and storage medium of camera |
CN113420581A (en) * | 2020-10-19 | 2021-09-21 | 杨宏伟 | Correction method and device for written document image, electronic equipment and readable medium |
CN113777589A (en) * | 2021-08-18 | 2021-12-10 | 北京踏歌智行科技有限公司 | LIDAR and GPS/IMU combined calibration method based on point characteristics |
CN113822943A (en) * | 2021-09-17 | 2021-12-21 | 中汽创智科技有限公司 | External parameter calibration method, device and system of camera and storage medium |
CN114252884A (en) * | 2020-09-24 | 2022-03-29 | 北京万集科技股份有限公司 | Method and device for positioning and monitoring roadside radar, computer equipment and storage medium |
CN114347917A (en) * | 2021-12-28 | 2022-04-15 | 华人运通(江苏)技术有限公司 | Vehicle and vehicle-mounted camera system calibration method and device |
CN114452545A (en) * | 2021-08-25 | 2022-05-10 | 西安大医集团股份有限公司 | Method, device and system for confirming coordinate system conversion relation |
TWI811954B (en) * | 2022-01-13 | 2023-08-11 | 緯創資通股份有限公司 | Positioning system and calibration method of object location |
Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004236199A (en) * | 2003-01-31 | 2004-08-19 | Canon Inc | Image processor and image processing method |
CN101699313A (en) * | 2009-09-30 | 2010-04-28 | 北京理工大学 | Method and system for calibrating external parameters based on camera and three-dimensional laser radar |
CN101839692A (en) * | 2010-05-27 | 2010-09-22 | 西安交通大学 | Method for measuring three-dimensional position and stance of object with single camera |
CN103871071A (en) * | 2014-04-08 | 2014-06-18 | 北京经纬恒润科技有限公司 | Method for camera external reference calibration for panoramic parking system |
CN103997637A (en) * | 2014-05-30 | 2014-08-20 | 天津大学 | Correcting method of multi-view-point images of parallel camera array |
CN107170010A (en) * | 2017-05-11 | 2017-09-15 | 四川大学 | System calibration method, device and three-dimensional reconstruction system |
CN107481292A (en) * | 2017-09-05 | 2017-12-15 | 百度在线网络技术(北京)有限公司 | The attitude error method of estimation and device of vehicle-mounted camera |
CN108052910A (en) * | 2017-12-19 | 2018-05-18 | 深圳市保千里电子有限公司 | A kind of automatic adjusting method, device and the storage medium of vehicle panoramic imaging system |
CN108198219A (en) * | 2017-11-21 | 2018-06-22 | 合肥工业大学 | Error compensation method for camera calibration parameters for photogrammetry |
CN108535753A (en) * | 2018-03-30 | 2018-09-14 | 北京百度网讯科技有限公司 | Vehicle positioning method, device and equipment |
CN108765498A (en) * | 2018-05-30 | 2018-11-06 | 百度在线网络技术(北京)有限公司 | Monocular vision tracking, device and storage medium |
CN108876826A (en) * | 2017-05-10 | 2018-11-23 | 深圳先进技术研究院 | A kind of image matching method and system |
CN109087382A (en) * | 2018-08-01 | 2018-12-25 | 宁波发睿泰科智能科技有限公司 | A kind of three-dimensional reconstruction method and 3-D imaging system |
CN109186616A (en) * | 2018-09-20 | 2019-01-11 | 禾多科技(北京)有限公司 | Lane line assisted location method based on high-precision map and scene search |
CN109685855A (en) * | 2018-12-05 | 2019-04-26 | 长安大学 | A kind of camera calibration optimization method under road cloud monitor supervision platform |
-
2019
- 2019-05-21 CN CN201910423326.8A patent/CN110146869B/en active Active
Patent Citations (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004236199A (en) * | 2003-01-31 | 2004-08-19 | Canon Inc | Image processor and image processing method |
CN101699313A (en) * | 2009-09-30 | 2010-04-28 | 北京理工大学 | Method and system for calibrating external parameters based on camera and three-dimensional laser radar |
CN101839692A (en) * | 2010-05-27 | 2010-09-22 | 西安交通大学 | Method for measuring three-dimensional position and stance of object with single camera |
CN103871071A (en) * | 2014-04-08 | 2014-06-18 | 北京经纬恒润科技有限公司 | Method for camera external reference calibration for panoramic parking system |
CN103997637A (en) * | 2014-05-30 | 2014-08-20 | 天津大学 | Correcting method of multi-view-point images of parallel camera array |
CN108876826A (en) * | 2017-05-10 | 2018-11-23 | 深圳先进技术研究院 | A kind of image matching method and system |
CN107170010A (en) * | 2017-05-11 | 2017-09-15 | 四川大学 | System calibration method, device and three-dimensional reconstruction system |
CN107481292A (en) * | 2017-09-05 | 2017-12-15 | 百度在线网络技术(北京)有限公司 | The attitude error method of estimation and device of vehicle-mounted camera |
CN108198219A (en) * | 2017-11-21 | 2018-06-22 | 合肥工业大学 | Error compensation method for camera calibration parameters for photogrammetry |
CN108052910A (en) * | 2017-12-19 | 2018-05-18 | 深圳市保千里电子有限公司 | A kind of automatic adjusting method, device and the storage medium of vehicle panoramic imaging system |
CN108535753A (en) * | 2018-03-30 | 2018-09-14 | 北京百度网讯科技有限公司 | Vehicle positioning method, device and equipment |
CN108765498A (en) * | 2018-05-30 | 2018-11-06 | 百度在线网络技术(北京)有限公司 | Monocular vision tracking, device and storage medium |
CN109087382A (en) * | 2018-08-01 | 2018-12-25 | 宁波发睿泰科智能科技有限公司 | A kind of three-dimensional reconstruction method and 3-D imaging system |
CN109186616A (en) * | 2018-09-20 | 2019-01-11 | 禾多科技(北京)有限公司 | Lane line assisted location method based on high-precision map and scene search |
CN109685855A (en) * | 2018-12-05 | 2019-04-26 | 长安大学 | A kind of camera calibration optimization method under road cloud monitor supervision platform |
Cited By (50)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110788858A (en) * | 2019-10-23 | 2020-02-14 | 武汉库柏特科技有限公司 | Image-based object position correction method, intelligent robot and position correction system |
CN110788858B (en) * | 2019-10-23 | 2023-06-13 | 武汉库柏特科技有限公司 | Object position correction method based on image, intelligent robot and position correction system |
CN112313640A (en) * | 2019-11-01 | 2021-02-02 | 深圳市大疆创新科技有限公司 | Data storage and processing method, related device and storage medium |
WO2021098448A1 (en) * | 2019-11-18 | 2021-05-27 | 商汤集团有限公司 | Sensor calibration method and device, storage medium, calibration system, and program product |
CN112819896A (en) * | 2019-11-18 | 2021-05-18 | 商汤集团有限公司 | Calibration method and device of sensor, storage medium and calibration system |
CN112819896B (en) * | 2019-11-18 | 2024-03-08 | 商汤集团有限公司 | Sensor calibration method and device, storage medium and calibration system |
CN111145259A (en) * | 2019-11-28 | 2020-05-12 | 上海联影智能医疗科技有限公司 | System and method for automatic calibration |
US11676305B2 (en) | 2019-11-28 | 2023-06-13 | Shanghai United Imaging Intelligence Co., Ltd. | Systems and methods for automated calibration |
CN111145259B (en) * | 2019-11-28 | 2024-03-08 | 上海联影智能医疗科技有限公司 | System and method for automatic calibration |
CN111192308A (en) * | 2019-12-31 | 2020-05-22 | 浙江商汤科技开发有限公司 | Image processing method and device, electronic equipment and computer storage medium |
CN113129382A (en) * | 2019-12-31 | 2021-07-16 | 华为技术有限公司 | Method and device for determining coordinate conversion parameters |
CN111192308B (en) * | 2019-12-31 | 2023-11-03 | 浙江商汤科技开发有限公司 | Image processing method and device, electronic equipment and computer storage medium |
CN113311422A (en) * | 2020-02-27 | 2021-08-27 | 富士通株式会社 | Coordinate conversion method and device and data processing equipment |
CN111323751B (en) * | 2020-03-25 | 2022-08-02 | 苏州科达科技股份有限公司 | Sound source positioning method, device and storage medium |
CN111323751A (en) * | 2020-03-25 | 2020-06-23 | 苏州科达科技股份有限公司 | Sound source positioning method, device and storage medium |
CN111311743B (en) * | 2020-03-27 | 2023-04-07 | 北京百度网讯科技有限公司 | Three-dimensional reconstruction precision testing method and device and electronic equipment |
CN111311743A (en) * | 2020-03-27 | 2020-06-19 | 北京百度网讯科技有限公司 | Three-dimensional reconstruction precision testing method and device and electronic equipment |
CN111460071B (en) * | 2020-03-31 | 2023-09-26 | 北京百度网讯科技有限公司 | Deflection method, deflection device, deflection equipment and readable storage medium for high-precision map |
CN111460071A (en) * | 2020-03-31 | 2020-07-28 | 北京百度网讯科技有限公司 | Deflection method, device and equipment of high-precision map and readable storage medium |
CN111477013A (en) * | 2020-04-01 | 2020-07-31 | 清华大学苏州汽车研究院(吴江) | Vehicle measuring method based on map image |
CN111681281A (en) * | 2020-04-16 | 2020-09-18 | 北京诺亦腾科技有限公司 | Calibration method and device for limb motion capture, electronic equipment and storage medium |
CN111681281B (en) * | 2020-04-16 | 2023-05-09 | 北京诺亦腾科技有限公司 | Calibration method and device for limb motion capture, electronic equipment and storage medium |
CN111741214A (en) * | 2020-05-13 | 2020-10-02 | 北京迈格威科技有限公司 | Image processing method and device and electronic equipment |
CN111612852A (en) * | 2020-05-20 | 2020-09-01 | 北京百度网讯科技有限公司 | Method and apparatus for verifying camera parameters |
CN111612852B (en) * | 2020-05-20 | 2023-06-09 | 阿波罗智联(北京)科技有限公司 | Method and apparatus for verifying camera parameters |
CN111578839A (en) * | 2020-05-25 | 2020-08-25 | 北京百度网讯科技有限公司 | Obstacle coordinate processing method and device, electronic equipment and readable storage medium |
CN111578839B (en) * | 2020-05-25 | 2022-09-20 | 阿波罗智联(北京)科技有限公司 | Obstacle coordinate processing method and device, electronic equipment and readable storage medium |
CN111832642A (en) * | 2020-07-07 | 2020-10-27 | 杭州电子科技大学 | Image identification method based on VGG16 in insect taxonomy |
CN111914048A (en) * | 2020-07-29 | 2020-11-10 | 北京天睿空间科技股份有限公司 | Automatic generation method for longitude and latitude coordinate and image coordinate corresponding point |
CN111914048B (en) * | 2020-07-29 | 2024-01-05 | 北京天睿空间科技股份有限公司 | Automatic generation method for corresponding points of longitude and latitude coordinates and image coordinates |
CN112150542A (en) * | 2020-09-24 | 2020-12-29 | 上海联影医疗科技股份有限公司 | Method and device for measuring radiation field, electronic equipment and storage medium |
CN114252884A (en) * | 2020-09-24 | 2022-03-29 | 北京万集科技股份有限公司 | Method and device for positioning and monitoring roadside radar, computer equipment and storage medium |
CN113420581A (en) * | 2020-10-19 | 2021-09-21 | 杨宏伟 | Correction method and device for written document image, electronic equipment and readable medium |
CN112509058A (en) * | 2020-11-30 | 2021-03-16 | 北京百度网讯科技有限公司 | Method and device for calculating external parameters, electronic equipment and storage medium |
CN112509058B (en) * | 2020-11-30 | 2023-08-22 | 北京百度网讯科技有限公司 | External parameter calculating method, device, electronic equipment and storage medium |
CN112560769B (en) * | 2020-12-25 | 2023-08-29 | 阿波罗智联(北京)科技有限公司 | Method for detecting obstacle, electronic device, road side device and cloud control platform |
US12125287B2 (en) | 2020-12-25 | 2024-10-22 | Apollo Intelligent Connectivity (Beijing) Technology Co., Ltd. | Detecting obstacle |
CN112560769A (en) * | 2020-12-25 | 2021-03-26 | 北京百度网讯科技有限公司 | Method for detecting obstacle, electronic device, road side device and cloud control platform |
CN112924955A (en) * | 2021-01-29 | 2021-06-08 | 同济大学 | Dynamic correction method for point cloud coordinates of roadside laser radar |
CN113329181A (en) * | 2021-06-08 | 2021-08-31 | 厦门四信通信科技有限公司 | Angle switching method, device, equipment and storage medium of camera |
CN113329181B (en) * | 2021-06-08 | 2022-06-14 | 厦门四信通信科技有限公司 | Angle switching method, device, equipment and storage medium of camera |
CN113777589A (en) * | 2021-08-18 | 2021-12-10 | 北京踏歌智行科技有限公司 | LIDAR and GPS/IMU combined calibration method based on point characteristics |
CN113777589B (en) * | 2021-08-18 | 2024-04-02 | 北京踏歌智行科技有限公司 | LIDAR and GPS/IMU combined calibration method based on point characteristics |
CN114452545A (en) * | 2021-08-25 | 2022-05-10 | 西安大医集团股份有限公司 | Method, device and system for confirming coordinate system conversion relation |
CN114452545B (en) * | 2021-08-25 | 2024-07-09 | 西安大医集团股份有限公司 | Method, device and system for confirming coordinate system conversion relation |
CN113822943A (en) * | 2021-09-17 | 2021-12-21 | 中汽创智科技有限公司 | External parameter calibration method, device and system of camera and storage medium |
CN113822943B (en) * | 2021-09-17 | 2024-06-11 | 中汽创智科技有限公司 | External parameter calibration method, device and system of camera and storage medium |
CN114347917A (en) * | 2021-12-28 | 2022-04-15 | 华人运通(江苏)技术有限公司 | Vehicle and vehicle-mounted camera system calibration method and device |
CN114347917B (en) * | 2021-12-28 | 2023-11-10 | 华人运通(江苏)技术有限公司 | Calibration method and device for vehicle and vehicle-mounted camera system |
TWI811954B (en) * | 2022-01-13 | 2023-08-11 | 緯創資通股份有限公司 | Positioning system and calibration method of object location |
Also Published As
Publication number | Publication date |
---|---|
CN110146869B (en) | 2021-08-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110146869A (en) | Determine method, apparatus, electronic equipment and the storage medium of coordinate system conversion parameter | |
CN110148185A (en) | Determine method, apparatus, electronic equipment and the storage medium of coordinate system conversion parameter | |
JP7054803B2 (en) | Camera parameter set calculation device, camera parameter set calculation method and program | |
US10909395B2 (en) | Object detection apparatus | |
US11842516B2 (en) | Homography through satellite image matching | |
CN110378965A (en) | Determine the method, apparatus, equipment and storage medium of coordinate system conversion parameter | |
CN110135376A (en) | Determine method, equipment and the medium of the coordinate system conversion parameter of imaging sensor | |
CN110927708A (en) | Calibration method, device and equipment of intelligent road side unit | |
US20190073542A1 (en) | Vehicle lane detection system | |
CN110119698A (en) | For determining the method, apparatus, equipment and storage medium of Obj State | |
CN110766760B (en) | Method, device, equipment and storage medium for camera calibration | |
CN114755662B (en) | Road-vehicle fusion perception laser radar and GPS calibration method and device | |
CN111444845A (en) | Non-motor vehicle illegal parking identification method, device and system | |
CN113029128A (en) | Visual navigation method and related device, mobile terminal and storage medium | |
CN111932627B (en) | Marker drawing method and system | |
CN112116655A (en) | Method and device for determining position information of image of target object | |
CN110766761A (en) | Method, device, equipment and storage medium for camera calibration | |
CN109883433A (en) | Vehicle positioning method in structured environment based on 360 degree of panoramic views | |
KR20130034528A (en) | Position measuring method for street facility | |
CN112766068A (en) | Vehicle detection method and system based on gridding labeling | |
CN115294204B (en) | Outdoor target positioning method and system | |
KR102637972B1 (en) | Apparatus for automatically acquiring the coordinates of distribution facilities and method thereof | |
CN111667531B (en) | Positioning method and device | |
CN117572446A (en) | Vehicle-mounted panoramic imaging measurement device and method and unmanned vehicle | |
CN115327512A (en) | Calibration method, device, server and storage medium for laser radar and camera |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right |
Effective date of registration: 20211011 Address after: 105 / F, building 1, No. 10, Shangdi 10th Street, Haidian District, Beijing 100085 Patentee after: Apollo Intelligent Technology (Beijing) Co.,Ltd. Address before: 100094 2 / F, baidu building, No.10 Shangdi 10th Street, Haidian District, Beijing Patentee before: BEIJING BAIDU NETCOM SCIENCE AND TECHNOLOGY Co.,Ltd. |
|
TR01 | Transfer of patent right |