CN110133663B - Distributed acoustic image joint calibration positioning method - Google Patents

Distributed acoustic image joint calibration positioning method Download PDF

Info

Publication number
CN110133663B
CN110133663B CN201910378075.6A CN201910378075A CN110133663B CN 110133663 B CN110133663 B CN 110133663B CN 201910378075 A CN201910378075 A CN 201910378075A CN 110133663 B CN110133663 B CN 110133663B
Authority
CN
China
Prior art keywords
camera
target
coordinate system
center
air sonar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910378075.6A
Other languages
Chinese (zh)
Other versions
CN110133663A (en
Inventor
项彬
马石磊
陈建峰
蔺贝
李晓强
温洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lianfeng Acoustic Technologies Co ltd
Original Assignee
Lianfeng Acoustic Technologies Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lianfeng Acoustic Technologies Co ltd filed Critical Lianfeng Acoustic Technologies Co ltd
Priority to CN201910378075.6A priority Critical patent/CN110133663B/en
Publication of CN110133663A publication Critical patent/CN110133663A/en
Application granted granted Critical
Publication of CN110133663B publication Critical patent/CN110133663B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/06Systems determining the position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/06Systems determining the position data of a target
    • G01S15/08Systems for measuring distance only
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)

Abstract

The invention provides a distributed acoustic image joint calibration positioning method, which is characterized in that a camera, an air sonar and the like are installed above a road at equal heights, the distance between the air sonar and the center of the camera is determined, and a three-dimensional rectangular coordinate system with the center of the air sonar array and the video center of the camera as the origin is constructed; determining the position coordinates of the target in an air sonar coordinate system through an air sonar, and calculating the included angle between the target and a horizontal plane and a plumb bob plane under a camera coordinate system through the transverse horizontal distance of two origins and an inverse trigonometric function formula; calculating the corresponding pixel coordinates of the target in the camera picture according to the camera parameters; and reflecting the sound source target position in a camera picture through pixel coordinates to realize sound image joint calibration and positioning. The invention can realize the combination of city vision and hearing and provide enough technical support for the development of intelligent transportation and intelligent cities.

Description

Distributed acoustic image joint calibration positioning method
Technical Field
The invention belongs to the field of signal processing, and relates to the theories of sonar beam forming, acoustic signal detection, acoustic image combination, target positioning and the like.
Background
With the progress of social science and technology, cameras and air sonars play more and more important roles in the social fields of traffic, security, environmental protection and the like. Through the combined work of the camera and the air sonar, the combination and the combined calibration of the video image and the real-time audio frequency are realized, and the accurate positioning of the sounding target in the video image is realized.
An air sonar is generally a microphone array composed of a plurality of high-precision microphone modules, and has functions such as acoustic positioning, sound recognition, and sound enhancement. Since the last 80 years, the array signal processing technology is widely applied to the research of passive positioning of a sound source, but the sound positioning effect is abstract and cannot be visually represented in the existing urban sensory equipment, so that the specific position of the sound source can be more visually reflected by using the array signal processing technology in combination with a camera system.
At present, a better joint calibration method for sonar positioning and video is not available, an air sonar and a camera are installed in a distributed mode in practical application, the air sonar is not accurately mapped to a camera picture due to the fact that the air sonar is not coincident with the center origin of the camera, conversion marking needs to be performed in a video image manually, accuracy is not accurate enough, and automatic accurate joint positioning cannot be achieved.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides a distributed sound-image joint calibration method, which realizes the combination of urban vision and hearing, provides enough technical support for the development of intelligent transportation and intelligent cities, and has great significance for the development of accurate positioning of vehicle targets in cities and actual engineering application.
The technical scheme adopted by the invention for solving the technical problem comprises the following steps: installing a camera, an air sonar and the like above a road, determining the distance between the air sonar and the center of the camera, and constructing a three-dimensional rectangular coordinate system with the center of the air sonar array and the video center of the camera as the origin; determining the position coordinates of the target in an air sonar coordinate system through an air sonar, and calculating the included angle between the target and a horizontal plane and a plumb bob plane under a camera coordinate system through the transverse horizontal distance of two origins and an inverse trigonometric function formula; calculating the corresponding pixel coordinates of the target in the camera picture according to the camera parameters; and reflecting the target position of the sound source in a camera picture through pixel coordinates to realize the combined calibration and positioning of the sound image.
The steps of the invention are developed, and the method specifically comprises the following steps:
firstly, a camera is installed at the transverse center above a road, an air sonar and a camera are installed at equal height, and an air sonar array center and a camera video center are respectively used as original points to buildVertical three-dimensional rectangular coordinate system o 1 And o 2 The horizontal distance between the two origins is d;
secondly, passively positioning and determining the target s in the air sonar coordinate system o through the air sonar 1 The included angle alpha between the horizontal plane and the vertical plane of the mounting rod, and the position coordinates (a, b, h) of the target in the air sonar coordinate system are calculated through a trigonometric function, wherein,
Figure GDA0002114889350000021
h is the height of the mounting rod;
thirdly, calculating the included angle between the target and the horizontal plane in the camera coordinate system
Figure GDA0002114889350000022
And the included angle with the vertical surface of the mounting rod
Figure GDA0002114889350000023
Fourthly, determining the picture pixel size of the camera as mxn, adjusting the field angle of the camera as theta, and measuring the distance d from the bottom edge in the field of view of the camera to the monitoring rod piece 1
Fifthly, calculating the Y-axis pixel coordinate of the target in the camera picture
Figure GDA0002114889350000024
And pixel coordinates in the X axis
Figure GDA0002114889350000025
Wherein xl is the pixel distance from the center to the left road boundary on the X-axis, xr is the pixel distance from the center to the right road boundary, the pixel position of the road boundary in the picture can be represented as a straight line, kl is the slope of the left boundary in the coordinate system, and kr is the slope of the right boundary in the coordinate system;
and sixthly, determining and reflecting the sound source target position on the image through the pixel coordinates, and realizing the combined calibration and positioning of the sound image.
The beneficial effects of the invention are: the method is simple and practical, can reflect various acoustic signal targets on the road in the image of the camera in real time, can be used for the fields of illegal whistle vehicle positioning, suspicious target positioning, environmental noise monitoring and the like, has profound significance for establishing intelligent traffic, perfecting city security, building smart cities and the like, and promotes the application and development of passive positioning technology in the fields of traffic, security, monitoring and the like.
Drawings
FIG. 1 is a schematic block diagram of the process flow of the present invention;
FIG. 2 is a schematic view of an air sonar and camera mounting location;
FIG. 3 is a schematic view of a camera mounting;
fig. 4 is a schematic view of a monitor screen display.
Detailed Description
The present invention will be further described with reference to the following drawings and examples, which include, but are not limited to, the following examples.
Aiming at the problem that in practical application, when an air sonar and a camera are installed in a distributed mode, the center origin of the air sonar and the center origin of the camera are not coincident, and the air sonar positioning effect is difficult to accurately map to a camera picture, the invention provides the distributed acoustic image positioning combined calibration method.
As shown in fig. 1, the distributed acoustic image joint calibration method provided by the present invention mainly includes the following steps:
the first step is as follows: determining the distance between the air sonar and the center of the camera
As shown in FIG. 2, the cameras and the air sonar are installed in a distributed mode, the cameras are installed at the transverse central position of the road, and the air sonar is locatedEstablishing a three-dimensional rectangular coordinate system with the air sonar array center and the video center of the camera as the original points and respectively being o beside the camera 1 And o 2 The lateral horizontal distance of the two origins is d.
The second step is that: air sonar positioning
Through passive positioning of air sonar, target s is determined in air sonar coordinate system o 1 The included angle alpha between the horizontal plane and the vertical plane of the installation rod and the included angle beta between the vertical plane of the installation rod are calculated, and the position coordinates (a, b, h) of the target in the air sonar coordinate system are calculated through a trigonometric function
Figure GDA0002114889350000031
Figure GDA0002114889350000032
Where h is the height of the mounting bar, i.e., the z-axis coordinate of the target.
The third step: calculating the included angle of the target in the coordinate system of the camera
The camera and the air sonar are arranged at the same height h, and the included angle alpha between the target and the horizontal plane in the camera coordinate system is calculated through the horizontal distance between the two original points and the inverse trigonometric function formula 1 And the angle beta with the vertical plane of the mounting rod 1
Figure GDA0002114889350000033
Figure GDA0002114889350000041
Where d is the lateral horizontal distance of the two coordinate system origins.
The fourth step: determining parameters of a camera
The picture pixel size of the camera is mxn, the field angle of the camera is adjusted to theta, and the bottom edge in the field of view of the camera is measuredDistance d from the monitoring rod 1 As shown in particular in fig. 3.
The fifth step: calculating the corresponding pixel position of the target in the camera picture
As shown in fig. 4, the corresponding pixel position of the object on the Y-axis in the screen:
Figure GDA0002114889350000042
the pixel coordinates of the target in the X-axis direction can be expressed as:
Figure GDA0002114889350000043
wherein xl is the pixel distance from the center to the left road boundary on the X-axis, xr is the pixel distance from the center to the right road boundary, the pixel position of the road boundary in the picture can be represented as a straight line, kl is the slope of the left boundary in the coordinate system, and kr is the slope of the right boundary in the coordinate system.
And a sixth step: sound image joint localization
And determining and reflecting the sound source target position on the image through the pixel coordinates to realize the sound image joint calibration and positioning.
An embodiment of the invention comprises the following steps:
the first step is as follows: determining the distance between the air sonar and the center of the camera
The camera and the air sonar are installed in a distributed mode, the camera is installed at the transverse center of a road, the air sonar is located beside the camera, and a three-dimensional rectangular coordinate system is established by respectively taking the center of the air sonar array and the video center of the camera as the origin to be o 1 And o 2 The lateral horizontal distance of the two origins is d.
In the coordinate system, any point s on the road has a unique corresponding point with the coordinate system of the air sonar and the camera. Under the three-dimensional coordinate system, the vector of the line connecting any point s on the road plane and the origin o is
Figure GDA0002114889350000044
Then vector
Figure GDA0002114889350000045
In the plane Y 0 O O Z O Is projected as
Figure GDA0002114889350000046
Then vector
Figure GDA0002114889350000047
In plane X 0 O O Z O Is projected as
Figure GDA0002114889350000048
The definition of the method is that,
Figure GDA0002114889350000049
the included angle between the Y axis and the Y axis is alpha,
Figure GDA00021148893500000410
if the angle between the x-axis and the x-axis is beta, a point s can be uniquely identified by alpha and beta.
The second step is that: air sonar positioning
Target s is determined in air sonar coordinate system o through air sonar passive positioning 1 The included angle alpha between the middle part and the horizontal plane and the included angle beta between the middle part and the vertical plane of the mounting rod. And calculating the position coordinates (a, b, h) of the target in the air sonar coordinate system through a trigonometric function
Figure GDA0002114889350000051
Figure GDA0002114889350000052
Where h is the height of the mounting bar, i.e., the z-axis coordinate of the target.
The third step: calculating the included angle of the target in the camera coordinate system
The camera and the air sonar are arranged at the same height h, and the included angle alpha between the target and the horizontal plane in the camera coordinate system is calculated through the horizontal distance between the two original points and an inverse trigonometric function formula 1 And the angle beta with the vertical plane of the mounting rod 1
Figure GDA0002114889350000053
Figure GDA0002114889350000054
Where d is the lateral horizontal distance of the two coordinate system origins.
The fourth step: determining parameters of a camera
The picture pixel size of the camera is mxn, the field angle of the camera is adjusted to theta, and the distance between the bottom edge in the field of view of the camera and the monitoring rod piece is measured to d 1 As shown in fig. 3:
the fifth step: calculating the corresponding pixel position of the target in the camera picture:
because there is a one-to-one correspondence between the pixels of the camera view and the field angle, for a given arbitrary target location in the camera coordinate system, the corresponding pixel location of the target on the Y-axis in the view can be determined:
Figure GDA0002114889350000055
in the X-axis direction, the angles do not correspond to the pixels one to one, but the distances correspond to the angles one to one, and if sl represents the actual distance from the left boundary of the road to the Y-axis, sr represents the actual distance from the right boundary of the road to the Y-axis, the left boundary of the road in the camera frame can be represented as a straight line perpendicular to the X-axis in the camera coordinate system:
x=sl
the right side boundary of the road can also be expressed as a straight line perpendicular to the X axis
x=-sr
The coordinate system shown in fig. 4 is re-established in the monitoring picture of the camera road, and the pixel position of the target corresponding to the Y axis in the picture is converted into:
n 11 =n-n 1
assuming that the center of the camera image is O, the pixel distance from the center on the X-axis to the left road boundary in the image is xl, and the pixel distance from the center to the right road boundary in the image is xr, it can be seen that the pixel position of the road boundary in the image can be represented as a straight line, the slope of the left boundary in the coordinate system is kl, and the slope of the right boundary in the coordinate system is kr.
The pixel boundary on the left X-axis can be expressed as:
y=kl×x+b b=-kl×xl
the pixel boundary on the right X-axis can be expressed as:
y=kr×x+b b=-kr×xr
the pixel distance xl from the center of the X-axis picture to the left road boundary, the pixel distance xr from the right road boundary, the left boundary slope kl and the right boundary slope kr can be calibrated and obtained on the picture of the camera.
At a known Y-axis pixel location n 1 Under the condition (2), the actual pixel distance of the road boundary at that time can be obtained.
Left boundary distance X left Expressed as:
Figure GDA0002114889350000061
right side boundary distance X right Expressed as:
Figure GDA0002114889350000062
when the coordinates of the target s in the camera coordinate system are (a-d, b, h), the pixel coordinates in the X-axis direction can be expressed as:
Figure GDA0002114889350000063
namely:
Figure GDA0002114889350000064
and a sixth step: sound image joint localization
And determining and reflecting the sound source target position on the image through the pixel coordinates to realize the sound image joint calibration and positioning.

Claims (2)

1. A distributed sound image joint calibration positioning method is characterized by comprising the following steps: installing a camera, an air sonar and the like above a road, determining the distance between the air sonar and the center of the camera, and constructing a three-dimensional rectangular coordinate system with the center of the air sonar array and the video center of the camera as the origin; determining the position coordinates of the target in an air sonar coordinate system through an air sonar, and calculating the included angle between the target and a horizontal plane and a plumb bob plane under a camera coordinate system through the transverse horizontal distance of two origins and an inverse trigonometric function formula; calculating the corresponding pixel coordinates of the target in the camera picture according to the camera parameters; and reflecting the sound source target position in a camera picture through pixel coordinates to realize sound image joint calibration and positioning.
2. The distributed sound image joint alignment localization method according to claim 1, characterized by comprising the steps of:
firstly, a camera is installed at the transverse center position above a road, an air sonar and the camera are installed at equal heights, and a three-dimensional rectangular coordinate system o is established by taking the air sonar array center and the video center of the camera as the origin points 1 And o 2 The horizontal distance between the two origins is d;
secondly, passively positioning and determining the target s in the air sonar coordinate system o through the air sonar 1 The included angle alpha between the horizontal plane and the vertical plane of the mounting rod and the position coordinates (a, b, h) of the target in the air sonar coordinate system are calculated through a trigonometric function,wherein,
Figure FDA0003948718890000011
h is the height of the mounting rod;
thirdly, calculating the included angle between the target and the horizontal plane in the camera coordinate system
Figure FDA0003948718890000012
And the included angle with the vertical surface of the mounting rod
Figure FDA0003948718890000013
Fourthly, determining the picture pixel size of the camera as m multiplied by m, adjusting the field angle of the camera as theta, and measuring the distance d from the bottom edge in the field of view of the camera to the monitoring rod piece 1
Fifthly, calculating the Y-axis pixel coordinate of the target in the camera picture
Figure FDA0003948718890000014
And pixel coordinates in the X axis
Figure FDA0003948718890000015
Wherein xl is the pixel distance from the center to the left road boundary on the X-axis, xr is the pixel distance from the center to the right road boundary, the pixel position of the road boundary in the picture can be expressed as a straight line, kl is the slope of the left boundary in the coordinate system, kr is the slope of the right boundary in the coordinate system, n 11 The corresponding pixel position of the target on the Y axis in the picture is shown, sl represents the actual distance from the boundary on the left side of the road to the Y axis, and sr represents the actual distance from the boundary on the right side of the road to the Y axis;
and sixthly, determining and reflecting the sound source target position on the image through the pixel coordinates, and realizing the combined calibration and positioning of the sound image.
CN201910378075.6A 2019-05-08 2019-05-08 Distributed acoustic image joint calibration positioning method Active CN110133663B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910378075.6A CN110133663B (en) 2019-05-08 2019-05-08 Distributed acoustic image joint calibration positioning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910378075.6A CN110133663B (en) 2019-05-08 2019-05-08 Distributed acoustic image joint calibration positioning method

Publications (2)

Publication Number Publication Date
CN110133663A CN110133663A (en) 2019-08-16
CN110133663B true CN110133663B (en) 2023-03-10

Family

ID=67576474

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910378075.6A Active CN110133663B (en) 2019-05-08 2019-05-08 Distributed acoustic image joint calibration positioning method

Country Status (1)

Country Link
CN (1) CN110133663B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110441737B (en) * 2019-08-26 2023-02-24 上海声茵科技有限公司 Sound source positioning method and equipment adopting fisheye lens
CN110672313B (en) * 2019-10-22 2021-06-01 上海声茵科技有限公司 Fault diagnosis method and equipment based on sound signals
CN110807901B (en) * 2019-11-08 2021-08-03 西安联丰迅声信息科技有限责任公司 Non-contact industrial abnormal sound detection method
CN111915918A (en) * 2020-06-19 2020-11-10 中国计量大学 System and method for calibrating automobile whistling snapshot device on site based on dynamic characteristics

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101101333A (en) * 2006-07-06 2008-01-09 三星电子株式会社 Apparatus and method for producing assistant information of driving vehicle for driver
CN101359043A (en) * 2008-09-05 2009-02-04 清华大学 Determining method for sound field rebuilding plane in acoustics video camera system
CN106886017A (en) * 2017-01-11 2017-06-23 浙江大学 Submarine target locus computational methods based on double frequency identification sonar
CN108051007A (en) * 2017-10-30 2018-05-18 上海神添实业有限公司 AGV navigation locating methods based on ultrasonic wave networking and stereoscopic vision
JP2018189463A (en) * 2017-05-01 2018-11-29 株式会社Soken Vehicle position estimating device and program
CN109479088A (en) * 2017-06-02 2019-03-15 深圳市大疆创新科技有限公司 The system and method for carrying out multiple target tracking based on depth machine learning and laser radar and focusing automatically

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101101333A (en) * 2006-07-06 2008-01-09 三星电子株式会社 Apparatus and method for producing assistant information of driving vehicle for driver
CN101359043A (en) * 2008-09-05 2009-02-04 清华大学 Determining method for sound field rebuilding plane in acoustics video camera system
CN106886017A (en) * 2017-01-11 2017-06-23 浙江大学 Submarine target locus computational methods based on double frequency identification sonar
JP2018189463A (en) * 2017-05-01 2018-11-29 株式会社Soken Vehicle position estimating device and program
CN109479088A (en) * 2017-06-02 2019-03-15 深圳市大疆创新科技有限公司 The system and method for carrying out multiple target tracking based on depth machine learning and laser radar and focusing automatically
CN108051007A (en) * 2017-10-30 2018-05-18 上海神添实业有限公司 AGV navigation locating methods based on ultrasonic wave networking and stereoscopic vision

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于计算机视觉的目标二维坐标定位实现;贺静;《现代计算机(专业版)》;20120525(第15期);第18-20页 *

Also Published As

Publication number Publication date
CN110133663A (en) 2019-08-16

Similar Documents

Publication Publication Date Title
CN110133663B (en) Distributed acoustic image joint calibration positioning method
US12033388B2 (en) Positioning method, apparatus, device, and computer-readable storage medium
CN106651990B (en) Indoor map construction method and indoor positioning method based on indoor map
WO2018196391A1 (en) Method and device for calibrating external parameters of vehicle-mounted camera
JP3906194B2 (en) CALIBRATION METHOD, CALIBRATION SUPPORT DEVICE, CALIBRATION DEVICE, AND CAMERA SYSTEM MANUFACTURING METHOD
US20180003498A1 (en) Visual positioning system and method based on high reflective infrared identification
CN104809718B (en) A kind of vehicle-mounted camera Auto-matching scaling method
CN111508027B (en) Method and device for calibrating external parameters of camera
CN104064055B (en) A kind of inland river navigation boats and ships superelevation detects early warning system and method for work thereof
CN108221603A (en) Road surface three-dimensional information detection device, the method and system of a kind of road
JP6260891B2 (en) Image processing apparatus and image processing method
JP2014098683A (en) Method for remotely measuring crack
JP2012075060A (en) Image processing device, and imaging device using the same
CN112967344B (en) Method, device, storage medium and program product for calibrating camera external parameters
CN109544628A (en) A kind of the accurate reading identifying system and method for pointer instrument
CN113223075A (en) Ship height measuring system and method based on binocular camera
CN109375195A (en) Parameter quick calibrating method outside a kind of multi-line laser radar based on orthogonal normal vector
KR20200083301A (en) Method for calibrating the alignment of moving object sensor
CN110645973A (en) Vehicle positioning method
CN112017238A (en) Method and device for determining spatial position information of linear object
TW201329426A (en) Camera testing device and test method thereof
CN112598756B (en) Roadside sensor calibration method and device and electronic equipment
CN111538008A (en) Transformation matrix determining method, system and device
CN112254646B (en) Push bench posture recognition system and method and storage medium
CN103884332B (en) A kind of barrier decision method, device and mobile electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A Distributed Audio Image Joint Calibration and Localization Method

Granted publication date: 20230310

Pledgee: Xi'an Caijin Financing Guarantee Co.,Ltd.

Pledgor: LIANFENG ACOUSTIC TECHNOLOGIES Co.,Ltd.

Registration number: Y2024980011038