CN103472434A - Robot sound positioning method - Google Patents

Robot sound positioning method Download PDF

Info

Publication number
CN103472434A
CN103472434A CN2013104552389A CN201310455238A CN103472434A CN 103472434 A CN103472434 A CN 103472434A CN 2013104552389 A CN2013104552389 A CN 2013104552389A CN 201310455238 A CN201310455238 A CN 201310455238A CN 103472434 A CN103472434 A CN 103472434A
Authority
CN
China
Prior art keywords
robot
kinect
sound
sound source
gravity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013104552389A
Other languages
Chinese (zh)
Other versions
CN103472434B (en
Inventor
莫宏伟
孟龙龙
徐立芳
梁作玉
蒋兴洲
雍升
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanhai Innovation And Development Base Of Sanya Harbin Engineering University
Original Assignee
Harbin Engineering University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Engineering University filed Critical Harbin Engineering University
Priority to CN201310455238.9A priority Critical patent/CN103472434B/en
Publication of CN103472434A publication Critical patent/CN103472434A/en
Application granted granted Critical
Publication of CN103472434B publication Critical patent/CN103472434B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)

Abstract

The invention discloses a robot sound positioning method, and relates to sound positioning and robot navigation. At least two Kinect sensors are arranged, the sound source direction detected by each Kinect is obtained, the deviation sector area where a sound source, determined by every two Kinect sensors, is arranged is determined, and three areas are determined totally; the gravity center of each area is solved through a gravity center method, and the mean of the three gravity centers is the optimal position of the sound source. By means of the robot sound positioning method, positioning accuracy is improved, and the method has strong practicality and flexibility, and can be applied to the fields such as sound positioning, robot navigation motion control and the like.

Description

A kind of robot sound localization method
Technical field
The invention belongs to the robot field, relate to a kind of robot sound localization method, can be used for the fields such as robot motion's control, robot indoor positioning and navigation.
Background technology
It is special that Kinect(gnaws) be a kind of three-dimensional (3D) body sense video camera, it has imported the functions such as instant motion capture, image identification, microphone input, speech recognition, community interaction simultaneously.When Kinect issues as the peripheral hardware of Xbox360 for the first time, bone is followed the trail of and speech recognition is the characteristic that Kinect SDK is welcome by the developer most, but compare bone, follows the trail of, and in speech recognition, the power of microphone array is out in the cold.Part reason is attributed to infusive bone tracing system in Kinect, and another part reason is that Xbox game control panel and Kinect somatic sensation television game do not give full play to the advantage that the Kinect audio frequency is processed.
The microphone array of Kinect is listed in the below of Kinect equipment.This array by 4 independently the microphone of horizontal distribution below Kinect form.Although each microphone is caught identical sound signal, form the source direction that array can detect sound.Make it possible to the sound transmitted from some specific directions for identification.The audio data stream that microphone array is caught strengthens the effect algorithm process through complicated audio frequency and removes incoherent background noise.All these complex operations are processed between Kinect hardware and Kinect SDK, and this makes in a larger space scope, even the people has certain distance also can carry out the identification of voice command and the source direction of judgement sound apart from Kinect.
Robot indoor positioning technology is a focus in robot research field, is also a difficult point, and researchers have proposed various methods.Relatively be typically the RFID technology, at first, at first build an intelligent space or be called the sensor network space indoor, lay a RFID label every certain distance in advance on flooring, deposited the absolute coordinates of its position in each RFID label in, secondly, RFID label information reading device has been housed on the mobile robot, when robot moves on the RFID label, the coordinate data read in the RFID label can be known the position that robot is current.But this localization method has certain requirement to environment, and the laying interval difference of RFID label, the positioning precision of robot is also different.The precision of other location technologies also is subject to the impact of many factors, as dead-reckoning has very large dependence to the precision of sensor and the kinematic system of robot itself; The technology such as WIFI, bluetooth also has certain requirement to environment; The sound localization technology also has application in robot, but the impact that is limited to the processing of its complexity and is subject to neighbourhood noise, so positioning precision is not high, is difficult to promote; Though indoor map structuring positioning precision is high, map structuring process complexity, calculated amount is very large, and real-time is difficult to meet the demands.
Summary of the invention:
The invention provides a kind of robot sound localization method, the not high problem of positioning precision existed for solving prior art.
On the one hand, provide a kind of robot sound localization method, having comprised: by least two Kinect, as sound transducer, obtained the source direction angle from the sound of robot; According to the position of described source direction angle and described at least two Kinect, determine deviation covering of the fan zone, every two the determined sound source of Kinect places; Determine the geometric center of gravity of the intersection region in every two deviation covering of the fan zones; According to described definite geometric center of gravity, calculate the sound source optimal location by the geometric center of gravity method, described sound source optimal location is for locating the position of the described robot obtained.
Preferably, after by the geometric center of gravity method, calculating the sound source optimal location, according to target location and described sound source optimal location, determine the route of described robot action; Control the described route of described Robot and move to described target location.
By such scheme, accurate and easily positioning robot's position.
The accompanying drawing explanation
Fig. 1 is Kinect sound localization general principles figure;
Fig. 2 is No. 1 and No. 3 common definite sound source region schematic diagrams of Kinect sensor;
Fig. 3 is that the irregular quadrilateral center of gravity is asked the method schematic diagram;
Fig. 4 is No. 1 and No. 2 common definite sound source region schematic diagrams of Kinect sensor;
Fig. 5 is No. 2 and No. 3 common definite sound source region schematic diagrams of Kinect sensor;
Fig. 6 is that the sound localization by Kinect realizes robot navigation's schematic diagram;
Fig. 7 is robot sound localization Navigation Control theory diagram.
Embodiment
Below in conjunction with accompanying drawing, specific implementation process of the present invention is elaborated.
The embodiment of the present invention provides a kind of positioning robot method, comprising:
(1) 3 Kinect sensors are discharged with the rectangular coordinate system form, the sound of certain time is sent in sound source a certain position in this coordinate system, preserves the sound source direction angle that three Kinect sensors obtain.
(2) because each Kinect sensor receives voice signal, certain deviation is arranged, mean misalignment angle with α, the sound source direction deviation range that the Kinect sensor detects is [α, α], be called the deviation covering of the fan, mean two sound source zones that the Kinect sensor is definite, this method is called deviation covering of the fan method.Then utilize method of geometry to try to achieve the center of gravity of the formed intersection region, deviation covering of the fan zone of every two Kinect sensors, be called the geometric center of gravity method.
(3) utilize deviation covering of the fan method and geometric center of gravity method to try to achieve the center of gravity of the definite sound source region of every two Kinect sensors, totally three, calculate the mean value of these three barycentric coordinates, be the optimal location of sound source.
(4) optimum sound source position is sent to the mobile robot, robot obtains the positional information of oneself, adjusts its direction of motion, moves to objective by sound localization, thereby realizes the navigation of mobile robot's sound localization.
The embodiment of the present invention is by the audio frequency processing power of Kinect sensor, and the sound detecting source direction is realized the robot indoor positioning.
Compared with prior art, the embodiment of the present invention has the following advantages:
The sound localization method is simple.The requirement of real-time of sound positioning system is very high, if adopt the algorithm of more complicated, can not meet the requirement of real-time of sound positioning system.The present invention adopts this more common method of gravity model appoach can determine fast the position of sound source.
The sound localization precision is high.Take full advantage of the microphone array of Kinect sensor and the background of Kinect software driver and suppress and echo cancellation process, eliminate the impact of virtual sound source and ground unrest, survey true Sounnd source direction, obtain the optimum position of sound source.
Localization method is general.Owing to being the source direction that utilizes the Kinect sound detecting, so, even barrier is arranged between Kinect and sound source, Kinect still can access the source direction of sound; On the other hand, this localization method not only can, for indoor sound localization, can be applicable to outdoor equally.
The accurate location and the Navigational Movements that can be applicable in robot chamber are controlled.
The embodiment of the present invention also provides a kind of robot location and air navigation aid, comprising:
1, Kinect sound zone centre of gravity definition method
With reference to Fig. 1, for the detection that makes sound source, all in the optimal detection angular range of Kinect sensor, the layout of three Kinect sensors as shown in the figure.The Kinect sensor obtains sound source direction angle and take its center line as reference, i.e. figure acceptance of the bid has the dotted line of sound source angle reference line, and in the face of the Kinect sensor, the angle in dotted line left side is negative value, the angle on dotted line right side be on the occasion of.Sound source sends with the frequency of 16KHz the sound that continues 50ms in a certain position.Deviation covering of the fan method and regional barycenter method are as follows in conjunction with obtaining the definite sound source intersection region center of gravity process of three Kinect sensors:
Mean the sound source direction angle that No. 1 Kinect sensor detects, l with β k1mean the sound source place straight line that No. 1 Kinect sensor detects, therefore true sound source is at straight line l k1in the sector region of deflection α degree, No. 2 and No. 3 Kinect sensor class are seemingly.
Figure BDA0000390088330000031
mean the sound source direction angle that No. 2 Kinect sensors detect, l k2mean the sound source place straight line that No. 2 Kinect sensors detect.γ means the sound source direction angle that No. 3 Kinect sensors detect, l k3mean the sound source place straight line that No. 3 Kinect sensors detect.From the above β and
Figure BDA0000390088330000041
for negative value, γ on the occasion of.
(x k1, y k1) mean Kinect sensor position coordinate, (x No. 1 k2, y k2) mean Kinect sensor position coordinate, (x No. 2 k3, y k3) mean Kinect sensor position coordinate No. 3, in reality, be easy to measure, so be known parameters.
Fig. 1 meaned sound source direction that three Kinect sensors detect and in twos the deviation covering of the fan mutually intersect determined sound source zone, the zone that the deviation covering of the fan of No. 1 and No. 3 Kinect sensor of take intersects to form mutually is example, and its intersection region center of gravity asks method as follows:
As Fig. 2,
Figure BDA0000390088330000042
mean that No. 1 Kinect sensor detects the counterclockwise deflection α of sound source place straight line degree,
Figure BDA0000390088330000043
mean that No. 1 Kinect sensor detects the clockwise deflection α of sound source place straight line degree;
Figure BDA0000390088330000044
mean that No. 3 Kinect sensors detect the counterclockwise deflection α of sound source place straight line degree,
Figure BDA0000390088330000045
mean that No. 3 Kinect sensors detect the clockwise deflection α of sound source place straight line degree.A(x a, y a) mean
Figure BDA0000390088330000046
with
Figure BDA0000390088330000047
intersection point, B (x b, y b) mean
Figure BDA0000390088330000048
with
Figure BDA0000390088330000049
intersection point, C (x c, y c) mean with
Figure BDA00003900883300000411
intersection point, D (x d, y d) mean
Figure BDA00003900883300000412
with
Figure BDA00003900883300000413
intersection point.Quadrilateral ABCD is by No. 1 and No. 3 common definite sound source regions of Kinect sensor.Ask for convenience of description the center of gravity of quadrilateral ABCD, by its amplification.
With reference to Fig. 3, N (x n, y n) mean the center of gravity of triangle DAB, O (x o, y o) mean the center of gravity of triangle ABC, P (x p, y p) mean the center of gravity of triangle BCD, Q (x q, y q) mean the center of gravity of triangle CDA, R (x r, y r) mean the center of gravity of quadrilateral ABCD.
For quadrilateral ABCD, connect one bar diagonal line AC, so just quadrilateral ABCD is divided into to the assembly of triangle ABC and triangle CDA, the center of gravity of quadrilateral ABCD is on the line OQ of triangle ABC center of gravity O and triangle CDA center of gravity Q; In like manner, connect another diagonal line of quadrilateral ABCD BD, so just quadrilateral ABCD is divided into to the assembly of triangle DAB and triangle BCD, the center of gravity of quadrilateral ABCD is equally on line segment NP, therefore the center of gravity of quadrilateral ABCD is arranged on the intersection point of line segment OQ and NP,
Figure BDA00003900883300000414
the point.
With reference to Fig. 4 and Fig. 5, quadrilateral EFGH is by No. 1 and No. 2 common definite sound source regions of Kinect sensor, quadrilateral IJKL is by No. 2 and No. 3 common definite sound source regions of Kinect sensor, tries to achieve quadrilateral EFGH center of gravity according to above-mentioned same method and is
Figure BDA0000390088330000051
quadrilateral IJKL center of gravity is
Figure BDA0000390088330000052
Finally, ask the center of gravity of quadrilateral ABCD, EFGH, IJKL
Figure BDA0000390088330000053
the average of three coordinates is the optimal location S (x of sound source s, y s).
According to the method described above, in like manner can try to achieve No. 1 and the center of gravity in the zone that the center of gravity in the zone that No. 2 mutual intersections of Kinect sensor form and No. 2 and No. 3 mutual intersections of Kinect sensor form.
2, optimum sound source position algorithm
The method of determining optimum sound source position comprises:
Step 1: a plurality of parameters of initialization
The position coordinates of three Kinect sensors all can actually record, i.e. (x k1, y k1), (x k2, y k2) and (x k3, y k3) being known parameters, the error angle [alpha] is set as 5 ° according to technical indicator and the actual experiment of Kinect, and sound source and three Kinect sensors are started working.
Step 2: ask the straight-line intersection coordinate
Sound source stops sounding after sending the sound that continues 50ms, preserves the sound source direction angle β of No. 1 Kinect sensor acquisition, the sound source direction angle of No. 2 Kinect sensor acquisitions
Figure BDA0000390088330000055
, the sound source direction angle γ of No. 3 Kinect sensor acquisitions.
Can list following straight-line equation according to point slope form:
Straight line
Figure BDA0000390088330000056
y=tan (45 °+β+α) x (1)
Straight line
Figure BDA0000390088330000057
y=tan (45 °+β-α) x (2)
Straight line
Figure BDA0000390088330000058
y=y k3+ tan (135 °+γ+α) (x-x k3) (3)
Straight line y=y k3+ tan (135 °+γ-α) (x-x k3) (4)
Straight line
Figure BDA00003900883300000510
and straight line
Figure BDA00003900883300000517
intersection point be A (x a, y a), the system of equations of separating straight-line equation (1) and (4) composition obtains intersection point A (x a, y a) coordinate.Straight line
Figure BDA00003900883300000511
and straight line
Figure BDA00003900883300000512
intersection point be B (x b, y b), the system of equations of separating straight-line equation (2) and (4) composition obtains intersection points B (x b, y b) coordinate.Straight line and straight line
Figure BDA00003900883300000514
intersection point be C (x c, y c), the system of equations of separating straight-line equation (2) and (3) composition obtains intersection point C (x c, y c) coordinate.Straight line and straight line intersection point be D (x d, y d), the system of equations of separating straight-line equation (1) and (3) composition obtains intersection point D (x d, y d) coordinate.
Step 3: ask the irregular quadrilateral center of gravity
By the triangle core coordinate formula,
Center of gravity N (the x of triangle DAB n, y n), x N = x D + x A + x B 3 , y N = y D + y A + y B 3 ,
Center of gravity O (the x of triangle ABC o, y o), x O = x A + x B + x C 3 , y O = y A + y B + y C 3 ,
Center of gravity P (the x of triangle BCD p, y p), x P = x B + x C + x D 3 , y P = y B + y C + y D 3 ,
Center of gravity Q (the x of triangle CDA q, y q), x Q = x C + x D + x A 3 , y Q = y C + y D + y A 3 .
Can list following straight-line equation according to two point form:
Line segment OQ place straight-line equation: y = y O + y Q - y O x Q - x O ( x - x O ) - - - ( 5 )
Line segment NP place straight-line equation: y = y N + y P - y N x P - x N ( x - x N ) - - - ( 6 )
The intersection point of line segment OQ and NP is
Figure BDA0000390088330000067
the system of equations of separating straight-line equation (5) and (6) composition obtains quadrilateral ABCD center of gravity and is
Figure BDA0000390088330000068
in like manner, try to achieve the center of gravity of quadrilateral EFGH, IJKL according to the method for step 2 and step 3
Figure BDA0000390088330000069
Step 4: ask the sound source optimal location
Sound source optimal location S (x s, y s) coordinate x S = x R 1 + x R 2 + x R 3 3 , y S = y R 1 + y R 2 + y R 3 3 , Represent the position of real sound source with this.
3, the Kinect of robot sound localization navigation algorithm
With reference to Fig. 6, sound source is housed in robot, send sound in space, a certain moment, robot was positioned at W (x w, y w), the robot target position is V (x v, y v).Magnetic compass is housed in robot, can measures the coordinate axis y of robot self 1angle with north orientation.Without loss of generality, establish the angle that δ is coordinate axis y and north orientation.
φ means the angle of robot deviation theory walking path, if the whole control system of robot and wheel structure are all error free, gives robot one craspedodrome signal, makes its walking certain distance arrive target V (x v, y v), the theoretic track route of robot is as shown in figure WV solid line.And the actual robot one craspedodrome signal of giving, due to the wheel structure error, robot always departs from its original orientation, and actual robot is walked along the WV dotted line.
ε means that robot finally arrives with impact point V (x v, y v) be the error-circular radius in the center of circle because there is the impact of other various factorss in reality, the size of robot itself for example, the position of the final arrival of robot not necessarily with impact point V (x v, y v) overlap fully, so according to realistic accuracy requirements definition one circle of uncertainty.As long as robot arrives with impact point V (x v, y v) be the center of circle, in the circle of uncertainty that ε is radius, think that robot has arrived impact point.Kinect sound localization robot navigation algorithm steps is as follows:
Step 1:
According to the current location W (x of robot w, y w) and target location V (x v, y v) can calculate the angle theta of straight line WV and horizontal x axle forward and line segment WV apart from d,
Figure BDA0000390088330000071
for towards target V (x v, y v) motion, robot adjusts its y 1direction of principal axis and north orientation angle are 90 °-θ-δ.Robot is from starting point W (x w, y w) start to walk.
Step 2:
Robot sends the sound that continues 50ms every 1s.If a certain moment, robot was positioned at U (x u, y u), the Sounnd source direction information that computing machine is processed three Kinect sensors acquisitions obtains the current location U (x of robot u, y u), by this position by being wirelessly sent to robot.
Step 3:
Robot calculates current location U (x u, y u) and starting point W (x w, y w) line WU and the deviation angle φ of WV, robot adjusts its y automatically 1the direction motion that axle reduces towards deviation angle φ.
Step 4:
Repeating step 2 is to step 3, final, processes when computing machine Sounnd source direction information that three Kinect sensors obtain and obtains the robot current location with impact point V (x v, y v) be the center of circle, in the circle of uncertainty that ε is radius, think that robot has arrived target location, completed robot from starting point W (x w, y w) to impact point V (x v, y v) navigation task.
Step 5:
From a V (x v, y v) to impact point Z (x z, y z) navigation task repeating step 1 to 4.
According to above-mentioned steps, constantly repeat, the sound localization based on Kinect just can realize the continuous navigation task of robot, has proved that the sound localization algorithm of above-mentioned Kinect is correct and effective.
Fig. 7 is robot whole sound localization Navigation Control theory diagram, the PC(PC) connect 3 Kinect sensors and a wireless data transmission module on machine, wireless data transmission module is for the communication between PC and robot.Comprise robot core processor, motor driving, magnetic compass, wireless data transmission module and sound source module in robot.The robot core processor is for the motion control of robot, the communicating by letter of robot and PC, pick-up transducers information etc.Motor drives the amplification for robot motor's power.Magnetic compass is for the angle in the relatively geographical geographical north of robot measurement self.Sound source module is for the robot signal of sounding, in order to realize the robot sound localization with 3 Kinect sensors.
Foregoing is only the preferred embodiments of the present invention, and on this basis, those skilled in the art can make some distortion, in the situation that do not depart from thought of the present invention, these distortion also should be within protection scope of the present invention.

Claims (2)

1. a robot sound localization method, is characterized in that, comprising:
As sound transducer, obtain the source direction angle from the sound of robot by least two Kinect;
According to the position of described source direction angle and described at least two Kinect, determine deviation covering of the fan zone, every two the determined sound source of Kinect places;
Determine the geometric center of gravity of the intersection region in every two deviation covering of the fan zones;
According to described definite geometric center of gravity, calculate the sound source optimal location by the geometric center of gravity method, described sound source optimal location is for locating the position of the described robot obtained.
2. robot according to claim 1 sound localization method, is characterized in that, after by the geometric center of gravity method, calculating the sound source optimal location, described method also comprises:
According to target location and described sound source optimal location, determine the route of described robot action;
Control the described route of described Robot and move to described target location.
CN201310455238.9A 2013-09-29 2013-09-29 Robot sound positioning method Expired - Fee Related CN103472434B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310455238.9A CN103472434B (en) 2013-09-29 2013-09-29 Robot sound positioning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310455238.9A CN103472434B (en) 2013-09-29 2013-09-29 Robot sound positioning method

Publications (2)

Publication Number Publication Date
CN103472434A true CN103472434A (en) 2013-12-25
CN103472434B CN103472434B (en) 2015-05-20

Family

ID=49797354

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310455238.9A Expired - Fee Related CN103472434B (en) 2013-09-29 2013-09-29 Robot sound positioning method

Country Status (1)

Country Link
CN (1) CN103472434B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105163209A (en) * 2015-08-31 2015-12-16 深圳前海达闼科技有限公司 Voice receiving processing method and voice receiving processing device
CN106291469A (en) * 2016-10-18 2017-01-04 武汉轻工大学 A kind of three dimensions source of sound localization method and system
CN106483504A (en) * 2015-08-31 2017-03-08 松下知识产权经营株式会社 Sound source detection device
CN108525259A (en) * 2018-04-27 2018-09-14 湖南环境生物职业技术学院 A kind of system for football positioning ball test
CN108579057A (en) * 2018-04-27 2018-09-28 长沙修恒信息科技有限公司 A kind of football positioning ball test method
CN109270491A (en) * 2018-08-17 2019-01-25 安徽信息工程学院 Indoor acoustic location device based on Kinect
CN110288984A (en) * 2019-05-17 2019-09-27 南昌大学 A kind of audio recognition method based on Kinect

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001023104A2 (en) * 1999-09-29 2001-04-05 1...Limited Method and apparatus to direct sound using an array of output transducers
CN201210187Y (en) * 2008-06-13 2009-03-18 河北工业大学 Robot automatically searching sound source
CN102707262A (en) * 2012-06-20 2012-10-03 太仓博天网络科技有限公司 Sound localization system based on microphone array

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001023104A2 (en) * 1999-09-29 2001-04-05 1...Limited Method and apparatus to direct sound using an array of output transducers
CN201210187Y (en) * 2008-06-13 2009-03-18 河北工业大学 Robot automatically searching sound source
CN102707262A (en) * 2012-06-20 2012-10-03 太仓博天网络科技有限公司 Sound localization system based on microphone array

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
居太亮等: "基于任意麦克风阵列的声源二维DOA估计算法研究", 《通信学报》, vol. 26, no. 08, 31 August 2005 (2005-08-31), pages 129 - 133 *

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105163209A (en) * 2015-08-31 2015-12-16 深圳前海达闼科技有限公司 Voice receiving processing method and voice receiving processing device
CN106483504A (en) * 2015-08-31 2017-03-08 松下知识产权经营株式会社 Sound source detection device
US10306360B2 (en) 2015-08-31 2019-05-28 Cloudminds (Shenzhen) Technologies Co., Ltd. Method and device for processing received sound and memory medium, mobile terminal, robot having the same
CN106483504B (en) * 2015-08-31 2021-07-30 松下知识产权经营株式会社 Sound source detecting device
CN106291469A (en) * 2016-10-18 2017-01-04 武汉轻工大学 A kind of three dimensions source of sound localization method and system
CN106291469B (en) * 2016-10-18 2018-11-23 武汉轻工大学 A kind of three-dimensional space source of sound localization method and system
CN108525259A (en) * 2018-04-27 2018-09-14 湖南环境生物职业技术学院 A kind of system for football positioning ball test
CN108579057A (en) * 2018-04-27 2018-09-28 长沙修恒信息科技有限公司 A kind of football positioning ball test method
CN108525259B (en) * 2018-04-27 2020-11-27 湖南环境生物职业技术学院 System for be used for football location ball to test
CN109270491A (en) * 2018-08-17 2019-01-25 安徽信息工程学院 Indoor acoustic location device based on Kinect
CN110288984A (en) * 2019-05-17 2019-09-27 南昌大学 A kind of audio recognition method based on Kinect

Also Published As

Publication number Publication date
CN103472434B (en) 2015-05-20

Similar Documents

Publication Publication Date Title
CN103472434B (en) Robot sound positioning method
EP3168705B1 (en) Domestic robotic system
EP3507572B1 (en) Apparatus and method for providing vehicular positioning
CN106383596B (en) Virtual reality anti-dizzy system and method based on space positioning
TWI505801B (en) Indoor robot and method for indoor robot positioning
Kriegman et al. A mobile robot: Sensing, planning and locomotion
EP3136128A1 (en) Trajectory matching using peripheral signal
CN103412565B (en) A kind of robot localization method with the quick estimated capacity of global position
Harapanahalli et al. Autonomous Navigation of mobile robots in factory environment
CN103822625B (en) Line-tracking navigation method and device for intelligent robot
CN111149072A (en) Magnetometer for robot navigation
KR20180080498A (en) Robot for airport and method thereof
JP2012137909A (en) Movable body remote control system and control program for the same
CN104703118A (en) System of indoor robot for locating mobile terminal based on bluetooth technology
JP6636260B2 (en) Travel route teaching system and travel route teaching method for autonomous mobile object
CN107562054A (en) The independent navigation robot of view-based access control model, RFID, IMU and odometer
RU2740229C1 (en) Method of localizing and constructing navigation maps of mobile service robot
CN103616025A (en) Three-dimensional field staff positioning navigation system
CN103389486A (en) Control method and electronic device
Ross et al. Toward refocused optical mouse sensors for outdoor optical flow odometry
CN107450546A (en) Obstacle Avoidance based on GPS and ultrasonic radar
JP2016080460A (en) Moving body
TW201831920A (en) Auto moving device
JP2018173707A (en) Person estimation system and estimation program
Tuvshinjargal et al. Hybrid motion planning method for autonomous robots using kinect based sensor fusion and virtual plane approach in dynamic environments

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20201126

Address after: Area A129, 4th floor, building 4, Baitai Industrial Park, Yazhou Bay science and Technology City, Yazhou District, Sanya City, Hainan Province, 572024

Patentee after: Nanhai innovation and development base of Sanya Harbin Engineering University

Address before: 150001 Heilongjiang, Nangang District, Nantong street,, Harbin Engineering University, Department of Intellectual Property Office

Patentee before: HARBIN ENGINEERING University

TR01 Transfer of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20150520

Termination date: 20210929

CF01 Termination of patent right due to non-payment of annual fee