CN106042005B - The method of work of bionic eye location tracking system - Google Patents

The method of work of bionic eye location tracking system Download PDF

Info

Publication number
CN106042005B
CN106042005B CN201610379179.5A CN201610379179A CN106042005B CN 106042005 B CN106042005 B CN 106042005B CN 201610379179 A CN201610379179 A CN 201610379179A CN 106042005 B CN106042005 B CN 106042005B
Authority
CN
China
Prior art keywords
head
module
sub
coordinate
speaker
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610379179.5A
Other languages
Chinese (zh)
Other versions
CN106042005A (en
Inventor
樊炳辉
李镇
刘勰
柏山清
尹逊敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University of Science and Technology
Original Assignee
Shandong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University of Science and Technology filed Critical Shandong University of Science and Technology
Priority to CN201610379179.5A priority Critical patent/CN106042005B/en
Publication of CN106042005A publication Critical patent/CN106042005A/en
Application granted granted Critical
Publication of CN106042005B publication Critical patent/CN106042005B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/04Viewing devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/026Acoustical sensing devices

Abstract

The invention discloses a kind of bionic eye location tracking system and its method of work, described system is mainly made up of bionic eye head module, sound collection sensor array, camera module, head motion-control module, controller module and the part of upper computer module six.Three dimensions auditory localization technology, vision processing algorithm are combined by the present invention by above-mentioned each system with automatic control technology, a kind of bionic eye location tracking system and method for work can be realized, enable center of the bionic eye in real time towards speaker's face mask, bionic device can be allowed when with people's exchange and interdynamic, seem more lively, active and friendly, increase more interests.

Description

The method of work of bionic eye location tracking system
Technical field
Location tracking system and method for work the present invention relates to various bionic eyes to target, more particularly to by bionic eye Bionic eye location tracking system and method for work of the binocular fixation to face of speaking.
Background technology
At present, various bio-robots (e.g., humanoid, imitative animal shape etc.) are main logical to the sensory perceptual system of surrounding environment Bionic eye and bionic ear are crossed to obtain external information.
Wherein, in the design of bionic eye, bionic eye mainly is served as to obtain video information by installing two cameras, To bionic eye overall structure founding mathematical models, using some algorithms of visual processes, the three-dimensional information to target object is realized Calculating and recognition of face etc..Technology related to this comparative maturity, and be widely applied to industry, military affairs, medical treatment etc. Numerous areas.
But the existing camera for serving as bionic eye is either fixedly mounted on certain position or can be by the behaviour of people Control travel direction adjustment, can not typically realize bionic eye and current speaker it is automatic to regarding with tracking, lack vividly with close friend Communicative effect.
In the auditory localization technology similar to bionic ear, the sound signal collecting device such as existing application microphone comes real The application of existing auditory localization, the technology is techniques known, e.g., following documents:The such as Zheng Zhenzhen, Feng Huajun, Shen Changyu Three-dimensional acoustic localization algorithm [J] journal of Zhejiang university based on coordinate system transformation, 2008,42 (2):341-343 (lower referred to as texts Offer 1), wherein just refer to a kind of auditory localization technology based on the sodar time difference (TDOA), and give one kind and be based on coordinate It is the Three-dimensional acoustic localization algorithm of conversion.
But existing sound localization method simply depends on the sound signal collecting devices such as microphone to collect signal, Information source is single, stability is low, error is big, when sound-source signal is disturbed by extraneous factor, this sound localization method without Method realizes the precise positioning to speaker.
, how can be lively, accurate by two cameras of bionic eye in the case where everybody surrounds and watches situation in exhibition or public occasion Ground watches speaker attentively on the face, and can follow the movement of speaker automatically and rotate eyes (two cameras), allows and talks with people Face can be in the picture centre that camera collects in real time, there is no the research of such technology and method at present.
The content of the invention
The automatic technological deficiency for watching face of being spoken with tracking attentively can not be realized in order to solve existing bionic eye, the present invention is based on The prior art of bionic eye and bionic ear, there is provided a kind of bionic eye location tracking system, to realize two shootings of bionic eye Head is watched attentively automatically and traces into face of speaking.
Present invention simultaneously provides the method for work of this location tracking system.
For the above-mentioned purpose, bionic eye location tracking system of the invention is mainly made up of following six part:Bionic eye cloud Platform module, sound collection sensor array, camera module, head motion-control module, controller module, upper computer module; Wherein:
Described bionic eye head module includes:One head base, the sub- head of a carrying platform and left and right two, its In, rotate horizontally joint by one between head base and carrying platform and be connected, by carrying platform on head base Horizontally rotate to imitate the neck rotation of bio-robot;The two sub- heads in left and right, left and right two are installed on carrying platform Sub- head respectively has a horizontal rotating frame and a pitching swivel mount respectively, so, every sub- head all have rotate horizontally and Pitching rotates two cradle heads, and two cameras are separately mounted on the pitching swivel mount among the two sub- heads in left and right, filled When the eyes of bionic eye, by the sub- respective horizontal rotation of head in left and right two and the rotation of elevation rotary joint, can distinguish The camera that adjustment is arranged on the two sub- heads in left and right points to, to imitate the rotation of eyes, such bionic eye head module A total of five freedoms of motion, the center position on carrying platform, bind a neck that can be horizontally rotated with carrying platform Portion's coordinate system.
Described sound collection sensor array serves as bionic ear in the present system, by four characteristic identical microphones, Centered on bionic eye head base fulcrum, it is arranged on by the space geometry relation of positive tetrahedron around head base center, and Origin using head base branch dot center as bionic eye head base coordinate system, so, by the voice signal that collects and Auditory localization algorithm (can refer to the sound localization method that document 1 mentioned in background technology etc. is mentioned), obtain speaker's mouth Relative to the D coordinates value of base coordinate system.
Described camera module is made up of two independent cameras, is separately mounted among the two sub- heads in left and right On pitching swivel mount, the eyes of bionic eye are served as in the present system, for obtaining video information, and video data are transmitted to upper Machine module.
Described head motion-control module includes:The horizontal rotation driving electricity being separately mounted on the two sub- heads in left and right Horizontal rotation motor, the data transmission module of machine and pitching rotary drive motor, carrying platform.Head motion control Module controls the motor of corresponding joint according to each motor turn signal of the controller module output received Rotated.
Described controller module, the output signal of sound collection sensor array is on the one hand gathered, run auditory localization Algorithm, D coordinates value of speaker's mouth relative to head base coordinate system is obtained, and the three-dimensional of speaker's mouth is sat Mark is transmitted to upper computer module;On the other hand control instruction and the data message that upper computer module is sent are received, to control each drive Dynamic motor rotates respective angles, while also to the motion state of upper computer module return system.
Described upper computer module includes:The eye level corner α of bionic eye head1And β1Derivation program, to taking Carrying platform rotates horizontally the solution of the derivation program of rotational angle theta, bionic eye head module or so two sub- head joint spaces Operation program, vision processing algorithm program, the signal procedure with controller.
The method of work of the present invention mainly comprises the following steps:
Step 1: system initialization:System electrification, sound collection sensor array enter working condition, upper computer module The video image that camera module obtains is shown, controller module controls all motors to reset to initial angle.
Step 2: auditory localization:Controller module reads the voice data of sound collection sensor array collection, and runs Auditory localization algorithm obtains D coordinates value of speaker's mouth relative to head base coordinate system, the coordinate value is transmitted to upper Machine module.
Step 3: eye level corner solves:I.e. to the respective level angle of two sub- heads of bionic eye head module or so Solution.
D coordinates value of the upper computer module according to speaker's mouth relative to head base coordinate system, is converted to speaker Mouth relative to neck coordinate system D coordinates value, when under the non-rotation status of forward neck portion, turning respectively to solve two eyes Required level angle during to speaker's mouth;If wherein the level angle of any one exceedes ± 30 degree (setting ranges), this When need to carry out rotation adjustment to neck, then turn to step 4;If the feathering angle for solving two eyes come does not all surpass ± 30 degree (setting ranges) are crossed, at this moment rotation adjustment need not be carried out to neck, then turn to step 5.
Step 4: rotation adjustment is carried out to neck joint:The solution of corner is rotated horizontally to carrying platform.
The first D coordinates value according to speaker's mouth relative to neck coordinate system of upper computer module, to meet carrying platform It towards speaker's mouth is condition to turn to honest, and the anti-feathering angle solved needed for neck, upper computer module is by the water of neck The flat anglec of rotation issues controller module, controls the horizontal rotary drive motor of neck to rotate respective angles by controller module, realizes Carrying platform turn to it is honest faces speaker's mouth direction, so, neck coordinate system, which also turns to, honest faces speaker mouth side To then turning to step 5, then relatively current neck coordinate system to carry out the pass of bionic eye head module or so two sub- heads Space derivation is saved, to cause two eyes to turn to speaker's mouth direction respectively.
Step 5: the sub- head joint space of bionic eye head module or so two solves:Upper computer module is according to speaker The relation of the relatively current neck coordinate system of mouth solves fortune to carry out the sub- head joint space of bionic eye head module or so two Calculate, the angle that the anti-pitching rotary drive motor for solving the two sub- heads in left and right, horizontal rotation motor should rotate respectively Information, be then issued to controller module, by the respective pitching rotary drive motor of the sub- head in controller module control left and right two, Rotate horizontally motor and rotate respective angles, to realize the eyes of bionic eye tentatively to the mouth of speaker.
So far, complete the process of two cameras (eyes) tentatively alignment speaker's mouth.
Step 6: eyes adjustment in direction:I.e. to two sub- each rotary joint corners of head of bionic eye head module or so Amendment.Upper computer module runs face recognition algorithms, speaks face rectangular peripheral profile to obtain and asks for its centre bit The two-dimensional coordinate value in video image coordinate system is put, according to the offset of the two-dimensional coordinate value and video image center, The pitching rotary drive motor of the two sub- heads in left and right is obtained respectively, rotates horizontally the correction value of the motor anglec of rotation, so Above-mentioned anglec of rotation correction value is issued into controller module again afterwards, controls the two sub- heads in left and right respective by controller module Pitching rotary drive motor, motor fine setting respective angles are rotated horizontally, it is accurate to two eyes difference of bionic eye to realize Towards the face mask center position of speaker so that face rectangular profile of speaking center appears in the video of camera identification Picture centre.
Step 7: when if controller module detects that new speaker occurs, circulation performs Step 2: Step 3: step Rapid four, Step 5: step 6, real-time tracking is carried out to emerging face of speaking;If occurring without new speaker, circulate Perform step 6 so that the corresponding free degree of eyes adjusts according to the movement for face location of currently speaking.
Step 8: whether detection whole system needs to work on, if so, then turning to step 7, determine whether new Speaker occurs, if it is not, then exiting work shutdown.
Illustrate advantages of the present invention with reference to operation principle:
Two cameras for serving as bionic eye are arranged on a head with five degree of freedom by the present invention;Pass through three-dimensional Spatial sound source location technology, realize the preliminary three-dimensional localization to speaker's mouth position;According to the solution to head joint space Operation method, the corresponding corner in the simultaneously each joint of rotary platform is obtained, bionic eye is stared to the direction of Primary Location;With reference to face Recognition methods, the accurate orientation at speaker's face mask center, and the corner of fine-adjustment tripod head associated joint are found out, by speaker Face mask move to serve as bionic eye two cameras collection picture centre untill;When speaker walks about, bionic eye The change in location for following the face of speaking is rotated into adjustment automatically, adopted with ensureing that the face of speaker can be in camera in real time The picture centre of collection;Until when another person lifts up one's voice again, bionic eye can just be noted again again according to the foregoing course of work Depending on new face of speaking.
In a word:Three dimensions auditory localization technology, vision processing algorithm are combined by the present invention with automatic control technology, can To realize a kind of bionic eye location tracking system and method for work so that bionic eye can be in real time towards speaker's face mask Center, bionic device can be allowed when with people's exchange and interdynamic, it appears it is more lively, active and friendly, increase more interests.
Brief description of the drawings
Fig. 1 is bionic eye location tracking system overall relation block diagram;
Fig. 2 is bionic eye location tracking system general structure schematic diagram;
Fig. 3 is bionic eye head modular structure schematic diagram;
Fig. 4 is head motion-control module structural representation;
Spatial relationship schematic diagram between Fig. 5 speaker's face mask center and two camera image coordinate systems;
Each coordinate system relation schematic diagram when Fig. 6 is initial position;
Fig. 7 is bionic eye location tracking system workflow diagram.
Marginal data:
Fig. 1:100- sound collection sensor arrays;200- camera modules;300- bionic eye head modules;400- clouds Platform motion-control module;500- controller modules;600- upper computer modules;
Fig. 2:100- sound collection sensor arrays;101-1 microphones;102-2 microphones;103-3 microphones; 104-4 microphones;200- camera modules;The left cameras of 201-;The right cameras of 202-;300- bionic eye head modules;
Fig. 3:301- head bases;302- carrying platforms;The left sub- heads of 303-;The right sub- heads of 304-;The left sub- heads of 305- Horizontal rotating frame;The pitching swivel mount of the left sub- heads of 306-;The horizontal rotating frame of the right sub- heads of 307-;The right sub- heads of 308- Pitching swivel mount;
Fig. 4:The horizontal rotation motor of 401- carrying platforms;The horizontal rotation motor of the left sub- heads of 402-; The pitching rotary drive motor of the left sub- heads of 403-;The horizontal rotation motor of the right sub- heads of 404-;The right sub- heads of 405- Pitching rotary drive motor.
Embodiment
Below in conjunction with the accompanying drawings, the bionic eye location tracking system and method for work of the present invention is illustrated.
The bionic eye location tracking system of the present invention is mainly made up of following six part:Sound collection sensor array 100th, camera module 200, bionic eye head module 300, head motion-control module 400, controller module 500, host computer Module 600;As shown in figure 1, wherein:
Described sound collection sensor array 100 serves as bionic ear in the present system, respectively by No. 1 microphone 101,2 Number microphone 104 of microphone 103,4 of microphone 102,3, as shown in Fig. 2 using the fulcrum of bionic eye head base 301 in The heart, it is arranged on according to the space geometry relation of positive tetrahedron around head base center, and 301 dot center of head base is made For bionic eye head base coordinate system O0Origin, as shown in fig. 6, so, passing through the voice signal collected and auditory localization Algorithm, it can obtain D coordinates value of speaker's mouth relative to base coordinate system.
Described camera module 200 is made up of left camera 201 and right camera 202, as shown in Fig. 2 by left shooting First 201 are arranged on the pitching swivel mount 306 of left sub- head 303, and right camera 202 is arranged on to the pitching of right sub- head 304 On swivel mount 308, the eyes of bionic eye are served as in the present system, for obtaining video information, and video data are transmitted to upper Machine module 600.
Described bionic eye head module 300 includes:One head base, 301, carrying platforms 302, left sub- head 303rd, right sub- head 304, closed as shown in figure 3, wherein, between head base 301 and carrying platform 302 being rotated horizontally by one Section is connected, and passes through carrying platform horizontally rotating to imitate the neck rotation of bio-robot on head base 301;Carrying Left sub- head 303 and right sub- head 304 are installed, left sub- head 303 has a horizontal rotating frame 305 and one on platform 302 Pitching swivel mount 306, right sub- head 304 have a horizontal rotating frame 307 and a pitching swivel mount 308, so, every sub- cloud Platform all has to rotate horizontally rotates two cradle heads with pitching, and two cameras are separately mounted among the two sub- heads in left and right Pitching swivel mount on, serve as the eyes of bionic eye, pass through the respective horizontal rotation of the sub- head in left and right two and pitching rotation is closed The rotation of section, the camera that can be adjusted respectively on the two sub- heads in left and right point to, to imitate the rotation of eyes, so A total of five freedoms of motion of bionic eye head module 300, and in the center position of carrying platform 302, bind one The neck coordinate system O that can be horizontally rotated and rotate with carrying platform 3021, as shown in Figure 6.
Described head motion-control module 400 includes:Horizontal rotation motor on left sub- head 303 402 and pitching rotary drive motor 403, the horizontal rotation motor 404 on right sub- head 304 and pitching rotation drive Horizontal rotation motor 401, the data transmission module 406 of 405, carrying platforms 302 of dynamic motor, as shown in figure 4, head Each motor turn signal that motion-control module 400 exports according to the controller module 500 received is corresponding to control The motor in joint is rotated.
Described controller module 500, on the one hand gather the output signal of sound collection sensor array 100, operation sound Source location algorithm, speaker's mouth is obtained relative to head base coordinate system O0D coordinates value, and by the three of the speaker Dimension coordinate is transmitted to upper computer module 600, as shown in Figure 1;On the other hand the control instruction sum that upper computer module 600 is sent is received It is believed that breath, to control each motor to rotate respective angles, while also to the motion shape of the return system of upper computer module 600 State.
Described upper computer module 600 includes:The eye level corner α of bionic eye head1And β1Derivation program, Derivation program, the bionic eye head module of rotational angle theta or so two sub- head joint spaces are rotated horizontally to carrying platform 302 Derivation program, vision processing algorithm program, with the signal procedure of controller 500.
The present invention realizes that bionic eye mainly comprises the following steps to the track and localization specific works method for face of speaking:
Step 1: system initialization:System electrification, sound collection sensor array 100 enter working condition, host computer mould Block 600 shows the video image that camera module 200 obtains, and the control of controller module 500 is arranged on left sub- head 303 Rotate horizontally motor 402 and pitching rotary drive motor 403, the horizontal rotation driving electricity on right sub- head 304 Horizontal rotation motor 401 on 405, carrying platforms 302 of machine 404 and pitching rotary drive motor is reset to initially Angle.
Step 2: auditory localization:Controller module 500 reads the data of the collection of sound collection sensor array 100, and transports Row auditory localization algorithm obtains speaker's mouth relative to head base coordinate system O0D coordinates value, the coordinate value is transmitted to Upper computer module 600.
Step 3: eye level corner solves:It is i.e. each horizontal to 300 or so two sub- heads of bionic eye head module to turn Angle α1And β1Solution.
Upper computer module 600 is according to speaker's mouth relative to head base coordinate system O0D coordinates value, be converted to Speaker's mouth is relative to neck coordinate system O1D coordinates value, under current 302 non-rotation status of carrying platform, to solve Left camera 201 and right camera 202 turn to the corner that left sub- head 303 during speaker's mouth rotates horizontally motor respectively Spend α1The gyration β of motor is rotated horizontally with right sub- head 3041If wherein the level angle of any one is more than ± 30 ° (setting range), at this moment need to carry out rotation adjustment to the neck of carrying platform 302, then turn to step 4;If solve the left side come Sub- head 303 rotates horizontally the gyration α of motor1The gyration β of motor is rotated horizontally with right sub- head 3041All Not less than ± 30 ° (setting range), at this moment rotation adjustment need not be carried out to carrying platform 302, then turn to step 5.
Step 4: rotation adjustment is carried out to neck joint:The solution of rotational angle theta is rotated horizontally to carrying platform 302.
Upper computer module 600 is first according to speaker's mouth relative to neck coordinate system O1D coordinates value, with meet take Carrying platform 302 turn to it is honest towards speaker's mouth be condition, the anti-feathering angle solved needed for the neck of carrying platform 302 θ.Feathering angle θ needed for the neck of carrying platform 302 is issued controller module 500 by upper computer module 600, by controller mould Block 500 controls the horizontal rotation motor 401 of carrying platform 302 to rotate respective angles θ, realizes that carrying platform 302 turns to just Face directly to speaker's mouth direction, so, neck coordinate system O1X-axis also turn to it is honest face speaker's mouth direction, then Turn to step 5, then relatively current neck coordinate system O1To carry out the joint space of bionic eye head module or so two sub- heads Derivation, to cause left camera 201 and right camera 202 to turn to speaker's mouth direction respectively
Step 5: the sub- head joint space of bionic eye head module 300 or so two solves:The basis of upper computer module 600 The relatively current neck coordinate system O of speaker's mouth1Relation carry out the joint of bionic eye head module or so two sub- heads Space derivation, the respectively anti-horizontal rotation motor 402 and pitching rotary drive motor 403 for solving left sub- head 303 Anglec of rotation α1、α2, the anglec of rotation for rotating horizontally motor 404 and pitching rotary drive motor 405 of right sub- head 304 Spend β1、β2, controller module 500 is then issued to, the horizontal rotation motor of left sub- head 303 is controlled by controller module 500 402 and pitching rotary drive motor 403, the horizontal rotation motor 404 and pitching rotary drive motor of right sub- head 304 405 rotate respective angles, to realize the left camera 201 of bionic eye and right camera 202 tentatively to the mouth of speaker.
So far, complete the process of two cameras (eyes) tentatively alignment speaker's mouth.
Step 6: eyes adjustment in direction:I.e. to two sub- each rotary joint corners of head of bionic eye head module or so Amendment.
Upper computer module 600 runs face recognition algorithms, face rectangular peripheral profile and is asked for wherein to obtain to speak Video image coordinate system O of the heart position in left camera 201l-XlYlWith the video image coordinate system O of right camera 202r-XrYr In two-dimensional coordinate value (Xl, Yl) and (Xr, Yr), it is highly H, according to the two-dimensional coordinate as shown in figure 5, setting picture traverse as W Value and the offset of video image center (W/2, H/2), the horizontal rotation motor of left sub- head 303 is obtained respectively 402 anglec of rotation α1Correction value Δ α1With the anglec of rotation α of pitching rotary drive motor 4032Correction value Δ α2, and right son The anglec of rotation β of horizontal rotation motor 404 of head 3041Correction value Δ β1With the anglec of rotation of pitching rotary drive motor 405 Spend β2Correction value Δ β2, above-mentioned anglec of rotation correction value is then issued into controller module 500 again, by controller module 500 Control left sub- head 303 rotates horizontally motor 402 and pitching rotary drive motor 403, the horizontal rotation of right sub- head 304 Turn motor 404 and pitching rotary drive motor 405 finely tunes respective angles, to realize the left camera 201 of bionic eye and the right side Camera 202 is respectively precisely towards face rectangular profile center position of speaking so that in face rectangular peripheral profile of speaking The heart appears in the video image center of camera identification.
Step 7: when if controller module 500 detects that new speaker occurs, circulation is performed Step 2: step 3rd, Step 4: Step 5: step 6, real-time tracking is carried out to emerging face of speaking;If occurring without new speaker, Then circulation performs step 6 so that the corresponding free degree of eyes adjusts according to the movement for face location of currently speaking.
Step 8: whether detection whole system needs to work on, if so, then turning to step 7, determine whether new Speaker occurs, if it is not, then exiting work shutdown.
In method of work of the present invention:Carried in specific algorithm that the eye level corner mentioned in step 3 solves, step 4 To neck carried out rotating the sub- heads of bionic eye head module mentioned in the computational methods of adjustment, step 5 or so two closed It is technology commonly used in the art to save space method for solving.Understand for the ease of auditor, be described in detail separately below.
The involved all angular units of present invention description are taken as:Degree, all length unit are taken as:Millimeter (mm).
The specific algorithm that the eye level corner mentioned in above-mentioned steps three solves comprises the following steps:
Step 3.1, as shown in fig. 6, establishing fixed head base coordinate system respectively in bionic eye head module 300 O0-X0Y0Z0, the neck coordinate system O that horizontally rotates1-X1Y1Z1, the left sub- base coordinate system O of head 3032-X2Y2Z2, left sub- head 303 rotate horizontally coordinate system O3-X3Y3Z3, the left sub- pitching rotating coordinate system O of head 3034-X4Y4Z4, the right sub- basis of head 304 is sat Mark system O5-X5Y5Z5Coordinate system, right sub- head 304 rotate horizontally coordinate system O6-X6Y6Z6, the right sub- pitching rotational coordinates of head 304 It is O7-X7Y7Z7, it is briefly referred to as O0、O1、O2、O4、O5、O6、O7Coordinate system, wherein:
O0Coordinate system build 301 dot center of bionic eye head base in, and speaker's mouth is obtained through auditory localization algorithm D coordinates value is the statement under the coordinate system, is the base coordinate system of whole bionic eye head;
O1Coordinate system is built at the center above carrying platform 302, bound with carrying platform 302 together with, equivalent to neck Coordinate system, O1Coordinate system follows horizontally rotating and rotating for carrying platform 302, and its corner is represented with θ;
O2Coordinate system is built in the bottom center plane of the horizontal rotation joints axes of left sub- head 303, with carrying platform 302 bindings are the left sub- base coordinate system of head 303 together;
O3Its origin of coordinate system just build the surface of the horizontal rotation joints axes of left sub- head 303, and and pitching in Joints axes are contour, O3Coordinate system is rightly bundled on the left horizontal rotating frame 305 of sub- head 303, with left sub- head 303 Horizontal rotating frame 305 horizontally rotates and rotated, its corner α1Represent;
O4Coordinate origin and O3Coordinate origin overlaps, O4Coordinate system is rightly bundled in the left pitching of sub- head 303 rotation On frame 306, rotated with the pitch rotation of the left sub- pitching swivel mount 306 of head 303, because O4The X of coordinate system4Axle positive direction Just with the optical axis coincidence for the left camera 201 being fixed on pitching swivel mount 306, as shown in figure 5, therefore, O4Coordinate system is also referred to as For the left coordinate system of camera 201, left camera 201 will be with the left horizontal rotating frame 305 of sub- head 303 and pitching swivel mount 306 rotation and rotate, O4Coordinate system with the left sub- pitching of 303 pitching swivel mount of head 306 corner α2Represent;
O5Coordinate system is built in the bottom center plane of the horizontal rotation joints axes of right sub- head 304, with carrying platform 302 bindings are the right sub- base coordinate system of head 304 together;
O6Its origin of coordinate system just build the surface of the horizontal rotation joints axes of right sub- head 304, and and pitching in Joints axes are contour, O6Coordinate system is rightly bundled on right sub- head horizontal rotating frame 307, as right sub- head 304 is horizontal Swivel mount 307 horizontally rotates and rotated, its corner β1Represent;
O7Coordinate origin and O6Coordinate origin overlaps, O7Coordinate system is rightly bundled in the right pitching of sub- head 304 rotation On frame 308, rotated with the pitch rotation of the right sub- pitching swivel mount 308 of head 304, because O7The X of coordinate system7Axle positive direction Just with the optical axis coincidence for the right camera 202 being fixed on pitching swivel mount 308, therefore, O7Coordinate system is also referred to as right camera Coordinate system, it is right to image 201 rotations that will be with the right horizontal rotating frame 307 of sub- head 304 and pitching swivel mount 308 and turn It is dynamic, O7Coordinate system with the right sub- pitching of 304 pitching swivel mount of head 308 corner β2Represent;
Step 3.2, spatial correlation description:
As shown in fig. 6, by neck coordinate system O1Relative to head base coordinate system O0Translation and horizontal rotation relation square Battle array A0Description, by O2Coordinate system is relative to O1The translation relation matrix A of coordinate system1Description, by O3Coordinate system is relative to O2Coordinate The translation of system and horizontal rotation relation matrix A2Description, by O4Coordinate system is relative to O3The pitching rotation relationship square of coordinate system Battle array A3Description, by O5Coordinate system is relative to O1The translation relation matrix A of coordinate system4Description, by O6Coordinate system is relative to O5Coordinate The translation of system and horizontal rotation relation matrix A5Description, by O7Coordinate system is relative to O6The pitching rotation relationship square of coordinate system Battle array A6Description, by speaker's mouth relative to head base coordinate system O0Vector correlation with matrix P describe, by speaker's mouth Relative to left camera coordinate system O4Vector correlation matrix PLDescription, by the mouth of speaker relative to right camera coordinate It is O7Vector correlation matrix PRDescription;
Following series matrix can be expressed as according to the spatial relation description of above-mentioned coordinate system:
PL=[XL YL ZL 1]T (9)PR=[XR YR ZR 1]T (10)
In matrix:Pz1For O1Coordinate origin is relative to O0The height of coordinate origin, it is known;PyFor neck coordinate system O1The distance between sub- head base coordinate system origins of origin and left and right two, are known;Pz2For O3Coordinate origin relative to Coordinate origin and O6Coordinate origin is relative to O5The height of coordinate origin, it is known;
θ is O1Coordinate system is relative to O0The horizontal rotation angle of coordinate system, and the horizontal rotary drive motor 401 of neck Gyration, seen from above, counterclockwise for just, clockwise is negative, can at any time read, be known quantity;
α1For O3Coordinate system is relative to O2The horizontal rotation angle of coordinate system, and left sub- head rotate horizontally motor 402 gyration, the corner serve as horizontally rotating for left eye, seen from above, counterclockwise for just, clockwise is negative, is demand solution Amount;
α2For O4Coordinate system is relative to O3The pitching anglec of rotation of coordinate system, and left sub- head pitching rotary drive motor 403 gyration, the corner serve as the pitch rotation of left eye, and from the side, nutation is just, it is negative to face upward, and is demand solution amount;
β1For O6Coordinate system is relative to O5The horizontal rotation angle of coordinate system, and right sub- head rotate horizontally motor 404 gyration, the corner serve as horizontally rotating for right eye, seen from above, counterclockwise for just, clockwise is negative, is demand solution Amount;
β2For O7Coordinate system is relative to O6The pitching anglec of rotation of coordinate system, and right sub- head pitching rotary drive motor 405 gyration, the corner serve as the pitch rotation of right eye, and from the side, nutation is just, it is negative to face upward, and is demand solution amount;
X, Y, Z are the D coordinates value of speaker's mouth, and via controller module 500 is run auditory localization algorithm and obtained in real time , as known quantity when calculating here;
When the sub- head in left and right two is towards front, the feathering angle of the two sub- heads in left and right is respectively:α1=0 °, β1=0 °;
When left 201 face acoustic target of camera, there must be YL=0, ZL=0, XLIt is demand solution amount;
When right 202 face acoustic target of camera, there must be YR=0, ZR=0, XRIt is demand solution amount;
Step 3.3, in the case where neck does not rotate, carry out the left sub- head 303 of bionic eye head and rotate horizontally driving electricity The gyration α of machine1The gyration β of motor is rotated horizontally with right sub- head 3041Derivation:
, can be by speaker's mouth relative to head base coordinate system according to the spatial relationship between above-mentioned each coordinate system When the vector of D coordinates value describes P and left camera 201, the X-axis of the coordinate system of right the camera 202 honest mouth towards speaker Vector P is describedL、PREstablish following relational expression:
Now, the horizontal rotation angle α needed for the two sub- heads in left and right need to only be calculated1With β1, by foregoing each formula and Know that condition can derive acquisition:
So, in the case where neck joint does not rotate (θ=0 °), solve and obtained eyes and see to speaker's mouth position When, the horizontal rotation angle α needed for the sub- head in left and right two1With β1, that is, obtained left sub- head 303 and rotated horizontally motor Gyration α1The gyration β of motor is rotated horizontally with right sub- head 3041
That is mentioned in above-mentioned steps four, which to neck rotate the computational methods of adjustment, is:
If after neck joint rotation is adjusted, neck coordinate system O1Should be honest towards speaker's mouth, if after adjustment The mouth of speaker is with respect to neck coordinate system O1Vector correlation matrix P1Description, if O after adjustment1Neck coordinate system relative to Head base coordinate system O0Translation and horizontal rotation relation described with B, then they can be write as to lower column matrix:
In matrix:(X1, Y1, Z1) for speaker's mouth in neck coordinate system O1In coordinate statement, as neck coordinate system O1 X1Y during the axle honest mouth towards speaker1=0, it is definite value, X1、Z1It is demand solution amount;θ is the horizontal rotary drive motor of neck 401 anglec of rotation, seen from above, counterclockwise for just, clockwise is negative, is demand solution amount;Pz1It is the relative O of coordinate system0Coordinate The height of system, it is measurable to obtain, it is known quantity;
, can be by speaker's mouth relative to head base coordinate system O according to the spatial relationship between above-mentioned each coordinate system0 D coordinates value vector describe P with relative to neck coordinate system O after adjustment1Vector P is described1Establish following relational expression:
P=BP1 (15)
(8) formula and known conditions are substituted into (15) formula can derive acquisition:
Above-mentioned required bionic eye neck joint horizontal rotation angle θ, it is, if making the face of carrying platform 302 to speaking The mouth direction of people, carrying platform 302 is required to obtain the anglec of rotation, and the rotation needed for the horizontal rotary drive motor 401 of neck Angle, θ.
Mentioned in above-mentioned steps five, the sub- head joint space method for solving of bionic eye head module or so two is:
Now, neck joint is without making rotation adjustment, according to the relationship description formula between each coordinate system of foregoing foundation (1)-(10), when camera is all directed at the mouth position of same speaker when left and right two, should equally there is formula (11).
Acquisition can be derived by foregoing each formula and known conditions:
The anglec of rotation α of the corresponding free degree of the above-mentioned sub- head in solution left and right1、α2、β1、β2Process, it is as counter to solve left sub- cloud The anglec of rotation α of horizontal rotation motor 402 of platform 3031With the anglec of rotation α of pitching rotary drive motor 4032, right sub- head The anglec of rotation β of 304 horizontal rotation motor 4041With the anglec of rotation β of pitching rotary drive motor 4052Calculating Journey, wherein, XLBe the photocentre of left camera 201 the distance between to speaker's mouth, XRIt is right camera photocentre to mouth of speaking The distance between portion.
The computational methods step for the eyes adjustment in direction value that step 6 of the present invention is carried is as follows:All angular units are equal below It is taken as:Spend (°), all Parameter units related to video image are taken as:Pixel;
Step 6.1, upper computer module 600 read and show bionic eye left camera 201 and right camera 202 obtain Video image, it is highly H if picture traverse is W;
Step 6.2, upper computer module 600 run face recognition algorithms, know respectively in left camera 201, right camera 202 In other video image, video image coordinate of the face rectangular peripheral profile center in left camera 201 of speaking is calculated It is Ol-XlYlWith the video image coordinate system O of right camera 202r-XrYrIn two-dimensional coordinate value:(Xl, Yl)、(Xr, Yr), such as Shown in Fig. 5.
Step 6.3, the video image coordinate system O for taking left camera 201l-XlYlSat with the video image of right camera 202 Mark system Or-XrYrCenter position coordinates value be:(W/2, H/2), then in the video image of left camera 201 identification, speak Two-dimensional coordinate value (the X of face centerl, Yl) with two-dimensional coordinate value (W/2, H/2) offset of video image center For:
In the video image of right camera 202 identification, the two-dimensional coordinate value (X for face center of speakingr, Yr) with Two-dimensional coordinate value (W/2, H/2) offset of video image center is:
Step 6.4, the corresponding freedom degree rotating angle correction of eyes is asked for according to offset:
Left eye rotates horizontally freedom degree rotating angle correction value:Δα1=arctan (Δ Xl/length); (20)
Left eye pitching rotary freedom anglec of rotation correction value:Δα2=arctan (Δ Yl/length); (21)
Right eye rotates horizontally freedom degree rotating angle correction value:Δβ1=arctan (Δ Xr/length); (22)
Right eye pitching rotary freedom anglec of rotation correction value:Δβ2=arctan (Δ Yr/length); (23)
Wherein, length is the focal length parameter of camera, is set according to the actual parameter of camera;Anglec of rotation correction value Δ α1、Δα2、Δβ1、Δβ2In include positive and negative number two kinds of situations.
After respective drive motor rotates above-mentioned correction angle, the face rectangular peripheral profile that will can speak is adjusted in image Heart position.
Two embodiments are set forth below and further illustrate the application of the present invention in practice
Embodiment:
Embodiment 1:Bionic eye cradle head structure design parameter is:Pz1=150mm, Pz2=120mm, Py1=120mm, it is assumed that Auditory localization algorithm is run by controller module 500 and obtains speaker's mouth relative to head base coordinate system O0Three-dimensional coordinate Value P=[6,000 2,000 3,000 1]T;First, obtained according to formula (12) formula and known conditions in step 3 of the present invention:
α1=17.398 °, β1=19.461 °
At this moment two angle values need not carry out rotation adjustment within ± 30 ° to the neck of carrying platform 302;
Then, according to step 5 of the present invention, known conditions is substituted into formula (17) and obtained:
α1=17.398 °, α2=-23.472 °, XL=6854.7mm, β1=19.461 °, β2=-23.2214 °, XR= 6924.4mm;
If above-mentioned result of calculation and known conditions are taken back into formula (11) to be verified, it can be deduced that speaker's mouth phase For head base coordinate system O0D coordinates value:
' P=[6000.1 1999.7 2999.8 1]T
Understand ' P ≈ P, and error demonstrates the correctness of angle method for solving of the present invention in allowed band.
Finally, according to step 6 of the present invention, video of the face rectangular peripheral profile center in left camera 201 of speaking Image coordinate system Ol-XlYlWith the video image coordinate system O of right camera 202r-XrYrIn two-dimensional coordinate value be respectively: (520,100), (130,110), two-dimensional coordinate value (320,240) offset with video image center are respectively:
Camera focal length parameter is taken as:Length=3000, bring above-mentioned condition into formula (20)-(23) and draw each rotation Corner correction value:
Δα1=3.81 °, Δ α2=-2.61 °, Δ β1=-3.62 °, Δ β2=-2.48 °
After respective drive motor rotates above-mentioned correction angle, the face rectangular peripheral profile that will can speak is adjusted in image Heart position.
Embodiment 2:Bionic eye cradle head structure design parameter is:Pz1=150mm, Pz2=120mm, Py1=120mm, it is assumed that Auditory localization algorithm is run by controller module 500 and obtains speaker's mouth relative to head base coordinate system O0Three-dimensional coordinate Value P=[3,000 3,000 3,000 1]T;Obtained according to (12) formula and known conditions in step 3 of the present invention:
α1=43.834 °, β1=46.1267 °,
It can be seen that they, according to step 4 of the present invention, first have to enter the neck of carrying platform 302 beyond ± 30 ° of scopes Row rotation adjustment, known conditions is substituted into (16) formula to try to achieve:θ=45.0033 °, control carrying platform 302 anglec of rotation θ= After 45.0033 °, make mouth direction of the face of carrying platform 302 to speaker;Then, relatively current neck coordinate system O1To enter Row step 5 computing, by P=[3,000 3,000 3,000 1]T, θ=45.0033 ° and other known conditions substitute into formula (17), :α1=-1.6204 °, α2=-32.7519 °, XL=5046.5mm, β1=1.6204 °, β2=-32.7519 °, XR= 5046.5mm now, α1=-1.6204 °, β1=1.6204 °, within ± 30 °;If by above-mentioned result of calculation and known Condition is taken back formula (11) and verified, it can be deduced that speaker's mouth is relative to head base coordinate system O0D coordinates value For:
' P=[3000.1 2999.8 2999.9 1]T
Understand ' P ≈ P, and error demonstrates the correctness of angle method for solving of the present invention in allowed band.
Finally, according to step 6 of the present invention, video of the face rectangular peripheral profile center in left camera 201 of speaking Image coordinate system Ol-XlYlWith the video image coordinate system O of right camera 202r-XrYrIn two-dimensional coordinate value be respectively: (400,340), (240,350), two-dimensional coordinate value (320,240) offset with video image center are respectively:
Camera focal length parameter is taken as:Length=3000, bring above-mentioned condition into formula (20)-(23) and draw each rotation Corner correction value:
Δα1=1.53 °, Δ α2=1.90 °, Δ β1=-1.53 °, Δ β2=2.09 °
After respective drive motor rotates above-mentioned correction angle, the face rectangular peripheral profile that will can speak is adjusted in image Heart position.
The technologies such as auditory localization, recognition of face and visual processes and algorithm are known technology, are not described in detail herein, this Invention combines above-mentioned technology and the design of bionic eye cradle head structure and joint space derivation algorithm, realizes bionic eye to face of speaking Portion's rectangular peripheral profile being accurately positioned and following the trail of.

Claims (2)

  1. A kind of 1. method of work of bionic eye location tracking system, it is characterised in that described bionic eye location tracking system master To be made up of following six part:Bionic eye head module, sound collection sensor array, camera module, head motion control Module, controller module and upper computer module;Wherein:
    Described bionic eye head module includes:One head base, the sub- head of a carrying platform and left and right two, wherein, Joint is rotated horizontally by one to be connected, pass through level of the carrying platform on head base between head base and carrying platform Rotate to imitate the neck rotation of bio-robot;The two sub- heads in left and right, the sub- cloud in left and right two are installed on carrying platform Platform respectively has a horizontal rotating frame and a pitching swivel mount respectively, and so, every sub- head all has horizontal rotation and pitching Two cradle heads are rotated, two cameras are separately mounted on the pitching swivel mount among the two sub- heads in left and right, served as imitative The eyes of raw eye, by the sub- respective horizontal rotation of head in left and right two and the rotation of elevation rotary joint, it can adjust respectively Camera on the sub- head in left and right two points to, to imitate the rotation of eyes;
    Described sound collection sensor array serves as bionic ear in the present system, by four characteristic identical microphones, with imitative Centered on raw eye head base fulcrum, it is arranged on by the space geometry relation of positive tetrahedron around head base center, and by cloud Origin of the platform base branch dot center as bionic eye head base coordinate system, so, pass through the voice signal and sound source collected Location algorithm, obtain D coordinates value of speaker's mouth relative to base coordinate system;
    Described camera module is made up of two independent cameras, the pitching being separately mounted among the two sub- heads in left and right On swivel mount, the eyes of bionic eye are served as in the present system, for obtaining video information, and video data are transmitted to host computer mould Block;
    Described head motion-control module includes:Be separately mounted to left and right two sub- heads on horizontal rotation motor and Horizontal rotation motor, the data transmission module of pitching rotary drive motor, carrying platform;Head motion-control module The motor of corresponding joint is controlled to carry out according to each motor turn signal of the controller module output received Rotate;
    Described controller module, the output signal of sound collection sensor array is on the one hand gathered, runs auditory localization algorithm, D coordinates value of speaker's mouth relative to head base coordinate system is obtained, and the three-dimensional coordinate of speaker's mouth is transmitted to Upper computer module;On the other hand control instruction and the data message that upper computer module is sent are received, to control each motor Respective angles are rotated, while also to the motion state of upper computer module return system;
    Described upper computer module includes:The eye level corner α of bionic eye head1And β1Derivation program, flat to carrying Platform rotates horizontally the derivation of the derivation program of rotational angle theta, bionic eye head module or so two sub- head joint spaces Program, vision processing algorithm program and the signal procedure with controller;
    The method of work of the bionic eye location tracking system mainly includes the following steps that:
    Step 1: system initialization:System electrification, sound collection sensor array enter working condition, and upper computer module is shown The video image that camera module obtains, controller module control all motors to reset to initial angle;
    Step 2: auditory localization:Controller module reads the voice data of sound collection sensor array collection, and runs sound source Location algorithm obtains D coordinates value of speaker's mouth relative to head base coordinate system, and the coordinate value is transmitted into host computer mould Block;
    Step 3: eye level corner solves:I.e. to two sub- heads of bionic eye head module or so, each level angle is asked Solution;
    D coordinates value of the upper computer module according to speaker's mouth relative to head base coordinate system, is converted to speaker's mouth Relative to the D coordinates value of neck coordinate system, when under the non-rotation status of forward neck portion, being turned to respectively to solve two eyes Talk about level angle required during people's mouth;If wherein at this moment the level angle of any one needs to enter neck more than ± 30 degree Row rotation adjustment, then turn to step 4;If the feathering angle of two eyes come is solved all not less than ± 30 degree, at this moment not Need to carry out rotation adjustment to neck, then turn to step 5;
    Step 4: rotation adjustment is carried out to neck joint:The solution of corner is rotated horizontally to carrying platform;
    The first D coordinates value according to speaker's mouth relative to neck coordinate system of upper computer module, to meet that carrying platform turns to Honest towards speaker's mouth is condition, and the anti-feathering angle solved needed for neck, upper computer module revolves the level of neck Corner issues controller module, controls the horizontal rotary drive motor of neck to rotate respective angles by controller module, realizes and carry Platform turn to it is honest faces speaker's mouth direction, so, neck coordinate system also turn to it is honest face speaker's mouth direction, so Rear steering step 5, then relatively current neck coordinate system carry out the joint space of bionic eye head module or so two sub- heads Derivation, to cause two eyes to turn to speaker's mouth direction respectively;
    Step 5: the sub- head joint space of bionic eye head module or so two solves:
    Upper computer module carries out bionic eye head module or so according to the relation of the relatively current neck coordinate system of speaker's mouth Two sub- head joint space derivations, the respectively anti-pitching rotary drive motor for solving the two sub- heads in left and right, horizontal rotation Turn the angle information that motor should rotate, be then issued to controller module, the two sub- clouds in left and right are controlled by controller module The respective pitching rotary drive motor of platform, motor rotation respective angles are rotated horizontally, to realize that the eyes of bionic eye are preliminary To the mouth of speaker;
    So far, the process that two cameras are tentatively aligned to speaker's mouth is completed;
    Step 6: eyes adjustment in direction:Two sub- each rotary joint corners of head of bionic eye head module or so are repaiied Just;
    Upper computer module runs face recognition algorithms, exists to obtain to speak face rectangular peripheral profile and ask for its center Two-dimensional coordinate value in video image coordinate system, according to the two-dimensional coordinate value and the offset of video image center, difference Obtain the pitching rotary drive motor of the two sub- heads in left and right and rotate horizontally the correction value of the motor anglec of rotation, then will Above-mentioned anglec of rotation correction value issues controller module again, and the two sub- respective pitching of head in left and right are controlled by controller module Two eyes of bionic eye are distinguished accurate directions by rotary drive motor and horizontal rotation motor fine setting respective angles to realize The face mask center position of speaker so that face rectangular profile of speaking center appears in the video image of camera identification Center;
    Step 7: when if controller module detects that new speaker occurs, circulation perform Step 2: Step 3: Step 4: Step 5 and step 6, real-time tracking is carried out to emerging face of speaking;If occurring without new speaker, circulation performs Step 6 so that the corresponding free degree of eyes adjusts according to the movement for face location of currently speaking;
    Step 8: whether detection whole system needs to work on, if so, then turning to step 7, new speak is determined whether People occurs, if it is not, then exiting work shutdown.
  2. 2. method of work as claimed in claim 1, it is characterised in that the calculating side for the eyes adjustment in direction value that step 6 is carried Method step is as follows:
    All angular units are taken as below:Spend (°), all Parameter units related to video image are taken as:Pixel;
    Step 6.1, upper computer module (600) read and show that the left camera (201) of bionic eye and right camera (202) obtain Video image, be highly H if picture traverse is W;
    Step 6.2, upper computer module (600) operation face recognition algorithms, respectively in left camera (201), right camera (202) In the video image of identification, video image of the face rectangular peripheral profile center in left camera (201) of speaking is calculated Coordinate system Ol-XlYlWith the video image coordinate system O of right camera (202)r-XrYrIn two-dimensional coordinate value:(Xl, Yl)、(Xr, Yr);
    Step 6.3, the video image coordinate system O for taking left camera (201)l-XlYlSat with the video image of right camera (202) Mark system Or-XrYrCenter position coordinates value be:(W/2, H/2), then in the video image of left camera (201) identification, say Talk about the two-dimensional coordinate value (X of face centerl, Yl) offset with the two-dimensional coordinate value (W/2, H/2) of video image center Measure and be:
    <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <msub> <mi>&amp;Delta;X</mi> <mi>l</mi> </msub> <mo>=</mo> <msub> <mi>X</mi> <mi>l</mi> </msub> <mo>-</mo> <mi>W</mi> <mo>/</mo> <mn>2</mn> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>&amp;Delta;Y</mi> <mi>l</mi> </msub> <mo>=</mo> <msub> <mi>Y</mi> <mi>l</mi> </msub> <mo>-</mo> <mi>H</mi> <mo>/</mo> <mn>2</mn> </mrow> </mtd> </mtr> </mtable> </mfenced>
    In the video image of right camera (202) identification, the two-dimensional coordinate value (X for face center of speakingr, Yr) with regarding Two-dimensional coordinate value (W/2, H/2) offset of frequency image center location is:
    <mfenced open = "{" close = ""> <mtable> <mtr> <mtd> <mrow> <msub> <mi>&amp;Delta;X</mi> <mi>r</mi> </msub> <mo>=</mo> <msub> <mi>X</mi> <mi>r</mi> </msub> <mo>-</mo> <mi>W</mi> <mo>/</mo> <mn>2</mn> </mrow> </mtd> </mtr> <mtr> <mtd> <mrow> <msub> <mi>&amp;Delta;Y</mi> <mi>r</mi> </msub> <mo>=</mo> <msub> <mi>Y</mi> <mi>r</mi> </msub> <mo>-</mo> <mi>H</mi> <mo>/</mo> <mn>2</mn> </mrow> </mtd> </mtr> </mtable> </mfenced>
    Step 6.4, the corresponding freedom degree rotating angle correction of eyes is asked for according to offset:
    Left eye rotates horizontally freedom degree rotating angle correction value:Δα1=arctan (Δ Xl/length);
    Left eye pitching rotary freedom anglec of rotation correction value:Δα2=arctan (Δ Yl/length);
    Right eye rotates horizontally freedom degree rotating angle correction value:Δβ1=arctan (Δ Xr/length);
    Right eye pitching rotary freedom anglec of rotation correction value:Δβ2=arctan (Δ Yr/length);
    Wherein, length is the focal length parameter of camera, is set according to the actual parameter of camera;Anglec of rotation correction value Δ α1、 Δα2、Δβ1、Δβ2In include positive and negative number two kinds of situations;
    After respective drive motor rotates above-mentioned anglec of rotation correction value, the face rectangular peripheral profile that will can speak is adjusted to image Center.
CN201610379179.5A 2016-06-01 2016-06-01 The method of work of bionic eye location tracking system Active CN106042005B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610379179.5A CN106042005B (en) 2016-06-01 2016-06-01 The method of work of bionic eye location tracking system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610379179.5A CN106042005B (en) 2016-06-01 2016-06-01 The method of work of bionic eye location tracking system

Publications (2)

Publication Number Publication Date
CN106042005A CN106042005A (en) 2016-10-26
CN106042005B true CN106042005B (en) 2018-02-06

Family

ID=57171669

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610379179.5A Active CN106042005B (en) 2016-06-01 2016-06-01 The method of work of bionic eye location tracking system

Country Status (1)

Country Link
CN (1) CN106042005B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106791681A (en) * 2016-12-31 2017-05-31 深圳市优必选科技有限公司 Video monitoring and face identification method, apparatus and system
CN106838563B (en) * 2017-01-20 2018-08-07 上海大学 A kind of robot bionic machinery holder
CN108476293B (en) * 2017-02-08 2021-08-06 深圳市大疆创新科技有限公司 Multifunctional camera, control method thereof, wearable device, cradle head and aircraft
JP6610609B2 (en) * 2017-04-27 2019-11-27 トヨタ自動車株式会社 Voice dialogue robot and voice dialogue system
CN109382849A (en) * 2017-08-07 2019-02-26 刘海云 Using the robot eyes of cone coordinate system vibration zoom
CN109693235B (en) * 2017-10-23 2020-11-20 中国科学院沈阳自动化研究所 Human eye vision-imitating tracking device and control method thereof
WO2019104681A1 (en) * 2017-11-30 2019-06-06 深圳市大疆创新科技有限公司 Image capture method and device
CN111136650A (en) * 2018-11-02 2020-05-12 深圳市优必选科技有限公司 Robot and visual tracking method thereof and computer readable storage medium
CN110163938A (en) * 2018-12-05 2019-08-23 腾讯科技(深圳)有限公司 Control method, device, storage medium and the electronic device of animation
CN109605397A (en) * 2019-01-11 2019-04-12 山东元脉电子技术股份有限公司 Artificial intelligence educational robot and its control method with automatic tracking function

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6826284B1 (en) * 2000-02-04 2004-11-30 Agere Systems Inc. Method and apparatus for passive acoustic source localization for video camera steering applications
CN202607678U (en) * 2012-04-19 2012-12-19 上海大学 Eyeball movement mechanism for bionic-robot
CN103841357A (en) * 2012-11-21 2014-06-04 中兴通讯股份有限公司 Microphone array sound source positioning method, device and system based on video tracking
CN103072140B (en) * 2012-12-18 2015-08-19 北京理工大学 A kind of binocular solid bionic eye mechanism with panoramic capture, positioning function
CN104469154B (en) * 2014-12-05 2017-09-26 合肥国科声拓信息技术有限公司 A kind of camera guide device and bootstrap technique based on microphone array
CN105184214B (en) * 2015-07-20 2019-02-01 北京进化者机器人科技有限公司 A kind of human body localization method and system based on auditory localization and Face datection

Also Published As

Publication number Publication date
CN106042005A (en) 2016-10-26

Similar Documents

Publication Publication Date Title
CN106042005B (en) The method of work of bionic eye location tracking system
WO2004039077A1 (en) Bionic vision and sight automatic control system and method
CN105835036B (en) A kind of parallel connected bionic eye device and its control method
Kuniyoshi et al. Active stereo vision system with foveated wide angle lenses
CN106113067B (en) A kind of Dual-Arm Mobile Robot system based on binocular vision
CN105631859B (en) Three-degree-of-freedom bionic stereo visual system
CN109753076A (en) A kind of unmanned plane vision tracing implementing method
CN206200967U (en) Robot target positioning follows system
CN207037750U (en) Whole scene scanning means and 3-D scanning modeling
CN105222758A (en) Based on multiple mobile object search and locating device and the method for birds visual signature
CN105389543A (en) Mobile robot obstacle avoidance device based on all-dimensional binocular vision depth information fusion
CN101096101A (en) Robot foot-eye calibration method and device
CN109062229A (en) The navigator of underwater robot system based on binocular vision follows formation method
CN206332777U (en) Vehicle-mounted head-up-display system
CN103777204B (en) Based on distance measuring equipment and the method for photoelectric intelligent aware platform target following identification
CN106933096A (en) It is a kind of to follow robot device and method certainly for what third party provided spatial orientation information
CN105469412A (en) Calibration method of assembly error of PTZ camera
CN106767913B (en) Compound eye system calibration device and calibration method based on single LED luminous point and two-dimensional rotary table
CN100566663C (en) A kind of probe frame for brain bloodstream detection
CN111867932A (en) Unmanned aerial vehicle comprising omnidirectional depth sensing and obstacle avoidance air system and operation method thereof
JP2008233081A (en) Method of recognizing and tracking spatial point
CN105783880B (en) A kind of monocular laser assisted bay section docking calculation
CN210201937U (en) Image acquisition device
CN108616744A (en) A kind of bionical binocular vision calibration system and calibration method
CN104216202A (en) Inertia gyroscope combined real-time visual camera positioning system and method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant