CN105844673A - All-angle human tracking system and control method based on nature human-computer interaction technology - Google Patents

All-angle human tracking system and control method based on nature human-computer interaction technology Download PDF

Info

Publication number
CN105844673A
CN105844673A CN201610342160.3A CN201610342160A CN105844673A CN 105844673 A CN105844673 A CN 105844673A CN 201610342160 A CN201610342160 A CN 201610342160A CN 105844673 A CN105844673 A CN 105844673A
Authority
CN
China
Prior art keywords
people
computer interaction
natural human
human
principle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610342160.3A
Other languages
Chinese (zh)
Other versions
CN105844673B (en
Inventor
王晖
成颜锋
毛自旺
万元芳
钟君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BEIJING TRAINSFER TECHNOLOGY DEVELOPMENT Co Ltd
Original Assignee
BEIJING TRAINSFER TECHNOLOGY DEVELOPMENT Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BEIJING TRAINSFER TECHNOLOGY DEVELOPMENT Co Ltd filed Critical BEIJING TRAINSFER TECHNOLOGY DEVELOPMENT Co Ltd
Priority to CN201610342160.3A priority Critical patent/CN105844673B/en
Publication of CN105844673A publication Critical patent/CN105844673A/en
Application granted granted Critical
Publication of CN105844673B publication Critical patent/CN105844673B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides an all-angle human tracking system and control method based on nature human-computer interaction technology. The system comprises a nature human-computer interaction data processing unit for analyzing collected nature human-computer interaction data, carrying out data processing to obtain a control instruction packet and outputting the control instruction packet. The provided all-angle human tracking control method based on the nature human-computer interaction technology comprises the following steps: collecting the nature human-computer interaction data; analyzing the nature human-computer interaction data, determining a main track person and outputting the control instruction packet; and converting the control instruction packet into a control signal according with the all-angle human tracking system. A control instruction packet converting unit is used for converting the control instruction packet into the control signal according with the all-angle human tracking system to realize control of a monitoring device. The system and method not only can enable the monitoring device to track the followed person in an all-angle manner, but also can eliminate other external factors intelligently to lock the followed person.

Description

Full angle people based on natural human-computer interaction technology follows the tracks of system and control method
Technical field
The present invention relates to human-computer interactive control system regions, a kind of full angle people based on natural human-computer interaction technology follows the tracks of system and control method.
Background technology
Video tracking present stage is widely used in the fields such as net cast, video conference, remote teaching, it is possible to further the distance between two places, it is achieved interactive function.
In prior art, the patent documentation of publication number CN101290681B discloses a kind of video target tracking method, device and automatic video frequency following system, carries out the deformation of gradient vector flow GVF including to position candidate each in present frame, obtains each deformation curve;Calculate the video features of described deformation curve;Determine that a position candidate is target location according to calculated video features.This patent also discloses a kind of video object follow-up mechanism and automatic video frequency following system, it is possible to increase target following accuracy rate.The defect of this invention be can not 360 degree to following the tracks of people's implementing monitoring, when having many people when, it is impossible to enough monitor tracked people accurately.
Summary of the invention
The present invention uses nature human-computer interaction technology, determining first artificial tracked people entering monitored picture, allowing tracking equipment follow tracked people all the time, even if having other people to enter monitored picture can also accurately lock tracked people.When tracked people departs from the monitoring of tracking equipment, tracking equipment is adjustable to initial position, waits that tracked people reenters monitored picture.Can 360 degree to follow the tracks of people's implementing monitoring, solve the problem that there is blind area in prior art, it is possible to lock first entrance monitored picture artificial tracked people, solve when there being many people to enter picture in prior art, the problem easily obscuring tracked people.
The concrete technical scheme of the present invention is as follows:
A first aspect of the present invention proposes a kind of full angle people based on natural human-computer interaction technology and follows the tracks of system, including the natural human-machine interaction data collection of the natural human-machine interaction data for acquisition operations person, it is preferred that include with lower module:
Track identification module, identifies and locks tracked people;
Control centre's module, resolves the natural human-machine interaction data of described collection, and is processed into the output of control instruction bag;
Instruction modular converter, is converted into described control instruction bag and meets full angle people and follow the tracks of the control signal of system;
Intelligence tracing module: receive control signal and the movement according to human body carries out automatic tracing.
In such scheme preferably, described track identification module has according to identifying that principle locks the function of tracked people.
In such scheme preferably, described identification principle include location recognition principle, gesture recognition principle, action recognition principle, recognition of face principle, voice recognition principle, first enter to lock principle, intelligence ignores at least one in principle.
In such scheme preferably, described location recognition principle refers to that the personage according to identifying in region determines tracked people with the relative position of sensor.
In such scheme preferably, described gesture recognition principle refers to that the personage when identifying in region shows the static limbs posture identical with systemic presupposition, it is determined that this personage is tracked people.
In such scheme preferably, described action recognition principle refers to that the personage when identifying in region makes the action identical with systemic presupposition, it is determined that this personage is tracked people, and described action recognition includes at least one in identification maneuver track and restriction movement time.
In such scheme preferably, described recognition of face principle is that the human face data prestored according to system determines whether tracked people.
In such scheme preferably, described voice recognition principle is that the voice data prestored according to system determines whether tracked people.
In such scheme preferably, described first enter to lock principle and refer to determine that first enters and identify that the personage in region is tracked people.
In such scheme preferably, described intelligence is ignored after principle refers to lock tracked people other and is entered and identify that the people in regions are ignored by intelligence.
In such scheme preferably, described track identification module also has according to the function following the trail of the principle tracked people of tracking.
In such scheme preferably, described tracking principle includes that skeleton data follows the trail of at least one in principle, depth data assistance tracking principle, sound tracing principle, auto-returned principle.
In such scheme preferably, described skeleton data is followed the trail of principle and is referred to extract the skeleton data of tracked people, is tracked it.
In such scheme preferably, described depth data assistance tracking principle refers to, when skeleton data gets muddled, transfer depth data and assist from the data collected, and continues tracked people and is tracked.
, described sound tracing principle refers to when tracked people walks out identification range in such scheme preferably, can the sound of the tracked people of automatic tracing, then re-start person recognition.
In such scheme preferably, described auto-returned principle refers to, after the time interval set, not collect any data of tracked people, be then automatically adjusted to predeterminated position and re-start person recognition.
In such scheme preferably, described control centre module has the function receiving tracked people's information that track identification system sends.
In such scheme preferably, described control centre module also has the function recording tracked people's information.
In such scheme preferably, described tracked people's information refers at least one in skeleton data, position data, gesture data, action data, human face data, voice data.
In such scheme preferably, described control centre module also has the tracked people's phonetic order of parsing, action command, the function of limbs instruction.
In such scheme preferably, described control centre module also has the function of anticipation tracked people real-time action.
In such scheme preferably, described control centre module also has the function of output rectification The Cloud Terrace automatic real-time action instruction.
In such scheme preferably, the instruction after described control centre module has parsing is processed into control instruction the function exported.
In such scheme preferably, described instruction modular converter has the function of the control instruction receiving control centre.
In such scheme preferably, described instruction modular converter has the function control instruction of control centre being changed into the action command controlling intelligence tracing module.
In such scheme preferably, described intelligence tracing module has the function following the trail of tracked people according to action command.
In such scheme preferably, described intelligence tracing module has the function controlling The Cloud Terrace action.
In such scheme preferably, during described The Cloud Terrace action includes basis action, switching action and high level active at least one.
In such scheme preferably, described basic acts includes opening module, closes module and suspend at least one in module.
In such scheme preferably, described switching action include pattern switching and multi-cam switch at least one.
In such scheme preferably, described high level active includes rotating to specified angle, photographic head Focussing, speed adjust, move forwards, backwards, move left and right with lifting moving at least one.
In such scheme preferably, rotating to specified angle described in is 0-360 degree.
A second aspect of the present invention also proposed a kind of full angle people's tracking and controlling method based on natural human-computer interaction technology, including the natural human-machine interaction data collection of the natural human-machine interaction data for acquisition operations person, it is preferred that comprise the steps of
Step 1: identify and lock tracked people;
Step 2: resolve the natural human-machine interaction data of described collection, and be processed into the output of control instruction bag;
Step 3: described control instruction bag is converted into and meets full angle people and follow the tracks of the control signal of system;
Step 4: receive control signal and the movement according to human body carries out automatic tracing.
In such scheme preferably, identify according to described step 1 that principle locks tracked people.
In such scheme preferably, described identification principle include location recognition principle, gesture recognition principle, action recognition principle, recognition of face principle, voice recognition principle, first enter to lock principle, intelligence ignores at least one in principle.
In such scheme preferably, described location recognition principle refers to that the personage according to identifying in region determines tracked people with the relative position of sensor.
In such scheme preferably, described gesture recognition principle refers to that the personage when identifying in region shows the static limbs posture identical with systemic presupposition, it is determined that this personage is tracked people.
In such scheme preferably, described action recognition principle refers to that the personage when identifying in region makes the action identical with systemic presupposition, it is determined that this personage is tracked people, and described action recognition includes at least one in identification maneuver track and restriction movement time.
In such scheme preferably, described recognition of face principle is that the human face data prestored according to system determines whether tracked people.
In such scheme preferably, described voice recognition principle is that the voice data prestored according to system determines whether tracked people.
In such scheme preferably, described first enter to lock principle and refer to determine that first enters and identify that the personage in region is tracked people.
In such scheme preferably, described intelligence is ignored after principle refers to lock tracked people other and is entered and identify that the people in regions are ignored by intelligence.
In such scheme preferably, follow the trail of principle according to described step 1 and follow the trail of tracked people.
In such scheme preferably, described tracking principle includes that skeleton data follows the trail of at least one in principle, depth data assistance tracking principle, voice tracking principle, auto-returned principle.
In such scheme preferably, described skeleton data is followed the trail of principle and is referred to extract the skeleton data of tracked people, is tracked it.
In such scheme preferably, described depth data assistance tracking principle refers to, when skeleton data gets muddled, transfer depth data and assist from the data collected, and continues tracked people and is tracked.
, described voice is followed the trail of principle and is referred to when tracked people walks out identification range in such scheme preferably, can the sound of the tracked people of automatic tracing, then re-start person recognition.
In such scheme preferably, described auto-returned principle refers to, after the time interval set, not collect any data of tracked people, be then automatically adjusted to predeterminated position and re-start person recognition.
In such scheme preferably, described step 2 is to receive tracked people's information that track identification system sends.
In such scheme preferably, described step 2 is for recording tracked people's information.
In such scheme preferably, described tracked people's information refers at least one in skeleton data, position data, gesture data, action data, human face data, voice data.
In such scheme preferably, described step 2 is for resolving tracked people's phonetic order, action command, limbs instruction.
In such scheme preferably, described control centre module also has the function of anticipation tracked people real-time action.
In such scheme preferably, described control centre module also has the function of output rectification The Cloud Terrace automatic real-time action instruction.
In such scheme preferably, the instruction after described step 2 is parsing is processed into control instruction and exports.
In such scheme preferably, described step 3 is the control instruction receiving control centre.
In such scheme preferably, described step 3 is the action command control instruction of control centre being changed into and controlling intelligence tracing module.
In such scheme preferably, described step 4 is to follow the trail of tracked people according to action command.
In such scheme preferably, described step 4 is for controlling The Cloud Terrace action.
In such scheme preferably, during described The Cloud Terrace action includes basis action, switching action and high level active at least one.
In such scheme preferably, described basic acts includes opening module, closes module and suspend at least one in module.
In such scheme preferably, described switching action include pattern switching and multi-cam switch at least one.
In such scheme preferably, described high level active includes rotating to specified angle, photographic head Focussing, speed adjust, move forwards, backwards, move left and right with lifting moving at least one.
In such scheme preferably, rotating to specified angle described in is 0-360 degree.
The applicable field of the present invention is quite varied, and the problem to tracking people's implementing monitoring, when having many people when, it is possible to monitor tracked people exactly, has the most wide market prospect with effectively solving 360 degree of full angles.
Accompanying drawing explanation
Figure 1A is the schematic flow sheet of full angle people's tracking and controlling method based on natural human-computer interaction technology of the present invention.
Figure 1B is the module fundamental diagram of full angle people's tracking and controlling method based on natural human-computer interaction technology of the present invention.
Fig. 2 be the present invention full angle people's tracking and controlling method based on natural human-computer interaction technology in record demonstration lesson class schematic diagram.
Fig. 3 is full angle people's tracking and controlling method medium-long range video conference schematic diagram based on natural human-computer interaction technology of the present invention.
Fig. 4 is full angle people's tracking and controlling method medium-long range video interactive teaching schematic diagram based on natural human-computer interaction technology of the present invention.
Fig. 5 be the present invention full angle people's tracking and controlling method based on natural human-computer interaction technology in course recorded broadcast schematic diagram.
Detailed description of the invention
Figure 1A is the schematic flow sheet of full angle people's tracking and controlling method based on natural human-computer interaction technology of the present invention.Figure 1B is the module fundamental diagram of full angle people's tracking and controlling method based on natural human-computer interaction technology of the present invention.As seen in figs. 1 a-1b, step 110 is opened for tracking system, and track identification module 191 is started working.Order performs step 120 and connects and open tracking equipment, step 130 system of following the tracks of carries out initializing set, initializing set includes: sets and initially monitors region, identify principle, tracking principle, identification principle includes: location recognition principle, gesture recognition principle, action recognition principle, recognition of face principle, voice recognition principle, first enter to lock one or more in principle, following the trail of principle and include that skeleton data follows the trail of principle, depth data assistance tracking principle, one or more in principle, auto-returned principle followed the trail of in voice.
Step 140 is followed the tracks of system and is waited that tracked people enters and initially monitors region, perform step 150 initially monitor region when tracked people enters and trigger tracking condition, track identification module 191 sends tracked people's information to control centre's module 192, control centre's module 192 resolves and records tracked people's information, sends control instruction to instruction modular converter 193 simultaneously.The instruction control instruction that sends over of 193 control centre's modules 192 of modular converter is converted into lock onto target and instruct and send this instruction and control the tracking equipment tracked people of locking to intelligence tracing module 194, intelligence tracing module 194 according to lock onto target instruction.
Perform step 160 control centre module 192 and send control instruction to instruction modular converter 193, the instruction control instruction that sends over of 193 control centre's modules 192 of modular converter is converted into tracking move and sends this instruction to intelligence tracing module 194, and intelligence tracing module 194 controls tracking equipment and rotates according to the movement of tracked people according to following the tracks of move.
Perform step 165 for leaving monitoring region as tracked people, track identification module 191 sends information to control centre's module 192, control centre's module 192 resolves and processes information, send control instruction to instruction modular converter 193 simultaneously, the control instruction that instruction 193 control centre's modules 192 of modular converter send over is converted into wait decision instruction and sends instructions to intelligence tracing module 194, and intelligence tracing module 194 is according to waiting that decision instruction controls tracking equipment and is waited for.
Step 170 is for determining whether that people enters monitoring region.If it exceeds the waiting time unmanned phonetic order entering monitoring region or being not connected to tracked people, control centre's module 192 sends control instruction to instruction modular converter 193, the control instruction that instruction 193 control centre's modules 192 of modular converter send over is converted into return original state and instructs and send instructions to intelligence tracing module 194, intelligence tracing module 194 controls tracking equipment return initial position according to returning original state instruction, continues executing with step 130.
If someone enters monitoring region, then perform step 180, whether tracking equipment is tracked people according to final principle automatic decision, track identification module 191 delivers letters breath to control centre's module 192, control centre's module 192 resolves and processes information, send control instruction to instruction modular converter 193 simultaneously, the control instruction that instruction 193 control centre's modules 192 of modular converter send over is converted into wait decision instruction and sends instructions to intelligence tracing module 194, and intelligence tracing module 194 is according to waiting that decision instruction controls tracking equipment and is waited for..
If not tracked people, control centre's module 192 sends control instruction to instruction modular converter 193, the control instruction that instruction 193 control centre's modules 192 of modular converter send over is converted into and continues waiting for instruction and send instructions to intelligence tracing module 194, intelligence tracing module 194 is waited for according to continuing waiting for instruction control tracking equipment, continues executing with step 165.
If tracked people, control centre's module 192 sends control instruction to instruction modular converter 193, the control instruction that instruction 193 control centre's modules 192 of modular converter send over is converted into continuation trace command and sends instructions to intelligence tracing module 194, intelligence tracing module 194 controls the tracking equipment tracked people of tracking according to continuing trace command, continues executing with step 160.
Fig. 2 be the present invention full angle people's tracking and controlling method based on natural human-computer interaction technology in record demonstration lesson class schematic diagram.With reference to Fig. 2, video recording equipment 201 is lifted on the center top in classroom, tracking system is installed in computer 202, set identification principle as gesture recognition principle, recognition of face principle and first to enter to lock principle, setting tracking principle and follow the trail of principle, depth data assistance tracking principle, voice tracking principle, auto-returned principle as skeleton data, initial monitoring region is dais near zone.When teaching teacher 203 enters into camera lens capture region, locking teaching teacher is tracked people automatically.When there is any uncertain factor to control, camera lens automatically returns to blackboard position waiting teacher and re-fetches control, can adjust tracking action according to the phonetic order of teaching teacher automatically according to the position adjustment focal length of teaching teacher.Do not affected by ambient light light and shade, though the faintest light, still can be with accurate tracking.
Fig. 3 is full angle people's tracking and controlling method medium-long range video conference schematic diagram based on natural human-computer interaction technology of the present invention.With reference to Fig. 3, video recording equipment 301 is lain on the desk of meeting room, tracking system is installed in computer 302, set identification principle as gesture recognition principle, action recognition principle and recognition of face principle, setting tracking principle and follow the trail of principle, voice tracking principle as skeleton data, initial monitoring region is neighboring area, seat of honour position.When company leader 303 enters into camera lens capture region, locking leader is tracked people automatically.Can the paces of real-time tracking company leader, make leader be in the center of camera lens all the time, picture can be made to remain clear automatically according to the position adjustment focal length of company leader.
Fig. 4 is full angle people's tracking and controlling method medium-long range video interactive teaching schematic diagram based on natural human-computer interaction technology of the present invention.With reference to Fig. 4, video recording equipment 401 is lain on the desk of meeting room, tracking system is installed in computer 402, set identification principle as gesture recognition principle, recognition of face principle and voice recognition principle, setting tracking principle and follow the trail of principle, voice tracking principle as skeleton data, initial monitoring region is neighboring area, dais.When teaching teacher 403 enters into camera lens capture region, locking teaching teacher is tracked people automatically.After remote teaching starts, video signal by network 404 be transferred to listen to the teacher student 405 listen to the teacher place computer on 408, and project on giant-screen 409 by projector 407.The resolution of transmission video can be automatically adjusted according to network loan.When network speed is relatively low, automatically reduce the resolution of video, sacrifice image quality, to reduce picture frame losing, it is ensured that the smoothness of picture.
Fig. 5 be the present invention full angle people's tracking and controlling method based on natural human-computer interaction technology in course recorded broadcast schematic diagram.With reference to Fig. 5, video recording equipment 501 is lain on the desk of meeting room, tracking system is installed in computer 502, set identification principle as gesture recognition principle, recognition of face principle and voice recognition principle, setting tracking principle and follow the trail of principle, voice tracking principle as skeleton data, initial monitoring region is neighboring area, dais..When teaching teacher 503 enters into camera lens capture region, locking teaching teacher is tracked people automatically.Course is recorded after terminating, and uploads to Cloud Server 505 by network 504.The student 506 that listens to the teacher is online by network or downloads the curriculum video on Cloud Server, watches in PC 507.

Claims (10)

1. full angle people based on natural human-computer interaction technology follows the tracks of system, including the natural human-machine interaction data collection of the natural human-machine interaction data for acquisition operations person, it is characterised in that include with lower module:
Track identification module, identifies and locks tracked people;
Control centre's module, resolves the natural human-machine interaction data of described collection, and is processed into the output of control instruction bag;
Instruction modular converter, is converted into described control instruction bag and meets full angle people and follow the tracks of the control signal of system;
Intelligence tracing module: receive control signal and the movement according to human body carries out automatic tracing.
Full angle people based on natural human-computer interaction technology the most according to claim 1 follows the tracks of system, it is characterised in that described track identification module has according to identifying that principle locks the function of tracked people.
Full angle people based on natural human-computer interaction technology the most according to claim 1 follows the tracks of system, it is characterised in that described control centre module also has the tracked people's phonetic order of parsing, action command, the function of limbs instruction.
Full angle people based on natural human-computer interaction technology the most according to claim 1 follows the tracks of system, it is characterised in that described instruction modular converter has the function control instruction of control centre being changed into the action command controlling intelligence tracing module.
Full angle people based on natural human-computer interaction technology the most according to claim 1 follows the tracks of system, it is characterised in that described intelligence tracing module has the function following the trail of tracked people according to action command.
6. full angle people's tracking and controlling method based on natural human-computer interaction technology, including the natural human-machine interaction data collection of the natural human-machine interaction data for acquisition operations person, it is characterised in that: comprise the steps of
Step 1: identify and lock tracked people;
Step 2: resolve the natural human-machine interaction data of described collection, and be processed into the output of control instruction bag;
Step 3: described control instruction bag is converted into and meets full angle people and follow the tracks of the control signal of system;
Step 4: receive control signal and the movement according to human body carries out automatic tracing.
Full angle people's tracking and controlling method based on natural human-computer interaction technology the most according to claim 6, it is characterised in that identify according to described step 1 that principle locks tracked people.
Full angle people's tracking and controlling method based on natural human-computer interaction technology the most according to claim 6, it is characterised in that described step 2 is for resolving tracked people's phonetic order, action command, limbs instruction.
Full angle people's tracking and controlling method based on natural human-computer interaction technology the most according to claim 6, it is characterised in that described step 3 is the action command control instruction of control centre being changed into and controlling intelligence tracing module.
Full angle people's tracking and controlling method based on natural human-computer interaction technology the most according to claim 6, it is characterised in that described step 4 is to follow the trail of tracked people according to action command.
CN201610342160.3A 2016-05-20 2016-05-20 Full-angle human tracking system based on natural human-computer interaction technology and control method Active CN105844673B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610342160.3A CN105844673B (en) 2016-05-20 2016-05-20 Full-angle human tracking system based on natural human-computer interaction technology and control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610342160.3A CN105844673B (en) 2016-05-20 2016-05-20 Full-angle human tracking system based on natural human-computer interaction technology and control method

Publications (2)

Publication Number Publication Date
CN105844673A true CN105844673A (en) 2016-08-10
CN105844673B CN105844673B (en) 2020-03-24

Family

ID=56593026

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610342160.3A Active CN105844673B (en) 2016-05-20 2016-05-20 Full-angle human tracking system based on natural human-computer interaction technology and control method

Country Status (1)

Country Link
CN (1) CN105844673B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6175382B1 (en) * 1997-11-24 2001-01-16 Shell Oil Company Unmanned fueling facility
CN101290681A (en) * 2008-05-26 2008-10-22 华为技术有限公司 Video frequency object tracking method, device and automatic video frequency following system
CN102045497A (en) * 2009-10-26 2011-05-04 鸿富锦精密工业(深圳)有限公司 Image videotaping equipment and method for monitoring sound event
CN102799191A (en) * 2012-08-07 2012-11-28 北京国铁华晨通信信息技术有限公司 Method and system for controlling pan/tilt/zoom based on motion recognition technology
CN102860041A (en) * 2010-04-26 2013-01-02 剑桥机电有限公司 Loudspeakers with position tracking
CN103024344A (en) * 2011-09-20 2013-04-03 佳都新太科技股份有限公司 Automatic PTZ (Pan/Tilt/Zoom) target tracking method based on particle filter
CN103136899A (en) * 2013-01-23 2013-06-05 宁凯 Intelligent alarming monitoring method based on Kinect somatosensory equipment
CN104065923A (en) * 2014-06-23 2014-09-24 苏州阔地网络科技有限公司 On-line synchronization classroom tracking control method and system
CN104125433A (en) * 2014-07-30 2014-10-29 西安冉科信息技术有限公司 Moving object video surveillance method based on multi-PTZ (pan-tilt-zoom)-camera linkage structure
CN104635209A (en) * 2014-12-01 2015-05-20 李士斌 Sound localization system
CN105025260A (en) * 2015-07-07 2015-11-04 合肥指南针电子科技有限责任公司 Monitoring system intelligent tracing-back method
CN105407283A (en) * 2015-11-20 2016-03-16 成都因纳伟盛科技股份有限公司 Multi-target active recognition tracking and monitoring method

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6175382B1 (en) * 1997-11-24 2001-01-16 Shell Oil Company Unmanned fueling facility
CN101290681A (en) * 2008-05-26 2008-10-22 华为技术有限公司 Video frequency object tracking method, device and automatic video frequency following system
CN102045497A (en) * 2009-10-26 2011-05-04 鸿富锦精密工业(深圳)有限公司 Image videotaping equipment and method for monitoring sound event
CN102860041A (en) * 2010-04-26 2013-01-02 剑桥机电有限公司 Loudspeakers with position tracking
CN103024344A (en) * 2011-09-20 2013-04-03 佳都新太科技股份有限公司 Automatic PTZ (Pan/Tilt/Zoom) target tracking method based on particle filter
CN102799191A (en) * 2012-08-07 2012-11-28 北京国铁华晨通信信息技术有限公司 Method and system for controlling pan/tilt/zoom based on motion recognition technology
CN103136899A (en) * 2013-01-23 2013-06-05 宁凯 Intelligent alarming monitoring method based on Kinect somatosensory equipment
CN104065923A (en) * 2014-06-23 2014-09-24 苏州阔地网络科技有限公司 On-line synchronization classroom tracking control method and system
CN104125433A (en) * 2014-07-30 2014-10-29 西安冉科信息技术有限公司 Moving object video surveillance method based on multi-PTZ (pan-tilt-zoom)-camera linkage structure
CN104635209A (en) * 2014-12-01 2015-05-20 李士斌 Sound localization system
CN105025260A (en) * 2015-07-07 2015-11-04 合肥指南针电子科技有限责任公司 Monitoring system intelligent tracing-back method
CN105407283A (en) * 2015-11-20 2016-03-16 成都因纳伟盛科技股份有限公司 Multi-target active recognition tracking and monitoring method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
R.帕特里克•戈贝尔: "《ROS入门实例》", 31 January 2016 *
顾占柱: "基于声音与视觉相结合的智能追踪平台设计与实现", 《信息通信》 *

Also Published As

Publication number Publication date
CN105844673B (en) 2020-03-24

Similar Documents

Publication Publication Date Title
JP6700785B2 (en) Control system, method and device for intelligent robot based on artificial intelligence
CN100531373C (en) Video frequency motion target close-up trace monitoring method based on double-camera head linkage structure
CN103093654B (en) Double video camera interactive intelligent tracking teaching system
Fritsch et al. Audiovisual person tracking with a mobile robot
CN109951936B (en) Illumination control system and method capable of being intelligently adjusted according to different application scenes
Rui et al. Building an intelligent camera management system
CN101072332A (en) Automatic mobile target tracking and shooting method
CN206674094U (en) Video camera automatic control system based on recognition of face
WO2019104681A1 (en) Image capture method and device
CN106355242A (en) Interactive robot on basis of human face detection
CN110933316A (en) Teacher tracking teaching system based on double-camera interactive mode
CN101814242A (en) Moving object real-time tracking recording device of classes
CN109189885A (en) A kind of real-time control method and smart machine based on smart machine camera
CN104883524A (en) Method and system for automatically tracking and shooting moving object in online class
KR100248955B1 (en) Automatic surveillance device
CN206331472U (en) A kind of interactive robot based on Face datection
Bellotto et al. Multisensor integration for human-robot interaction
US8319865B2 (en) Camera adjusting system and method
CN103546679A (en) Device and method for reducing separation of shot object from shooting window
CN105844673A (en) All-angle human tracking system and control method based on nature human-computer interaction technology
CN111273232B (en) Indoor abnormal condition judging method and system
CN115733943A (en) Recording and broadcasting interaction system and method based on multi-camera automatic tracking linkage
CN108734724A (en) A kind of classroom behavior parser based on image recognition
Akolkar et al. Visual-auditory saliency detection using event-driven visual sensors
KR101502683B1 (en) Instructor Tracing System Using 3 Dimensional Sensing Module

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant