CN103984315A - Domestic multifunctional intelligent robot - Google Patents

Domestic multifunctional intelligent robot Download PDF

Info

Publication number
CN103984315A
CN103984315A CN201410205535.2A CN201410205535A CN103984315A CN 103984315 A CN103984315 A CN 103984315A CN 201410205535 A CN201410205535 A CN 201410205535A CN 103984315 A CN103984315 A CN 103984315A
Authority
CN
China
Prior art keywords
robot
module
human body
identification
gesture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410205535.2A
Other languages
Chinese (zh)
Inventor
黄鹏宇
周建雄
何跃凯
彭元华
郭振中
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CHENGDU BESTVISION TECHNOLOGY Co Ltd
Original Assignee
CHENGDU BESTVISION TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CHENGDU BESTVISION TECHNOLOGY Co Ltd filed Critical CHENGDU BESTVISION TECHNOLOGY Co Ltd
Priority to CN201410205535.2A priority Critical patent/CN103984315A/en
Priority to PCT/CN2014/084138 priority patent/WO2015172445A1/en
Publication of CN103984315A publication Critical patent/CN103984315A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]

Abstract

A domestic multifunctional intelligent robot comprises a processing and control system and a plurality of functional subsystems, and the functional subsystems comprise the movement system, the data collection system and the communication system. The domestic multifunctional intelligent robot is characterized in that the movement system comprises a locating module and a driving module, the locating module is used for recognizing and locating the environment where the robot is located, and the driving module is used for controlling movement of the robot; the data collection system comprises a vision module, a voice module and a data collection module, the vision module comprises a camera used for collecting video images, and the voice module comprises a pickup used for collecting audio information; the communication system comprises a wireless communication module used for achieving telecommunication of the robot; the processing and control system comprises a processing module and a control module, the processing module is used for receiving data in the functional subsystems in real time and processing the data according to a preset algorithm, and the control module is used for issuing control instructions to all the functional subsystems based on the processing result to control the movement of the robot.

Description

A kind of multifunctional domestic intelligent robot
Technical field
The present invention relates to the Intelligent Recognition of machine, particularly a kind of family expenses function intelligent robot.
Background technology
In recent years, along with the raising day by day of people's quality of life, intelligent robot technology is developed rapidly, become the developing direction that Smart Home technology is the swiftest and the most violent, intelligent robot has also started to enter home services industry, for example, replace people to complete sanitation and hygiene, control home entertaining, telephone receiving etc.But middle control control, traditional burglar alarm and independent water, electricity, gas are revealed to report to the police and remained the principal market developing direction of current intelligent domestic system, and rapid development, annual growth rate has exceeded 20%.
So-called Intelligent household central control terminal, cheap, " omnipotent " telepilot with self-learning function in fact, it is deployed in each room of family conventionally, adopt near radio control technology, people can be realized the optical, electrical moving curtain of place compartment lamp, electrical equipment, security alarm, background music, home theater etc. are controlled by its man-machine interface, also can be passed through 3G/4G or cable network access public network, by mobile terminals such as mobile phones, household appliances be carried out to Long-distance Control thereby realize people.
Although these systems can play a role to home intelligence, household safety-protection, but because it has only realized simple control signal transmission, can not help people to complete more home services activity, but also require each room all to need to dispose one, this is difficult to meet today people to powerful, the demand of the desirable Smart Home product of high (IQ); So, various powerful, can constantly enter people's the visual field in the intelligent domestic robot of each room free " walking about ", routine sweeping robot, it has started to enter average family, replaces people to carry out the cleaning in room.
But current practical robot is on the market nearly all for the functional development of specific house, and applicability is not strong.For have at present active demand, can one robot just realize kinsfolk because of disease or the personal injury warning of seeking help that meets accident and bring out such as fall, burglary, incident of violence are reported to the police, Telemedicine, entertainment service, small article indoor transporting, the product of the functions such as family's notepad prompting does not have completely.
The for example a kind of household service robot disclosed in patent documentation 1:201310079408.8, adopt a background server, and the various devices that are connected with background server, as shown in Figure 1, induction installation (2), motion expression device (3), power supply and charging device (4), mechanical arm (5), wherein induction installation (2) comprises smoke alarm (21), gas concentration monitor (22), temperature inductor (23), infrared inductor (24), infrared signal receiver (25) and ultrasonic sensing device (26), described motion expression device (3) comprises elemental motion control device (31), balance control device (32), expression control device (33) and body language control device (34), described power supply and charging device (4) comprise wired charging device (41), wireless charging device (42) and automatic charge device (43), described background server (1) is also connected with audiovisual and says device (6) and wireless network coupling arrangement (7), described audiovisual says that device (6) comprises camera (61), microphone (62), loudspeaker (63).
Although this service robot, by server, can collect various information around, but this robot can not utilize the various information of collecting to carry out Intelligent Recognition surrounding environment, and judges accordingly and decision making package by abundant induction installation.
In brief, this robot is the synthesis that various sensors are pooled together, and can not realize intellectual analysis, identification and comprehensively judgement.
Patent documentation 2:201110191167.7 has proposed the Household mobile safety protection robot of a kind of based target identification, and by robot is bound with mobile phone or remote computer, at impact point, place posts label; Robot learning, finds the position of each label; Set monitoring mode, time and the frequency of robot; Robot, according to the mode start by set date watchdog routine setting, detects monumented point, and image and other information is sent on master cellular phone or computer; In doing fixed point monitoring, robot continuous collecting smog, sound and people information, and abnormal information is sent on master cellular phone or computer; Robot and intelligent appliance, fixing supervisory system intercom mutually by the mode of Internet of Things; The control information that robot reception master cellular phone or computer send, interrupts having set of task or carries out the task outside routine.
Can target identify although this robot is known as, but obviously, just some predefined targets of this target identification, cannot tackle complex situations and various accident in actual environment.Say that in essence the target recognition machine people that this patent proposes is exactly in fact to have fixed point watch-dog or the sensor that remote information sends.
Patent documentation 3:201210156595.0 has proposed a kind of round-the-clock domestic robot, the same with patent documentation 1,2, also be movably on device, to have carried a large amount of sensors at one, at rotating head, infrared pick-up head infrared lamp and optical camera, LED lamp are housed, before and after described vehicle body dress, electromagnetic sensor, infrared ray low coverage sensing and CO are housed 2, formaldehyde sensor, smoke transducer, CO sensor, infrared heat energy sensor, dust sensor and temperature sensor are equipped with in left and right, and ultrasonic ultrasonic delay line memory, dehumidifying and cooling device, air purifier, suction cleaner and control circuit board, touch display screen, upset keyboard are housed in vehicle body.
Based on its disclosed working method and function, and robot in above-mentioned patent 1,2 do not have essential distinction, is all the synthesis of various sensors, do not fully utilize various sensing datas and carries out comprehensive analysis and decision.
Although what patent documentation 4:201310135363.1 proposed is a kind of integrated robot of intelligent domestic system, but, according to its description, also just on the basis of the above-mentioned robot that has carried large quantity sensor, increase the function communicating with various intelligent appliances, that is: home services robot, comprise wireless remote control terminal, described home services robot is provided with wireless transmit receiving element, described wireless remote control terminal is provided with wireless transmit receiving element, described wireless remote control terminal is received and is stored each household appliance remote control signal by radio receiving unit, the remote signal that described home services robot sends by control signal and the wireless remote control terminal wireless transmitter unit of radio receiving unit receiving remote supervisory system, described home services robot sends control signal to wireless remote control terminal by wireless transmitting unit, wireless remote control terminal is sent control signal to each household electrical appliance by wireless transmitting unit.
That is to say, robot in patent documentation 4, on aforementioned machines people's basis, increased an intelligent appliance information interaction, read, the function of transfer, do not possess equally the ability of automatic identification, judgement, live little helper's role of not competent modern wired home.
In addition, no matter the watch-dog using now, be monitoring camera or robot, and in order to ensure that video monitoring does not stay dead angle, the shooting angle of its image unit is omnibearing to surrounding space, inevitable like this, can face the problem that individual privacy is leaked.And, the monitoring camera or the security robot that just occur at present, the shooting angle of its image unit and orientation can not be adjusted in real time according to the variation of ambient conditions.
So, existing technological means, also cannot adapt to both meet sufficient safety assurance, also ensures the desirability that individual privacy is inviolable simultaneously.
Summary of the invention
For the problems referred to above, in order to adapt to people to the human needs of wired home machine, the present invention proposes a kind of multifunctional domestic intelligent robot, described multifunctional domestic intelligent robot comprises process control system and multiple functional subsystem, described multiple functional subsystem comprises mobile system, data acquisition system (DAS), communication system, is characterized in that:
Described mobile system comprises locating module and driver module, and described locating module is identified and locates the environment at described robot place, and moves by driver module control;
Described data acquisition system (DAS) comprises vision module, voice module, data collection module, and described vision module comprises camera head and gathers video image by described camera head; Described voice module comprises acoustic pickup, and described acoustic pickup gathers audio-frequency information;
Described communication system comprises wireless communication module, realizes the telecommunication of described robot;
Described process control system comprises processing module and control module, described processing module receives the data in described multiple functional subsystems in real time, process according to predetermined algorithm, based on result, described control module is assigned steering order to each functional subsystem system, controls the action of described robot.
Robot as above, is characterized in that:
Described multiple functional subsystem also comprises power-supply system, and described power-supply system comprises electric weight Intelligent management module and charging module;
Described electric weight Intelligent management module detects robot current residual electric weight in real time, in the time that electric weight is not enough, sends charging instruction, and described charging module starts the charging to described robot.
Robot as above, is characterized in that:
Described charging instruction sends described mobile system to, and behind the identification through described locating module to current environment and location, robot automatic moving is to charge position described in described driver module control, and described charging module starts described robot to charge.
Robot as above, is characterized in that:
Adopt contact charging or contactless charging to charge to described robot.
Robot as above, is characterized in that:
The charging of described contact is included in described charge position socket or charging pile is set, and described robot automatic moving is electrically connected to described socket or charging pile to described charging module after described charge position;
Described contactless charging is included in described charge position electromagnetism charging device is set, and described robot automatic moving is realized electromagnetic coupled to described charging module after described charge position and described battery charger.
As above arbitrary described robot, is characterized in that: described multiple functional subsystems also comprise man-machine interactive system, comprises the one or more modules in touch display module, remote control module and I/O interface module.
Robot as above, is characterized in that:
By described man-machine interactive system, running parameter and/or mode of operation to described robot are set, revise or reset, or described robot is carried out to system maintenance, upgrading by described man-machine interactive system, running parameter and/or mode of operation to described robot are set, revise or reset, or described robot is carried out to system maintenance, upgrading;
And by man-machine interactive system, based on described communication system, carry out video calling, access internet.
Robot as above, is characterized in that:
Described locating module comprises oneself state perception unit and environment sensing unit;
Described oneself state perception unit comprises the one or more sensors in acceleration transducer, electronic compass and steep cliff sensor, for determining the state of robot self;
Described environment sensing unit comprises one or more in distance measuring sensor, obstacle avoidance sensor and intelligent positioning system, for determining the external status of robot.
Robot as above, is characterized in that:
Described driver module comprises drive motor and wheeled mobile device.
Robot as above, is characterized in that:
Described wheeled mobile device is made up of three wheels or more than three wheel, and described wheel is by the described robot of being supported and fixed on of universal head symmetry bottom.
Robot as above, is characterized in that:
Described drive motor at least drives one of them wheel to carry out self-movement.
Robot as above, is characterized in that:
Described vision module also comprises light compensating apparatus, and described light compensating apparatus is automatically opened the in the situation that of surround lighting deficiency, ensures that described camera head all can obtain image clearly under different illumination conditions.
Robot as above, is characterized in that:
Described vision module also comprises cradle head mechanism, and described camera head is arranged in described robot by cradle head mechanism.
Robot as above, is characterized in that:
Described cradle head mechanism comprises controls motor and gear train, rotates freely in horizontal and vertical direction, ensures freely controlling of the shooting angle of described process control system to described camera head.
As above arbitrary described robot, is characterized in that:
Described light compensating apparatus is looped around around the camera of described camera head, with described camera head interlock.
As above arbitrary described robot, is characterized in that: described light compensating apparatus is near infrared supplementary lighting sources.
As above arbitrary described robot, is characterized in that:
Described vision module comprises three camera heads, and described three camera heads are to be the camera arranging in a triangle.
Robot as above, is characterized in that:
Described voice module also comprises loudspeaker, and described loudspeaker is exported the output information of described robot in the mode of simulation people's voice.
Robot as above, is characterized in that:
Described voice module comprises two symmetrically arranged acoustic pickups and two symmetrically arranged loudspeakers.
Robot as above, is characterized in that:
Smart machine in described data collection module and surrounding environment carries out alternately, reads the data in described smart machine and passes on steering order to described smart machine.
Robot as above, is characterized in that:
Described is the mode that adopts WLAN (wireless local area network) alternately, comprises one or more in bluetooth, Zigbee, WiFi.
Robot as above, is characterized in that:
Described smart machine comprises various intelligent sensors and domestic intelligent Medical Devices and various remote control home appliance.
Robot as above, is characterized in that:
Described intelligent sensor at least comprises the one in smoke transducer, firedamp sensor, infrared sensor;
Described domestic intelligent Medical Devices comprise one or more in sphygmomanometer, blood oxygen instrument, blood glucose meter and other wearable intelligent medical equipment;
Described various remote control home appliance comprises one or more in Intelligent lighting lamp, refrigerator, TV, washing machine, intelligent cooking equipment, air-conditioning.
Robot as above, is characterized in that:
Described wireless communication module comprises wireless communication unit, communicates by described wireless communication unit and terminal device;
And by wireless communication unit, described robot is sent to action command on described terminal device.
Robot as above, is characterized in that:
Described terminal device comprises portable equipment, server;
After described process control system is processed the data in described multiple functional subsystems, described robot by wireless communication unit by wherein relate to safe data send to respectively background server store put on record and intelligent terminal carry out alarm.
Robot as above, is characterized in that:
Described intelligent terminal and described server send action command by wireless communication system to described robot.
Robot as above, is characterized in that:
Described wireless communication unit comprises one or more in gsm, cdma, wcdma, cdma2000, td-scdma, LTE, 4G, wifi standard traffic module.
Robot as above, is characterized in that:
Described processing module comprises one or more in CPU, DSP, ARM, PPC chip processes in real time with the data that described multiple functional subsystems are delivered in described process control system.
As above arbitrary described robot, is characterized in that:
Described control module is the processing to described data based on described processing module, and comprehensive analysis processing result, sends steering order.
As above arbitrary described robot, is characterized in that:
Described process control system is processed the video image of described video module collection, and to surrounding environment event is carried out to Intelligent Recognition and judgement, and send corresponding action command based on described judgement.
Robot as above, is characterized in that:
Described process control system after described video image is processed, carries out Intelligent Recognition and judgement comprises that illegal invasion detects one or more in identification, limbs collision detection identification, gestures detection identification, fall detection judgement, flame detection, Smoke Detection.
Robot as above, is characterized in that:
Described illegal invasion detects identification and comprises the steps:
One, human body catches, and adopts the video human detection of machine learning, and the shape description using HOG feature as human body selects SVM by adaboost algorithm, and the described video image of the traversal of sliding is realized the seizure to human body;
Two, human body tracking, sets up human body tracking model, carries out measuring similarity and calculates similarity with Pasteur's distance with LBP function, realizes the tracking to human body based on mean shift;
Three, illegal invasion detects, and catches and obtains the position of people in video image based on described human body, determines that according to described human body tracking people enters warning region;
Once enter warning region, described process control system is identified the people in described video image, if identification is judged as non-registered personnel, described control system is confirmed as illegal invasion and sent alarm command.
Robot as above, is characterized in that:
Described limbs collision detection identification comprises the steps:
One, human body catches, and adopts the video human detection of machine learning, and the shape description using HOG feature as human body selects SVM by adaboost algorithm, and the described video image of the traversal of sliding is realized the seizure to human body;
Two, human body tracking, sets up human body tracking model, carries out measuring similarity and calculates similarity with Pasteur's distance with LBP function, realizes the tracking to human body based on mean shift;
Three, limbs collision detection, catch and obtain the position of people in video image and the region at described human body tracking acquisition people place based on described human body, light stream vector described in employing KLT (Kande-Lucas-Tomasi) unique point optical flow computation order in region, characterize violent random motion with Region Entropy, in the time that Region Entropy exceedes threshold value, described control system is confirmed as and limbs conflict is occurred and send alarm command.
Robot as above, is characterized in that:
Described process control system is identified and is comprised to the people in described video image is carried out recognition of face and/or the sound of the people in described video image is carried out to Application on Voiceprint Recognition the people in described video image.
Robot as above, is characterized in that:
In described limbs collision detection step, also comprise that the audio-frequency information that acoustic pickup shown in described voice module is obtained carries out audio analysis;
If when the Region Entropy obtaining by described light stream vector calculating exceedes threshold value, described audio-frequency information exceedes a decibel threshold value, described control system is confirmed as and limbs conflict is occurred and send alarm command.
Robot as above, is characterized in that:
Described Application on Voiceprint Recognition comprises off-line model training and online Application on Voiceprint Recognition;
Described off-line model training obtains the vocal print characteristic model of particular person;
Described online Application on Voiceprint Recognition is mated described sound with described vocal print characteristic model, thereby realizes Application on Voiceprint Recognition.
Robot as above, is characterized in that:
Described Application on Voiceprint Recognition comprises the steps:
One, described sound is carried out to sampling processing, obtain voice signal;
Two, described voice signal is carried out to feature extraction, obtain the power spectrum of normalization;
Three, adopt GMM model carry out the training of audio frequency identification model and deposit database in, complete vocal print registration;
Four, mate with the sound model of registering in described database, realize Application on Voiceprint Recognition.
Robot as above, is characterized in that:
Described gestures detection identification comprises image pre-service, gesture tracking, feature extraction, four steps of gesture identification.
Robot as above, is characterized in that:
Described image pre-service is to adopt medium filtering to carry out smoothing denoising to the images of gestures in described video image, obtains described images of gestures, and gesture Image Segmentation Using is obtained to gesture bianry image;
It is to adopt the CAMShift track algorithm in HSV space that described gesture bianry image is converted to color probability distribution graph that described gesture is followed the tracks of, and realizes having the tracking of characteristic color object by iteration repeatedly;
Described feature extraction is the Hu invariant moment features that gesture profile in described gesture bianry image is extracted to gesture, and described feature has translation, rotation and yardstick unchangeability, and described feature is sent to BP neural network trains;
Described gesture identification is sent to the BP neural network of having trained by the described feature of extracting and mates identification, thereby completes described gesture identification.
Robot as above, is characterized in that:
Described fall detection judgement comprises that human body outline point extracts, consecutive frame point is mated and deformation analysis.
Robot as above, is characterized in that:
It is to adopt marginal information to set up single Gaussian Background model that described human body outline point extracts, and realizes the rarefaction of point by the mode that background subtraction is extracted the profile of human body and adopted equally spaced mode to sample;
Described consecutive frame point coupling is to adopt Shape context (Shape Context, SC) feature to realize the coupling of point; Shape context has been described the spatial relationship between unique point is adjacent a little, any one point of former frame can with any point coupling of next frame, described coupling selects optimum matching to arrange by bi-directional matching algorithm;
Described deformation analysis is the average coupling cost that uses optimal match point quantize deformation, be defined as follows:
C ‾ = 1 N Σ n = 1 N C ( n )
Wherein C (n) represents the coupling cost of n optimal match point, the number that N is optimal match point; When average coupling cost while being greater than setting threshold, judgement has occurred to fall.
Robot as above, is characterized in that:
Average coupling cost while being greater than setting threshold, judgement has occurred to fall and has also comprised average coupling cost time-domain analysis:
Under normal circumstances, while falling, on average mate cost be a very large value, after falling, keep static or only have trickle motion because people knows from experience at short notice, so on average mate cost it is a very little value;
Adopt dynamic time warping (Dynamic Time Warping, DTW) algorithm to exceeding the average coupling cost in the setting threshold front and back short time carry out the coupling of dynamic mode, realize the accurate detection of the behavior of falling and accurately judgement.
Robot as above, is characterized in that:
Described flame detection is that time and the spatial alternation state to image-region analyzed the extraction that realizes image Flame Area, thereby realize, the flame in described video image is detected;
Detailed process comprises:
(1) set up image Flame picture element and nonflame picture element about time and spatial variations Hidden Markov Model (HMM) by the mode of off-line learning;
(2) in actual testing process, first utilize color and spatial model to determine the potential region of flame;
(3) utilize Hidden Markov Model (HMM) to analyze picture element in potential region whether to meet the Hidden Markov Model (HMM) of flame pixel;
(4) flame pixel cluster is obtained to flame region.
Robot as above, is characterized in that:
The process of described Smoke Detection is as follows:
1) described video image is divided into nonoverlapping sub-block, each sub-block is followed the tracks of, judge the direction of motion of sub-block, the sub-block that meets smog movement direction is carried out to cluster and obtain potential smog region;
2) two-dimensional discrete wavelet conversion is carried out in potential smog region, obtain the HFS of image, if less than the high-frequency energy of corresponding background, further confirm as smog region.
As above arbitrary described robot, is characterized in that:
Smart machine in described data collection module and surrounding environment carries out alternately, reading the data in described smart machine;
Described data are uploaded to described processing module in real time;
Described processing module is processed described data according to predetermined algorithm, and described control module is sent corresponding hint instructions based on result.
Robot as above, is characterized in that:
Described processing module receives in real time described video image and described audio-frequency information simultaneously and according to predetermined algorithm, described video image and audio-frequency information is processed, analyzed and identify;
Described control module is based on processing, analysis and recognition result, and the result of comprehensively described data being processed, sends corresponding hint instructions.
Robot as above, is characterized in that:
Occur after the behavior of falling in confirmation, extract wearable intelligent medical equipment that the people who falls that described process control system receives the wears Monitoring Data to human body, described control system, based on described Monitoring Data, judges the health that people is current, and sends health hint instructions.
Robot as above, is characterized in that:
In described video image, there is flame or occur after smog detecting, extract the Monitoring Data of the firedamp sensor arranging in the surrounding environment that described process control system receives, described control system is based on described Monitoring Data, judge the current condition of a fire, and send accordingly condition of a fire situation hint instructions.
As above arbitrary described robot, is characterized in that:
Described robot is provided with objective table, to carry the convenient transport of article.
Robot as above, is characterized in that:
Described objective table is arranged on robot top.
According to the Multifunctional intelligent robot of invention, first, realize under actual domestic environment, the processing of the various environmental informations including video image, audio-frequency information, various sensing data is analyzed with comprehensive, thereby the variety of event under energy Intelligent Recognition current environment, and provide corresponding decision instruction.Ji Gai robot has realized to the organically blending of various sensors, to the integrated use of sensing data;
Secondly, robot of the present invention can identify people's sound and image, action, so can give easily robot assigned tasks, thus can assist owner to complete a few thing;
Again, can obtain the various information in surrounding environment, for example close, the invasion of stranger, old man, child's nurse and to the obtaining of the data in the Worn type equipment of theirs with it, reading and the above-mentioned information of integrated use ambient intelligence household electrical appliances, sensing data, substantially overcome the wrong report to various situations, improve the accuracy of situation, allow people feel at ease and trust, really having accomplished family life assistant;
Finally, robot of the present invention, owing to having adopted mobile system, can identify oneself state and position, thereby can position oneself, and in conjunction with the power-supply system of robot, the in the situation that of electric weight deficiency, can oneself move to charge position, gives oneself charging.
In addition, in order to solve the collision problem of security protection and secret protection, the set camera head of robot in the present invention, be loaded in can the The Cloud Terrace of all-direction rotation on, and in conjunction with drive motor, thus shooting angle that can free setting camera head.For example, under normal circumstances, set shooting angle apart from ground 30cm, or 20cm is with interior altitude range; In the time that robot judges that there is something special, more automatically adjust shooting angle, carry out omnidirectional shooting, thereby solved the problem that protection conflicts with privacy.
To sum up, the little house keeper that the robot in the present invention can be used as home centers stands ready, and user can pass through abundant voice command, Local or Remote guidance command, issues various instructions to robot; Family nobody time can arrange timing patrol mode, if unusual circumstance, during as abnormal emergencies such as gas leak, the condition of a fire, stranger swarm into, personnel's abduction, violent conflict, pilferage, Falls Among Old Peoples, family is local immediately reports to the police, and to the default intelligent terminal remote alarm of owner.
Brief description of the drawings
Fig. 1 robot of the prior art block diagram
Fig. 2 robot system module of the present invention relative position (overlooking) block diagram
Fig. 3 robot system modular circuit of the present invention connects block diagram
Fig. 4 robot of the present invention functional block diagram
Fig. 5 robot system process flow diagram of the present invention
Fig. 6 Application on Voiceprint Recognition process flow diagram
Embodiment
For the present invention is described in more detail, to realize a specific embodiment in robot of the present invention, robot of the present invention is described below.
In this embodiment, robot relies on symmetrically arranged three wheels of bottom and the distance measuring sensor cooperating of surface distributed, realizes it at indoor autonomous.
Robot system module frame chart as shown in Figure 2, be isosceles triangle at robotic surface 3 video cameras are installed, every video camera has 1 light compensating lamp to coordinate with it, No. 2 acoustic pickups that are symmetrical set, 2 loudspeaker, 1 high-clear display, 1 induction panel, and integrated multiple sensor, and pass through bluetooth, the communication modes such as WIFI and Zigbee and infrared thermal releasing electric alarms based on pattern recognition, smoke detector, domestic intelligent hospital equipment is as blood oxygen, survey meter of blood pressure, and intelligent health is dressed equipment, domestic electric appliance (lamp, TV, refrigerator, home theater etc.) realize junction service, and by the airborne high-performance CPU of these information processes, PPC, DSP, the processors such as ARM carry out efficient processing in real time, and the wireless mobile communications mode that is representative by 3G/4G or cable network are realized with various intelligent terminals as the real-time interactive of smart mobile phone, thereby realize the various powerful of robot and the design function of wisdom.
At the top of robot, be provided with a circular platform, can carry various small articles.
Above-mentioned hardware, divides with functional interface, comprises eight functional modules, each functional module and circuit connecting mode, as shown in Figure 3,4.
Comprise man-machine interface, core control panel, wireless monitor module, robot location and driving, vision, voice, Wireless Data Transmission and power management.Wherein,
Human-machine interface module: comprise touch display screen, pilot lamp, Infrared remote controller composition; By touch-screen, robot running parameter is set, display device people duty;
Core control panel module, i.e. process control system: mainly by flush bonding processor chip, comprise processor, the storage systems such as CPU, PPC, DSP, ARM, such as eMMC file system and TF storage card and peripheral interface composition.As the brain of robot, in the present embodiment, adopt Freescale four core 1GHZ processors to carry out the various information of its reception of comprehensive analysis processing, and command machine to make corresponding actions; This kernel control module adopts embedded OS and moves robot control application software, comprises robot autonomous walking, article are transported to control, task setting, various signals collecting, audio-video collection, Alarm Communication, recognition of face, language idendification and analysis, human body behavioural analysis and target tracking algorism etc.
Wireless monitor module, various smart machines are around carried out to mutual data acquisition module: using bluetooth, Zigbee or WIFI as communication mode, receive in real time the wireless signal data of household appliances and other equipment, process through being sent to kernel control module.Two kind equipments below main support:
1) intelligent sensor, as smoke transducer, gas leakage detector, glass break detector with and other various special gas sensors, comprise carbon dioxide, carbon monoxide, formaldehyde, air mass sensor etc.;
2) intelligent medical equipment, as sphygmomanometer, blood oxygen instrument, blood glucose meter and other intelligent medicals or wearing equipment;
In addition, in wireless monitor module, be also provided with wireless remote control unit, comprise radio receiving unit and wireless transmitting unit by wireless remote control unit; Radio receiving unit automatic reception is also stored each household appliance remote control signal, in the time of needs robot control household electrical appliance, domestic robot sends corresponding steering order to wireless remote control terminal, and the wireless remote control instruction of wireless remote control terminal simulation electrical equipment sends, and reaches the object of controlling household electrical appliance.
Location, driver module, the i.e. mobile system of robot: Locating driver module is made up of robot locating module and driver module; Robot locating module transfers to robot position external information after kernel control module processing, complete identification and location to robot place environment, and by motor drive module at high-precision motion opertaing device---under the control of steering wheel, drive machines people walking.
Robot locating module is made up of oneself state perception unit and environment sensing unit; Oneself state perception unit comprises acceleration transducer, electronic compass and steep cliff sensor, for determining the state of robot self; Environment sensing unit is made up of distance measuring sensor, obstacle avoidance sensor and intelligent positioning system, for determining the external status of robot.
Vision module, i.e. video acquisition module: formed by three minisize pick-up heads, cradle head mechanism and light compensating apparatus being distributed in diverse location.Light compensating apparatus can be near infrared light-supplementing system composition, can within 24 hours, be recognition of face, identity identification, behavioural analysis, and indoor environment video recording and environmental monitoring provide video data.Three minisize pick-up heads adopt CMOS video sensor, complete respectively the collection of recognition of face video, target following video and ambient video.
For controlling the cradle head mechanism of camera shooting angle, comprise and control motor and corresponding gear train, at horizontal and vertical direction rotation, so that camera can collect the image in each corner.The instruction that described control motor can reception & disposal control system sends, adjusts the shooting angle of camera in real time.
Near infrared light-supplementing system forms by being looped around camera near infrared light compensating lamp around, ensures that video camera all can obtain stable image clearly under different illumination conditions.
Voice module, comprises 2 acoustic pickups and 2 loudspeakers.Acoustic pickup receives various phonetic entries, as owner's phonetic order, indoor abnormal sound etc.; General purpose speaker is exported robot voice.In the time carrying out audio collection, can adopt existing widely used audio frequency disappear echo algorithm and coding, decoding process, obtain the audio frequency input and output of high-resolution.
Wireless data transfer module, i.e. communication system, adopts mobile communication now, and the various communication standards including 2G, 3G/4G, and WIFI are realized communicating by letter between intelligent terminal (as mobile phone) and other server.
The data-pushings such as the text that relates to, picture, video, audio frequency that supervisory control of robot is reported to the police are upper to the application platform (APP) of background server and intelligent terminal (as mobile phone), and receive the operational order that background server and intelligent terminal (as mobile phone) APP send.
Power management module; Intelligent electric power management module detects robot electric weight in real time, and in the time that the not enough minimum electric weight of electric weight requires, robot will move to charge position automatically, voluntarily charging.
Charging can adopt charging, also can charge and adopt up-to-date contactless charging mode.Wherein, utilize electromagnetic wave induction principle to carry out contactless charging, be similar to transformer, respectively have a coil at sending and receiving end, transmitting terminal coil connects wired power generation and puts this signal, thereby the electromagnetic signal generation current of the coil-induced transmitting terminal of receiving end charges the battery, this charging modes has reduced artificial manual intervention, when robot is during in low electric weight, automatically find charging pile to charge by the sensor of robot, realize the self-management of robot.
Except above-mentioned hardware, analyze for overall treatment the various data that above-mentioned each sensor collects, robot of the present invention is provided with process control system and processes above-mentioned data and make corresponding judgement.
Below in conjunction with accompanying drawing 5, information acquisition and treatment scheme to robot are described.
After system powers on, first complete the initialization of master chip, next complete peripheral collecting device, communication facilities initialization, starts system software host process; Create signals collecting thread, sound intelligent video analysis thread, service, control worker thread, communication thread.
Signals collecting thread: monitor and gather peripherals device signal, signal is sent to respectively corresponding signal processing threads and processes after pre-service.As collected after audio, video data when camera and acoustic pickup collection, first audio-video signal is carried out to pre-service, then pretreated audio, video data is sent to sound intelligent video analysis thread process.
Intelligence audio frequency and video are analyzed thread: this thread receives after audio, video data, to audio/video frames data analysis, and start fall, gesture, personnel are kidnapped, personnel's invasion, violent conflict scheduling algorithm detect, if detect that the intrusion behavior duration exceedes setting threshold, undertaken, after identification failure, transmitting alerting signal to warning thread by the method such as recognition of face, speech recognition; Complete coding, compression, the storage of audio, video data simultaneously.
Robot service, control thread: comprise to robot ambulation control all kinds of command response processing.
Communication thread: establish a communications link and communicate by letter with remote equipment; As receive after alerting signal, organize relevant warning message, as the audio frequency and video of pre-recording, pictures etc., send to predetermined mobile phone terminal.
Below, be described with regard to the intelligent video analysis technology and the speech recognition technology that adopt in described process control system.
Why to adopt intelligent video analysis technology and speech recognition technology, mainly because the Audio and Video information that robot collects is too huge, must be first through processing and screening, mass data in video pictures is carried out to high speed analysis, filter out the unconcerned information of user, only provide the key message of use for user.
Intelligent Video Surveillance Technology is mainly that video is analyzed automatically, extracts key message from video, finds the events of interest abnormal with identification, thereby can substitute artificial monitoring or assist artificial monitoring; Video analysis relates to complicated software algorithm with identification, and it can identify strange and abnormal behavior by programming; Video content analysis and identification software can, by analyzing video flowing on-the-spot or record, detect and identify suspicious activity, event or behavior pattern; The intellectuality of video monitoring system refers in the situation that not needing human intervention, system can automatically realize to the abnormal conditions in monitored picture detect, identification and video quality analysis, and make in time pre-/report to the police.
The Intelligent Recognition of video image, mainly comprises the following aspects.
One, human detection and tracking
1. human detection
Adopt the video human detection based on machine learning; Adopt the shape description of gradient orientation histogram (Histograms of Oriented Gradients, HOG) feature as human body, for regional area, tiny variation has unchangeability to a certain degree to HOG feature; Picture is divided into N unit by HOG feature, is called " cell ", and adjacent unit (cell) forms several fritters (block), can be overlapping between fritter can be not overlapping yet; By the gradient direction distribution situation of each unit in statistics fritter, realize the extraction of HOG characteristic block; By changing the dividing mode of unit (cell), can produce a large amount of HOG features fast; Each waits until a sorter by machine learning from a series of training image data learnings, can realize the detection to human body with this sorter; Sorter is selected linear SVM (Support Vector Machine, SVM), realizes the lifting to Linear SVM sorter classification performance by adaptive boosting learning algorithm (Adaboost); The flow process of Adaboost algorithm is as follows:
Given training sample and initial weight
T takes turns screening, selects T optimum Weak Classifier:
Normalization sample weights
For Weak Classifier of each features training
Select optimum Weak Classifier f i
Upgrade sample weights
Obtain strong classifier
Here Weak Classifier is Linear SVM, and each is characterized as the different HOG characteristic block of full range of sizes; Trained and obtained human body recognition classifier by said method, traversal that the detection window of sorter is slided in video image realizes the detection of human body.
2. human body tracking
(1) foundation of human body tracking model and measuring similarity
Textural characteristics has been selected local binary (Local Binary Pattern, LBP) feature; LBP is a kind of effectively texture description operator, has texture recognition ability strong, and to the insensitive feature of the variation of brightness, LBP is defined as follows shown in formula:
LBP P , R ( x c , y c ) = Σ p = 0 P - 1 s ( g p - g c ) × 2 p
Wherein s ( x ) = 1 x &GreaterEqual; 0 0 x < 0
R represents the distance of center pixel and neighborhood pixel, and P represents the number of neighborhood pixel, g prepresent with g ccentered by the distance annulus that is R on the gray-scale value of p Along ent; Here get P=8, R=1, considers 8 neighborhood pixels; LBP value by each pixel in statistical regions can form LBP histogram, LBP histogram is quantified as to 32 rank here;
Color characteristic is selected H component and V component, and H component has reflected the color characteristic of target, and V component has reflected the brightness of target, and color component is quantified as 32 rank;
The feature of final goal is expressed as three-dimensional feature histogram, comprises bidimensional color characteristic and one dimension textural characteristics, and the histogrammic quantification exponent number of every one-dimensional characteristic is all 32;
Here select weighted feature histogram as object module, weighted feature histogram has reflected the statistical nature of target area, Selection of kernel function Epanechnikov Kernel, be shown below:
K E ( x ) = c ( 1 - | | x | | 2 ) | | x | | &le; 1 0 otherwise - - - ( 23 )
Set up object module by formula (23)
p x 0 ( n ) = 1 C &Sigma; i = 1 N k ( | | x i - x 0 | | h ) &delta; [ h ( x i ) - n ]
Wherein C is normalization coefficient, represent with x 0centered by the histogrammic weights in n rank, N represents the number of the pixel in region, k (.) is Epanechnikov kernel function, x ifor any point in region, || x i-x 0|| be x ito x 0distance, δ [.] is unit impulse function, h (x i) be x irank in corresponding three-dimensional feature histogram;
Aspect similarity measurement, select conventional Pasteur apart from calculating similarity:
&rho; x 0 [ p , q ] = &Sigma; n = 1 m p x 0 ( n ) q ( n )
Wherein be illustrated in x 0the weighted feature histogram that place sets up is as object module similarity degree between the template q (n) setting up in advance, ρ is larger, and similarity degree is higher; M represents histogrammic exponent number;
(2) human body tracking based on mean shift
Target following comprises position prediction, and mean shift search and feature are upgraded three parts;
The position prediction of target adopts the method for gray scale template matches to realize, and can find the approximate location of target at present frame by position prediction, and the exact position of target obtains by mean shift search;
Mean shift location finding is suc as formula shown in (26):
y ^ 1 = &Sigma; i = 1 W &times; H x i &omega; i g ( | | y ^ 0 - x i h | | 2 ) &Sigma; i = 1 W &times; H &omega; i g ( | | y ^ 0 - x i h | | 2 )
&omega; i = &Sigma; u = 1 m &delta; [ h ( x i ) - u ] q u p u ( y )
Wherein W and H represent width and the height of To Template, represent the Geometric center coordinates point of current goal, x ifor sampled point, the derivative that g (.) is kernel function, h represents that nucleus band is wide, ω ifor weighting coefficient;
The renewal of object module is stablized for realization and target following is accurately necessary, blindly upgrade and may make extraneous interference also be doped to model, make model can not describe clarification of objective completely, along with the increase of time, the model real situation that can more and more depart from objectives, causes tracking accuracy to decline;
The strategy of model modification is shown below:
If | ρ kk-1| > ρ k-1× 0.9AND ρ k> 0.9
q i=q i-1×0.95+p k×(1-0.95)
ρ in formula krepresent Pasteur's distance of k frame optimum position, q irepresent the color of object model after upgrading for the i time, p krepresent the model of the k two field picture target collecting.
Two, behavioural analysis
1. illegal invasion detects
Can obtain the position of people in image by human detection and tracking, when judging that by human body tracking people enters warning region, next opening face detects, if face detected, carry out recognition of face, if while being judged as non-registered personnel by recognition of face, trigger alarm, the front of simultaneously capturing invasion personnel is shone.
2. limbs collision detection
In the time of the conflict of outburst limbs, be accompanied by violent random motion and loud uttering long and high-pitched sounds, therefore can detect limbs conflict by light stream vector analysis and audio analysis, in the time that two kinds of methods all detect limbs conflict, trigger limbs collision alert, capture a scene photograph simultaneously.
3. light stream vector analysis
Can obtain the region at target place by target following, adopt the light stream vector V={ ν in KLT (Kanade-Lucas-Tomasi) unique point optical flow computation target area here 1, ν 2..., ν n, adopt amplitude weighting histogram H p={ h j} j=1,2 ..., nthe statistical study of feasible region light stream vector;
h j = C h &Sigma; i = 1 k A v i &delta; ( b ( v i ) - j )
Wherein h jrepresent j rank histogram, exponent number is got 12, C here hfor normalized parameter, for normalization light stream vector amplitude, b (v i) be light stream vector v icorresponding histogram, by the orientation determination of vector, δ (.) is Kronecker delta function;
Here adopt Region Entropy E hrealize the tolerance of violent random motion, E hexpression formula as follows:
E H = - &Sigma; j = 1 n h j log h j
Wherein h jrepresent j rank amplitude weighting histogram; E hmotion Shaoxing opera in larger declare area is strong random, and setting threshold T, works as E hwhen > T, in declare area, break out limbs conflict.
On the above-mentioned basis that Video processing is analyzed, can also, in conjunction with the analysis to the audio frequency collecting simultaneously, further improve the accuracy of identification.Analysis to audio frequency is mainly reflected in, and follows fierce speech and loud uttering long and high-pitched sounds, so can adopt the specific sound detection based on audio analysis whether to have limbs conflict during due to the conflict of outburst limbs.
Three, gesture identification
Gesture recognition system based on vision mainly comprises image pre-service, gesture tracking, feature extraction and gesture identification four parts.
1. image pre-service
First carry out median filter smoothness of image denoising to gathering image, under hsv color space, adopt following formula here
To gesture Image Segmentation Using, obtain gesture bianry image:
f ( x , y ) = 255 H &Element; { 5,25 } &cup; R > G > B 0 else
The image coordinate that wherein f (x, y) is picture element; R, G, the color component that B is rgb space, H is the color component in HSV space;
After binaryzation, by morphologic filtering, remove cavity and the rough edge cut apart in image, obtain sealing complete images of gestures; Next by 8 Neighbor searchs, the profile of gesture bianry image is extracted, obtain the gesture profile with chain representation;
2. gesture is followed the tracks of
Adopt CamShift (the Continuously Adaptive Mean-Shift) track algorithm based on HSV space to realize the tracking to gesture; The color histogram graph model of algorithm model target, is converted to color probability distribution graph by image, by iteration repeatedly, search window is moved to barycenter direction, until convergence, thereby realize the tracking to characteristic color object;
3. feature extraction
Obtained the contour images of gesture by image pre-service, extracted the Hu invariant moment features of gesture here, this feature has translation, rotation and yardstick unchangeability; 7 of Hu not bending moment group be defined as follows:
φ 1=η 2002
&phi; 2 = ( &eta; 20 - &eta; 02 ) 2 - 4 &eta; 11 2
φ 3=(η 30-3η 12) 2+(3η 2103) 2
φ 4=(η 30+3η 12) 2+(η 2103) 2
φ 5=(η 30-3η 12)×(η 3012)×[(η 3012) 2-3(η 2103) 2]+(3η 2103)×(η 2103)×[3(η 3012) 2-(η 2103) 2]
φ 6=(η 2002)×[(η 3012) 2-(η 2103) 2]+4η 113012)(η 2103)
φ 7=(3η 2103)×(η 3012)×[(η 3012) 2-3(η 2103) 2]+(3η 1230)×(η 2103)×[3(η 3012) 2-(η 2103) 2]
Wherein η pqfor the centre distance after standardization, be defined as follows:
&eta; pq = &mu; pq &mu; 00 r r = p + q + 2 2 , p + q = 2,3 . . .
Wherein μ pqcentered by distance, be defined as follows:
&eta; pq = &Integral; - &infin; + &infin; &Integral; - &infin; + &infin; ( x - x &OverBar; ) p ( y - y &OverBar; ) q f ( x , y ) dxdy
with for the barycenter of target;
4. gesture identification
Gesture identification adopts BP (Back Propagation) neural network; BP neural network is a kind of multitiered network of propagated forward, and training is made up of forward-propagating and backpropagation; In forward-propagating process, input message is progressively processed through hidden layer from input layer, and is transmitted to output layer, and the neuronic state of every one deck only affects the neuronic state of lower one deck; If output layer can not get desired output, transfer reverse relay to, error signal, along the passback of link passage, by revising each layer of neuronic weights, is made to error signal minimum; Error backpropagation algorithm utilizes the error between real output value and expectation value to be connected weights to the multilayer of network and successively proofreaies and correct by before backward;
In the time of hands-on neural network, first determine output node number according to gesture class number, input number of nodes order is number of features, and hidden layer generally selects one or two-layer, and hidden layer node number determines according to training result; Next gather the gesture picture of a large amount of various implications, through image pre-service, feature extraction, sends feature into neural network and trains, and adjusts hidden layer node number and reach the highest nicety of grading in training process;
After neural metwork training completes, the gesture feature of extraction is sent into network, the classification corresponding to node of network output layer response maximum is the gesture classification identifying.
Four, fall detection
From the angle of machine vision, people falls and conventionally follows and significantly move and violent profile variations while generation, and in people's normal activity, the variation of human body contour outline is slowly, therefore can be by judging that the variation of human body contour outline detects the generation of the behavior of falling; Whole fall detection flow process is as follows:
1. human body outline point extracts
First adopt marginal information to set up single Gaussian Background model, than adopting colouring information to set up background model,
Variation and the shade of marginal information to illumination is insensitive, has ensured the accuracy that profile extracts; Next adopt background subtraction to extract the profile of human body, because the point of the human body extracting exists very large redundancy, the mode that adopts equally spaced mode to sample here realizes the rarefaction of point;
2. consecutive frame point coupling
Adopt Shape context (Shape Context, SC) feature to realize the coupling of point; Shape context
Spatial relationship between unique point is adjacent is a little described, given n marginal point x 1, x 2..., x n, wherein any point x ishape context by describing the log-polar histogram h of itself and the residue n-1 spatial relationship of ordering irepresent;
h i(k)=#{x j≠x i:(x j-x i)∈bin(k)}
Wherein log-polar histogram obtains by each unique point being placed in to coordinate center of gravity;
Use C ijcharacterize the coupling cost of two point, wherein C ijuse a χ 2distribute and represent:
C ij = 1 2 &Sigma; k = 1 K [ h i ( k ) - h j ( k ) ] 2 h i ( k ) + h j ( k )
Any one point of former frame can with any point coupling of next frame, realize profile
Good coupling, need to find a kind of π of arrangement (i), makes total coupling Least-cost:
H ( &pi; ) = &Sigma; i C ( p i , q &pi; i )
Here adopt bi-directional matching algorithm to select optimal arrangement;
3. deformation analysis
Use the average coupling cost of optimal match point quantize deformation, be defined as follows:
C &OverBar; = 1 N &Sigma; n = 1 N C ( n )
Wherein C (n) represents the coupling cost of n optimal match point, the number that N is optimal match point; When flat
All mate cost while being greater than setting threshold, judgement may occur to fall;
4. time-domain analysis
Under normal circumstances, when people falls be a larger value, in a bit of time after falling (as
5 seconds) can keep static or only have trickle motion, during this period of time in it is a less value; Here adopt dynamic time warping (Dynamic Time Warping, DTW) algorithm, complete the coupling of above-mentioned dynamic mode, realize the judgement that is to backward; DTW algorithm can be realized the coupling between the inconsistent pattern of length, and this algorithm is found one by the optimal path from starting point to terminal of each point of crossing, makes the not good enough degree of distortion summation of all families on this path reach minimum.
Five, pyrotechnics detects
1. flame detects
Analyze by the time to image-region and spatial alternation state the extraction that realizes image Flame Area; Detailed process comprises:
(1) set up image Flame picture element and nonflame picture element about time and spatial variations Hidden Markov Model (HMM) by the mode of off-line learning;
(2) in actual testing process, first utilize color and spatial model to determine the potential region of flame;
(3) utilize Hidden Markov Model (HMM) to analyze picture element in potential region whether to meet the Hidden Markov Model (HMM) of flame pixel;
(4) flame pixel cluster is obtained to flame region;
Six, Smoke Detection
Smog possesses two obvious features:
(1) smog is under the driving of heat, direction of motion often to lower and on;
(2) smog has the fuzzy characteristic of blocking, and can make object edge fuzzy when smog blocks object; As follows according to the Smoke Detection process of above characteristic Design:
1) first image is divided into nonoverlapping sub-block, by each sub-block is followed the tracks of, judges the direction of motion of sub-block, the sub-block that meets smog movement direction is carried out to cluster and obtain potential smog region;
2) two-dimensional discrete wavelet conversion is carried out in potential smog region, obtain the HFS of image, if less than the high-frequency energy of corresponding background, further confirm as smog region.
For Application on Voiceprint Recognition, mainly comprise two parts, i.e. off-line model training and online Application on Voiceprint Recognition.
Train the characteristic model that can obtain about particular person by off-line model, in actual testing process, audio frequency characteristics is sent in different characteristic models and calculated, select the model of the highest model of similarity as final coupling, thereby complete identification.
Application on Voiceprint Recognition process comprises following several step:
1. voice signal pre-service
The sampling rate of supposing sound signal X (t) is f s, get 8kHz here, X (t) is passed through to pre-emphasis successively, point frame and windowing process, window function is selected Hanning window; Before signal is processed, first remove average, avoid DC component to exert an influence near the spectral line place of ω=0;
2. feature extraction
Mel cepstrum coefficient (Mel-frequency cepstral coefficients, MFCC): adopt the period map method in classical spectrum estimate, use Fast Fourier Transform (FFT) (Fast Fourier Transformation, FFT) realize, finally obtain normalized power spectrum X (f n); Mel bank of filters is made up of one group of V-belt bandpass filter distributing according to Mel frequency marking, and the number of getting bank of filters is here 24, power spectrum X (f n) filtering of process Mel bank of filters is taken the logarithm, then process discrete cosine transform obtains MFCC coefficient;
MPEG7 bottom audio descriptor: MPEG7 provides abundant feature to describe to voice data, as frequency profile line, audio object, tone color, harmony, frequecy characteristic, amplitude envelope, time structure (comprising rhythm) etc.; Here selected the spectrum envelope in basic frequency spectrum, the distribution of barycenter of frequency spectrum and frequency spectrum; Wherein spectrum envelope description audio short-time rating spectrum, frequency represents with log-domain; Barycenter of frequency spectrum is described the center of gravity of power spectrum, is used to indicate that ratio that sound signal medium and low frequency signal accounts for is great or the ratio that high-frequency signal accounts for is great; Spectrum distribution, for describing the intensity of spectrum energy, concentrates near center of gravity or is uniformly distributed;
3. mixed Gauss model (Gaussian Mixture Model, GMM) audio frequency identification model training
GMM model, by the linear combination any distribution of approximate description well of multiple Gaussian distribution, by given special audio features training sample set, can train the vocal print GMM model about particular person; Here adopt the maximum EM algorithm of expectation to ask for GMM model;
Given training sample set X={x 1, x 2..., x n, the likelihood function of GMM is
p ( X / &lambda; ) = &Pi; i = 1 n p ( x i / &lambda; )
Wherein model parameter p irepresent the probability of Gauss model, Σ irepresent respectively mean vector and the covariance matrix of Gauss model;
EM algorithm comprises two steps, and E step is asked for expectation, calculates auxiliary function m walks expectation maximization, maximizes obtain walk with the continuous iteration of M step until algorithm convergence by E;
Q ( &lambda; , &lambda; ^ ) = &Sigma; y ( P ( Y = y | X = x | &lambda; ) log P ( Y = y , X = x | &lambda; ^ ) )
Wherein X is observed reading, and Y is implicit state;
In the time that the expectation value maximal value of adjacent twice iterative computation is more or less the same, algorithm convergence is described, stop iteration, be shown below:
Q t ( &lambda; , &lambda; ^ ) - Q t ( &lambda; , &lambda; ^ ) < &epsiv;
Wherein t represents the number of times of iteration, and ε is a less positive number;
4. use training pattern identification
Everyone is trained and is obtained model parameter λ and deposit database in by GMM, completes vocal print registration; When online detection, sound bite is extracted after feature, the similarity of calculating and each GMM model in database, the accredited personnel corresponding to GMM model of similarity maximum is speaker.
Above-mentioned speech recognition technology, comprises natural-sounding recognition technology and characteristic sounds recognition technology.
Speech recognition technology based on classes of instructions is for man-machine interaction; Characteristic sounds recognition technology can strengthen the susceptibility of robot to sound such as door lock unlatching, glass breakings; Sound line recognition technology can be used under specified conditions, while inefficacy, by Application on Voiceprint Recognition, distinguishes kinsfolk and stranger as recognition of face.
Meanwhile, in the process control system of robot, be also provided with identification of sound source technology, make robot can in safety protection function, independently find and find suspicious situation, and based on identification of sound source technology, independently judge the direction that sound sends, then labour contractor aims at this direction to make camera, in conjunction with video image identification technology, makes further identification and judgement.
In sum, robot of the present invention, owing to having adopted based on video image analysis and Application on Voiceprint Recognition, identification of sound source technology, its process control system is the result of the various sensing datas of its lift-launch comprehensively, automatic identification current environment is also made respective reaction, thereby possessed the quite multi-function robot of intelligence, it is multi-functional, is mainly reflected in following aspect:
1,, according to the sensor signal such as oneself state, ambient condition, stop or changing current direction of motion, the motion state of control;
2, by acoustic pickup, real-time sense is caught voice; Intelligence is differentiated the sound such as phonetic order and door lock unlatching, glass breaking; In the time having glass breaking etc. extremely to send, the direction that automatic decision sound sends, then aims at this direction environment camera to make robot further identify and to judge; In the time that recognition of face was lost efficacy, by Application on Voiceprint Recognition, distinguish kinsfolk and stranger;
3, in the time that robot receives carry an object instruction, robot will move to target area automatically, arrive automatic voice reminding behind target area;
4, detect in real time monitoring objective as whether old man can not fall, if occur, start and report to the police immediately;
5, in real-time sensing chamber, whether guarded region there is violent conflict, kidnaps the behaviors such as warning, gesture warning, if exist, starts and reports to the police immediately;
6, in real-time sensing chamber, whether guarded region there is intrusion behavior and by face recognition algorithms or voiceprint recognition algorithm, personnel identity is verified; Illegal invasion if, starts and reports to the police immediately;
7, real-time detection is analyzed security incidents such as judge indoor whether breaking out of fire, gas leak, blast and is occurred, if there is this type of event, starts immediately warning;
8, in the time there is the alert event of 4-7, the video and audio recording of the people that starts the machine own pushes to intelligent terminal by warning message by the wired outer net of WIFI+ or 3G/4G, as other terminals such as master cellular phone or property management command centres simultaneously;
9, medical treatment & health data retransmission; After having the Domestic health-care Medical Devices of the purposes such as Measure blood pressure, blood oxygen, heartbeat, pulse and heart rate to open, robot establishes a communications link with it automatically, and diagnostic result voice broadcast is given and measured people, simultaneously according to user instruction, measurement or diagnostic data are pushed to intelligent terminal by the wired outer net of WIFI+ or 3G/4G, as master cellular phone, or in other Telemedicine System;
10, automatic learning store various home used remote controlers or remote information that middle control terminal sends; Control household electrical appliance instructions as turned on TV time when robot obtains, robot is sent to household electrical appliance by this steering order by wireless transmitting unit, thereby realizes the control to household electrical appliance; Robot also can by self with the variation of the perception domestic environment such as temperature sensor, optical sensor, humidity sensor, automatically open air-conditioning, open the operations such as curtain;
11, press danger button, can be to specifying intelligent terminal as mobile phone or other-end transmission warning distress signal;
12, when robot receives after the audio frequency and video preview or play-back command that remote terminal sends as mobile phone, robot will start preview or playback software, and fact is sent to remote terminal in real time;
13, unmanned pattern; When family nobody time, can open this pattern, robot will timing in doors make an inspection tour or key monitoring designated room or region;
14,, by remote mobile terminals such as mobile phones, can manipulate robot, as distribution of machine people control command, amendment robot operational factor etc.;
15, the dynamic display device people of touch-screen work state information, receives touch control order;
16, recharging; In the time that robot electric weight is not enough or do not have task idle, can oneself find charger charging;
17, in addition, the patient, old man and the child that stay in, timing reminding function, study etc. can also be accompanied by this robot; Can also there is the functions such as notepad prompting, night-light, dialogue communication.
It should be noted that: the unrestricted technical scheme of the present invention in order to explanation only above, although the present invention is had been described in detail with reference to above-described embodiment, those of ordinary skill in the art is to be understood that: still can modify or be equal to replacement the present invention, and do not depart from any modification or partial replacement of the spirit and scope of the present invention, all should be encompassed in the middle of claim scope of the present invention.

Claims (10)

1. a multifunctional domestic intelligent robot, comprises process control system and multiple functional subsystem, and described multiple functional subsystems comprise mobile system, data acquisition system (DAS), and communication system, is characterized in that:
Described mobile system comprises locating module and driver module, and described locating module is identified and locates the environment at described robot place, and moves by driver module control;
Described data acquisition system (DAS) comprises vision module, voice module, data collection module, and described vision module comprises camera head and gathers video image by described camera head; Described voice module comprises acoustic pickup, and described acoustic pickup gathers audio-frequency information;
Described communication system comprises wireless communication module, realizes the telecommunication of described robot;
Described process control system comprises processing module and control module, described processing module receives the data in described multiple functional subsystems in real time, process according to predetermined algorithm, based on result, described control module is assigned steering order to each functional subsystem system, controls the action of described robot.
2. robot as claimed in claim 1, is characterized in that:
Described process control system is processed the video image of described video module collection, and to surrounding environment event is carried out to Intelligent Recognition and judgement, and send corresponding action command based on described judgement.
3. robot as claimed in claim 2, is characterized in that:
Described process control system after described video image is processed, carries out Intelligent Recognition and judgement comprises that illegal invasion detects one or more in identification, limbs collision detection identification, gestures detection identification, fall detection judgement, flame detection, Smoke Detection.
4. robot as claimed in claim 3, is characterized in that:
Described illegal invasion detects identification and comprises the steps:
One, human body catches, and adopts the video human detection of machine learning, and the shape description using HOG feature as human body selects SVM by adaboost algorithm, and the described video image of the traversal of sliding is realized the seizure to human body;
Two, human body tracking, sets up human body tracking model, carries out measuring similarity and calculates similarity with Pasteur's distance with LBP function, realizes the tracking to human body based on mean shift;
Three, illegal invasion detects, and catches and obtains the position of people in video image based on described human body, determines that according to described human body tracking people enters warning region;
Once enter warning region, described process control system is identified the people in described video image, if identification is judged as non-registered personnel, described control system is confirmed as illegal invasion and sent alarm command.
5. robot as claimed in claim 3, is characterized in that:
Described limbs collision detection identification comprises the steps:
One, human body catches, and adopts the video human detection of machine learning, and the shape description using HOG feature as human body selects SVM by adaboost algorithm, and the described video image of the traversal of sliding is realized the seizure to human body;
Two, human body tracking, sets up human body tracking model, carries out measuring similarity and calculates similarity with Pasteur's distance with LBP function, realizes the tracking to human body based on mean shift;
Three, limbs collision detection, catch and obtain the position of people in video image and the region at described human body tracking acquisition people place based on described human body, light stream vector described in employing KLT (Kande-Lucas-Tomasi) unique point optical flow computation order in region, characterize violent random motion with Region Entropy, in the time that Region Entropy exceedes threshold value, described control system is confirmed as and limbs conflict is occurred and send alarm command.
6. robot as claimed in claim 4, is characterized in that:
Described process control system is identified and is comprised to the people in described video image is carried out recognition of face and/or the sound of the people in described video image is carried out to Application on Voiceprint Recognition the people in described video image.
7. robot as claimed in claim 5, is characterized in that:
In described limbs collision detection step, also comprise that the audio-frequency information that acoustic pickup shown in described voice module is obtained carries out audio analysis;
If when the Region Entropy obtaining by described light stream vector calculating exceedes threshold value, described audio-frequency information exceedes a decibel threshold value, described control system is confirmed as and limbs conflict is occurred and send alarm command.
8. robot as claimed in claim 6, is characterized in that:
Described Application on Voiceprint Recognition comprises off-line model training and online Application on Voiceprint Recognition;
Described off-line model training obtains the vocal print characteristic model of particular person;
Described online Application on Voiceprint Recognition is mated described sound with described vocal print characteristic model, thereby realizes Application on Voiceprint Recognition.
9. robot as claimed in claim 8, is characterized in that:
Described Application on Voiceprint Recognition comprises the steps:
One, described sound is carried out to sampling processing, obtain voice signal;
Two, described voice signal is carried out to feature extraction, obtain the power spectrum of normalization;
Three, adopt GMM model carry out the training of audio frequency identification model and deposit database in, complete vocal print registration;
Four, mate with the sound model of registering in described database, realize Application on Voiceprint Recognition.
10. robot as claimed in claim 3, is characterized in that:
Described gestures detection identification comprises image pre-service, gesture tracking, feature extraction, four steps of gesture identification;
Described image pre-service is to adopt medium filtering to carry out smoothing denoising to the images of gestures in described video image, obtains described images of gestures, and gesture Image Segmentation Using is obtained to gesture bianry image;
It is to adopt the CAMShift track algorithm in HSV space that described gesture bianry image is converted to color probability distribution graph that described gesture is followed the tracks of, and realizes having the tracking of characteristic color object by iteration repeatedly;
Described feature extraction is the Hu invariant moment features that gesture profile in described gesture bianry image is extracted to gesture, and described feature has translation, rotation and yardstick unchangeability, and described feature is sent to BP neural network trains;
Described gesture identification is sent to the BP neural network of having trained by the described feature of extracting and mates identification, thereby completes described gesture identification.
CN201410205535.2A 2014-05-15 2014-05-15 Domestic multifunctional intelligent robot Pending CN103984315A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201410205535.2A CN103984315A (en) 2014-05-15 2014-05-15 Domestic multifunctional intelligent robot
PCT/CN2014/084138 WO2015172445A1 (en) 2014-05-15 2014-08-11 Domestic multifunctional intelligent robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410205535.2A CN103984315A (en) 2014-05-15 2014-05-15 Domestic multifunctional intelligent robot

Publications (1)

Publication Number Publication Date
CN103984315A true CN103984315A (en) 2014-08-13

Family

ID=51276330

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410205535.2A Pending CN103984315A (en) 2014-05-15 2014-05-15 Domestic multifunctional intelligent robot

Country Status (2)

Country Link
CN (1) CN103984315A (en)
WO (1) WO2015172445A1 (en)

Cited By (146)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104269016A (en) * 2014-09-22 2015-01-07 北京奇艺世纪科技有限公司 Alarm method and device
CN104317199A (en) * 2014-09-16 2015-01-28 江苏大学 Mobile smart housekeeper
CN104503419A (en) * 2015-01-24 2015-04-08 无锡桑尼安科技有限公司 Method used for ward data collection
CN104699101A (en) * 2015-01-30 2015-06-10 深圳拓邦股份有限公司 Robot mowing system capable of customizing mowing zone and control method thereof
CN104808670A (en) * 2015-04-29 2015-07-29 成都陌云科技有限公司 Intelligent interacting robot
CN104853165A (en) * 2015-05-13 2015-08-19 许金兰 WiFi-technology-based multi-media sensor network system
CN104865856A (en) * 2015-03-30 2015-08-26 成都好飞机器人科技有限公司 Voice control method for unmanned aerial vehicle
CN104932534A (en) * 2015-05-22 2015-09-23 广州大学 Method for cloud robot to clean items
CN105022400A (en) * 2015-07-22 2015-11-04 上海思依暄机器人科技有限公司 Controlled robot, remote control device, robot system and control method thereof
CN105182983A (en) * 2015-10-22 2015-12-23 深圳创想未来机器人有限公司 Face real-time tracking method and face real-time tracking system based on mobile robot
CN105204349A (en) * 2015-08-19 2015-12-30 杨珊珊 Unmanned aerial vehicle for intelligent household control and control method thereof
CN105291113A (en) * 2015-11-27 2016-02-03 深圳市神州云海智能科技有限公司 Robot system for home care
CN105364915A (en) * 2015-12-11 2016-03-02 齐鲁工业大学 Intelligent home service robot based on three-dimensional machine vision
CN105380575A (en) * 2015-12-11 2016-03-09 美的集团股份有限公司 Control method and system for sweeping robot, cloud server and sweeping robot
CN105415384A (en) * 2015-12-30 2016-03-23 天津市安卓公共设施服务有限公司 Sweeping and patrolling integrated operation robot used for transformer substation
CN105468145A (en) * 2015-11-18 2016-04-06 北京航空航天大学 Robot man-machine interaction method and device based on gesture and voice recognition
CN105511477A (en) * 2016-02-16 2016-04-20 江苏美的清洁电器股份有限公司 Cleaning robot system and cleaning robot
CN105549399A (en) * 2015-07-29 2016-05-04 宇龙计算机通信科技(深圳)有限公司 Indoor environment monitoring method and internet-of-things terminal
CN105578058A (en) * 2016-02-03 2016-05-11 北京光年无限科技有限公司 Shooting control method and device for intelligent robot and robot
CN105563494A (en) * 2016-01-29 2016-05-11 江西智能无限物联科技有限公司 Intelligent accompanying robot
CN105656953A (en) * 2014-11-11 2016-06-08 沈阳新松机器人自动化股份有限公司 Robot Internet of Things system based on Internet big data
CN105653037A (en) * 2015-12-31 2016-06-08 张小花 Interactive system and method based on behavior analysis
CN105760824A (en) * 2016-02-02 2016-07-13 北京进化者机器人科技有限公司 Moving body tracking method and system
CN105773615A (en) * 2016-04-06 2016-07-20 成都令可科技有限公司 Robot system
CN105788161A (en) * 2016-04-27 2016-07-20 深圳前海勇艺达机器人有限公司 Intelligent alarm robot
CN105798931A (en) * 2016-04-26 2016-07-27 南京玛锶腾智能科技有限公司 Arousing method and device for intelligent robot
CN105979230A (en) * 2016-07-04 2016-09-28 上海思依暄机器人科技股份有限公司 Monitoring method and device realized through images by use of robot
CN106155050A (en) * 2015-04-15 2016-11-23 小米科技有限责任公司 The mode of operation method of adjustment of intelligent cleaning equipment and device, electronic equipment
CN106200564A (en) * 2016-08-03 2016-12-07 苏州见真物联科技有限公司 A kind of Smart Home completely mobile integration operates terminal
CN106203361A (en) * 2016-07-15 2016-12-07 苏州宾果智能科技有限公司 A kind of robotic tracking's method and apparatus
CN106227216A (en) * 2016-08-31 2016-12-14 朱明� Home-services robot towards house old man
CN106249711A (en) * 2016-08-03 2016-12-21 海南警视者科技开发有限公司 A kind of Multifunctional intelligent robot
CN106297790A (en) * 2016-08-22 2017-01-04 深圳市锐曼智能装备有限公司 The voiceprint service system of robot and service control method thereof
CN106297172A (en) * 2016-10-31 2017-01-04 河池学院 A kind of robot based on mobile Internet alarm method
WO2017000795A1 (en) * 2015-06-30 2017-01-05 芋头科技(杭州)有限公司 Robot system and method for controlling same
CN106325171A (en) * 2016-10-31 2017-01-11 河池学院 Rechargeable service robot
CN106325282A (en) * 2016-09-28 2017-01-11 中国人民解放军国防科学技术大学 Comprehensive security and service method with intelligent robot
CN106341477A (en) * 2016-09-12 2017-01-18 国网辽宁省电力有限公司电力科学研究院 Automatic information collecting, recording and uploading system during experiment process and method thereof
CN106371441A (en) * 2016-10-13 2017-02-01 安徽翔龙电气有限公司 Intelligent sweeping robot system with voice input function
CN106408852A (en) * 2016-10-31 2017-02-15 河池学院 Robot alarm system based on mobile Internet
CN106408853A (en) * 2016-10-31 2017-02-15 河池学院 Robot video security and protection system based on mobile Internet
CN106448027A (en) * 2016-12-15 2017-02-22 湖南纽思曼导航定位科技有限公司 Intelligent security and protection device
CN106440229A (en) * 2016-10-24 2017-02-22 美的集团武汉制冷设备有限公司 Intelligent floor sweeping robot, and system and air condition detecting method thereof
CN106529460A (en) * 2016-11-03 2017-03-22 贺江涛 Object classification identification system and identification method based on robot side
CN106558052A (en) * 2016-10-10 2017-04-05 北京光年无限科技有限公司 A kind of interaction data for intelligent robot processes output intent and robot
CN106597903A (en) * 2016-12-26 2017-04-26 刘震 System for perceiving environment of stationary position
CN106625711A (en) * 2016-12-30 2017-05-10 华南智能机器人创新研究院 Method for positioning intelligent interaction of robot
CN106685026A (en) * 2015-11-09 2017-05-17 江苏嘉钰新能源技术有限公司 Electromobile charging pile with wireless charging function
CN106682602A (en) * 2016-12-16 2017-05-17 深圳市华尊科技股份有限公司 Driver behavior identification method and terminal
CN106791681A (en) * 2016-12-31 2017-05-31 深圳市优必选科技有限公司 Video monitoring and face identification method, apparatus and system
CN106781332A (en) * 2017-02-14 2017-05-31 上海斐讯数据通信技术有限公司 The method and system of alarm are realized by sweeping robot
CN106774317A (en) * 2016-12-13 2017-05-31 安徽乐年健康养老产业有限公司 A kind of control method of assist type robot
CN106891337A (en) * 2016-09-21 2017-06-27 摩瑞尔电器(昆山)有限公司 Multifunctional service robot
CN106950973A (en) * 2017-05-19 2017-07-14 苏州寅初信息科技有限公司 A kind of Intelligent road patrol method and its system based on teaching robot
CN107030714A (en) * 2017-05-26 2017-08-11 深圳市天益智网科技有限公司 A kind of medical nurse robot
CN107168079A (en) * 2017-06-05 2017-09-15 百色学院 A kind of home security robot system based on radio communication
CN107168174A (en) * 2017-06-15 2017-09-15 重庆柚瓣科技有限公司 A kind of method that use robot does family endowment
CN107195153A (en) * 2017-07-06 2017-09-22 陈旭东 A kind of intelligent household security system
CN107197196A (en) * 2016-03-15 2017-09-22 群耀光电科技(苏州)有限公司 Carrier system is monitored in real time
CN107199572A (en) * 2017-06-16 2017-09-26 山东大学 A kind of robot system and method based on intelligent auditory localization and Voice command
CN107248410A (en) * 2017-07-19 2017-10-13 浙江联运知慧科技有限公司 The method that Application on Voiceprint Recognition dustbin opens the door
CN107292908A (en) * 2016-04-02 2017-10-24 上海大学 Pedestrian tracting method based on KLT feature point tracking algorithms
CN107280588A (en) * 2017-06-24 2017-10-24 武汉洁美雅科技有限公司 A kind of dust catcher infrared remote control control system based on Internet of Things
TWI610295B (en) * 2015-04-13 2018-01-01 英特爾公司 Computer-implemented method of decompressing and compressing transducer data for speech recognition and computer-implemented system of speech recognition
CN107610766A (en) * 2017-09-12 2018-01-19 合肥矽智科技有限公司 A kind of ward medical care robot Internet of things system
CN107633644A (en) * 2017-09-30 2018-01-26 河南职业技术学院 A kind of computer based safety-protection system
CN107665398A (en) * 2017-09-06 2018-02-06 安徽乐金环境科技有限公司 The purification center adjustment localization method of air purifier
CN107708553A (en) * 2015-09-03 2018-02-16 三菱电机株式会社 Activity recognition device, air conditioner and robot controller
CN107784151A (en) * 2016-08-26 2018-03-09 福特全球技术公司 The physical modeling of radar and sonac
CN107864078A (en) * 2017-11-10 2018-03-30 刘永新 The removable remote control household fixtures of one kind
CN108053606A (en) * 2017-12-28 2018-05-18 深圳市国华光电科技有限公司 A kind of smart home burglary-resisting system
CN108132606A (en) * 2018-02-02 2018-06-08 宁夏慧百通赢科技有限公司 Appliance control method and device based on wireless transmission
CN108182379A (en) * 2017-11-28 2018-06-19 珠海格力电器股份有限公司 The antitheft tracing method of home appliance, device and system
CN108237536A (en) * 2018-03-16 2018-07-03 重庆鲁班机器人技术研究院有限公司 Robot control system
CN108326875A (en) * 2017-01-20 2018-07-27 松下知识产权经营株式会社 Communication control method and device, long-range presentation robot and storage medium
CN108363490A (en) * 2018-03-01 2018-08-03 深圳大图科创技术开发有限公司 A kind of good intelligent robot system of interaction effect
WO2018152723A1 (en) * 2017-02-23 2018-08-30 深圳市前海中康汇融信息技术有限公司 Home security and protection robot and control method therefor
CN108527382A (en) * 2018-04-09 2018-09-14 上海方立数码科技有限公司 A kind of crusing robot
CN108551355A (en) * 2018-04-23 2018-09-18 王宏伟 A kind of security protection patrol robot
CN108574804A (en) * 2018-07-04 2018-09-25 珠海市微半导体有限公司 A kind of Light Source Compensation system and method for vision robot
CN108712404A (en) * 2018-05-04 2018-10-26 重庆邮电大学 A kind of Internet of Things intrusion detection method based on machine learning
CN108724178A (en) * 2018-04-13 2018-11-02 顺丰科技有限公司 The autonomous follower method of particular person and device, robot, equipment and storage medium
CN108748143A (en) * 2018-05-04 2018-11-06 安徽三弟电子科技有限责任公司 A kind of life prompt robot control system based on Internet of Things
CN108765921A (en) * 2018-04-04 2018-11-06 昆山市工研院智能制造技术有限公司 View-based access control model lexical analysis is applied to the intelligent patrol method of patrol robot
CN108806013A (en) * 2018-04-04 2018-11-13 昆山市工研院智能制造技术有限公司 The patrol robot ecosystem
CN108806142A (en) * 2018-06-29 2018-11-13 炬大科技有限公司 A kind of unmanned security system, method and sweeping robot
CN108874142A (en) * 2018-06-26 2018-11-23 哈尔滨拓博科技有限公司 A kind of Wireless intelligent control device and its control method based on gesture
CN108898108A (en) * 2018-06-29 2018-11-27 炬大科技有限公司 A kind of user's abnormal behaviour monitoring system and method based on sweeping robot
CN108921218A (en) * 2018-06-29 2018-11-30 炬大科技有限公司 A kind of target object detection method and device
CN108919809A (en) * 2018-07-25 2018-11-30 智慧式控股有限公司 Wisdom formula safety protection robot and business model
CN108960109A (en) * 2018-06-26 2018-12-07 哈尔滨拓博科技有限公司 A kind of space gesture positioning device and localization method based on two monocular cams
CN109005432A (en) * 2018-07-24 2018-12-14 上海常仁信息科技有限公司 A kind of network television system based on healthy robot
CN109003262A (en) * 2018-06-29 2018-12-14 炬大科技有限公司 Stain clean method and device
CN109117055A (en) * 2018-07-26 2019-01-01 深圳市商汤科技有限公司 Intelligent terminal and control method
CN109116740A (en) * 2017-06-23 2019-01-01 美的智慧家居科技有限公司 Mobile device applied to smart home
CN109118703A (en) * 2018-07-19 2019-01-01 苏州菲丽丝智能科技有限公司 A kind of intelligent household security system and its working method
CN109153122A (en) * 2016-06-17 2019-01-04 英特尔公司 The robot control system of view-based access control model
CN109147277A (en) * 2018-09-30 2019-01-04 桂林海威科技股份有限公司 A kind of old man care system and method
CN109190456A (en) * 2018-07-19 2019-01-11 中国人民解放军战略支援部队信息工程大学 Pedestrian detection method is overlooked based on the multiple features fusion of converging channels feature and gray level co-occurrence matrixes
CN109191768A (en) * 2018-09-10 2019-01-11 天津大学 A kind of kinsfolk's security risk monitoring method based on deep learning
CN109257563A (en) * 2018-08-30 2019-01-22 浙江祥生建设工程有限公司 Building site remote monitoring system
CN109333548A (en) * 2018-10-18 2019-02-15 何勇 A kind of intelligent Service chat robots with intelligence training function
CN109445427A (en) * 2018-09-26 2019-03-08 北京洪泰同创信息技术有限公司 Intelligentized Furniture, furniture positioning device and furniture positioning system
CN109547771A (en) * 2019-01-07 2019-03-29 中国人民大学 A kind of household intelligent robot having bore hole 3D display device
CN109634129A (en) * 2018-11-02 2019-04-16 深圳慧安康科技有限公司 Implementation method, system and the device actively shown loving care for
CN109691090A (en) * 2018-12-05 2019-04-26 珊口(深圳)智能科技有限公司 Monitoring method, device, monitoring system and the mobile robot of mobile target
CN109719736A (en) * 2017-10-31 2019-05-07 科沃斯机器人股份有限公司 Self-movement robot and its control method
CN109739097A (en) * 2018-12-14 2019-05-10 武汉城市职业学院 A kind of smart home robot and application thereof based on embedded type WEB
CN109740461A (en) * 2018-12-21 2019-05-10 北京智行者科技有限公司 Target is with subsequent processing method
CN109800802A (en) * 2019-01-10 2019-05-24 深圳绿米联创科技有限公司 Visual sensor and object detecting method and device applied to visual sensor
CN109887515A (en) * 2019-01-29 2019-06-14 北京市商汤科技开发有限公司 Audio-frequency processing method and device, electronic equipment and storage medium
CN109917666A (en) * 2019-03-28 2019-06-21 深圳慧安康科技有限公司 The implementation method and intelligent apparatus of wisdom family
CN109993945A (en) * 2019-04-04 2019-07-09 清华大学 For gradually freezing the alarm system and alarm method of disease patient monitoring
CN110162044A (en) * 2019-05-18 2019-08-23 珠海格力电器股份有限公司 A kind of automated wireless charging unit and charging method
CN110161903A (en) * 2019-05-05 2019-08-23 宁波财经学院 A kind of control method of smart home robot and smart home robot
CN110164538A (en) * 2019-01-29 2019-08-23 浙江瑞华康源科技有限公司 A kind of medical logistics system and method
CN110209483A (en) * 2019-05-28 2019-09-06 福州瑞芯微电子股份有限公司 Machine control system of sweeping the floor and control method, storage medium and controlling terminal
CN110765895A (en) * 2019-09-30 2020-02-07 北京鲲鹏神通科技有限公司 Method for distinguishing object by robot
CN110891352A (en) * 2019-11-26 2020-03-17 珠海格力电器股份有限公司 Control method and control system for intelligent lamp
TWI691864B (en) * 2017-06-21 2020-04-21 鴻海精密工業股份有限公司 Intelligent robot
CN111300429A (en) * 2020-03-25 2020-06-19 深圳市天博智科技有限公司 Robot control system, method and readable storage medium
CN111322718A (en) * 2020-03-16 2020-06-23 北京云迹科技有限公司 Data processing method and delivery robot
CN111428666A (en) * 2020-03-31 2020-07-17 齐鲁工业大学 Intelligent family accompanying robot system and method based on rapid face detection
CN111464776A (en) * 2020-01-19 2020-07-28 浙江工贸职业技术学院 Internet of things safety alarm equipment and assessment method
CN111491004A (en) * 2019-11-28 2020-08-04 赵丽侠 Information updating method based on cloud storage
CN111508184A (en) * 2020-04-10 2020-08-07 扬州大学 Intelligent fire protection system in building
CN111611904A (en) * 2020-05-15 2020-09-01 新石器慧通(北京)科技有限公司 Dynamic target identification method based on unmanned vehicle driving process
CN111618856A (en) * 2020-05-27 2020-09-04 山东交通学院 Robot control method and system based on visual excitation points and robot
CN111862524A (en) * 2020-07-10 2020-10-30 广州博冠智能科技有限公司 Monitoring alarm method and device based on intelligent home system
CN111844046A (en) * 2017-03-11 2020-10-30 陕西爱尚物联科技有限公司 Robot hardware system and robot thereof
CN111898524A (en) * 2020-07-29 2020-11-06 江苏艾什顿科技有限公司 5G edge computing gateway and application thereof
CN111915851A (en) * 2020-08-11 2020-11-10 山西应用科技学院 Gas intelligence switch
CN111964154A (en) * 2020-08-28 2020-11-20 邯郸美的制冷设备有限公司 Air conditioner indoor unit, control method, operation control device and air conditioner
CN112101145A (en) * 2020-08-28 2020-12-18 西北工业大学 SVM classifier based pose estimation method for mobile robot
CN112352244A (en) * 2018-04-23 2021-02-09 尚科宁家运营有限公司 Techniques for limiting cleaning operations of a robotic surface cleaning device to a region of interest
CN112888902A (en) * 2019-01-28 2021-06-01 株式会社日立制作所 Movable air purifier
CN113035374A (en) * 2021-03-16 2021-06-25 深圳市南山区慢性病防治院 Tuberculosis comprehensive management system and management method
CN113119118A (en) * 2021-03-24 2021-07-16 智能移动机器人(中山)研究院 Intelligent indoor inspection robot system
CN113143165A (en) * 2021-04-26 2021-07-23 上海甄徽网络科技发展有限公司 Intelligent security household robot with disinfection function
CN113177972A (en) * 2021-05-20 2021-07-27 杭州华橙软件技术有限公司 Object tracking method and device, storage medium and electronic device
CN113341812A (en) * 2021-06-11 2021-09-03 深圳风角智能科技有限公司 Environment-friendly electric storage type energy consumption and power saving management system and method for terminal of Internet of things
CN113362563A (en) * 2021-06-03 2021-09-07 国网北京市电力公司 Method and device for determining abnormal condition of power tunnel
CN113571054A (en) * 2020-04-28 2021-10-29 中国移动通信集团浙江有限公司 Speech recognition signal preprocessing method, device, equipment and computer storage medium
CN113822095A (en) * 2020-06-02 2021-12-21 苏州科瓴精密机械科技有限公司 Method, system, robot and storage medium for identifying working position based on image
CN114018253A (en) * 2021-10-25 2022-02-08 珠海一微半导体股份有限公司 Robot with visual positioning function and positioning method
WO2023217193A1 (en) * 2022-05-10 2023-11-16 神顶科技(南京)有限公司 Robot and method for robot to recognise fall

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105835069A (en) * 2016-06-06 2016-08-10 李志华 Intelligent household health robot
CN106346491A (en) * 2016-10-25 2017-01-25 塔米智能科技(北京)有限公司 Intelligent member-service robot system based on face information
CN109889772A (en) * 2017-12-06 2019-06-14 东莞华南设计创新院 A kind of intelligent toy monitoring system
JP6927083B2 (en) * 2018-03-01 2021-08-25 オムロン株式会社 Judgment device and control method of judgment device
US11188810B2 (en) 2018-06-26 2021-11-30 At&T Intellectual Property I, L.P. Integrated assistance platform

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010176177A (en) * 2009-01-27 2010-08-12 Panasonic Electric Works Co Ltd Load control system
CN101947788A (en) * 2010-06-23 2011-01-19 焦利民 Intelligent robot
CN103198605A (en) * 2013-03-11 2013-07-10 成都百威讯科技有限责任公司 Indoor emergent abnormal event alarm system
CN103273982A (en) * 2013-04-27 2013-09-04 东莞市华虹电子有限公司 Multifunctional all-terrain bio-robot
CN203259876U (en) * 2013-03-01 2013-10-30 李冀 Intelligent household control system for mobile robot
CN103593680A (en) * 2013-11-19 2014-02-19 南京大学 Dynamic hand gesture recognition method based on self incremental learning of hidden Markov model

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3307354B2 (en) * 1999-01-29 2002-07-24 日本電気株式会社 Personal identification method and apparatus and recording medium recording personal identification program
WO2005098729A2 (en) * 2004-03-27 2005-10-20 Harvey Koselka Autonomous personal service robot
CN101957194A (en) * 2009-07-16 2011-01-26 北京石油化工学院 Rapid visual orientation and remote monitoring system and method based on embedded mobile robot
CN101786272A (en) * 2010-01-05 2010-07-28 深圳先进技术研究院 Multisensory robot used for family intelligent monitoring service
CN103419203A (en) * 2012-05-21 2013-12-04 李坚 All-day domestic robot

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010176177A (en) * 2009-01-27 2010-08-12 Panasonic Electric Works Co Ltd Load control system
CN101947788A (en) * 2010-06-23 2011-01-19 焦利民 Intelligent robot
CN203259876U (en) * 2013-03-01 2013-10-30 李冀 Intelligent household control system for mobile robot
CN103198605A (en) * 2013-03-11 2013-07-10 成都百威讯科技有限责任公司 Indoor emergent abnormal event alarm system
CN103273982A (en) * 2013-04-27 2013-09-04 东莞市华虹电子有限公司 Multifunctional all-terrain bio-robot
CN103593680A (en) * 2013-11-19 2014-02-19 南京大学 Dynamic hand gesture recognition method based on self incremental learning of hidden Markov model

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨丽波 等: "《C语言程序设计教程》", 28 February 2014, 机械工业出版社 *

Cited By (188)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104317199A (en) * 2014-09-16 2015-01-28 江苏大学 Mobile smart housekeeper
CN104269016A (en) * 2014-09-22 2015-01-07 北京奇艺世纪科技有限公司 Alarm method and device
CN105656953A (en) * 2014-11-11 2016-06-08 沈阳新松机器人自动化股份有限公司 Robot Internet of Things system based on Internet big data
CN105182882A (en) * 2015-01-24 2015-12-23 郝红娟 Ward data acquisition method
CN104503419A (en) * 2015-01-24 2015-04-08 无锡桑尼安科技有限公司 Method used for ward data collection
CN104503419B (en) * 2015-01-24 2016-02-03 康群 A kind of for ward collecting method
CN104699101A (en) * 2015-01-30 2015-06-10 深圳拓邦股份有限公司 Robot mowing system capable of customizing mowing zone and control method thereof
CN104865856A (en) * 2015-03-30 2015-08-26 成都好飞机器人科技有限公司 Voice control method for unmanned aerial vehicle
US9990917B2 (en) 2015-04-13 2018-06-05 Intel Corporation Method and system of random access compression of transducer data for automatic speech recognition decoding
TWI610295B (en) * 2015-04-13 2018-01-01 英特爾公司 Computer-implemented method of decompressing and compressing transducer data for speech recognition and computer-implemented system of speech recognition
CN106155050A (en) * 2015-04-15 2016-11-23 小米科技有限责任公司 The mode of operation method of adjustment of intelligent cleaning equipment and device, electronic equipment
CN104808670A (en) * 2015-04-29 2015-07-29 成都陌云科技有限公司 Intelligent interacting robot
CN104853165A (en) * 2015-05-13 2015-08-19 许金兰 WiFi-technology-based multi-media sensor network system
CN104932534A (en) * 2015-05-22 2015-09-23 广州大学 Method for cloud robot to clean items
CN104932534B (en) * 2015-05-22 2017-11-21 广州大学 A kind of method of cloud robot cleaning article
TWI622474B (en) * 2015-06-30 2018-05-01 芋頭科技(杭州)有限公司 Robot system and control method thereof
WO2017000795A1 (en) * 2015-06-30 2017-01-05 芋头科技(杭州)有限公司 Robot system and method for controlling same
CN105022400A (en) * 2015-07-22 2015-11-04 上海思依暄机器人科技有限公司 Controlled robot, remote control device, robot system and control method thereof
CN105022400B (en) * 2015-07-22 2018-06-22 上海思依暄机器人科技股份有限公司 Controlled machine people, remote control equipment, robot system and its control method
CN105549399A (en) * 2015-07-29 2016-05-04 宇龙计算机通信科技(深圳)有限公司 Indoor environment monitoring method and internet-of-things terminal
CN105549399B (en) * 2015-07-29 2018-09-07 宇龙计算机通信科技(深圳)有限公司 A kind of indoor environment monitoring method and internet-of-things terminal
CN105204349B (en) * 2015-08-19 2017-11-07 杨珊珊 A kind of unmanned vehicle and its control method for Intelligent housing
CN105204349A (en) * 2015-08-19 2015-12-30 杨珊珊 Unmanned aerial vehicle for intelligent household control and control method thereof
CN107708553A (en) * 2015-09-03 2018-02-16 三菱电机株式会社 Activity recognition device, air conditioner and robot controller
US10768591B2 (en) 2015-09-03 2020-09-08 Mitsubishi Electric Corporation Behavior identification device, air conditioner, and robot control device
CN105182983A (en) * 2015-10-22 2015-12-23 深圳创想未来机器人有限公司 Face real-time tracking method and face real-time tracking system based on mobile robot
CN106685026A (en) * 2015-11-09 2017-05-17 江苏嘉钰新能源技术有限公司 Electromobile charging pile with wireless charging function
CN105468145A (en) * 2015-11-18 2016-04-06 北京航空航天大学 Robot man-machine interaction method and device based on gesture and voice recognition
CN105468145B (en) * 2015-11-18 2019-05-28 北京航空航天大学 A kind of robot man-machine interaction method and device based on gesture and speech recognition
CN105291113A (en) * 2015-11-27 2016-02-03 深圳市神州云海智能科技有限公司 Robot system for home care
CN105364915A (en) * 2015-12-11 2016-03-02 齐鲁工业大学 Intelligent home service robot based on three-dimensional machine vision
CN105380575A (en) * 2015-12-11 2016-03-09 美的集团股份有限公司 Control method and system for sweeping robot, cloud server and sweeping robot
CN105415384A (en) * 2015-12-30 2016-03-23 天津市安卓公共设施服务有限公司 Sweeping and patrolling integrated operation robot used for transformer substation
CN105653037A (en) * 2015-12-31 2016-06-08 张小花 Interactive system and method based on behavior analysis
CN105563494A (en) * 2016-01-29 2016-05-11 江西智能无限物联科技有限公司 Intelligent accompanying robot
CN105760824A (en) * 2016-02-02 2016-07-13 北京进化者机器人科技有限公司 Moving body tracking method and system
CN105760824B (en) * 2016-02-02 2019-02-01 北京进化者机器人科技有限公司 A kind of moving human hand tracking method and system
WO2017133453A1 (en) * 2016-02-02 2017-08-10 北京进化者机器人科技有限公司 Method and system for tracking moving body
CN105578058A (en) * 2016-02-03 2016-05-11 北京光年无限科技有限公司 Shooting control method and device for intelligent robot and robot
CN105511477A (en) * 2016-02-16 2016-04-20 江苏美的清洁电器股份有限公司 Cleaning robot system and cleaning robot
CN107197196A (en) * 2016-03-15 2017-09-22 群耀光电科技(苏州)有限公司 Carrier system is monitored in real time
CN107292908A (en) * 2016-04-02 2017-10-24 上海大学 Pedestrian tracting method based on KLT feature point tracking algorithms
CN105773615B (en) * 2016-04-06 2018-05-29 成都令可科技有限公司 A kind of robot system
CN105773615A (en) * 2016-04-06 2016-07-20 成都令可科技有限公司 Robot system
CN105798931B (en) * 2016-04-26 2018-03-09 南京玛锶腾智能科技有限公司 Intelligent robot awakening method and device
CN105798931A (en) * 2016-04-26 2016-07-27 南京玛锶腾智能科技有限公司 Arousing method and device for intelligent robot
CN105788161A (en) * 2016-04-27 2016-07-20 深圳前海勇艺达机器人有限公司 Intelligent alarm robot
CN109153122A (en) * 2016-06-17 2019-01-04 英特尔公司 The robot control system of view-based access control model
CN105979230A (en) * 2016-07-04 2016-09-28 上海思依暄机器人科技股份有限公司 Monitoring method and device realized through images by use of robot
CN106203361A (en) * 2016-07-15 2016-12-07 苏州宾果智能科技有限公司 A kind of robotic tracking's method and apparatus
CN106249711A (en) * 2016-08-03 2016-12-21 海南警视者科技开发有限公司 A kind of Multifunctional intelligent robot
CN106200564A (en) * 2016-08-03 2016-12-07 苏州见真物联科技有限公司 A kind of Smart Home completely mobile integration operates terminal
CN106297790A (en) * 2016-08-22 2017-01-04 深圳市锐曼智能装备有限公司 The voiceprint service system of robot and service control method thereof
CN107784151A (en) * 2016-08-26 2018-03-09 福特全球技术公司 The physical modeling of radar and sonac
CN106227216A (en) * 2016-08-31 2016-12-14 朱明� Home-services robot towards house old man
CN106227216B (en) * 2016-08-31 2019-11-12 朱明� Home-services robot towards house old man
CN106341477A (en) * 2016-09-12 2017-01-18 国网辽宁省电力有限公司电力科学研究院 Automatic information collecting, recording and uploading system during experiment process and method thereof
CN106891337A (en) * 2016-09-21 2017-06-27 摩瑞尔电器(昆山)有限公司 Multifunctional service robot
CN106325282A (en) * 2016-09-28 2017-01-11 中国人民解放军国防科学技术大学 Comprehensive security and service method with intelligent robot
CN106558052A (en) * 2016-10-10 2017-04-05 北京光年无限科技有限公司 A kind of interaction data for intelligent robot processes output intent and robot
CN106371441A (en) * 2016-10-13 2017-02-01 安徽翔龙电气有限公司 Intelligent sweeping robot system with voice input function
CN106440229B (en) * 2016-10-24 2019-07-30 美的集团武汉制冷设备有限公司 The method of intelligent sweeping robot and its system and detection air regime
CN106440229A (en) * 2016-10-24 2017-02-22 美的集团武汉制冷设备有限公司 Intelligent floor sweeping robot, and system and air condition detecting method thereof
CN106408853A (en) * 2016-10-31 2017-02-15 河池学院 Robot video security and protection system based on mobile Internet
CN106297172A (en) * 2016-10-31 2017-01-04 河池学院 A kind of robot based on mobile Internet alarm method
CN106325171A (en) * 2016-10-31 2017-01-11 河池学院 Rechargeable service robot
CN106408852A (en) * 2016-10-31 2017-02-15 河池学院 Robot alarm system based on mobile Internet
CN106529460A (en) * 2016-11-03 2017-03-22 贺江涛 Object classification identification system and identification method based on robot side
CN106774317A (en) * 2016-12-13 2017-05-31 安徽乐年健康养老产业有限公司 A kind of control method of assist type robot
CN106448027A (en) * 2016-12-15 2017-02-22 湖南纽思曼导航定位科技有限公司 Intelligent security and protection device
CN106682602A (en) * 2016-12-16 2017-05-17 深圳市华尊科技股份有限公司 Driver behavior identification method and terminal
CN106682602B (en) * 2016-12-16 2020-01-21 深圳市华尊科技股份有限公司 Driver behavior identification method and terminal
CN106597903A (en) * 2016-12-26 2017-04-26 刘震 System for perceiving environment of stationary position
CN106625711A (en) * 2016-12-30 2017-05-10 华南智能机器人创新研究院 Method for positioning intelligent interaction of robot
CN106791681A (en) * 2016-12-31 2017-05-31 深圳市优必选科技有限公司 Video monitoring and face identification method, apparatus and system
CN108326875A (en) * 2017-01-20 2018-07-27 松下知识产权经营株式会社 Communication control method and device, long-range presentation robot and storage medium
CN106781332A (en) * 2017-02-14 2017-05-31 上海斐讯数据通信技术有限公司 The method and system of alarm are realized by sweeping robot
WO2018152723A1 (en) * 2017-02-23 2018-08-30 深圳市前海中康汇融信息技术有限公司 Home security and protection robot and control method therefor
CN111844046A (en) * 2017-03-11 2020-10-30 陕西爱尚物联科技有限公司 Robot hardware system and robot thereof
CN106950973A (en) * 2017-05-19 2017-07-14 苏州寅初信息科技有限公司 A kind of Intelligent road patrol method and its system based on teaching robot
CN107030714A (en) * 2017-05-26 2017-08-11 深圳市天益智网科技有限公司 A kind of medical nurse robot
CN107168079A (en) * 2017-06-05 2017-09-15 百色学院 A kind of home security robot system based on radio communication
CN107168174A (en) * 2017-06-15 2017-09-15 重庆柚瓣科技有限公司 A kind of method that use robot does family endowment
CN107168174B (en) * 2017-06-15 2019-08-09 重庆柚瓣科技有限公司 A method of family endowment is done using robot
CN107199572A (en) * 2017-06-16 2017-09-26 山东大学 A kind of robot system and method based on intelligent auditory localization and Voice command
CN107199572B (en) * 2017-06-16 2020-02-14 山东大学 Robot system and method based on intelligent sound source positioning and voice control
TWI691864B (en) * 2017-06-21 2020-04-21 鴻海精密工業股份有限公司 Intelligent robot
CN109116740A (en) * 2017-06-23 2019-01-01 美的智慧家居科技有限公司 Mobile device applied to smart home
CN107280588A (en) * 2017-06-24 2017-10-24 武汉洁美雅科技有限公司 A kind of dust catcher infrared remote control control system based on Internet of Things
CN107195153A (en) * 2017-07-06 2017-09-22 陈旭东 A kind of intelligent household security system
CN107248410A (en) * 2017-07-19 2017-10-13 浙江联运知慧科技有限公司 The method that Application on Voiceprint Recognition dustbin opens the door
CN107665398A (en) * 2017-09-06 2018-02-06 安徽乐金环境科技有限公司 The purification center adjustment localization method of air purifier
CN107610766A (en) * 2017-09-12 2018-01-19 合肥矽智科技有限公司 A kind of ward medical care robot Internet of things system
CN107633644A (en) * 2017-09-30 2018-01-26 河南职业技术学院 A kind of computer based safety-protection system
CN109719736A (en) * 2017-10-31 2019-05-07 科沃斯机器人股份有限公司 Self-movement robot and its control method
CN109719736B (en) * 2017-10-31 2024-03-26 科沃斯机器人股份有限公司 Self-moving robot and control method thereof
CN107864078A (en) * 2017-11-10 2018-03-30 刘永新 The removable remote control household fixtures of one kind
CN108182379A (en) * 2017-11-28 2018-06-19 珠海格力电器股份有限公司 The antitheft tracing method of home appliance, device and system
CN108182379B (en) * 2017-11-28 2020-06-16 珠海格力电器股份有限公司 Anti-theft tracking method, device and system for household electrical appliance
CN108053606A (en) * 2017-12-28 2018-05-18 深圳市国华光电科技有限公司 A kind of smart home burglary-resisting system
CN108132606A (en) * 2018-02-02 2018-06-08 宁夏慧百通赢科技有限公司 Appliance control method and device based on wireless transmission
CN108363490A (en) * 2018-03-01 2018-08-03 深圳大图科创技术开发有限公司 A kind of good intelligent robot system of interaction effect
CN108237536A (en) * 2018-03-16 2018-07-03 重庆鲁班机器人技术研究院有限公司 Robot control system
CN108806013A (en) * 2018-04-04 2018-11-13 昆山市工研院智能制造技术有限公司 The patrol robot ecosystem
CN108765921A (en) * 2018-04-04 2018-11-06 昆山市工研院智能制造技术有限公司 View-based access control model lexical analysis is applied to the intelligent patrol method of patrol robot
CN108527382A (en) * 2018-04-09 2018-09-14 上海方立数码科技有限公司 A kind of crusing robot
CN108724178B (en) * 2018-04-13 2022-03-29 顺丰科技有限公司 Method and device for autonomous following of specific person, robot, device and storage medium
CN108724178A (en) * 2018-04-13 2018-11-02 顺丰科技有限公司 The autonomous follower method of particular person and device, robot, equipment and storage medium
CN112352244A (en) * 2018-04-23 2021-02-09 尚科宁家运营有限公司 Techniques for limiting cleaning operations of a robotic surface cleaning device to a region of interest
CN108551355A (en) * 2018-04-23 2018-09-18 王宏伟 A kind of security protection patrol robot
CN112352244B (en) * 2018-04-23 2024-04-09 尚科宁家运营有限公司 Control system and method for updating map in memory
CN108712404B (en) * 2018-05-04 2020-11-06 重庆邮电大学 Internet of things intrusion detection method based on machine learning
CN108712404A (en) * 2018-05-04 2018-10-26 重庆邮电大学 A kind of Internet of Things intrusion detection method based on machine learning
CN108748143A (en) * 2018-05-04 2018-11-06 安徽三弟电子科技有限责任公司 A kind of life prompt robot control system based on Internet of Things
CN108960109B (en) * 2018-06-26 2020-01-21 哈尔滨拓博科技有限公司 Space gesture positioning device and method based on two monocular cameras
CN108960109A (en) * 2018-06-26 2018-12-07 哈尔滨拓博科技有限公司 A kind of space gesture positioning device and localization method based on two monocular cams
CN108874142A (en) * 2018-06-26 2018-11-23 哈尔滨拓博科技有限公司 A kind of Wireless intelligent control device and its control method based on gesture
CN108806142A (en) * 2018-06-29 2018-11-13 炬大科技有限公司 A kind of unmanned security system, method and sweeping robot
CN109003262A (en) * 2018-06-29 2018-12-14 炬大科技有限公司 Stain clean method and device
CN108898108B (en) * 2018-06-29 2022-04-26 炬大科技有限公司 User abnormal behavior monitoring system and method based on sweeping robot
CN109003262B (en) * 2018-06-29 2022-06-21 炬大科技有限公司 Stubborn stain cleaning method and device
CN108898108A (en) * 2018-06-29 2018-11-27 炬大科技有限公司 A kind of user's abnormal behaviour monitoring system and method based on sweeping robot
CN108921218A (en) * 2018-06-29 2018-11-30 炬大科技有限公司 A kind of target object detection method and device
CN108574804A (en) * 2018-07-04 2018-09-25 珠海市微半导体有限公司 A kind of Light Source Compensation system and method for vision robot
CN109190456A (en) * 2018-07-19 2019-01-11 中国人民解放军战略支援部队信息工程大学 Pedestrian detection method is overlooked based on the multiple features fusion of converging channels feature and gray level co-occurrence matrixes
CN109190456B (en) * 2018-07-19 2020-11-20 中国人民解放军战略支援部队信息工程大学 Multi-feature fusion overlook pedestrian detection method based on aggregated channel features and gray level co-occurrence matrix
CN109118703A (en) * 2018-07-19 2019-01-01 苏州菲丽丝智能科技有限公司 A kind of intelligent household security system and its working method
CN109005432A (en) * 2018-07-24 2018-12-14 上海常仁信息科技有限公司 A kind of network television system based on healthy robot
CN108919809A (en) * 2018-07-25 2018-11-30 智慧式控股有限公司 Wisdom formula safety protection robot and business model
CN109117055A (en) * 2018-07-26 2019-01-01 深圳市商汤科技有限公司 Intelligent terminal and control method
CN109257563A (en) * 2018-08-30 2019-01-22 浙江祥生建设工程有限公司 Building site remote monitoring system
CN109191768A (en) * 2018-09-10 2019-01-11 天津大学 A kind of kinsfolk's security risk monitoring method based on deep learning
CN109445427A (en) * 2018-09-26 2019-03-08 北京洪泰同创信息技术有限公司 Intelligentized Furniture, furniture positioning device and furniture positioning system
CN109147277A (en) * 2018-09-30 2019-01-04 桂林海威科技股份有限公司 A kind of old man care system and method
CN109333548A (en) * 2018-10-18 2019-02-15 何勇 A kind of intelligent Service chat robots with intelligence training function
CN109634129B (en) * 2018-11-02 2022-07-01 深圳慧安康科技有限公司 Method, system and device for realizing active care
CN109634129A (en) * 2018-11-02 2019-04-16 深圳慧安康科技有限公司 Implementation method, system and the device actively shown loving care for
CN109691090A (en) * 2018-12-05 2019-04-26 珊口(深圳)智能科技有限公司 Monitoring method, device, monitoring system and the mobile robot of mobile target
US10970859B2 (en) 2018-12-05 2021-04-06 Ankobot (Shenzhen) Smart Technologies Co., Ltd. Monitoring method and device for mobile target, monitoring system and mobile robot
CN109739097A (en) * 2018-12-14 2019-05-10 武汉城市职业学院 A kind of smart home robot and application thereof based on embedded type WEB
CN109740461B (en) * 2018-12-21 2020-12-25 北京智行者科技有限公司 Object and subsequent processing method
CN109740461A (en) * 2018-12-21 2019-05-10 北京智行者科技有限公司 Target is with subsequent processing method
CN109547771A (en) * 2019-01-07 2019-03-29 中国人民大学 A kind of household intelligent robot having bore hole 3D display device
CN109800802A (en) * 2019-01-10 2019-05-24 深圳绿米联创科技有限公司 Visual sensor and object detecting method and device applied to visual sensor
CN112888902A (en) * 2019-01-28 2021-06-01 株式会社日立制作所 Movable air purifier
CN110164538A (en) * 2019-01-29 2019-08-23 浙江瑞华康源科技有限公司 A kind of medical logistics system and method
CN109887515B (en) * 2019-01-29 2021-07-09 北京市商汤科技开发有限公司 Audio processing method and device, electronic equipment and storage medium
CN109887515A (en) * 2019-01-29 2019-06-14 北京市商汤科技开发有限公司 Audio-frequency processing method and device, electronic equipment and storage medium
CN109917666A (en) * 2019-03-28 2019-06-21 深圳慧安康科技有限公司 The implementation method and intelligent apparatus of wisdom family
CN109917666B (en) * 2019-03-28 2023-03-24 深圳慧安康科技有限公司 Intelligent household realization method and intelligent device
CN109993945A (en) * 2019-04-04 2019-07-09 清华大学 For gradually freezing the alarm system and alarm method of disease patient monitoring
CN110161903B (en) * 2019-05-05 2022-02-22 宁波财经学院 Intelligent household robot and control method thereof
CN110161903A (en) * 2019-05-05 2019-08-23 宁波财经学院 A kind of control method of smart home robot and smart home robot
CN110162044A (en) * 2019-05-18 2019-08-23 珠海格力电器股份有限公司 A kind of automated wireless charging unit and charging method
CN110209483A (en) * 2019-05-28 2019-09-06 福州瑞芯微电子股份有限公司 Machine control system of sweeping the floor and control method, storage medium and controlling terminal
CN110765895A (en) * 2019-09-30 2020-02-07 北京鲲鹏神通科技有限公司 Method for distinguishing object by robot
CN110891352A (en) * 2019-11-26 2020-03-17 珠海格力电器股份有限公司 Control method and control system for intelligent lamp
CN111491004A (en) * 2019-11-28 2020-08-04 赵丽侠 Information updating method based on cloud storage
CN111464776A (en) * 2020-01-19 2020-07-28 浙江工贸职业技术学院 Internet of things safety alarm equipment and assessment method
CN111322718A (en) * 2020-03-16 2020-06-23 北京云迹科技有限公司 Data processing method and delivery robot
CN111300429A (en) * 2020-03-25 2020-06-19 深圳市天博智科技有限公司 Robot control system, method and readable storage medium
CN111428666A (en) * 2020-03-31 2020-07-17 齐鲁工业大学 Intelligent family accompanying robot system and method based on rapid face detection
CN111508184A (en) * 2020-04-10 2020-08-07 扬州大学 Intelligent fire protection system in building
CN113571054B (en) * 2020-04-28 2023-08-15 中国移动通信集团浙江有限公司 Speech recognition signal preprocessing method, device, equipment and computer storage medium
CN113571054A (en) * 2020-04-28 2021-10-29 中国移动通信集团浙江有限公司 Speech recognition signal preprocessing method, device, equipment and computer storage medium
CN111611904A (en) * 2020-05-15 2020-09-01 新石器慧通(北京)科技有限公司 Dynamic target identification method based on unmanned vehicle driving process
CN111611904B (en) * 2020-05-15 2023-12-01 新石器慧通(北京)科技有限公司 Dynamic target identification method based on unmanned vehicle driving process
CN111618856A (en) * 2020-05-27 2020-09-04 山东交通学院 Robot control method and system based on visual excitation points and robot
CN111618856B (en) * 2020-05-27 2021-11-05 山东交通学院 Robot control method and system based on visual excitation points and robot
CN113822095B (en) * 2020-06-02 2024-01-12 苏州科瓴精密机械科技有限公司 Method, system, robot and storage medium for identifying working position based on image
CN113822095A (en) * 2020-06-02 2021-12-21 苏州科瓴精密机械科技有限公司 Method, system, robot and storage medium for identifying working position based on image
CN111862524A (en) * 2020-07-10 2020-10-30 广州博冠智能科技有限公司 Monitoring alarm method and device based on intelligent home system
CN111898524A (en) * 2020-07-29 2020-11-06 江苏艾什顿科技有限公司 5G edge computing gateway and application thereof
CN111915851A (en) * 2020-08-11 2020-11-10 山西应用科技学院 Gas intelligence switch
CN112101145B (en) * 2020-08-28 2022-05-17 西北工业大学 SVM classifier based pose estimation method for mobile robot
CN111964154A (en) * 2020-08-28 2020-11-20 邯郸美的制冷设备有限公司 Air conditioner indoor unit, control method, operation control device and air conditioner
CN111964154B (en) * 2020-08-28 2021-09-21 邯郸美的制冷设备有限公司 Air conditioner indoor unit, control method, operation control device and air conditioner
CN112101145A (en) * 2020-08-28 2020-12-18 西北工业大学 SVM classifier based pose estimation method for mobile robot
CN113035374B (en) * 2021-03-16 2024-03-12 深圳市南山区慢性病防治院 Comprehensive management system and management method for tuberculosis
CN113035374A (en) * 2021-03-16 2021-06-25 深圳市南山区慢性病防治院 Tuberculosis comprehensive management system and management method
CN113119118A (en) * 2021-03-24 2021-07-16 智能移动机器人(中山)研究院 Intelligent indoor inspection robot system
CN113143165A (en) * 2021-04-26 2021-07-23 上海甄徽网络科技发展有限公司 Intelligent security household robot with disinfection function
CN113177972A (en) * 2021-05-20 2021-07-27 杭州华橙软件技术有限公司 Object tracking method and device, storage medium and electronic device
CN113362563A (en) * 2021-06-03 2021-09-07 国网北京市电力公司 Method and device for determining abnormal condition of power tunnel
CN113341812A (en) * 2021-06-11 2021-09-03 深圳风角智能科技有限公司 Environment-friendly electric storage type energy consumption and power saving management system and method for terminal of Internet of things
CN114018253A (en) * 2021-10-25 2022-02-08 珠海一微半导体股份有限公司 Robot with visual positioning function and positioning method
CN114018253B (en) * 2021-10-25 2024-05-03 珠海一微半导体股份有限公司 Robot with vision positioning function and positioning method
WO2023217193A1 (en) * 2022-05-10 2023-11-16 神顶科技(南京)有限公司 Robot and method for robot to recognise fall

Also Published As

Publication number Publication date
WO2015172445A1 (en) 2015-11-19

Similar Documents

Publication Publication Date Title
CN103984315A (en) Domestic multifunctional intelligent robot
CN103839373B (en) A kind of unexpected abnormality event Intelligent Recognition alarm device and warning system
US11978256B2 (en) Face concealment detection
CN103839346B (en) A kind of intelligent door and window anti-intrusion device and system, intelligent access control system
US11741766B2 (en) Garage security and convenience features
US10943113B2 (en) Drone pre-surveillance
US20220351598A1 (en) Enhanced audiovisual analytics
CN110291489A (en) The efficient mankind identify intelligent assistant&#39;s computer in calculating
US11200786B1 (en) Canine assisted home monitoring
US11349707B1 (en) Implementing security system devices as network nodes
US11935297B2 (en) Item monitoring for doorbell cameras
US20230196106A1 (en) Training image classifiers
US10943442B1 (en) Customized notifications based on device characteristics
AU2019333044B2 (en) Assisted creation of video rules via scene analysis
US11032128B2 (en) Using a local hub device as a substitute for an unavailable backend device
US11501618B1 (en) Security device with user-configurable motion detection settings
US11550276B1 (en) Activity classification based on multi-sensor input
US11544505B1 (en) Semi-supervised learning based on clustering objects in video from a property
US20240046485A1 (en) Real-motion prediction
US20240071083A1 (en) Using implicit event ground truth for video cameras

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20140813