CN208438358U - Robot anticollision device, collision-prevention device and the robot for using the device - Google Patents
Robot anticollision device, collision-prevention device and the robot for using the device Download PDFInfo
- Publication number
- CN208438358U CN208438358U CN201820601577.1U CN201820601577U CN208438358U CN 208438358 U CN208438358 U CN 208438358U CN 201820601577 U CN201820601577 U CN 201820601577U CN 208438358 U CN208438358 U CN 208438358U
- Authority
- CN
- China
- Prior art keywords
- robot
- human body
- matrix
- collision
- model
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 239000011159 matrix material Substances 0.000 claims abstract description 143
- 230000007123 defense Effects 0.000 claims description 2
- 238000009434 installation Methods 0.000 claims description 2
- 230000015654 memory Effects 0.000 abstract description 28
- 230000005540 biological transmission Effects 0.000 abstract description 21
- 238000002360 preparation method Methods 0.000 abstract description 4
- 230000009471 action Effects 0.000 description 49
- 230000033001 locomotion Effects 0.000 description 29
- 238000012545 processing Methods 0.000 description 29
- 230000036760 body temperature Effects 0.000 description 26
- 238000000034 method Methods 0.000 description 22
- 230000006870 function Effects 0.000 description 18
- 238000002604 ultrasonography Methods 0.000 description 18
- 238000010586 diagram Methods 0.000 description 11
- 238000005516 engineering process Methods 0.000 description 10
- 230000000875 corresponding effect Effects 0.000 description 8
- 230000008569 process Effects 0.000 description 8
- 238000009987 spinning Methods 0.000 description 7
- 230000001133 acceleration Effects 0.000 description 6
- 238000004519 manufacturing process Methods 0.000 description 5
- 230000000153 supplemental effect Effects 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 4
- 230000004075 alteration Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 241001269238 Data Species 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000006698 induction Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- NJPPVKZQTLUDBO-UHFFFAOYSA-N novaluron Chemical group C1=C(Cl)C(OC(F)(F)C(OC(F)(F)F)F)=CC=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F NJPPVKZQTLUDBO-UHFFFAOYSA-N 0.000 description 2
- 230000001012 protector Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 206010052428 Wound Diseases 0.000 description 1
- 208000027418 Wounds and injury Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000002079 cooperative effect Effects 0.000 description 1
- 238000005260 corrosion Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Abstract
A kind of robot anticollision device, collision-prevention device and the robot with the device, are configured in a robot and are communicated with the robot to assist the anticollision of the robot to work, include: a thermal sensing unit, for detecting a heat source caused by an object;One memory connects the thermal sensing unit;One transmission unit connects the memory;An and control unit, connect the thermal sensing unit, the memory and the transmission unit, the thermal sensing unit is controlled to a thermal imagery matrix signal caused by the heat source to be stored in the memory, and the thermal imagery matrix signal is handled, to judge whether the object is a human body, when judging the object for the human body, a relative distance of the human body and the robot is calculated, it is sent to the robot via the transmission unit, to assist the robot to carry out anticollision preparation.
Description
Technical field
The utility model is about a kind of robot, especially in regard to a kind of robot anticollision device, collision-prevention device and with the machine of the device
Device people.
Background technique
Currently, robot outline is divided into two classes, one kind is service humanoid robot, and another kind of is industrial robot.Either
Humanoid robot or industrial robot are serviced, the development speed of the two is all very fast.Wherein, industrial robot another kind address is work
Industry robotic arm.Since industrial robot has undertaken the important task of quick production, most of industrial machine arm is big at present
It is isolated in firm safety enclosure (safety fences) more, because of the misgivings entered without staff, industrial machine can be allowed
Device arm can be with working at high speed, also, many machines are also expected to possess the high function of multiplying loading capacity.By industrial machine arm every
A possibility that from getting up, factory can be allowed to waste many spaces as operation isolated area, while also limiting the mankind and mechanical cooperation.
Due to the trend of customizing products manufacture, so that original standardized production process, it is necessary to it dynamically adjusts often,
This is but also production procedure allows for the adjustment of elasticity, and synchronously, the design of the safety enclosure of robot allows the work of factory
Process can not comply with this customized working condition.Also therefore, International Standards Organization has delivered the specification of ISO/TS 15066,
For ISO 10218 " industrial robot safety requirements ", cooperation robot is allowed to become safer, it helps to promote producing line fortune
Make efficiency.For example, the 5.11.3 articles requirement: ISO 10218-2 " is being less than 500mm in the requirement provision of cooperation robot
, there is extra safeguard in the region of headroom, to prevent from being exposed to the harm for twining and falling into or crushing." for example, ISO 10218-1 cooperates
Robot requires in provision, the 5.5th article of requirement: " each robot should all have the function of that protectiveness stops and independent promptly stops
Only function, these functions must have and external protector line." the 5.5.3 articles requirement: " robot, which should have 1 or more, to be protected
Shield property stops function, and can connect external protector on designing." the 5.10.2 articles requirement: " personnel are located at the sky that cooperates
When interior, robot stops." the 5.10.4 articles requirement: " robot maintains set rate and separation distance." etc..
Since the trend that this industrial machine arm cooperates with the mankind is irresistible, many is directed to industrial machine
Also day crescent is beneficial for the anti-collision technique of people.Currently, the anti-collision technique of mainstream, to reach ISO 10218-1, ISO 10218-2
Aforementioned claim, must mostly arrange in pairs or groups many accessories, for example, to reach the empty regions 500mm of the 5.11.3 articles of ISO 10218-2
Requirement, be only capable of the planning by clearance zone at present, therefore, grating technology become current clearance zone one preferred technique scheme.Example
Such as, to reach the 5.5th article of ISO 10218-1 of protectiveness and stop function and emergency stop function, current mainstream technology scheme
To pass through the pressure detection device, touch monitoring device, capacitor detector of guard shield by adding guard shield on robot arm
Deng to realize that protectiveness when collision stops function.However, this anticollision technology, narrowly not anticollision, but subtract
It is injured caused by low collision.
Real anticollision technical solution, it should be to avoid generating collision, there is some specific solutions at present.
For example, TaiWan, China patent I 564128, " anticollision arrangement for detecting, corresponding control method and it is applicable in there is disclosed a kind of
Mechanical arm ", used projected capacitive sensor have the function of certain space induction, to be implemented in mechanical arm court
Active anti-corrosion technology when to default movement routine.This technology breaches previous robot arm and lowers after collision and touches
It wounds harmful technical disadvantages, and realizes the function of actively preventing collision.However, this technology is still unable to satisfy ISO
The requirement of the 5.11.3 articles of 10218-2 of empty regions 500mm, with this technology, the solution for the grating that still must arrange in pairs or groups.
And the spatial impression that projected capacitive sensor cannot achieve more than 500mm or more is answered, but also this technology can not be applied to palpus
Meet the working environment of the empty regions 500mm of the 5.11.3 articles of ISO 10218-2.
TaiWan, China patent I 608894 discloses a kind of " wisdom anticollision safety system and the toolroom machine using it ",
It is opposite with other objects to calculate processing head with threedimensional model information with the 3-dimensional image acquisition unit for being set to distal end for it
Distance, when distance lower than a preset alert apart from when give a warning.However, such image acquisition unit set on distal end, has perhaps
More dead angles can not equally define the empty regions 500mm of the 5.11.3 articles of ISO 10218-2, and still must arrange in pairs or groups grating
Technical solution.Also it cannot achieve the 5.5th article of ISO 10218-1 of protectiveness and stop function and emergency stop function, be only capable of reality
The monitoring of existing danger situation.
Chinese Patent No. CN103192414A (publication) number discloses a kind of robot Anti-bumping protection based on machine vision
Device and method, with the photo in filming apparatus shooting place, and compared with default photo, in different zones difference, with
Different warning ways generates warning, such as warning lamp, buzzer, stopping movement.This technical solution, similar TaiWan, China are special
Sharp I 608894 is only capable of reaching the dangerous monitoring of distal end, is equally unable to satisfy the headroom of the 5.11.3 articles of ISO 10218-2
Region 500mm requires to stop the requirement such as function and emergency stop function with the 5.5th article of ISO 10218-1 of protectiveness.
In gross, at present there has been no complete anti-collision technique solution, it can synchronize and meet above various ISO
The Collsion proof safety requirement of 10218-1,10218-2.In gross, to meet its requirement must realize three kinds of functions, and one is
Human body can be recognized, the other is the relative distance between human body, even absolute distance can be detected actively, be connect
, by non-contacting mode, directly avoid collision.Therefore, human body identification can be reached simultaneously by how providing one kind, and
It can be realized the technical solution with the acquirement of the relative distance of human body, really can initiatively realize the anticollision mesh of robot arm
Mark.
In addition, service humanoid robot also has same anticollision requirement, in particular, service humanoid robot can be at any time in human body
Side even can directly contact human body.Therefore, robot anti-collision technique needs further in-depth to active anticollision, becomes not
Carry out the important milestone of Robot industry development.
Utility model content
In order to achieve the above object, the utility model provides a kind of robot anticollision device, collision-prevention device and the robot with the device, fortune
Come the human body temperature of sensing article and judged whether with thermal imagery sensing unit as people, further judges object and robot
Relative distance can provide the basic message of this anticollision of robot, robot is allowed to have the ability of vision and judgement, Ji Keshi
The active and wisdom anticollision of existing robot.
The utility model provides a kind of robot anticollision device, collision-prevention device, is configured in a robot and is communicated with the robot
To assist the anticollision of the robot to work, include: a thermal sensing unit, for detecting a heat source caused by an object;One deposits
Reservoir connects the thermal sensing unit;One transmission unit connects the memory;And a control unit, connect the thermal sensing unit,
The memory and the transmission unit, control the thermal sensing unit to a thermal imagery matrix signal caused by the heat source to be stored in
In the memory, and the thermal imagery matrix signal is handled, to judge whether the object is a human body, when judging the object
When for the human body, a relative distance of the human body and the robot is calculated, is sent to the robot via the transmission unit, to assist
The robot is helped to carry out anticollision preparation.
The utility model also provides a kind of robot, with the robot anticollision sighting device of the utility model, includes: one
Robot processor is connected to the transmission unit and the control unit via a connecting interface, receives the control unit and is produced
Raw relative distance, so as to carrying out anticollision preparation.
The utility model also provides a kind of robot anticollision device, collision-prevention device, is mounted on one comprising a shell and a circuit board, shell
On robot arm, an at least aperture on the shell is equipped with an at least thermal sensing matrix, the thermal sensing square on the circuit board
An at least camera lens is set in battle array, each camera lens corresponds to the position of the aperture, and a connector of the circuit board is with a connection
Line is connected to a control circuit board of the robot arm.
Preferably, a gyroscope, an accelerometer or one are also equipped with suddenly on the circuit board of the robot anticollision device, collision-prevention device
That sensor.
Preferably, an image capture matrix or a ultrasonic wave shadow are also equipped on the circuit board of the robot anticollision device, collision-prevention device
As capturing matrix, the image capture matrix or the ultrasonograph capture matrix and are respectively configured with one second camera lens on the circuit board
With one second aperture in the shell.
The utility model also provides a kind of robot anticollision device, collision-prevention device, is mounted on one comprising a shell and a circuit board, shell
On robot arm, two apertures on the shell, installation is set on the thermal sensing matrix there are two thermal sensing matrix on the circuit board
It sets there are two camera lens, each camera lens corresponds to the position of the aperture, and a connector of the circuit board is connected to a connecting line
One control circuit board of the robot arm.
Preferably, a gyroscope, an accelerometer or one are also equipped with suddenly on the circuit board of the robot anticollision device, collision-prevention device
That sensor.
Preferably, an image capture matrix or a ultrasonic wave shadow are also equipped on the circuit board of the robot anticollision device, collision-prevention device
As capturing matrix, the image capture matrix or the ultrasonograph capture matrix and are respectively configured with one second camera lens on the circuit board
With one second aperture in the shell.
The utility model also provides a kind of robot, and a robot anticollision device, collision-prevention device is configured on the robot arm, should
Robot anticollision device, collision-prevention device includes a shell and a circuit board, an at least aperture on the circuit board, is equipped on the circuit board
An at least camera lens is arranged on the thermal sensing matrix in an at least thermal sensing matrix, and each camera lens corresponds to the position of the aperture,
One connector of the circuit board is connected to a control circuit board of the robot arm with a connecting line.
Preferably, a gyroscope, an acceleration are also equipped on the circuit board of the robot anticollision device, collision-prevention device of the robot
Meter or a Hall sensor.
Preferably, an image capture matrix or one are also equipped on the circuit board of the robot anticollision device, collision-prevention device of the robot
Ultrasonograph captures matrix, and the image capture matrix or the ultrasonograph capture matrix and be respectively configured with one second camera lens in this
With one second aperture in the shell on circuit board.
For the above and other purpose, feature and advantage of the utility model can be clearer and more comprehensible, be cited below particularly it is several compared with
Good embodiment, and cooperate attached drawing, it is described in detail below (embodiment).
Detailed description of the invention
Figure 1A -1D, the action schematic diagrams such as the robot anticollision device, collision-prevention device of the utility model, the solid of robot, top view.
Fig. 2A -2B, two specific embodiment functional block diagrams of the robot anticollision device, collision-prevention device of the utility model.
Fig. 3 A-3C, the other three specific embodiment functional block diagram of the robot anticollision device, collision-prevention device of the utility model.
Fig. 4 A-4E, another five specific embodiment functional block diagrams of the robot anticollision device, collision-prevention device of the utility model.
Robot anticollision device, collision-prevention device is configured at the sensing space schematic diagram of the fixed position of robot by Fig. 5, the utility model.
Robot anticollision device, collision-prevention device is configured at the sensing space schematic diagram at the arm position of robot by Fig. 6, the utility model.
Fig. 7, the utility model show the active anticollision movement at the arm position that robot anticollision device, collision-prevention device is configured at robot
It is intended to.
The relative distance of Fig. 8, the utility model calculate, with the area accounting method schematic diagram of thermal imagery model.
Fig. 9-15, multiple specific embodiment flow charts of the robot avoiding collision of the utility model.
Symbol description:
10: robot
11: there is pedestal
12: fixing seat
13: the first rotating parts
14: the first rotary shaft fixed parts
15: the first arms
16: the second arms
20,20-1,20-2,20-3,20-4,20-5,20-6,20-7: robot anticollision device, collision-prevention device
21: control unit
22A-1,22A-2: thermal imagery matrix
22B-1,22B-2: image sensing matrix
22C-1,22C-2: ultrasound video senses matrix
23A-1,23A-2: the first noise processing circuit
23B-1,23B-2: the second noise processing circuit
23C-1,23C-2: third noise processing circuit
24A-1,24A-2: lens group
24B-1,24B-2: lens group
24C-1,24C-2: lens group
25: transmission unit
26: memory
27: gyroscope
28A-1,28A-2: the first analog-digital converter
28B-1,28B-2: the second analog-digital converter
28C-1,28C-2: third analog-digital converter
101,102,103,104,105,106,107: angular field of view
99: human body
D1, D2, D3: distance.
Specific embodiment
Multiple embodiments according to the present utility model, the utility model provide a kind of robot anticollision device, collision-prevention device and with should
The robot of device captures the thermal imagery of environment with one or more thermal imagery matrixes, and passes through thermal imagery model
Comparison technology, to realize to human body and non-human judgement.Other than the judgement to human body, pass through thermal imagery model and pre-stored
Body image model comparison, can determine whether out instant thermal imagery model (namely human body) and anticollision device, collision-prevention device it is opposite away from
From.What is more, collocation image acquisition unit obtains the second sensing range spatial model, or obtain relative velocity parameters etc., it can
Further anticollision device, collision-prevention device is assisted to judge its relative distance with human body.Human body and relative distance, the utility model are grasped
A series of accurately robot anticollision movement or even cooperative action are carried out, active anticollision is realized, further allows robot
It visualizes, wisdom.
Figure 1A -1D is please referred to, the movements such as the robot anticollision device, collision-prevention device of the utility model, the solid of robot, top view are shown
It is intended to.The robot 10 of the utility model have pedestal 11, fixing seat 12, the first rotating part 13, the first rotary shaft fixed part 14,
First arm 15 and the second arm 16.Robot 10 fixing seat 12 configured with 4 robot anticollision device, collision-prevention device 20-1,20-2,20-3,
20-4, there are three robot anticollision device, collision-prevention device 20-5,20-6,20-7 for configuration on the second arm 16 of robot.It can by Figure 1B -1D
Know, 4 robots anticollision device, collision-prevention device 20-1,20-2,20-3, the 20-4 configured in the fixing seat 12 of robot 10, it will not be with
Rotation, the movement of the first rotating part 13 or the first arm 15 and the second arm 16 of robot 10 and change its visual angle, visual angle
Range is as shown in figure 5, respectively angular field of view 101,102,103,104.When human body 99 enters angular field of view 101, it will quilt
Robot anticollision device, collision-prevention device 20-1 is detected.And configured on the second arm 16 of robot three robot anticollision device, collision-prevention device 20-5,
20-6,20-7 can then change with rotation, the movement of the first rotating part 13 or the first arm 15 and the second arm 16 of robot 10
Become its visual angle.The angular field of view of robot anticollision device, collision-prevention device 20-5,20-6,20-7, referring to FIG. 6, when human body 99 enters visual angle model
When enclosing 105, it will detected by robot anticollision device, collision-prevention device 20-5.And the angular field of view 106 and machine of robot anticollision device, collision-prevention device 20-6
The angular field of view 107 of device people's air defense collision device 20-7 not can be appreciated that human body 99 then.In this embodiment, when human body 99 is mobile, simultaneously
It is detected by robot anticollision device, collision-prevention device 20-1,20-5.Since there are two robot anticollision device, collision-prevention device 20-1,20-5 to detect simultaneously
It arrives, and robot anticollision device, collision-prevention device 20-1,20-5 can generate that thermal imagery is poor, also, the distance of the two is it is known that the utility model
The relative distance of human body 99 Yu robot anticollision device, collision-prevention device 20-1,20-5 is judged accordingly.In other words, the utility model can pass through two
Robot anticollision device, collision-prevention device calculates the relative distance of object.
The utility model by a thermal imagery matrix in thermal sensing unit or more than two thermal imagery matrixes come
Realize the identification of human body and the calculating of relative distance.Hereinafter, illustrating each function of the utility model by several embodiments are enumerated
It can square.
Please refer to Fig. 2A -2B, two specific embodiment functional block diagrams of the robot anticollision device, collision-prevention device of the utility model.It is first
First, in the embodiment of Fig. 2A, the robot anticollision device, collision-prevention device 20A of the utility model is configured in robot and carries out with robot
Communication is worked with the anticollision of assist people, includes: control unit 21, thermal sensing unit use thermal imagery matrix 22A-1, mirror
Head group 24A-1, the first analog-digital converter 28A-1, the first noise processing circuit 23A-1, transmission unit 25, memory 26 with
Gyroscope 27 etc..Wherein, thermal imagery matrix 22A-1 connects thermal imagery square for detecting heat source caused by object, memory 26
Battle array 22A-1, the first analog-digital converter 28A-1, the first noise processing circuit 23A-1, transmission unit 25 and control unit 21.
Control unit 21 is then connected to the first noise processing circuit 23A-1, the first analog-digital converter 28A-1, thermal imagery matrix
22A-1, memory 26, transmission unit 25 and gyroscope 27 etc..For purpose of brevity for schema, schema part connection relationship is not drawn
Out.It is single that thermal imagery matrix 22A-1, the first analog-digital converter 28A-1, the first noise processing circuit 23A-1 are controlled by control
Member 21 and environment is scanned and is detected.When scanning is to when an object (for heat source), thermal imagery matrix 22A-1 generates a heat
Image matrix signal is stored in memory 26 immediately.And this thermal imagery matrix signal by the first analog-digital converter 28A-1 with
After first noise processing circuit 23A-1 is processed, by noise filtering, and digital signal is converted to.Control unit 21 can be by this
Clean digital signal such as is analyzed and is judged, calculated at the processing movement, that is, can determine whether object is a human body.When judgement should
When object is human body, the relative distance for calculating human body and robot then can will have human body to go out via transmission unit 25
Existing, relative distance of human body etc. two basic messages are sent to robot, carry out anticollision preparation with assist people.To robot
For, robot definition itself has its absolute coordinates, to carry out the work of various precise positionings.The phase of the utility model meaning
The concept adjusted the distance refers to the relative distance between the human body and robot defined relative to the utility model.To robot
For, this relative distance can be converted to absolute coordinates.In other words, after this relative distance being obtained, it is scaled robot itself
Used absolute coordinates (precision of acquirement depending on the utility model different embodiments and different precision can be obtained), also
It is the absolute coordinates of items/people body, the distance parameter of utilization can be provided by becoming robot.
Obtain thermal imagery matrix thermal imagery matrix signal after, control unit 21 judge the object whether be human body side
Formula is that thermal imagery matrix signal corresponding to heat source that thermal sensing matrix 22A-1 is sensed is established a sensing range spatial mode
Type, and the several human space models stored in advance in memory 26 are compared, when the sensing range spatial model symbol established immediately
Human space model a period of time therein is closed, that is, can determine whether that the object is human body.In the present invention, memory 26 can use
A variety of different memories, for example, dynamic random access memory (DRAM), flash memory (Flash Memory), solid-state
Hard disc (SSD), static random access memory (SRAM) differ.It is planned in advance in device 26 in addition, the utility model can store
Storage space allows memory 26 to work more efficiently.For example, untreated thermal imagery data, processed thermal imagery data
The thermal imagery data (human space model) of (sensing range spatial model), comparison, the thermal imagery data simplified after comparing etc.
Different blocks.
Wherein, calculate the mode of the relative distance of human body and robot, can by for several times sensing range spatial model and people
Body spatial model is compared, and according to ratio and the relative position of sensing range spatial model and human space model, estimates phase
It adjusts the distance.In addition, can then assist to calculate relative distance by elements such as gyroscopes 27.
In the embodiment of Fig. 2A, it is configured with the element of a gyroscope 27.This element is speed parameter acquisition unit,
Its function is acquisition speed parameter, the third axis supplemental characteristic obtained is wanted to generate the utility model, it is, in addition to heat
Third axis supplemental characteristic except image matrix 22A-1 (two plane axis).By this third axis supplemental characteristic, the utility model is
The variation for the thermal imagery matrix signal that can be accurately captured by thermal imagery matrix 22A-1 be (sensing range spatial model
Variation) and the variation of third axis supplemental characteristic come calculate jointly object and robot relative velocity, and then computation assistance
Obtain the more accurately relative distance of robot and object.
Wherein, other than gyroscope 27, speed parameter acquisition unit is also possible to an accelerometer, a Hall sensor,
Or the arm angular speed of robot, acceleration sensing element etc., purpose are all to generate a magnitude of angular velocity or an acceleration
Value, so as to as third axis supplemental characteristic.Wherein, Hall sensor is configured in the joint both ends of robot arm, by induction
The angle of release velocity variations of upper lower arm learn velocity variations.Angular speed, the acceleration sensing element of robot arm can use
The servo motor electric current of robot itself, in other words, electricity acquired by current sensing element of every servo motor when running
Flow valuve can grasp the parameters such as speed, the angular speed of current robot arm movement.
If gyroscope 27 uniform velocity parameter is not used to obtain element, the another kind of the utility model calculates to obtain relative distance
Mode is accounting method.Referring to FIG. 8, the relative distance of the utility model calculates, show with the area accounting method of thermal imagery model
It is intended to.Wherein, the relative distance for calculating human body and robot, be by sensing range spatial model for several times and human space model into
Row compares, and the scale of thermal sensing matrix is occupied according to sensing range spatial model, estimates relative distance.This genealogy of law is according to warp
Rule is tested, in other words, the human space model in memory 26, can to the accounting and relative distance of thermal sensing matrix 22A-1
Be made as comparison list, control unit 21 can look-up table learn different human space model and thermal imagery matrix 22A-1
The relative distance relationship of accounting, and then infer, meet the sensing range spatial model of specific human body spatial model, with thermal imagery
Relative distance corresponding to the accounting of matrix 22A-1.
Next, please referring to Fig. 2 B, the robot anticollision device, collision-prevention device 20B of the utility model is equally configured in robot
And the anticollision work communicated with robot with assist people, include: control unit 21, thermal sensing unit use hot shadow
As matrix 22A-1 and thermal imagery matrix 22A-2, lens group 24A-1 and lens group 24A-2, the first analog-digital converter 28A-1
It is single with the first analog-digital converter 28A-2, the first noise processing circuit 23A-1 and the first noise processing circuit 23A-1, transmission
Member 25, memory 26 and gyroscope 27 etc..Wherein, the embodiment of Fig. 2 B uses two thermal imagery matrixes, thermal imagery matrix
22A-1 and thermal imagery matrix 22A-2.With the absolute distance (in the distance on circuit board) of the two thermal imagery matrixes, and
It generates two groups of instant sensing range spatial models simultaneously, compares the difference of two groups of sensing range spatial models, can be accurate
The relative distance of ground calculating robot and human body.Specific way is institute after the overlapping by two groups of sensing range spatial models
Sensing range spatial model after the overlapping generated can obtain two by this sensing range spatial model after overlapping
The pixel difference of a difference sensing range spatial model, then it is aided with the absolute distance of two thermal imagery matrixes, it can triangulation
Method calculate the eyes (namely robot anticollision device, collision-prevention device 20B) of human body and robot relative distance.The reality of remaining and Fig. 2A
It applies example and illustrates identical, repeat no more.
It is carried out outside the identification and judgement of human body except through thermal imagery matrix, the utility model can also pass through image capture list
Member carrys out pick-up image, comes whether auxiliary judgment is human body and the relative distance for calculating human body and robot.
Please refer to Fig. 3 A-3C, the other three specific embodiment functional block diagram of the robot anticollision device, collision-prevention device of the utility model.
Fig. 3 A is please referred to, the robot anticollision device, collision-prevention device 20C of the utility model is equally configured in robot and is led to robot
Letter is worked with the anticollision of assist people, includes: control unit 21, image sensing matrix 22B-1, lens group 24B-1, the second mould
Quasi- digital quantizer 28B-1, the second noise processing circuit 23B-1, transmission unit 25, memory 26 and gyroscope 27 etc..With figure
Unlike the embodiment of 2A, Fig. 3 A uses image sensing matrix 22B-1 and corresponding lens group 24B-1, the second mould
Quasi- digital quantizer 28B-1 and the second noise processing circuit 23B-1.Image sensing matrix 22B-1 connection control unit 21 generates
Image data makes control unit 21 be converted to multiple second sensing ranges so as to the multiple image datas generated via Continuous plus
Spatial model.After the human space model of second sensing range spatial model and image is compared, human body and machine can be calculated
The relative distance of device people.
Then, please refer to Fig. 3 B, the robot anticollision device, collision-prevention device 20D of the utility model be equally configured in robot and with
Robot is communicated to be worked with the anticollision of assist people, includes: control unit 21, image sensing matrix 22B-1, image sense
Survey matrix 22B-2, lens group 24B-1, lens group 24B-2, the second analog-digital converter 28B-1, the second Analog-digital Converter
Device 28B-2, the second noise processing circuit 23B-1, the second noise processing circuit 23B-2, transmission unit 25, memory 26 and gyro
Instrument 27 etc..Unlike the embodiment of Fig. 2 B, Fig. 3 B uses two image sensing matrixes 22B-1,22B-2.By two
The aberration of different image datas, the utility model are the relative distance calculating that can reach effective human body and robot.Specifically
Calculation method repeated no more with the explanation of Fig. 2 B.
Then, Fig. 3 C is please referred to, the robot anticollision device, collision-prevention device 20E of the utility model is equally configured in robot simultaneously
The anticollision work with assist people is communicated with robot, includes: control unit 21, thermal imagery matrix 22A-1, image sense
Survey matrix 22B-1, lens group 24A-1, lens group 24B-1, the first analog-digital converter 28A-1, the second Analog-digital Converter
Device 28B-1, the first noise processing circuit 23A-1, the second noise processing circuit 23B-1, transmission unit 25, memory 26 and gyro
Instrument 27 etc..Unlike other embodiments, Fig. 3 C uses an image sensing matrix 22B-1 and a thermal imagery matrix
22A-1.The first sensing range spatial model for being established by thermal imagery matrix signal and image matrix signal established the
The comparison of two sensing range spatial models, the utility model are can reach the judgement of effective human body and human body and robot opposite
Distance calculates.
Wherein, image sensing matrix 22B-1,22B-2 of Fig. 3 A-3C can be CMOS image acquisition unit, CCD image is picked
Take unit.
Please refer to Fig. 4 A-4E, another five specific embodiment functional block diagrams of the robot anticollision device, collision-prevention device of the utility model.
Fig. 4 A is please referred to, the robot anticollision device, collision-prevention device 20F of the utility model is equally configured in robot and is led to robot
Letter is worked with the anticollision of assist people, includes: control unit 21, ultrasound video sensing matrix 22C-1, lens group 24C-1,
Third analog-digital converter 28C-1, third noise processing circuit 23C-1, transmission unit 25, memory 26 and gyroscope 27
Deng.From Fig. 2A, Fig. 3 A embodiment unlike, the embodiment of Fig. 4 A use ultrasound video sensing matrix 22C-1 and with
Its corresponding lens group 24C-1, third analog-digital converter 28C-1 and third noise processing circuit 23C-1.Ultrasound video
Matrix 22C-1 connection control unit 21 is sensed, ultrasound video matrix signal is generated, makes control unit 21 so as to via continuous meter
It calculates the multiple ultrasound video matrix signals generated and is converted to multiple second sensing range spatial models.By the second sensing range sky
Between after the human space model of model and image compares, the relative distance of human body and robot can be calculated.
Then, Fig. 4 B is please referred to, the robot anticollision device, collision-prevention device 20G of the utility model is equally configured in robot simultaneously
The anticollision work with assist people is communicated with robot, includes: control unit 21, ultrasound video sensing matrix 22C-
1, ultrasound video senses matrix 22C-2, lens group 24C-1, lens group 24C-2, third analog-digital converter 28C-1, the
Three analog-digital converter 28C-2, third noise processing circuit 23C-1, third noise processing circuit 23C-2, transmission unit 25,
Memory 26 and gyroscope 27 etc..From Fig. 2 B, Fig. 3 B embodiment unlike, Fig. 4 B uses two ultrasound videos sensing
Matrix 22C-1,22C-2.By the aberration of two different ultrasound video matrix signals, the utility model be can reach effectively
Human body and robot relative distance calculate.What is more, ultrasound video matrix signal can be believed by ultrasound video matrix
Number depth calculation, immediately arrive at whether be human body judgement.
Then, Fig. 4 C is please referred to, the robot anticollision device, collision-prevention device 20H of the utility model is equally configured in robot simultaneously
The anticollision work with assist people is communicated with robot, includes: control unit 21, thermal imagery matrix 22A-1, ultrasonic
Image sensing matrix 22C-1, lens group 24A-1, lens group 24C-1, the first analog-digital converter 28A-1, third simulate number
Word converter 28C-1, the first noise processing circuit 23A-1, third noise processing circuit 23C-1, transmission unit 25, memory 26
With gyroscope 27 etc..Unlike other embodiments, Fig. 4 C uses a ultrasound video and senses matrix 22C-1 and one
A thermal imagery matrix 22A-1.The the first sensing range spatial model and ultrasound video established by thermal imagery matrix signal
The comparison for the second sensing range spatial model that matrix signal is established, the utility model can reach the judgement of effective human body with
The relative distance of human body and robot calculates.
Then, Fig. 4 D is please referred to, the robot anticollision device, collision-prevention device 20I of the utility model is equally configured in robot simultaneously
The anticollision work with assist people is communicated with robot, includes: control unit 21, image sensing matrix 22B-1, Supersonic
Wave image sensing matrix 22C-1, lens group 24B-1, lens group 24C-1, the second analog-digital converter 28B-1, third simulation
Digital quantizer 28C-1, the second noise processing circuit 23B-1, third noise processing circuit 23C-1, transmission unit 25, memory
26 and gyroscope 27 etc..Unlike other embodiments, Fig. 4 D use ultrasound video sensing matrix 22C-1 with
One image sensing matrix 22B-1.It is established by image sensing matrix signal and ultrasound video sensing matrix different
The comparison of sensing range spatial model, the utility model be can reach the judgement of effective human body and human body and robot it is opposite away from
From calculating.
Then, Fig. 4 E is please referred to, the robot anticollision device, collision-prevention device 20J of the utility model is equally configured in robot simultaneously
The anticollision work with assist people is communicated with robot, includes: control unit 21, thermal imagery matrix 22A-1, image sense
Survey matrix 22B-1, ultrasound video senses matrix 22C-1, lens group 24A-1, lens group 24B-1, lens group 24C-1, first
Analog-digital converter 28A-1, the second analog-digital converter 28B-1, third analog-digital converter 28C-1, the first noise
Processing circuit 23A-1, the second noise processing circuit 23B-1, third noise processing circuit 23C-1, transmission unit 25, memory 26
With gyroscope 27 etc..Unlike other embodiments, Fig. 4 E uses a thermal sensing matrix 22A-1, a ultrasonic
An image sensing matrix 22C-1 and image sensing matrix 22B-1.The the first perception model established by thermal imagery matrix signal
Two the second sensing range spatial models that confining space model and image sensing matrix, ultrasound video matrix signal are established
It compares, the utility model is that can reach more accurate and effective human body judgement to calculate with the relative distance of human body and robot.
Wherein, above image sensing matrix, ultrasonic matrix etc. are referred to as image acquisition unit.
By the embodiment of Figure 1A -4E it is found that the utility model can by the configuration of a variety of different robot anticollision device, collision-prevention devices,
It is included in the different location of robot, allows robot to possess multiple " different eyes ", then see by multiple and different eyes
Different object sizes, and by with absolute distance (each robot of robot between the different eyes that grasp in advance
The distance of the space configuration of anticollision device, collision-prevention device), different eyes and the speed of object etc. can be accurately calculated whether object is people
Relative distance between body, with each component of robot, even absolute distance.The phase of robot each component and human body is grasped
Adjust the distance or absolute distance after, various robot anticollision controls can be carried out.
Then, referring to FIG. 7, robot anticollision device, collision-prevention device is configured at the active at the arm position of robot by the utility model
Anticollision action schematic diagram.In this embodiment, human body 99 come into the robot anticollision device, collision-prevention device 20-5 of robot 10 " depending on
Feel range ", also, continue toward robot one move in.In this embodiment, the utility model defines several pre-determined distances, away from
It is with a distance from robot reconnaissance range from D3, distance D2 is warning distance, and distance D1 is risk distance.Due to the utility model
Robot anticollision device, collision-prevention device 20-5 can provide the relative distance of arm Yu human body 99, therefore, when human body 99 is gradually toward warning distance D2
When, robot 10 already knows that human body 99 enters in warning distance D2.Risk distance is reached when human body 99 is more than warning distance D2
When D1 when, i.e. controllable first rotating part 13 rotation of robot 10 with human body of dodging, allows it not exceed risk distance D1.Such as
This, that is, can reach the purpose of active anticollision.Here, either the original arm action of robot 10 be close to or away from human body,
The method that all can be dodged when reaching risk distance, and dodge such as can be upper and lower, left and right, rotate, step back at the movement.
Since the utility model can grasp the relative distance of robot and human body, the robot anticollision device, collision-prevention device of the utility model
Movement below can be performed, carry out assist people, or directly to the corresponding action command of robot.
Control unit 21 generates a human body and enters alarm signal when judging object for the human body.Control unit 21 generates
After human body enters alarm signal, robot 10 after receiving human body alarm signal, into a man-machine collaboration mode or a personnel into
Enter guard model.Control unit 21 after calculating relative distance, according to relative distance up to one warning apart from when, generate a human body into
Enter fence coverage signal.After the generation human body of control unit 21 enters fence coverage signal, robot 10 receives the human body and enters police
After guarding against range signal, stop action mode into a robot.After the generation human body of control unit 21 enters fence coverage signal, produce
A raw robot stops action mode and instructs to robot 10.Control unit 21 is after calculating relative distance, according to relative distance
When up to a risk distance, generates a human body and enter risk range signal.Control unit 21 generates human body and enters fence coverage signal
Afterwards, an avoidance action mode instruction is generated to robot 10.After the generation human body of control unit 21 enters risk range signal, machine
After the reception human body of people 10 enters risk range signal, into an avoidance action mode.
Other than above movement, there are also more robot cooperated modes are executable.It is because the utility model mentions
It has supplied robot " vision ", and the relative distance for allowing robot to grasp it with human body, and can initiatively execute the anti-of items
Hit function.
Next, the utility model will be described the motion flow of a variety of robot avoiding collisions of various the utility model.
Referring to FIG. 9, a specific embodiment flow chart of the robot avoiding collision of the utility model, the present embodiment are
The operating process explanation of the embodiment of Fig. 2A.The robot avoiding collision of the utility model applies in a robot to assist
The anticollision of the robot works, and includes:
Step S101: a thermal imagery matrix signal is captured.
Step S102: establishing a sensing range spatial model of the heat source in the thermal imagery matrix signal, and judging should
Whether sensing range spatial model meets one of several preset human space models.
Step S103: judge whether the heat source of the thermal imagery matrix falls within the scope of a human body temperature.In general, this
The human body temperature range at place is the shell temperature of human body, and non-human DIE Temperature is about being greater than 28 degree to 34 degree of model Celsius
It encloses.
Step S104: when the heat source falls into human body temperature range and meets preset human space model, calculate with
The relative distance of the heat source.In robotized plant, there are many heat sources, are all possible to reach human body temperature range.This is practical
It is novel by simultaneously meet both human body temperature range and human space model, whether be human body to identify, can avoid only using
The erroneous judgement that human body temperature range may cause.
Step S105: when the heat source falls into human body temperature range and meet preset human space model, generation one
Human body entering signal.
Step S106: when the relative distance up to one warning apart from when, generate a human body enter warning range signal.
Step S107: it when the relative distance is up to a risk distance, generates a human body and enters risk range signal.
Step S108: it when the reception human body enters warning range signal, generates a robot and stops action command.
Step S109: when reception human body enters risk range signal, a robot is generated far from action command.
Additionally it contained following step: after the robot receives human body entering signal, into a man-machine collaboration mould
Formula or a personnel enter guard model.After the robot reception human body enters fence coverage signal, stop into a robot
Action mode.After the robot one human body of reception enters risk range signal, into an avoidance action mode.Wherein, the avoidance
Action mode includes: an evasive action, a reverse motion or a spinning movement.
Then, referring to FIG. 10, the still another embodiment flow chart of the robot avoiding collision of the utility model, this reality
Apply the operating process explanation for the embodiment that example is Fig. 2A.A kind of robot avoiding collision applies in a robot to assist
The anticollision of the robot works, and includes:
Step S111: a thermal imagery matrix signal is captured.
Step S112: establishing a sensing range spatial model of the heat source in the thermal imagery matrix signal, and judging should
Whether sensing range spatial model meets one of several preset human space models
Step S113: judge whether the heat source of the thermal imagery matrix falls within the scope of a human body temperature.
Step S114: it when the heat source falls into human body temperature range and meet preset human space model, calculates
In a thermal imagery accounting of the thermal imagery matrix.
Step S115: when the heat source falls into human body temperature range and meet preset human space model, generation one
Human body entering signal.
Step S116: it when the thermal imagery accounting warns accounting up to one, generates a human body and enters warning range signal.
Step S117: it when the thermal imagery accounting is up to a dangerous accounting, generates a human body and enters risk range signal.
Step S118: it after the reception human body enters warning range signal, generates a robot and stops action command.
Step S119: after the reception human body enters risk range signal, a robot is generated far from action command.
Additionally it contained following step: after the robot receives human body entering signal, into a man-machine collaboration mould
Formula or a personnel enter guard model.After the robot reception human body enters fence coverage signal, stop into a robot
Action mode.After the robot one human body of reception enters risk range signal, into an avoidance action mode.Wherein, the avoidance
Action mode includes: an evasive action, a reverse motion or a spinning movement.
Then, Figure 11, the still another embodiment flow chart of the robot avoiding collision of the utility model, this reality are please referred to
Apply the operating process explanation for the embodiment that example is Fig. 3 C, Fig. 4 C.A kind of robot avoiding collision, apply in a robot with
It assists the anticollision of the robot to work, includes:
Step S121: a thermal imagery matrix signal is captured.
Step S122: establishing a sensing range spatial model of the heat source in the thermal imagery matrix signal, and judging should
Whether sensing range spatial model meets one of several preset human space models.
Step S123: judge whether the heat source of the thermal imagery matrix falls within the scope of a human body temperature.
Step S124: when the heat source falls into human body temperature range and meet preset human space model, generation one
Human body entering signal.
Step S125: one second sensor of starting is to capture one second video signal.Wherein second sensor is to be selected from
CMOS image acquisition unit, CCD image acquisition unit, ultrasonic matrix image acquisition unit.
Step S126: according to second video signal, one second sensing range spatial model is established to be bonded first sense
Know ranged space model.
Step S127: when the heat source is fallen into human body temperature range and meets preset human space model, according to this
First sensing range spatial model and the second sensing range spatial model calculate and generate the relative distance.The production of relative distance
Raw mode, equally can be used the overlapping after being bonded the first sensing range spatial model with the second sensing range spatial model
Sensing range spatial model caused by aberration data and thermal imagery sensing matrix and the second sensor absolute distance be total to
It is same to calculate relative distance.
Step S128: when the relative distance up to one warning apart from when, generate a human body enter warning range signal.
Step S129: it when the relative distance is up to a risk distance, generates a human body and enters risk range signal.
Step S130: it when the reception human body enters warning range signal, generates a robot and stops action command.
Step S131: when reception human body enters risk range signal, a robot is generated far from action command.
Additionally it contained following step: after the robot receives human body entering signal, into a man-machine collaboration mould
Formula or a personnel enter guard model.After the robot reception human body enters fence coverage signal, stop into a robot
Action mode.After the robot one human body of reception enters risk range signal, into an avoidance action mode.Wherein, the avoidance
Action mode includes: an evasive action, a reverse motion or a spinning movement.
Then, Figure 12, the another specific embodiment flow chart of the robot avoiding collision of the utility model, this reality are please referred to
Apply the operating process explanation that example is another embodiment of Fig. 2A.A kind of robot avoiding collision, apply in a robot with
It assists the anticollision of the robot to work, includes:
Step S141: a thermal imagery matrix signal is captured.
Step S142: establishing a sensing range spatial model of the heat source in the thermal imagery matrix signal, and judging should
Whether sensing range spatial model meets one of several preset human space models.
Step S143: judge whether the heat source of the thermal imagery matrix falls within the scope of a human body temperature.
Step S144: when the heat source falls into human body temperature range and meet preset human space model, generation one
Human body entering signal.
Step S145: a speed parameter is captured, to establish a third shaft space parameter.Wherein the speed parameter is by a top
Spiral shell instrument, an accelerometer, a Hall sensor or the robot arm angular speed and acceleration sensing element acquired by, generate
One magnitude of angular velocity or an acceleration value, so as to as the third shaft space parameter.
Step S146: when the heat source falls into human body temperature range and meets preset human space model, calculate with
The opposite fortune of human body corresponding to heat source is to speed.
Step S147: according to the first sensing range spatial model and the speed of related movement, the people's temperature will be fallen into
The heat source in range, calculates the relative distance.
Step S148: when the relative distance up to one warning apart from when, generate a human body enter warning range signal.
Step S150: it when the relative distance is up to a risk distance, generates a human body and enters risk range signal.
Step S150: it when the reception human body enters warning range signal, generates a robot and stops action command.
Step S151: when reception human body enters risk range signal, a robot is generated far from action command.
Additionally it contained following step: after the robot receives human body entering signal, into a man-machine collaboration mould
Formula or a personnel enter guard model.After the robot reception human body enters fence coverage signal, stop into a robot
Action mode.After the robot one human body of reception enters risk range signal, into an avoidance action mode.Wherein, the avoidance
Action mode includes: an evasive action, a reverse motion or a spinning movement.
Then, Figure 13, the another specific embodiment flow chart of the robot avoiding collision of the utility model, this reality are please referred to
Apply the operating process explanation that example is another embodiment of Fig. 3 C, Fig. 4 C.A kind of robot avoiding collision, applies to a robot
On to assist the anticollision of the robot to work, include:
Step S201: a thermal imagery matrix signal is captured.
Step S202: establishing a sensing range spatial model of the heat source in the thermal imagery matrix signal, and judging should
Whether sensing range spatial model meets one of several preset human space models.
Step S203: judge whether the heat source of the thermal imagery matrix falls within the scope of a human body temperature.
Step S204: when the heat source falls into human body temperature range and meet preset human space model, generation one
Human body entering signal.
Step S205: a speed parameter is captured, to establish a third shaft space parameter.
Step S206: when the heat source falls into human body temperature range and meets preset human space model, calculate with
The opposite fortune of human body corresponding to heat source is to speed.
Step S207: one second sensor of starting is to capture one second video signal.
Step S208: according to second video signal, one second sensing range spatial model is established to be bonded first sense
Know ranged space model.
Step S209: according to the first sensing range spatial model and the second sensing range spatial model, relative motion speed
Degree calculates and generates a relative distance.
Step S210: when the relative distance up to one warning apart from when, generate a human body enter warning range signal.
Step S211: it when the relative distance is up to a risk distance, generates a human body and enters risk range signal.
Step S212: it when the reception human body enters warning range signal, generates a robot and stops action command.
Step S213: when reception human body enters risk range signal, a robot is generated far from action command.
Additionally it contained following step: after the robot receives human body entering signal, into a man-machine collaboration mould
Formula or a personnel enter guard model.After the robot reception human body enters fence coverage signal, stop into a robot
Action mode.After the robot one human body of reception enters risk range signal, into an avoidance action mode.Wherein, the avoidance
Action mode includes: an evasive action, a reverse motion or a spinning movement.
Then, Figure 14, the another specific embodiment flow chart of the robot avoiding collision of the utility model, this reality are please referred to
Apply the operating process explanation for the embodiment that example is Fig. 2 B.A kind of robot avoiding collision applies in a robot to assist
The anticollision of the robot works, and includes:
Step S301: the first thermal imagery matrix and the second thermal imagery matrix signal are captured.
Step S302: establishing two sensing range spatial models of the heat source in the first, second thermal imagery matrix signal,
And judge whether sensing range spatial model meets one of several preset human space models.This step will mainly be sentenced
It is disconnected whether someone, specific practice can be, as long as one of sensing range spatial model meet human space model wherein it
One, it can be judged as that the object is behaved.Another situation is, while having more people to enter the first thermal imagery matrix and the second thermal imagery
Matrix.If human space model has the human space model of such more people, similarly, as long as two sensing range spatial models its
One of meet human space model, that is, can determine whether someone enter.
Step S303: whether judge the first thermal imagery matrix, the first heat source of the second thermal imagery matrix and Secondary Heat Source
It falls within the scope of human body temperature.
Step S304: when the first heat source, Secondary Heat Source fall into human body temperature range and meet preset human space model
When, relative distance is calculated.In this step, two sensing range spatial models can be redeveloped into an overlapping by control unit
Sensing range spatial model, then by this overlapping sensing range spatial model and two thermal imagery matrixes itself circuit board
Distance calculate, the relative distance of human body Yu circuit board (the namely robot anticollision device, collision-prevention device of the utility model) can be obtained.
Step S305: when first heat source, the Secondary Heat Source are fallen within the scope of human body temperature and meet preset human body
Spatial model generates a human body entering signal.
Step S306: when the relative distance up to one warning apart from when, generate a human body enter warning range signal.
Step S307: it when the relative distance is up to a risk distance, generates a human body and enters risk range signal.
Step S308: it when the reception human body enters warning range signal, generates a robot and stops action command.
Step S309: when reception human body enters risk range signal, a robot is generated far from action command.
Additionally it contained following step: after the robot receives human body entering signal, into a man-machine collaboration mould
Formula or a personnel enter guard model.After the robot reception human body enters fence coverage signal, stop into a robot
Action mode.After the robot one human body of reception enters risk range signal, into an avoidance action mode.Wherein, the avoidance
Action mode includes: an evasive action, a reverse motion or a spinning movement.
Then, Figure 15, the another specific embodiment flow chart of the robot avoiding collision of the utility model, this reality are please referred to
Apply the operating process explanation for the embodiment that example is Fig. 2 B.A kind of robot avoiding collision applies in a robot to assist
The anticollision of the robot works, and includes:
Step S311: the first thermal imagery matrix and the second thermal imagery matrix signal are captured.
Step S312: two sensing range spatial modes of each heat source in the first, second thermal imagery matrix signal are established
Type, and judge whether the two sensing range spatial models meet one of several preset human space models.
Step S313: whether judge the first thermal imagery matrix, the first heat source of the second thermal imagery matrix and Secondary Heat Source
It falls within the scope of human body temperature.
Step S314: when the first heat source, Secondary Heat Source fall into human body temperature range and meet preset human space model
When, it is calculated in the first thermal imagery accounting and the second thermal imagery accounting of thermal imagery matrix.
Step S315: when the first heat source, Secondary Heat Source fall into human body temperature range and meet preset human space model,
Generate a human body entering signal.
Step S316: it when the thermal imagery accounting warns accounting up to one, generates a human body and enters warning range signal.
Step S317: it when the thermal imagery accounting is up to a dangerous accounting, generates a human body and enters risk range signal.
Step S318: it after the reception human body enters warning range signal, generates a robot and stops action command.
Step S319: after the reception human body enters risk range signal, a robot is generated far from action command.
Additionally it contained following step: after the robot receives human body entering signal, into a man-machine collaboration mould
Formula or a personnel enter guard model.After the robot reception human body enters fence coverage signal, stop into a robot
Action mode.After the robot one human body of reception enters risk range signal, into an avoidance action mode.Wherein, the avoidance
Action mode includes: an evasive action, a reverse motion or a spinning movement.
Although the technology contents of the utility model have been disclosed above in the preferred embodiment, so it is not limited to this reality
It is any to be familiar with this those skilled in the art with novel, the spirit for not departing from the utility model make it is a little change and retouch, should all cover
In in the scope of the utility model, therefore the protection scope of the utility model is subject to view as defined in claim.
Claims (9)
1. a kind of robot anticollision device, collision-prevention device, it is characterised in that: include a shell and a circuit board, shell is mounted on a robot
On arm, an at least aperture on the shell is equipped with an at least thermal sensing matrix on the circuit board, is arranged on the thermal sensing matrix
An at least camera lens, each camera lens correspond to the position of the aperture, and a connector of the circuit board is connected to a connecting line
One control circuit board of the robot arm.
2. robot anticollision device, collision-prevention device as described in claim 1, it is characterised in that: be also equipped on the circuit board gyroscope,
One accelerometer or a Hall sensor.
3. robot anticollision device, collision-prevention device as described in claim 1, it is characterised in that: be also equipped with an image capture on the circuit board
Matrix or a ultrasonograph capture matrix, and the image capture matrix or the ultrasonograph capture matrix and be respectively configured with one second
Camera lens on the circuit board with one second aperture in the shell.
4. a kind of robot anticollision device, collision-prevention device, it is characterised in that: include a shell and a circuit board, shell is mounted on a robot
On arm, two apertures on the shell, installation is there are two thermal sensing matrix on the circuit board, on the thermal sensing matrix there are two settings
Camera lens, each camera lens correspond to the position of the aperture, and a connector of the circuit board is connected to the robot with a connecting line
One control circuit board of arm.
5. robot anticollision device, collision-prevention device as claimed in claim 4, it is characterised in that: be also equipped on the circuit board gyroscope,
One accelerometer or a Hall sensor.
6. robot anticollision device, collision-prevention device as claimed in claim 4, it is characterised in that: be also equipped with an image capture on the circuit board
Matrix or a ultrasonograph capture matrix, and the image capture matrix or the ultrasonograph capture matrix and be respectively configured with one second
Camera lens on the circuit board with one second aperture in the shell.
7. a kind of robot, it is characterised in that: be configured with a robot anticollision device, collision-prevention device, the machine people's air defense on the robot arm
Collision device includes a shell and a circuit board, an at least aperture on the circuit board, and at least one heat is equipped on the circuit board
Matrix is sensed, an at least camera lens is set on the thermal sensing matrix, each camera lens corresponds to the position of the aperture, the circuit board
A connector control circuit board of the robot arm is connected to a connecting line.
8. robot as claimed in claim 7, it is characterised in that: be also equipped on the circuit board of the robot anticollision device, collision-prevention device
One gyroscope, an accelerometer or a Hall sensor.
9. robot as claimed in claim 7, it is characterised in that: be also equipped on the circuit board of the robot anticollision device, collision-prevention device
One image capture matrix or a ultrasonograph capture matrix, and the image capture matrix or the ultrasonograph capture matrix and respectively match
Be equipped with one second camera lens on the circuit board with one second aperture in the shell.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201820601577.1U CN208438358U (en) | 2018-04-25 | 2018-04-25 | Robot anticollision device, collision-prevention device and the robot for using the device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201820601577.1U CN208438358U (en) | 2018-04-25 | 2018-04-25 | Robot anticollision device, collision-prevention device and the robot for using the device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN208438358U true CN208438358U (en) | 2019-01-29 |
Family
ID=65093399
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201820601577.1U Expired - Fee Related CN208438358U (en) | 2018-04-25 | 2018-04-25 | Robot anticollision device, collision-prevention device and the robot for using the device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN208438358U (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110026718A (en) * | 2019-05-21 | 2019-07-19 | 郑万众 | A kind of Intelligent welding robot arm with warning function |
-
2018
- 2018-04-25 CN CN201820601577.1U patent/CN208438358U/en not_active Expired - Fee Related
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110026718A (en) * | 2019-05-21 | 2019-07-19 | 郑万众 | A kind of Intelligent welding robot arm with warning function |
CN110026718B (en) * | 2019-05-21 | 2021-04-02 | 郑万众 | Intelligent welding robot arm with early warning function |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109141373A (en) | For protecting the sensor of machine | |
US10195741B2 (en) | Controlling a robot in the presence of a moving object | |
JP6491640B2 (en) | Apparatus and method for protecting automatically operating machines and computer program data storage medium | |
JP2009545457A (en) | Monitoring method and apparatus using camera for preventing collision of machine | |
US20140135984A1 (en) | Robot system | |
KR20220012921A (en) | Robot configuration with 3D lidar | |
CN108274469B (en) | Detection method of vacuum manipulator anti-collision detection system based on multi-dimensional visual sensor | |
US11510740B2 (en) | Systems and methods for tracking objects | |
CN108089553B (en) | Method and device for starting a multiaxial system | |
CN107077729B (en) | Method and device for recognizing structural elements of a projected structural pattern in a camera image | |
CN104007715A (en) | Recognition-based industrial automation control with position and derivative decision reference | |
EP2772812B1 (en) | Recognition-based industrial automation control with redundant system input support | |
CN208438358U (en) | Robot anticollision device, collision-prevention device and the robot for using the device | |
CN108536142A (en) | Industrial robot anti-collision early warning system based on digital fringe projection and method | |
CN108081267B (en) | Method and device for starting a multiaxial system | |
JP6906821B2 (en) | Mobile robot | |
CN109834710A (en) | Robot and robot system | |
CN113664832A (en) | Robot collision prediction method, computer storage medium and electronic device | |
US20200342237A1 (en) | Method for the Emergency Shutdown of Hand-Guided Tools, and Hand-Guided Tool | |
KR102228835B1 (en) | Industrial robot measuring system and method | |
JP2019069486A (en) | Processor | |
CN102528811A (en) | Mechanical arm positioning and obstacle avoiding system in Tokamak cavity | |
CN110394829A (en) | Robot anticollision device, collision-prevention device, robot and robot avoiding collision | |
KR101444270B1 (en) | Unmanned mobile monitoring system | |
TWM567156U (en) | Device for robot collision-avoidance and applicable robot thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right |
Effective date of registration: 20200916 Address after: 6 / F, 1-1 Nanjing West Road, Zhongshan District, Taipei, Taiwan, China Patentee after: Fengtai Instrument Co.,Ltd. Address before: Innovation and development center R229, Central University, 300 Zhongda Road, Zhongli District, Taoyuan City, Taiwan, China Patentee before: Foresee Technology Corp. |
|
TR01 | Transfer of patent right | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20190129 |