EP1934870A2 - System und verfahren für bildabbildung und visuelle aufmerksamkeit - Google Patents
System und verfahren für bildabbildung und visuelle aufmerksamkeitInfo
- Publication number
- EP1934870A2 EP1934870A2 EP06816851A EP06816851A EP1934870A2 EP 1934870 A2 EP1934870 A2 EP 1934870A2 EP 06816851 A EP06816851 A EP 06816851A EP 06816851 A EP06816851 A EP 06816851A EP 1934870 A2 EP1934870 A2 EP 1934870A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- robot
- attentional
- ses
- image
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 44
- 238000013507 mapping Methods 0.000 title claims abstract description 15
- 230000000007 visual effect Effects 0.000 title abstract description 21
- 238000012545 processing Methods 0.000 claims abstract description 36
- 230000001953 sensory effect Effects 0.000 claims abstract description 32
- 230000003044 adaptive effect Effects 0.000 claims description 7
- 230000004913 activation Effects 0.000 abstract description 38
- 238000003672 processing method Methods 0.000 abstract description 4
- 230000006399 behavior Effects 0.000 description 80
- 239000003795 chemical substances by application Substances 0.000 description 58
- 238000001994 activation Methods 0.000 description 37
- 230000009471 action Effects 0.000 description 16
- 230000006870 function Effects 0.000 description 14
- 230000008569 process Effects 0.000 description 13
- 230000000694 effects Effects 0.000 description 11
- 230000013016 learning Effects 0.000 description 11
- 238000012549 training Methods 0.000 description 11
- 230000033001 locomotion Effects 0.000 description 6
- 230000011514 reflex Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 241000282414 Homo sapiens Species 0.000 description 4
- 230000008901 benefit Effects 0.000 description 4
- 238000005286 illumination Methods 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 238000011160 research Methods 0.000 description 4
- 230000015572 biosynthetic process Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 238000002474 experimental method Methods 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000006403 short-term memory Effects 0.000 description 3
- 230000007480 spreading Effects 0.000 description 3
- 239000000126 substance Substances 0.000 description 3
- 241000282412 Homo Species 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000019771 cognition Effects 0.000 description 2
- 230000001934 delay Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 230000002401 inhibitory effect Effects 0.000 description 2
- 230000015654 memory Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000004043 responsiveness Effects 0.000 description 2
- 230000031893 sensory processing Effects 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
- 238000012935 Averaging Methods 0.000 description 1
- 206010052804 Drug tolerance Diseases 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 208000025174 PANDAS Diseases 0.000 description 1
- 208000021155 Paediatric autoimmune neuropsychiatric disorders associated with streptococcal infection Diseases 0.000 description 1
- 240000000220 Panda oleosa Species 0.000 description 1
- 235000016496 Panda oleosa Nutrition 0.000 description 1
- 206010070834 Sensitisation Diseases 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000026781 habituation Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000007787 long-term memory Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000037023 motor activity Effects 0.000 description 1
- 230000004973 motor coordination Effects 0.000 description 1
- 239000010813 municipal solid waste Substances 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 230000000272 proprioceptive effect Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000035484 reaction time Effects 0.000 description 1
- 230000002787 reinforcement Effects 0.000 description 1
- 230000008313 sensitization Effects 0.000 description 1
- 230000008685 targeting Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 238000009827 uniform distribution Methods 0.000 description 1
- 230000001720 vestibular Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0268—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
- G05D1/0274—Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means using mapping information stored in a memory device
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/35—Nc in input of data, input till input file format
- G05B2219/35144—Egosphere: spherical shell 2-5-D around robot, objects are projected on it
Definitions
- An allocentric world model places objects in a coordinate grid that does not change with the robot's position.
- An ego-centric model is always centered on the present position of the robot.
- An ego-centric model is described in Albus, J. S., "Outline for a theory of intelligence", IEEE Trans. Syst. Man, and Cybern., vol. 21, no. 3, 1991.
- Albus describes an Ego-Sphere wherein the robot's environment is projected onto a spherical surface centered on the robot's current position.
- the Ego-Sphere is a dense representation of the world in the sense that all sensory information is projected onto the Ego-Sphere.
- Albus' Ego-Sphere is also continuous because the projection is affine.
- the frequency is determined by the number of vertices that connect the center of one pentagon to the center of another pentagon, all pentagons being distributed on the dome evenly.
- the SES has a tessellation of 14 and, therefore, 1963 nodes.
- the SES facilitates the detection of events in the environment that simultaneously stimulate multiple sensors.
- Each sensor on the robot sends information to one or more sensory processing modules (SPMs) designed to extract specific information from the data stream associated with that sensor.
- SPMs are independent of each other and run continuously and concurrently on preferably different processors.
- Each SPM sends information messages to an SES manager agent which stores the data, including directional sensory information if available, in the SES.
- An object 350 is projected onto the SES by a ray 355 connecting the center 301 to the object 350.
- Ray 355 intersects a face 360 at a point 357 defined by azimuthal angle, ⁇ s , and elevation (or polar) angle, ⁇ s .
- Information about the object 350, such as ⁇ s and ⁇ s are stored at the vertex 370 that is closest to point 357.
- a vertex may also contain links to behaviors stored in the DBAM.
- Landmark mapping agents may also write to the SES, storing a pointer to an object descriptor at the vertex where the object is expected.
- Objects may be tracked during robot movement on the SES using transformations such as those described in Peters, R. A. II, K. E. Hambuchen, K. Kawamura, and D. M. Wilkes, "The Sensory Ego-Sphere as a Short-Term Memory for Humanoids", Proc. IEEE-RAS Int'l. Conf. on Humanoid Robots, pp. 451-459, Waseda University, Tokyo, Japan, Nov. 22-24, 2001 herein incorporated by reference in its entirety.
- Each of the behaviors linked to the current behavior computes the vector-space distance between the current state and its own pre-condition state.
- Each behavior propagates an inhibitory signal (by adding a negative number to the activation term) that is inversely proportional to the computed distance to the other linked behaviors.
- the propagation of the inhibitory signal between the linked behaviors has the effect that, in most instances, the behavior with the highest activation term is also the behavior whose pre-condition state most closely matches the current state of the robot.
- the links between behaviors are created by the SAN agent during task planning but may also be created by a dream agent during the dream state. The links are task dependent and different behaviors may be linked together depending on the assigned goal.
- the spreading activation network (SAN) agent constructs a sequence of behaviors that will take the robot from its current state to the goal state (active map) in the DBAM by back-propagating from the goal state to the current state. For each behavior added to the active map, the SAN agent performs a search for behaviors that have a pre-condition state close to the post-condition state of the added behavior and adds a link connecting the close behavior to the added behavior. An activation term characterizing the link and based on the inverse vector space distance between the linked behaviors is also added to the added behavior. The SAN agent may create several paths connecting the current state to the goal state.
- the problem of attention arises once the SES is populated with dense information. Because of limited computational resources, only regions of interest-determined by safety, opportunity, and by the task—can be attended to, if the robot is to interact with a human-centered environment in real time. The problem lies in how to perform attention processing given a populated SES and an image input stream. There are at least two possibilities. One is to perform visual attention processing on the entire SES. The other is to detect points of interest within the individual images and combine them with the imagery that is already present.
- the summed activation image (paragraph 88) appears better-suited for attention deployment on the SES. Processing the entire reconstructed scene image makes less information available than the summed activation image since only one image determines the most salient locations in the scene as opposed to a sequence of overlapping images. Moreover, updating the salience distribution on the SES as new information is made available is straightforward if the summed activation image is implemented. For example, this can be done simply by processing new images and combining the new attention points found with the attentional points already present. The activation at each node could be weighed by the age of each attentional point, giving more weight to newer points.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Aviation & Aerospace Engineering (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Manipulator (AREA)
- Image Analysis (AREA)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US72603305P | 2005-10-11 | 2005-10-11 | |
PCT/US2006/040040 WO2007044891A2 (en) | 2005-10-11 | 2006-10-11 | System and method for image mapping and visual attention |
Publications (2)
Publication Number | Publication Date |
---|---|
EP1934870A2 true EP1934870A2 (de) | 2008-06-25 |
EP1934870A4 EP1934870A4 (de) | 2010-03-24 |
Family
ID=37943550
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP06816851A Withdrawn EP1934870A4 (de) | 2005-10-11 | 2006-10-11 | System und verfahren für bildabbildung und visuelle aufmerksamkeit |
Country Status (4)
Country | Link |
---|---|
EP (1) | EP1934870A4 (de) |
JP (1) | JP2009517225A (de) |
CA (2) | CA2625805C (de) |
WO (1) | WO2007044891A2 (de) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2002081156A2 (en) * | 2001-04-06 | 2002-10-17 | Vanderbilt University | Architecture for robot intelligence |
US20050149227A1 (en) * | 2003-12-31 | 2005-07-07 | Peters Richard A.Ii | Architecture for robot intelligence |
US20050223176A1 (en) * | 2003-12-30 | 2005-10-06 | Peters Richard A Ii | Sensory ego-sphere: a mediating interface between sensors and cognition |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2648071B1 (fr) * | 1989-06-07 | 1995-05-19 | Onet | Procede et appareil autonomes de nettoyage automatique de sol par execution de missions programmees |
US5548511A (en) * | 1992-10-29 | 1996-08-20 | White Consolidated Industries, Inc. | Method for controlling self-running cleaning apparatus |
US5995884A (en) * | 1997-03-07 | 1999-11-30 | Allen; Timothy P. | Computer peripheral floor cleaning system and navigation method |
JP4409035B2 (ja) * | 2000-03-22 | 2010-02-03 | 本田技研工業株式会社 | 画像処理装置、特異箇所検出方法、及び特異箇所検出プログラムを記録した記録媒体 |
JP2002006784A (ja) * | 2000-06-20 | 2002-01-11 | Mitsubishi Electric Corp | 浮遊型ロボット |
JP2004086401A (ja) * | 2002-08-26 | 2004-03-18 | Sony Corp | 対象物認識装置および方法 |
-
2006
- 2006-10-11 CA CA2625805A patent/CA2625805C/en not_active Expired - Fee Related
- 2006-10-11 CA CA2868135A patent/CA2868135A1/en not_active Abandoned
- 2006-10-11 WO PCT/US2006/040040 patent/WO2007044891A2/en active Application Filing
- 2006-10-11 JP JP2008535701A patent/JP2009517225A/ja active Pending
- 2006-10-11 EP EP06816851A patent/EP1934870A4/de not_active Withdrawn
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2002081156A2 (en) * | 2001-04-06 | 2002-10-17 | Vanderbilt University | Architecture for robot intelligence |
US20020169733A1 (en) * | 2001-04-06 | 2002-11-14 | Peters Richard Alan | Architecture for robot intelligence |
US20050223176A1 (en) * | 2003-12-30 | 2005-10-06 | Peters Richard A Ii | Sensory ego-sphere: a mediating interface between sensors and cognition |
US20050149227A1 (en) * | 2003-12-31 | 2005-07-07 | Peters Richard A.Ii | Architecture for robot intelligence |
Non-Patent Citations (8)
Also Published As
Publication number | Publication date |
---|---|
EP1934870A4 (de) | 2010-03-24 |
JP2009517225A (ja) | 2009-04-30 |
WO2007044891A2 (en) | 2007-04-19 |
CA2625805A1 (en) | 2007-04-19 |
CA2868135A1 (en) | 2007-04-19 |
WO2007044891A3 (en) | 2007-07-12 |
CA2625805C (en) | 2014-11-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7835820B2 (en) | System and method for image mapping and visual attention | |
US7328196B2 (en) | Architecture for multiple interacting robot intelligences | |
US6697707B2 (en) | Architecture for robot intelligence | |
JP4609584B2 (ja) | ロボット装置、顔認識方法及び顔認識装置 | |
US10131052B1 (en) | Persistent predictor apparatus and methods for task switching | |
Shabbir et al. | A survey of deep learning techniques for mobile robot applications | |
US20230154015A1 (en) | Virtual teach and repeat mobile manipulation system | |
US11887363B2 (en) | Training a deep neural network model to generate rich object-centric embeddings of robotic vision data | |
CA2625805C (en) | System and method for image mapping and visual attention | |
Peters et al. | System and method for image mapping and visual attention | |
Peters et al. | Apparatus for multiprocessor-based control of a multiagent robot | |
Peters et al. | Architecture for robot intelligence | |
AU2002258757A1 (en) | Architecture for robot intelligence | |
WO2023100282A1 (ja) | データ生成システム、モデル生成システム、推定システム、学習済みモデルの製造方法、ロボット制御システム、データ生成方法、およびデータ生成プログラム | |
Joshi | Antipodal Robotic Grasping using Deep Learning | |
Bianco et al. | Biologically-inspired visual landmark learning for mobile robots | |
Hambuchen | Multi-modal attention and event binding in humanoid robot using a sensory ego-sphere | |
Blackburn et al. | Robotic Sensor-Motor Transformations | |
Liu et al. | Basic behavior acquisition based on multisensor integration of a robot head | |
Pence | Autonomous Mobility and Manipulation of a 9-DoF WMRA |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20080421 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC NL PL PT RO SE SI SK TR |
|
A4 | Supplementary search report drawn up and despatched |
Effective date: 20100219 |
|
17Q | First examination report despatched |
Effective date: 20100609 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20101221 |