WO1997018523B1 - Computer stereo vision system and method - Google Patents

Computer stereo vision system and method

Info

Publication number
WO1997018523B1
WO1997018523B1 PCT/IL1996/000145 IL9600145W WO9718523B1 WO 1997018523 B1 WO1997018523 B1 WO 1997018523B1 IL 9600145 W IL9600145 W IL 9600145W WO 9718523 B1 WO9718523 B1 WO 9718523B1
Authority
WO
WIPO (PCT)
Prior art keywords
cameras
area
distance
areas
pictures
Prior art date
Application number
PCT/IL1996/000145
Other languages
French (fr)
Other versions
WO1997018523A2 (en
WO1997018523A3 (en
Filing date
Publication date
Priority claimed from IL11597195A external-priority patent/IL115971A/en
Application filed filed Critical
Priority to CA 2237886 priority Critical patent/CA2237886A1/en
Priority to KR1019980703244A priority patent/KR19990067273A/en
Priority to AU73316/96A priority patent/AU738534B2/en
Priority to EP96935318A priority patent/EP0861415A4/en
Priority to JP9518719A priority patent/JP2000500236A/en
Priority to BR9611710-9A priority patent/BR9611710A/en
Publication of WO1997018523A2 publication Critical patent/WO1997018523A2/en
Publication of WO1997018523A3 publication Critical patent/WO1997018523A3/en
Publication of WO1997018523B1 publication Critical patent/WO1997018523B1/en

Links

Abstract

The system and the method are intended to enable a robot (20) to see the environment in which it operates, by means of a pair of identical cameras (3). The pictures from the cameras (3) are processed by a coordinator/translator (59). The system includes a memory register (51) for movement identification at time intervals.A basic shapes registry (52) identifies shapes.

Claims

-32-
AMENDED CLAIMS
[received by the International Bureau on 2 July 1 97 (02 07 97), original claims 1 and 9 amended, remaining claims unchanged (3 pages)]
1. A stereo computer vision system including: a. A pair of identical, aligned, coordinated cameras at fixed distance M, with coordinated photographing/filming angle and enlargement/ reduction, creating for the photographing cameras optical parallel fields of sight, coordinated, aligned, with identical field of sight line in both cameras from (0:0) and up to (Y:X), at a fixed distance M at any photographing distance, and the pictures of said cameras are received by means of input-memory devices in the computer, translated to computer language; b. Based on step A, the pixels of the received pictures are matched by the available means, at photographing rate and/or at any other rate to each of the previous picture(s) existing in the system's spatial memory registry according to the coordinates and space-counters, and when no matching is found there, the system performs updating and proceeds with matching of two pictures that have been input; c. Based on steps A and B, with regard to unidentified areas or areas in which any movement takes place and which are seen simultaneously in both cameras, the system looks for various features of the picture, such as color, and based on the difference in the count of pixels in each picture taken by a camera from the beginning of a line up to that matching point, calculation of the distance and the size of a point in space, represented by a pixel, are performed, and based on this, also the size of the area, the distances, etc.; d. Based on steps B and C, in matching points of unidentified areas with stored pictures, there will be points around the area that will not match due to a change in the distance between the area and the surrounding environment and the distance between the two cameras, these depth dots creating a contour-line and/or dividing line in addition to the approximately similar distance to the area dots, and with regard to areas at similar distance, changes in color or shade between areas and/or area motion/movement, will allow, by means of -33-
infoπnation; said means for receiving, transmitting and drawing of data; said means including power supply, data protection, protection of said means, the system and alike.
8. A computer vision system as in any of claims 1 to 7, wherein the received and stored information that has been collected before, during and after identification, including calculations, known data, features, definitions that have been collected such as with regard to areas, shapes, is immediately or after a short while transmitted forward in form of information, in a regular manner and/or by stereo in form of multimedia and/or in 3D form, and which the system preserves and/or provides and/or which are accessible upon request and/or automatically to the user, such as a robot, a device, a blind person, by means of adequate interfacing means.
9. A stereo computer vision method including: a. A pair of identical, aligned, coordinated cameras at fixed distance M, with coordinated photographing/filming angle and enlargement/ reduction, creating for the photographing cameras optical parallel fields of sight, coordinated, aligned, with identical field of sight line in both cameras from (0:0) and up to (Y:X), at a fixed distance M at any photographing distance, and the pictures of said cameras are received by means of input-memory devices in the computer, translated to computer language; b. Based on step A, the pixels of the received pictures are matched by the available means, at photographing rate and/or at any other rate to each of the previous picture(s) existing in the system's spatial memory registry according to the coordinates and space-counters, and when no matching is found there, the system performs updating and proceeds with mater ing of two pictures that have been input; c. Based on steps A and B, with regard to unidentified areas or areas in which any movement takes place and which are seen simultaneously in both cameras, the system looks for various features of the picture, such as color, and based on the difference in the count of pixels in each picture taken by a camera from the beginning of a line up to that matching point, calculation of the distance and the size of a point in space, represented by a pixel, are performed, and based on this, also the size of the area, the distances, etc.; d. Based on steps B and C, in matching points of unidentified areas with stored pictures, there will be points around the area that will not match due to a change in the distance between the area and the surrounding environment and the distance between the two cameras, these depth dots creating a contour-line and/or dividing line in addition to the approximately similar distance to the area dots, and with regard to areas at similar distance, changes in color or shade between areas and/or area motion/movement, will allow, by means of the available means, to detect and define a frame or countour for each and every area; e. For purpose of collection of additional data enabling to perform with the means available calculations, matching, setting of definitions and drawing of conclusions, the picture received from one of the cameras undergoes color filtering and is entered into the movement- identification register at time-intervals, after being duplicated and is then examined at fixed time intervals and based on steps B to D, the frame or countour of the area that has been detected is matched in size with the basic-shapes register, said data from these registers enable to calculate, for example, the size of an area/shape and speed of movement/motion in time and additional data, part of said data that have been input and part of the data that have been calculated/ detected previously allow matching as against a register of table(s) such as "true" for setting definitions and drawing conclusions with regard to other key-elements, such as: inanimate object, fluttering, animate, for example, if the size of the area goes from size to size and there is movement and/or motion in the envelope and/or inside it, etc.; f. Said key-elements drawn from data, features, definitions and conclusions, ordered in a particular order, allow, through the means provided, to detect, match and identify areas as against a register of stored shapes; g. The unidentified areas will be stored under temporary names and the identified shapes will be stored under their own name and with reference to the coordinates data in each scanning of the first dot and
PCT/IL1996/000145 1995-11-14 1996-11-12 Computer stereo vision system and method WO1997018523A2 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
CA 2237886 CA2237886A1 (en) 1995-11-14 1996-11-12 Computer stereo vision system and method
KR1019980703244A KR19990067273A (en) 1995-11-14 1996-11-12 Computer stereoscopic observation system and method thereof
AU73316/96A AU738534B2 (en) 1995-11-14 1996-11-12 Computer stereo vision system and method
EP96935318A EP0861415A4 (en) 1995-11-14 1996-11-12 Computer stereo vision system and method
JP9518719A JP2000500236A (en) 1995-11-14 1996-11-12 Computer stereo vision system and method
BR9611710-9A BR9611710A (en) 1995-11-14 1996-11-12 Stereo computer vision system and stereo computer vision method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IL115971 1995-11-14
IL11597195A IL115971A (en) 1995-11-14 1995-11-14 Computer stereo vision system and method

Publications (3)

Publication Number Publication Date
WO1997018523A2 WO1997018523A2 (en) 1997-05-22
WO1997018523A3 WO1997018523A3 (en) 1997-07-24
WO1997018523B1 true WO1997018523B1 (en) 1997-08-21

Family

ID=11068178

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL1996/000145 WO1997018523A2 (en) 1995-11-14 1996-11-12 Computer stereo vision system and method

Country Status (8)

Country Link
EP (1) EP0861415A4 (en)
JP (1) JP2000500236A (en)
KR (1) KR19990067273A (en)
CN (1) CN1202239A (en)
AU (1) AU738534B2 (en)
BR (1) BR9611710A (en)
IL (1) IL115971A (en)
WO (1) WO1997018523A2 (en)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE29918341U1 (en) * 1999-10-18 2001-03-01 Tassakos Charalambos Device for determining the position of measuring points of a measuring object relative to a reference system
KR100374408B1 (en) * 2000-04-24 2003-03-04 (주) 케이앤아이테크놀로지 3D Scanner and 3D Image Apparatus using thereof
DK1287502T3 (en) * 2000-05-23 2009-02-02 Munroe Chirnomas Method and apparatus for storing hoses in a device for handling articles
CN1292941C (en) * 2004-05-24 2007-01-03 刘新颜 Rear-view device of automobile
CN100447820C (en) * 2005-08-04 2008-12-31 浙江大学 Bus passenger traffic statistical method based on stereoscopic vision and system therefor
TWI327536B (en) 2007-05-16 2010-07-21 Univ Nat Defense Device and method for detecting obstacle by stereo computer vision
WO2013067513A1 (en) 2011-11-04 2013-05-10 Massachusetts Eye & Ear Infirmary Contextual image stabilization
CN102592121B (en) * 2011-12-28 2013-12-04 方正国际软件有限公司 Method and system for judging leakage recognition based on OCR (Optical Character Recognition)
CN102799183B (en) * 2012-08-21 2015-03-25 上海港吉电气有限公司 Mobile machinery vision anti-collision protection system for bulk yard and anti-collision method
CN103679742B (en) * 2012-09-06 2016-08-03 株式会社理光 Method for tracing object and device
CN102937811A (en) * 2012-10-22 2013-02-20 西北工业大学 Monocular vision and binocular vision switching device for small robot
US20180099846A1 (en) 2015-03-06 2018-04-12 Wal-Mart Stores, Inc. Method and apparatus for transporting a plurality of stacked motorized transport units
US10239740B2 (en) 2015-03-06 2019-03-26 Walmart Apollo, Llc Shopping facility assistance system and method having a motorized transport unit that selectively leads or follows a user within a shopping facility
WO2016142794A1 (en) 2015-03-06 2016-09-15 Wal-Mart Stores, Inc Item monitoring system and method
JP7037876B2 (en) 2015-06-26 2022-03-17 コグネックス・コーポレイション Use of 3D vision in automated industrial inspection
US11158039B2 (en) 2015-06-26 2021-10-26 Cognex Corporation Using 3D vision for automated industrial inspection
CN106610522A (en) * 2015-10-26 2017-05-03 南京理工大学 Three-dimensional microscopic imaging device and method
CA2961938A1 (en) 2016-04-01 2017-10-01 Wal-Mart Stores, Inc. Systems and methods for moving pallets via unmanned motorized unit-guided forklifts
JP2018041247A (en) * 2016-09-07 2018-03-15 ファナック株式会社 Server, method, program, and system for recognizing individual identification information of machine
CN107145823A (en) * 2017-03-29 2017-09-08 深圳市元征科技股份有限公司 A kind of image-recognizing method, pattern recognition device and server
CN106940807A (en) * 2017-04-19 2017-07-11 深圳市元征科技股份有限公司 A kind of processing method and processing device based on mirror device of looking in the distance
CN114543684B (en) * 2022-04-26 2022-07-12 中国地质大学(北京) Structural displacement measuring method

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4601053A (en) * 1983-11-21 1986-07-15 Grumman Aerospace Corporation Automatic TV ranging system
JPS60200103A (en) * 1984-03-26 1985-10-09 Hitachi Ltd Light cutting-plate line extraction circuit
JPH07109625B2 (en) * 1985-04-17 1995-11-22 株式会社日立製作所 3D stereoscopic method
US4924506A (en) * 1986-07-22 1990-05-08 Schlumberger Systems & Services, Inc. Method for directly measuring area and volume using binocular stereo vision
JPS63288683A (en) * 1987-05-21 1988-11-25 株式会社東芝 Assembling robot
US4982438A (en) * 1987-06-02 1991-01-01 Hitachi, Ltd. Apparatus and method for recognizing three-dimensional shape of object
US4900128A (en) * 1988-11-01 1990-02-13 Grumman Aerospace Corporation Three dimensional binocular correlator
JPH04207866A (en) * 1990-11-30 1992-07-29 Toshiba Corp Image processor
US5179441A (en) * 1991-12-18 1993-01-12 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Near real-time stereo vision system
US5309522A (en) * 1992-06-30 1994-05-03 Environmental Research Institute Of Michigan Stereoscopic determination of terrain elevation

Similar Documents

Publication Publication Date Title
WO1997018523B1 (en) Computer stereo vision system and method
CN102214000B (en) Hybrid registration method and system for target objects of mobile augmented reality (MAR) system
JP4198054B2 (en) 3D video conferencing system
US6570566B1 (en) Image processing apparatus, image processing method, and program providing medium
US20170310946A1 (en) Three-dimensional depth perception apparatus and method
CN103207664A (en) Image processing method and equipment
WO1999006950A3 (en) Scanning apparatus and methods
CN202362833U (en) Binocular stereo vision-based three-dimensional reconstruction device of moving vehicle
JP2001008235A (en) Image input method for reconfiguring three-dimensional data and multiple-lens data input device
EP0877274A3 (en) Image tracking system and method and observer tracking autostereoscopic display
EP1130504A3 (en) Three-dimensional check image viewer and a method of handling check images in an image-based check processing system
JP3383228B2 (en) Head mounted display device
CN106028136A (en) Image processing method and device
CN107972585A (en) Scene rebuilding System and method for is looked around with reference to the adaptive 3 D of radar information
JPH07296185A (en) Three-dimensional image display device
JP2000500236A (en) Computer stereo vision system and method
WO2018222122A1 (en) Methods for perspective correction, computer program products and systems
CN110099268A (en) The blind area perspectiveization display methods of color Natural matching and viewing area natural fusion
CN106846243A (en) The method and device of three dimensional top panorama sketch is obtained in equipment moving process
Mukai et al. The recovery of object shape and camera motion using a sensing system with a video camera and a gyro sensor
KR100824744B1 (en) Localization System and Method for Mobile Robot Using Corner's Type
RU2735066C1 (en) Method for displaying augmented reality wide-format object
CN113401058A (en) Real-time display method and system for automobile A column blind area based on three-dimensional coordinates of human eyes
CN115514885B (en) Remote augmented reality follow-up sensing system and method based on monocular and binocular fusion
CN106864372A (en) Outdoor scene internet is called a taxi accessory system and method