DE102011118611A1 - Apparatus and method for a semi-automatic testing station - Google Patents

Apparatus and method for a semi-automatic testing station

Info

Publication number
DE102011118611A1
DE102011118611A1 DE102011118611A DE102011118611A DE102011118611A1 DE 102011118611 A1 DE102011118611 A1 DE 102011118611A1 DE 102011118611 A DE102011118611 A DE 102011118611A DE 102011118611 A DE102011118611 A DE 102011118611A DE 102011118611 A1 DE102011118611 A1 DE 102011118611A1
Authority
DE
Germany
Prior art keywords
device
object
preferably
data
characterized
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
DE102011118611A
Other languages
German (de)
Inventor
Franz Mathi
Peter Stelzer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Knapp AG
Original Assignee
Knapp AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Knapp AG filed Critical Knapp AG
Priority to DE102011118611A priority Critical patent/DE102011118611A1/en
Publication of DE102011118611A1 publication Critical patent/DE102011118611A1/en
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C7/00Sorting by hand only e.g. of mail
    • B07C7/005Computer assisted manual sorting, e.g. for mail
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C2501/00Sorting according to a characteristic or feature of the articles or material to be sorted
    • B07C2501/0045Return vending of articles, e.g. bottles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C2501/00Sorting according to a characteristic or feature of the articles or material to be sorted
    • B07C2501/0054Sorting of waste or refuse

Abstract

Device for optosensory detection, computer-aided identification and highlighting of objects by optical means. The invention relates to a device (1) for a semi-automatic test station with • at least one device (10) for detecting at least one object (3) or object feature (3 ') with optical and photosensory means, • a device (20) for computer-assisted identification such detected object (3) or features (3 ') based on the comparison of the optically recorded data (11) with predetermined, stored data (21) or selected properties (22) by a computer program (23), • and a device ( 30) for highlighting or marking at least one object (3) or object feature (3 ') by means of optical means. The device (1) is used to assist with manual sorting operations. Different objects on a translucent work surface (2) are detected, for example, from below via a camera, identified by means of a data processing system and, depending on the determined identity of the object with a projector optically z. B. highlighted.

Description

  • The present invention relates to an apparatus for optosensory sensing, computer aided identification and highlighting of objects by optical means. It is assigned to the fields of automation technology and partly to measurement technology, and deals with an automation-supporting device for a semi-automatic test station. Devices of this type are used in order picking, in goods control (quality assurance), in production plants, in packaging or repackaging plants, in customs, in mail order, in the merchandise management, in waste sorting plants and many other applications, in particular where Objects that are similar to each other are difficult to differentiate with the eye alone.
  • The apparatus ingeniously combines known devices such as optical detection systems, image processing systems or optical readers with computerized controlled marking devices such as light spots to provide feedback to a user on real-time automated analysis using a computer. As a result, the user is able to handle (seize, sort, select) the correct objects or incorrect objects faster than previously possible.
  • State of the art
  • To support picking operations, a variety of new devices have been developed in recent years. So is in the utility model AT 010520 U2 an apparatus and method for visual support of picking operations, comprising an optical detection device and a display device. About Augmented Reality functions are z. B. in a camera image hint information such as arrows and the like. Such devices are also used in spectacles, thereby bringing additional information of the virtual world into the field of view or video image of the real world. Other aids have come in the literature as pick-to-light techniques, where light signaling devices on cabinets or trays assist the user in decision support in selecting an object from a material store. A disadvantage of systems with data glasses or video glasses is that they are not happy to be worn, either because of the limitation of the field of view, due to the weight of such glasses or for reasons of vanity or the necessary care to always have clear view. In the case of monitor-based recognition systems, on the other hand, it is again troublesome to once again identify a product, which is preferably highlighted by recognition software on the monitor, on the real work area, where the highlighting is then missing. In this case, comparing target data on a monitor with actual data on the real work surface has a higher probability that the user will pick up a wrong object despite the monitor decision aid. In addition, the speed for performing such selections is limited.
  • With pick-to-light devices one is limited to very few objects. For example, if a magazine has five signatures on five trays, there is a risk that wrong products will be sorted in a bin because video analysis is missing.
  • Object of the invention
  • The object of the invention is to eliminate the aforementioned disadvantages for the user of the above known methods and methods by a simple semi-automatic test station, which allows one to quickly identify one or more objects from a set of objects to be differentiated technically. The user should be informed immediately about the right object. It should be able to detect false or incorrectly located objects faster or to probe the right objects from wrong or faulty objects faster from a crowd. It should be possible to sort different products with a similar appearance more quickly than before manually. The tools should be of an optical nature to avoid the risk of injury during use, such as by machines. An augmented reality-close solution was sought, which is technically simple and inexpensive. In addition, an adaptability for a variety of applications should be possible. A quick reconfiguration or reprogramming of the system components should be easily adjustable for each new application situation via simple interfaces or teach-in possibilities. Also, scalability is an issue of the invention, the device must be able to grow with the growing challenges of the tasks.
  • Solution of the task
  • According to the invention the task is solved by a device for a semi-automatic test station, which is equipped with different components:
    First, at least one device is required for detecting at least one object or object feature by means of optical and photo-sensory means. In addition, a device for computer-aided evaluation of optosensory detected objects or features by comparing the acquired data or the data evaluated by means of image comparison and / or character recognition algorithms against stored desired data. For example, barcodes or alphanumeric characters can be recognized and compared to records in memory. But it could also be determined by suitable image analysis programs an optical property. Possible properties are dimensional dimensions, such as the length or the widths. A particular shape, such as round, quadrangular, hexagonal, octagonal, oval or heart-shaped, or a particular color, such as red, yellow, blue; or certain brightness or spectral values of the reflection components of white light are other possible properties. Such a property can be determined absolutely or a ratio z. B. length to width or a contrast. The stored data sets can advantageously be created by a training algorithm, for which purpose the device presented here can be useful. For predefined properties, eg. As a certain size, a certain shape or a specific barcode, the detected and evaluated information with stored comparison values including a certain tolerance bandwidth can be compared, and evaluated by a computer program. This image data processing is not new per se but is required for the proposed device.
  • According to the invention, a device for directly highlighting or marking at least one object or object feature as a function of the result of the object or feature comparison evaluation is integrated in the system. The object to be marked lies on a document of defined size within an operative work area. A user of the device places objects on the pad or gets them to the pad via non-described devices. It is now advantageous for the marking device to have optical means which can project light with at least one specific property from the group intensity, shape and wavelength onto parts of the working area or onto the objects located thereon. In this case, these parts of the work area are selectively selected using a drive device, which is controlled by the device for computer-aided evaluation. For this purpose, the programming and the results of the automated analysis of the device for computer-assisted evaluation are used. In order to be able to unambiguously assign the location for the marking, at least one device or additional device is provided in order to allocate data, preferably optically sensed by calibration, to the actual position data correctly and then as a result of the computer-assisted determination of the position and orientation in the working area to use.
  • Advantageously, as an apparatus for marking or highlighting an image projector can be used, which is mounted at a distance above or obliquely above the surface. If the image projector is in a different position, the base can also be reached as a projection surface by deflecting mirrors. Any reflection of the target positions must be taken into account.
  • Alternatively, the marking or highlighting device may also be constructed from at least one light source which will project a light beam of a particular size, shape, intensity, pattern type and / or color as focused spots onto positions of that projection surface given by the dimensions of the base. In this case, at least one detail of the quality, for example the light color (spectral composition) or spot pattern and the target direction of the spot can be determined by the drive device. Thus, either by selection switch for a light source from a group of light sources with different beam alignment or by servomotors for changing the position and / or direction of a directional radiation of the irradiated place can be selected.
  • As an apparatus for detecting an object or an object feature, an electronic area camera preferably a CCD or a CMOS camera, preferably with a lens objective is particularly suitable, but also a line camera with a movable preferably oscillating or rotating mirror is possible. Also conceivable is the use of a laser scanner or the like for object detection.
  • For a detection with a camera, a lighting device for the objects may be useful, wherein the camera facing side of the object should be illuminated.
  • At least one of the devices for object detection or for object position detection can advantageously be arranged below the support for the objects, since they then do not hinder the work area and do not limit the user's hands to the field of view of the camera. The underlay should consist of transparent or partially translucent material. Suitable are glass or plastic. The pad can also be designed for this purpose as a grid or network.
  • To connect the individual devices for object detection, data processing and marking or for the control device of the marking device are advantageously cable connections, radio links or optical fiber connections for data transmission.
  • It makes sense if the predetermined, stored data also includes object-linked information beyond the optical properties, such as batch number, serial number, article number, expiry date, supplier information.
  • If a device is provided for additional acoustic signaling of a defined object or object feature recognition whose signal quality, preferably pitch or modulation signals different recognition events, this system offers further advantages.
  • In addition to the device, the method thereby possible is the subject of the invention. This method makes it possible according to the invention to recognize and identify at least one object with the aid of a detection device and an evaluation device. At least one previously selected and defined object feature, preferably a specific barcode of an object to be examined, is either not identified via this or another detection device. or clearly identified or misidentified. The purpose of the device for highlighting and marking, by the object is marked in a qualitatively different way "visually". This is advantageously done with differently colored light, in particular with green light for "right object - correctly recognized", with red light "possibly right object - not correctly recognized" and with blue light for "wrong object - correctly recognized". Other color assignments are possible as well as the use of certain projection patterns.
  • The invention will be explained with reference to the following embodiment. Show it:
  • 1 a sketch of an arrangement example of the inventive device;
  • The example described here showed a possible application of the device or the associated method. Various objects or object features are possible. The example has been chosen deliberately simple for ease of understanding, in real applications there are much higher complex situations which in most cases can also be supported by this device. Suppose a set of memory cards is spread on a glass plate. In this known card game the task is to collect as many identical image pairs from a set of image pairs with as few grab attempts. These memory cards are now the objects to be identified 3 with certain object features 3 ' , that's the picture content. These are printed on one of the two sides of the cards. Initially, in a learning mode via a device, the back of the cards is scanned via the image capture device 10 captured and their characteristics stored as a record 21 ' In order to avoid having two backs through the device 1 recognized as the same pair of cards and identified. Using a first algorithm within an evaluation device 20 in the form of a computer, one now identifies a first card on the glass plate 2 based on the opto-sensory acquired data 11 using a device 10 for example, a digital camera and compares the results of the data 21 after processing with the storage data 21 ' , If these are the same as the backside information in the data memory, then the evaluation of the optosensory data on the next detected card becomes on the document 2 continued from glass. The camera is below the surface for the purpose of card or image capture 2 attached and directed to the transparent glass plate. As long as no image but only backs are captured, the process repeats itself. The position and orientation of the card is also optically, for example, via the device 10 or the position detecting device 10 ' detected. About a jet device 30 If the card is now illuminated as unknown, for example, with blue light from above, the new image content is saved as a new data record. This procedure is repeated for all cards on the glass plate. Each card whose back is down on the glass plate is illuminated from above, for example, red. This will signal to the user that the red card must be turned over, otherwise it will not be recognized. As soon as a stored image motif is recognized a second time, these two identical or similar cards are selectively illuminated with green light on the basis of the determined position data; in addition, a signal can be emitted which can be used as a request to remove the image pair irradiated with green light. The user must remove the recognized pairs of images each time he beeps. He also has to turn all the cards illuminated in blue so that they are lying face down. This method allows the user to remove or collect all image pairs, little by little, from the glass plate, using the green light marks, and only takes a fraction of the time that would be required for this logistical task without the device. The signal tone can be delivered with a time delay by a corresponding acoustic signaling device. In this case, the user has the opportunity to take a recognized image pair of the noise emission from the glass plate and thereby avoid the beep. As a result, it is additionally possible to distinguish slow users from rapid ones in the number of acoustic messages, with the result that an indirect motivation of the user takes place.
  • In 1 there is a user 4 in front of the work surface 2 , There are two objects on this surface 3 with the object features 3 ' , Four detection devices 10 . 10 ' are arranged so that they "look" from different directions on the work surface. Here are two devices 10 below the pad, which is transparent here. A device for detecting objects or object features and a device for determining the position of the object are arranged above the work area above the work surface. The assembly management and control devices are omitted here for the sake of clarity, as well as the connections are only indicated. The devices 10 . 10 ' here can be digital cameras. The collected data 11 . 11 ' These devices are connected to the device 20 which is here a computer stand device is transmitted and evaluated accordingly by this device, so that the data 21 and 24 which are used for comparison and position assignment. The comparison of the data 21 , or determined properties 22 with setpoints 21 ' . 22 ' done using the computer program 23 in the device 20 , Using the data 24 the position detection and the identification algorithms in the device 20 becomes the device 30 , which here is for example a projector, according to directly or by means of a control device 31 controlled or positioned to cause the marking by means of a corresponding light irradiation. Here are two different light cones marked on two different objects in different colors G, B as an example.
  • LIST OF REFERENCE NUMBERS
  • 1
    identification device
    10
    Device for detecting an object or an object feature
    10 '
    Additional device for determining the position of the object in the detection area.
    11
    optosensory recorded data of the objects, or the local position of the same 11 '
    12
    optical means for optical detection (here: lenses / aperture system)
    13
    Photosensoric means (here: camera modules, sensor array, alternatively eg barcode scanner)
    2
    Underlay (working surface under the working area, eg glass plate)
    20
    Device for computer-aided evaluation of the acquired data from the device 10
    21
    evaluated data from the sensor data
    21 '
    stored data (eg from teach-in mode)
    22
    determined properties from the sensor data (size, color, brightness, shape, etc.)
    22 '
    predefined tolerance values for the properties
    23
    Computer program
    24
    Position data (control data or calibrated assignment data based on position data and identification results of the computer evaluation 24 ' )
    3
    Object;
    3 '
    object feature
    30
    Device for direct highlighting or marking
    31
    Device for controlling the device 30 using the data from device 20
    4
    user
    G
    first marking type (eg green light spot)
    B
    second marking type (eg blue light spot)
  • QUOTES INCLUDE IN THE DESCRIPTION
  • This list of the documents listed by the applicant has been generated automatically and is included solely for the better information of the reader. The list is not part of the German patent or utility model application. The DPMA assumes no liability for any errors or omissions.
  • Cited patent literature
    • AT 010520 U2 [0003]

Claims (10)

  1. Contraption ( 1 ) for a semi-automatic test station with at least one device ( 10 ) for detecting at least one object ( 3 ) or object feature ( 3 ' ) using optical ( 12 ) and photosensitizers ( 13 ) and with a device ( 20 ) for the computer-assisted evaluation of such detected objects ( 3 ) or characteristics ( 3 ' ) based on the comparison of opto-sensory data ( 11 ), preferably using image comparison and / or character recognition algorithms, evaluated data ( 21 ), preferably barcode or alphanumeric characters, and / or algorithms for detecting at least one optically determined property ( 22 ) from the group: dimension, shape detail, color, brightness - in absolute or relative values - on the one hand -, with predetermined, stored data ( 21 ' ) preferably by a training algorithm created data and / or with predefined properties ( 22 ' ) within predefined tolerance ranges by a computer program ( 23 ) - on the other hand - characterized in that the device ( 1 ) A device ( 30 ) for directly highlighting or marking at least one object ( 3 ) or object feature ( 3 ' ) depending on the result of the object or feature comparison evaluation and • a document ( 2 ) having dimensions defining a working area, the device ( 30 ) comprises optical means projecting light having at least one particular intensity, shape and wavelength group onto parts of the work area or objects thereon, said parts of the work area being determined by the programming and the results of the automated analysis of the device ( 20 ) for computer-assisted evaluation via a device ( 31 ) are selected and controlled in a well-defined manner, and • for which at least one device ( 10 ) or an additional device ( 10 ' ), which indirectly, preferably by calibration of the optosensory acquired data with actual position data, the data for the computer-assisted determination of the position and orientation in the work area supplies.
  2. Device according to claim 1, characterized in that the device ( 30 ) is an image projector which is mounted at a distance above or obliquely above the base, or that a mirror is mounted at the same position and the image projector is directed from another position to this mirror, that the resulting projection surface in both cases, the pad ( 2 ).
  3. Device according to claim 1, characterized in that the device ( 30 ) is constructed from at least one light source which throws a beam of light of particular size, shape, intensity, pattern and / or color as focused spots on positions of the screen given by the dimensions of the pad, with at least a detail of the quality and the aiming direction of the spot by the driving device ( 31 ) are determinable, preferably by selection switch for a light source from a group of light sources with different beam alignment or by servomotors for position and / or direction change of a directional radiator.
  4. Device according to claim 1), characterized in that the device ( 10 ) for detecting an object ( 3 ) or an object feature ( 3 ' ) an electronic area camera is preferably a CCD or a CMOS camera, preferably with lens objective, or a line camera with a movable preferably oscillating or rotating mirror or a laser scanner or the like.
  5. Device according to Claim 4), characterized in that a lighting device for illuminating at least that of the device ( 10 ) facing side of the objects ( 3 ) is available.
  6. Device according to claim 1), characterized in that at least one of the devices ( 10 . 10 ' ) is arranged below the pad, wherein the pad of transparent, or partially translucent material is preferably glass or plastic and / or formed as a grid or mesh.
  7. Device according to claim 1), characterized in that between the devices ( 10 . 20 . 30 respectively. 31 ) Cable connections, radio links or optical fiber links are provided for data transmission.
  8. Device according to claim 1), characterized in that the predetermined, stored data ( 21 ' ) include object-linked information beyond the optical properties.
  9. Apparatus according to claim 1), characterized in that a device for additional acoustic signaling of a defined object or object feature recognition is provided, the signal quality, preferably pitch or modulation signals different detection events.
  10. Method according to one of the devices of claim 1) to 9), characterized in that at least one object ( 3 ) using a detection device ( 10 . 10 ' ) and an evaluation device ( 20 ), but with at least one previously selected and defined object feature ( 3 ' ) preferably a specific barcode of the object to be examined via this or a further detection device ( 10 ) is not clearly identified, marked in a qualitatively different manner by means of a device for highlighting and marking - on the one hand, compared to a recognized object ( 3 ) whose previously selected and defined object feature ( 3 ' ) is uniquely identified with this device, and on the other hand again different from a recognized object with an evaluation device known, but not previously selected object feature preferably a wrong or another barcode, preferably with different colored light, especially with green light for "real object - correctly recognized ", with red light" possibly correct object - not correctly recognized "and with blue light for" wrong object - correctly recognized ".
DE102011118611A 2011-11-16 2011-11-16 Apparatus and method for a semi-automatic testing station Ceased DE102011118611A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE102011118611A DE102011118611A1 (en) 2011-11-16 2011-11-16 Apparatus and method for a semi-automatic testing station

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102011118611A DE102011118611A1 (en) 2011-11-16 2011-11-16 Apparatus and method for a semi-automatic testing station
PCT/EP2012/004587 WO2013072020A2 (en) 2011-11-16 2012-11-03 Device and method for a semiautomatic testing station preferably in an order-picking system

Publications (1)

Publication Number Publication Date
DE102011118611A1 true DE102011118611A1 (en) 2013-05-16

Family

ID=47263221

Family Applications (1)

Application Number Title Priority Date Filing Date
DE102011118611A Ceased DE102011118611A1 (en) 2011-11-16 2011-11-16 Apparatus and method for a semi-automatic testing station

Country Status (2)

Country Link
DE (1) DE102011118611A1 (en)
WO (1) WO2013072020A2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102013003661A1 (en) * 2013-03-02 2014-09-04 Audi Ag Device for checking an automated optical assembly of pre-mounted components for motor car, has computing unit connected with superordinate production planning and/or control system for the regard of individual defaults of component
WO2018031956A3 (en) * 2016-08-12 2018-03-15 Amazon Technologies, Inc. Object sensing and handling system and associated methods
EP3434626A4 (en) * 2016-03-23 2019-05-08 Panasonic Intellectual Property Management Co., Ltd. Projection instruction device, parcel sorting system, and projection instruction method

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9658310B2 (en) * 2015-06-16 2017-05-23 United Parcel Service Of America, Inc. Concepts for identifying an asset sort location
US10495723B2 (en) 2015-06-16 2019-12-03 United Parcel Service Of America, Inc. Identifying an asset sort location

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AT10520U2 (en) 2008-09-05 2009-05-15 Knapp Systemintegration Gmbh Device and method for the visual support of picking processes

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7090134B2 (en) * 2003-03-04 2006-08-15 United Parcel Service Of America, Inc. System for projecting a handling instruction onto a moving item or parcel
US8825200B2 (en) * 2007-11-07 2014-09-02 Siemens Industry, Inc. Method and system for tracking of items
FR2949893A1 (en) * 2009-09-09 2011-03-11 Sidel Participations Method for aiding the identification of non-compliant products manually sorted and installation for its implementation

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AT10520U2 (en) 2008-09-05 2009-05-15 Knapp Systemintegration Gmbh Device and method for the visual support of picking processes

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102013003661A1 (en) * 2013-03-02 2014-09-04 Audi Ag Device for checking an automated optical assembly of pre-mounted components for motor car, has computing unit connected with superordinate production planning and/or control system for the regard of individual defaults of component
DE102013003661B4 (en) 2013-03-02 2018-12-20 Audi Ag Device for automated optical assembly inspection of preassembled assemblies for motor vehicles
EP3434626A4 (en) * 2016-03-23 2019-05-08 Panasonic Intellectual Property Management Co., Ltd. Projection instruction device, parcel sorting system, and projection instruction method
WO2018031956A3 (en) * 2016-08-12 2018-03-15 Amazon Technologies, Inc. Object sensing and handling system and associated methods

Also Published As

Publication number Publication date
WO2013072020A3 (en) 2013-07-11
WO2013072020A2 (en) 2013-05-23

Similar Documents

Publication Publication Date Title
US9058518B2 (en) Seed classification using spectral analysis to determine existence of a seed structure
US8625930B2 (en) Digital microscope slide scanning system and methods
US9798910B2 (en) Mobile hand held machine vision method and apparatus using data from multiple images to perform processes
KR101928111B1 (en) A Checkout Counter
JP5989161B2 (en) Method and system for imaging cut stone
AU2013343537B2 (en) Bio-imaging method
TWI290363B (en) Method and system for marking a workpiece such as a semiconductor wafer and laser marker for use therein
JP4309439B2 (en) Object take-out device
JP4465719B2 (en) Impersonation detection device and impersonation detection method
CN104101608B (en) Intelligent detecting device for detecting defects of multi-type irregularly shaped product
CN102792124B (en) The formation method strengthened and device
CN102369539B (en) Exposure control for multi-imaging scanner
DE69838714T2 (en) Optical screening device and image reader for image reading and decoding of optical information with one- and two-dimensional symbols with changing depth
CN103170459B (en) Spectacle lens flaw detection system
US6592033B2 (en) Item recognition method and apparatus
US5095204A (en) Machine vision inspection system and method for transparent containers
US7295948B2 (en) Laser system for marking tires
US5335288A (en) Apparatus and method for biometric identification
TWI313576B (en) Board inspecting apparatus, its parameter setting method and parameter setting apparatus
DE112010002174B4 (en) Method and device for a practical 3D vision system
JP3790638B2 (en) Medical training device and evaluation method of medical training result
CN105817430B (en) Product inspection method based on machine vision
EP1761738B1 (en) Measuring apparatus and method for range inspection
EP0696236B1 (en) Process and device for sorting materials
JP5414685B2 (en) System and method for reading a pattern using a plurality of image frames

Legal Events

Date Code Title Description
R012 Request for examination validly filed
R002 Refusal decision in examination/registration proceedings
R003 Refusal decision now final