CN108745942B - Product appearance detection method and device - Google Patents

Product appearance detection method and device Download PDF

Info

Publication number
CN108745942B
CN108745942B CN201810394348.1A CN201810394348A CN108745942B CN 108745942 B CN108745942 B CN 108745942B CN 201810394348 A CN201810394348 A CN 201810394348A CN 108745942 B CN108745942 B CN 108745942B
Authority
CN
China
Prior art keywords
product
material receiving
controlling
detection
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810394348.1A
Other languages
Chinese (zh)
Other versions
CN108745942A (en
Inventor
刘少林
赵伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Ck Technology Co ltd
Original Assignee
Chengdu Ck Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Ck Technology Co ltd filed Critical Chengdu Ck Technology Co ltd
Priority to CN201810394348.1A priority Critical patent/CN108745942B/en
Publication of CN108745942A publication Critical patent/CN108745942A/en
Application granted granted Critical
Publication of CN108745942B publication Critical patent/CN108745942B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/34Sorting according to other particular properties
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/36Sorting apparatus characterised by the means used for distribution
    • B07C5/361Processing or control devices therefor, e.g. escort memory
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/36Sorting apparatus characterised by the means used for distribution
    • B07C5/38Collecting or arranging articles in groups

Abstract

The invention discloses a method and a device for detecting product appearance, wherein the method comprises the following steps: controlling a positioning module to position a product to obtain position information of the product; controlling a robot to pick up the product based on the position information and move the product to be within the detection range of a detection module; controlling the detection module to detect a plurality of surfaces of the product; and controlling the robot to place the product into the corresponding material receiving disc based on the obtained detection result. The invention solves the technical problems of low detection accuracy, low detection efficiency and incomplete detection of the detection method for the product appearance in the prior art, and realizes the technical effects of improving the detection accuracy and the detection efficiency and comprehensively detecting the product appearance.

Description

Product appearance detection method and device
Technical Field
The invention relates to the technical field of manufacturing, in particular to a method and a device for detecting product appearance.
Background
With the development of manufacturing technology, China has become a strong manufacturing country, and nowadays, the electronic chips can be manufactured as big as airplanes and ships and as small as electronic chips. The manufactured products often need to go through a plurality of detection procedures to finally leave the factory and become qualified products. Among them, there is an important inspection item, namely, the inspection of the product appearance.
At present, the current situation of a detection method for product appearance is as follows:
(1) the evaluation mode of poor appearance is difficult to standardize, often judges by the manual work, and the unstable factor is more, and the staff who accomplishes this work must be trained, and is very familiar with the product, experienced, has to survey the accuracy rate lower, and detection efficiency is lower, detects the higher problem of cost.
(2) Because the outward appearance is bad probably takes place in each surface of product, and current automatic check scheme mostly is special equipment, only detects to a peculiar certain face, can not detect comprehensively all surfaces of product, has the problem that can't detect comprehensively to the product.
In summary, most of the existing detection methods for product appearance have the technical problems of low detection accuracy, low detection efficiency and incomplete detection.
Disclosure of Invention
The embodiment of the application provides a detection method and a detection device for product appearance, solves the technical problems of low detection accuracy, low detection efficiency and incomplete detection of the detection method for product appearance in the prior art, improves the detection accuracy and the detection efficiency, and can achieve the technical effect of comprehensively detecting the product appearance.
In a first aspect, the present application provides the following technical solutions through an embodiment of the present application:
a method for detecting the appearance of a product, comprising:
controlling a positioning module to position a product to obtain position information of the product;
controlling a robot to pick up the product based on the position information and move the product to be within the detection range of a detection module;
controlling the detection module to detect a plurality of surfaces of the product;
and controlling the robot to place the product into the corresponding material receiving disc based on the obtained detection result.
Preferably, the controlling and positioning module positions a product to obtain the position information of the product, including:
and controlling a first image acquisition unit arranged on the positioning module to acquire images of the products in the feeding tray, positioning the products based on the acquired image information, and acquiring the position information of the products.
Preferably, the control robot picks up the product, including:
sucking the product through a suction nozzle arranged on the robot; or
And grabbing the product through a clamping jaw arranged on the robot.
Preferably, the moving the product into the detection range of the detection module includes:
controlling the robot to move the product into a detection range of a first detection module, wherein the first detection module is used for detecting one or more first surfaces of the product; and
and controlling the robot to move the product to a detection range of a second detection module, wherein the second detection module is used for detecting other one or more second sides of the product.
Preferably, the controlling the detection module to detect a plurality of surfaces of the product includes:
controlling the first detection module to detect one or more first sides of the product;
and if the first detection module detects that each surface of the product is qualified, controlling the second detection module to detect other one or more second surfaces of the product.
Preferably, the controlling the robot to place the product into a corresponding receiving tray based on the obtained detection result includes:
if the first detection module detects that each surface of the product is qualified, controlling the robot to place the product into a first material receiving tray; if the first detection module detects that at least one face of the product is unqualified, controlling the robot to place the product into a second material receiving disc;
if the second detection module detects that each surface of the product is qualified, the product is left in the first material receiving tray; and if the second detection module detects that at least one surface of the product is unqualified, controlling the robot to transfer the product from the first material receiving tray to a third material receiving tray.
Preferably, the method further comprises:
when the first material receiving tray is full of products, the material receiving module is controlled to move the first material receiving tray downwards, and the material receiving tray management module is controlled to convey empty material receiving trays to the material receiving module so as to supplement the first material receiving tray;
and when the third material receiving tray is full of products, the material receiving module is controlled to move the third material receiving tray downwards, and the material receiving tray management module is controlled to convey empty material receiving trays to the material receiving module so as to supplement the third material receiving tray.
Preferably, the method further comprises:
when the number of the first material receiving discs filled with the products is larger than a first numerical value, controlling the output module to output first prompt information to prompt a worker to take the first material receiving discs and the products in the first material receiving discs;
when the second material receiving tray is full of products, the output module is controlled to output second prompt information to prompt a worker to take the second material receiving tray and the products in the second material receiving tray;
when the number of the third material receiving discs filled with the products is larger than a third numerical value, the output module is controlled to output third prompt information to prompt a worker to take the third material receiving discs and the products in the third material receiving discs away.
Preferably, the method further comprises:
and controlling an air purification module to purify the air of the environment where the product is located.
In a second aspect, the present application provides the following technical solutions through an embodiment of the present application:
a detection device for the appearance of a product, comprising:
the first control unit is used for controlling the positioning module to position the product and obtaining the position information of the product;
the second control unit is used for controlling the robot to pick up the product and move the product to the detection range of the detection module based on the position information;
a third control unit for controlling the detection module to detect a plurality of surfaces of the product;
and the fourth control unit is used for controlling the robot to place the product into the corresponding material receiving tray based on the obtained detection result.
Preferably, the first control unit is specifically configured to:
and controlling an image acquisition unit arranged on the positioning module to acquire images of the products in the feeding tray, positioning the products based on the acquired image information, and acquiring the position information of the products.
Preferably, the second control unit is specifically configured to:
sucking the product through a suction nozzle arranged on the robot; or
And grabbing the product through a clamping jaw arranged on the robot.
Preferably, the second control unit is specifically configured to:
controlling the robot to move the product into a detection range of a first detection module, wherein the first detection module is used for detecting one or more first surfaces of the product; and controlling the robot to move the product to a detection range of a second detection module, wherein the second detection module is used for detecting other one or more second sides of the product.
Preferably, the third control unit is specifically configured to:
controlling the first detection module to detect one or more first sides of the product; and if the first detection module detects that each surface of the product is qualified, controlling the second detection module to detect other one or more second surfaces of the product.
Preferably, the fourth control unit is specifically configured to:
if the first detection module detects that each surface of the product is qualified, controlling the robot to place the product into a first material receiving tray; if the first detection module detects that at least one face of the product is unqualified, controlling the robot to place the product into a second material receiving disc;
if the second detection module detects that each surface of the product is qualified, the product is left in the first material receiving tray; and if the second detection module detects that at least one surface of the product is unqualified, controlling the robot to transfer the product from the first material receiving tray to a third material receiving tray.
Preferably, the apparatus further comprises:
the fifth control unit is used for controlling the material receiving module to move the first material receiving tray downwards when the first material receiving tray is filled with products, and controlling the material receiving tray management module to convey empty material receiving trays to the material receiving module so as to supplement the first material receiving tray; and when the third material receiving tray is full of products, the material receiving module is controlled to move the third material receiving tray downwards, and the material receiving tray management module is controlled to convey empty material receiving trays to the material receiving module so as to supplement the third material receiving tray.
Preferably, the apparatus further comprises:
the sixth control unit is used for controlling the output module to output first prompt information when the number of the first material receiving discs filled with the products is larger than a first numerical value so as to prompt a worker to take the first material receiving discs and the products in the first material receiving discs; when the second material receiving tray is full of products, the output module is controlled to output second prompt information to prompt a worker to take the second material receiving tray and the products in the second material receiving tray; when the number of the third material receiving discs filled with the products is larger than a third numerical value, the output module is controlled to output third prompt information to prompt a worker to take the third material receiving discs and the products in the third material receiving discs away.
Preferably, the apparatus further comprises:
and the seventh control unit is used for controlling the air purification module to purify the air of the environment where the product is located.
In a third aspect, the present application provides the following technical solutions through an embodiment of the present application:
a detection apparatus for product appearance, comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the following steps when executing the computer program:
controlling a positioning module to position a product to obtain position information of the product; controlling a robot to pick up the product based on the position information and move the product to be within the detection range of a detection module; controlling the detection module to detect a plurality of surfaces of the product; and controlling the robot to place the product into the corresponding material receiving disc based on the obtained detection result.
In a fourth aspect, the present application provides the following technical solutions through an embodiment of the present application:
a computer-readable storage medium, on which a computer program is stored which, when executed by a processor, carries out the steps of:
controlling a positioning module to position a product to obtain position information of the product; controlling a robot to pick up the product based on the position information and move the product to be within the detection range of a detection module; controlling the detection module to detect a plurality of surfaces of the product; and controlling the robot to place the product into the corresponding material receiving disc based on the obtained detection result.
One or more technical solutions provided in the embodiments of the present application have at least the following technical effects or advantages:
in an embodiment of the present application, a method for detecting an appearance of a product is disclosed, including: controlling a positioning module to position a product to obtain position information of the product; controlling a robot to pick up the product based on the position information and move the product to be within the detection range of a detection module; controlling the detection module to detect a plurality of surfaces of the product; and controlling the robot to place the product into the corresponding material receiving disc based on the obtained detection result. Because utilize orientation module, robot and detection module to detect product appearance, compare in traditional artifical detection mode, this embodiment can improve and detect the rate of accuracy, improve detection efficiency, and simultaneously, can detect a plurality of surfaces of product again, realize the comprehensive detection to the product, so effectively solved among the prior art be used for product appearance's detection method to have the detection rate of accuracy low, detection efficiency is low, detect incomplete technical problem, realized improving and detected the rate of accuracy, improve detection efficiency, and can carry out comprehensive detection's technological effect to product appearance.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on the drawings without creative efforts.
FIG. 1 is a perspective view of a detecting device for product appearance according to an embodiment of the present application;
fig. 2 is an exploded view of a detection apparatus for product appearance (after removing the security gate frame 20) according to an embodiment of the present invention;
FIG. 3 is a flow chart of a method for detecting the appearance of a product according to an embodiment of the present application;
FIG. 4 is a block diagram of a detecting device for detecting the appearance of a product according to an embodiment of the present application;
FIG. 5 is a block diagram of a detecting device for detecting the appearance of a product according to an embodiment of the present application;
fig. 6 is a block diagram of a computer-readable storage medium according to an embodiment of the present application.
Detailed Description
The embodiment of the application provides a detection method and a detection device for product appearance, solves the technical problems of low detection accuracy, low detection efficiency and incomplete detection of the detection method for product appearance in the prior art, improves the detection accuracy and the detection efficiency, and can achieve the technical effect of comprehensively detecting the product appearance.
In order to solve the technical problems, the general idea of the embodiment of the application is as follows:
a method for detecting the appearance of a product, comprising: controlling a positioning module to position a product to obtain position information of the product; controlling a robot to pick up the product based on the position information and move the product to be within the detection range of a detection module; controlling the detection module to detect a plurality of surfaces of the product; and controlling the robot to place the product into the corresponding material receiving disc based on the obtained detection result. Because utilize orientation module, robot and detection module to detect product appearance, compare in traditional artifical detection mode, this embodiment can improve and detect the rate of accuracy, improve detection efficiency, and simultaneously, can detect a plurality of surfaces of product again, realize the comprehensive detection to the product, so effectively solved among the prior art be used for product appearance's detection method to have the detection rate of accuracy low, detection efficiency is low, detect incomplete technical problem, realized improving and detected the rate of accuracy, improve detection efficiency, and can carry out comprehensive detection's technological effect to product appearance.
In order to better understand the technical solution, the technical solution will be described in detail with reference to the drawings and the specific embodiments.
Example one
The embodiment provides a detection method for product appearance, which can be applied to a detection device for product appearance, as shown in fig. 1 and fig. 2, the device includes a cabinet 10, a positioning module 21 and a detection module (which include a second detection module 23 and a first detection module 24) disposed in the cabinet 10 (i.e., in a space between the cabinet 10 and a security door frame 20 thereof), and a robot 22 disposed in the cabinet 10, and at the same time, the device further includes an industrial personal computer 40 (the industrial personal computer 40 is disposed on a side surface of the cabinet) connected to the positioning module 21, the detection module and the robot 22, and the detection method for product appearance can be specifically performed by the industrial personal computer 40.
As shown in fig. 3, the method for detecting the appearance of a product includes:
step S301: the control positioning module 21 positions the product to obtain the position information of the product;
step S302: controlling the robot 22 to pick up the product and move the product into the detection range of the detection module based on the position information;
step S303: controlling a detection module to detect a plurality of surfaces of a product;
step S304: based on the obtained detection results, the robot 22 is controlled to place the product into the corresponding take-up tray.
As an alternative embodiment, step S301 includes:
and controlling a first image acquisition unit 211 arranged on the positioning module 21 to acquire images of the products in the batch pan, positioning the products based on the acquired image information, and acquiring position information of the products.
In a specific implementation process, the method is used for detecting the appearance of a product, wherein the product can be: the present disclosure relates to a device and a method for manufacturing a device, and more particularly, to a device and a method for manufacturing a device, and a device and a method for manufacturing a device.
In the specific implementation process, a feeding tray management module 25 can be arranged in the cabinet 10, the feeding tray management module 25 comprises a tray dividing mechanism and a conveying mechanism, a plurality of feeding trays can be stored in the tray dividing mechanism and used for containing products to be detected (one or more products can be contained in each feeding tray), and the feeding trays are sequentially stacked from top to bottom, so that the space can be saved. When the appearance of a product is detected, the feeding tray management module 25 may be started, the tray dividing mechanism is controlled to extract one feeding tray from the bottom layer, and the conveying mechanism is controlled to convey the feeding tray to the positioning range of the positioning module 21 (for example, the area directly opposite to the first image acquisition unit 211), so that the positioning module 21 may position the product in the feeding tray. When the feeding disc is conveyed by the conveying mechanism, the feeding disc can be conveyed to the position near the positioning module 21 by the conveying belt, then the feeding disc is clamped by the air cylinder, and the feeding disc is dragged to the positioning range of the positioning module 21 under the driving of the servo motor.
In the specific implementation process, a first image capturing unit 211 (e.g., a monocular camera or a monocular camera) is disposed on the positioning module 21, and after the material feeding tray is conveyed to the working range of the positioning module 21, the first image capturing unit 211 may be controlled to take one or more pictures of the products in the material feeding tray, perform image recognition on the obtained pictures, recognize the placement positions of one or more products, and obtain the position information of one or more products in the material feeding tray. Furthermore, since the products in the batch tray are continuously taken away by the robot 22, the positions of the remaining products in the batch tray may change, and therefore, it is preferable that the positioning module 21 is controlled to reposition the products in the batch tray after the robot 22 takes away one product each time.
As an alternative embodiment, the control robot 22 picks up the product, including: the product is sucked up by a suction nozzle provided on the robot 22.
In the specific implementation process, the robot 22 has a robot arm, which can move freely, and a suction nozzle is disposed at the front end of the robot arm, so that the robot 22 can be controlled to suck any product in the feeding tray through the suction nozzle and drive the product to move. Of course, other ways of picking up the product (e.g., grasping the product by the jaws) are also contemplated, and the present embodiment is not particularly limited to these other ways.
As an alternative embodiment, the moving the product into the detection range of the detection module 21 includes:
controlling the robot 22 to move the product into a detection range of a first detection module 24, wherein the first detection module 24 is used for detecting one or more surfaces (i.e. a first surface) of the product; and controlling the robot 22 to move the product into the detection range of the second detection module 23, wherein the second detection module 23 is used for detecting one or more other surfaces (i.e. the second surface) of the product.
In the following, taking one or more surfaces to be detected by the first detecting module 24 as the side surface of the product as an example, the second detecting module 23 detects the front surface of the product.
In a specific implementation process, the first detection module 24 is fixed in position, and includes a second image capturing unit 241 (for example, a monocular camera or a monocular camera) for capturing images of one or more sides of the product, and when detecting the side of the product, the robot 22 needs to be controlled to move the product to a focal position of the second image capturing unit 241 (the focal position is a detection range of the first detection module 21).
In the specific implementation process, one or more side surfaces of the product are detected by the first detection module 24, and if each side surface is qualified, the robot 22 is controlled to place the product into the first material receiving tray 261 (the inside of the first material receiving tray 261 is the detection range of the second detection module 23), and wait for the front detection.
As an alternative embodiment, step S303 includes:
first, the first detection module 24 is controlled to detect one or more side surfaces of the product; and if each side surface of the product is qualified, controlling the second detection module 23 to detect one or more front surfaces of the product.
In a specific implementation process, when the side surface of the product is detected, the robot 22 may be controlled to rotate the product for multiple times according to the shape of the product, so that each side surface can be captured by the second image capturing unit 241 of the first detecting module 24.
In the specific implementation process, one or more side surfaces of the product are detected by the first detection module 24, if each side surface is qualified, the robot 22 is controlled to place the product into the first material receiving tray 261, and the second detection module 23 is controlled to detect one or more front surfaces of the product, where the front surface of the product is a surface (referred to as a suction surface for short) on which the robot 22 sucks the product, and when the robot 22 places the product into the first material receiving tray 261, the suction surface faces upward. Meanwhile, the second detection die 23 is located above the first material receiving tray 261 and has a third image capturing unit 231 and an XYZ tri-axial servo motor, and the XYZ tri-axial servo motor can drive the third image capturing unit 231 to move freely in a three-dimensional space, thereby aligning the front surface of the product.
In a specific implementation process, when the side surface (or the front surface) of the product is detected based on the image information obtained by the first detection module 24 (or the image information obtained by the second detection module 23), a mode of combining machine learning and a machine vision algorithm can be adopted, and compared with a traditional manual detection mode, the detection method in the embodiment has the advantages of consistent detection standard, higher detection accuracy, higher detection speed and higher detection efficiency.
In this embodiment, based on the second detection module 23 and the first detection module 24, one or more front surfaces and one or more side surfaces of the product can be detected, so that a technical effect of comprehensively detecting the product is achieved, and the technical problem that detection is incomplete in a detection method for product appearance in the prior art is solved.
In the specific implementation process, after the robot 22 puts the product with the side detection into the first material receiving tray 261 (or the second material receiving tray 262), the robot 22 is controlled to continue to suck the next product from the material receiving tray and move the product into the detection range of the first detection module 21 to continue the front detection.
As an alternative embodiment, step S304 includes:
if the first detection module 24 detects that each side face of the product is qualified, the robot 22 is controlled to place the product into the first material receiving tray 261; if the first detection module 24 detects that at least one side face of the product is unqualified, controlling the robot 22 to place the product into the second material receiving tray 262; if the front side of the product is qualified, the product is left in the first material receiving disc 261; if the front detection of the product is not qualified, the robot 22 is controlled to transfer the product from the first receiving tray 261 to the third receiving tray 263.
In the implementation process, as described above, the side surfaces of the product are detected first, and if all the side surfaces are detected to be qualified, the robot 22 is controlled to place the product into the first material receiving tray 261, and if at least one side surface is detected to be unqualified, the robot 22 is controlled to place the product into the second material receiving tray 262. Wherein the products in the first receiving tray 261 need to be further front inspected, while the products in the second receiving tray 262 are typically non-repairable failed products.
In the specific implementation process, when the front detection is performed on the product in the first material receiving tray 261, if the front detection is qualified, the product is left in the first material receiving tray 261, and if the front detection is not qualified, the robot 22 is controlled to transfer the product to the third material receiving tray 263, where the product in the third material receiving tray 263 is generally a repairable product. Moreover, the robot 22 can be controlled to transfer the product with the next qualified side surface inspection into the third material receiving tray 263 after the product with the next qualified side surface inspection is placed into the first material receiving tray 261, so that the working efficiency of the robot 22 is higher.
In the specific implementation process, the first material receiving tray 261, the second material receiving tray 262 and the third material receiving tray 263 have the same shape and size, and are different in working positions and are used for containing products with different detection results.
In the embodiment, the positioning module 21, the robot 22 and the detection modules (i.e., the first detection module 24 and the second detection module 23) are used for detecting the appearance of the product, so that compared with the conventional manual detection method, the embodiment can improve the detection accuracy and improve the detection efficiency.
In addition, in the embodiment, the positioning module 21, the detecting module (including the first detecting module 24 and the second detecting module 23), and the robot 22 can work simultaneously without interfering with each other, so that the detecting efficiency of the product appearance can be improved.
As an optional embodiment, the method for detecting the appearance of a product further includes:
when the first material receiving tray 261 is full of products, the material receiving module 26 is controlled to move the first material receiving tray 261 downwards, and the material receiving tray management module 27 is controlled to convey an empty material receiving tray to the material receiving module 26 so as to replenish the first material receiving tray 261;
when the third receiving tray 263 is full of products, the receiving module 26 is controlled to move the third receiving tray 263 downward, and the receiving tray management module 27 is controlled to transfer an empty receiving tray to the receiving module 26 to replenish the third receiving tray 263.
In the specific implementation process, the first receiving tray 261, the third receiving tray 263 and the second receiving tray 262 are placed on the receiving module 26 side by side, and when the currently used first receiving tray 261 is full of products, the first receiving tray 261 can be controlled to move downwards for a certain distance so as to leave an upper space for placing the empty first receiving tray 261 supplemented by the receiving tray management module 27, and similarly, when the currently used third receiving tray 263 is full of products, the third receiving tray 263 can be controlled to move downwards for a certain distance so as to leave an upper space for placing the empty third receiving tray 263 supplemented by the receiving tray management module 27.
In the specific implementation process, when the robot 22 puts the products into the first receiving tray 261, the second receiving tray 262 and the third receiving tray 263, the number of the products in each receiving tray may be counted, and when the number of the products in a certain receiving tray reaches a preset value, the receiving tray is considered to be full of the products.
In the specific implementation process, because the reject ratio of the side surface detection is low (generally lower than 2%), only one second material receiving tray 262 can be arranged, and when the second material receiving tray 262 is full of products, an empty second material receiving tray 262 is replaced by a worker; of course, if the failure rate of the side detection is high, the receiving tray management module 27 may be controlled to replenish the empty second receiving tray 262 to the receiving module 26 by using the first receiving tray 261 (or the third receiving tray 263).
In the specific implementation process, a receiving tray management module 27 is further disposed in the housing 10, wherein a plurality of empty receiving trays are stored in the receiving tray management module 27, and the receiving tray management module 27 is configured to convey the empty receiving trays to the receiving module 26, so as to replenish the receiving module 26 with the first receiving tray 261 and the third receiving tray 263.
In the specific implementation process, the material receiving tray management module 27 comprises a tray dividing mechanism and a conveying mechanism, a plurality of empty material receiving trays are stored in the tray dividing mechanism, and the material receiving trays are stacked from top to bottom in sequence, so that the space can be saved. When the first material receiving tray 261 (or the third material receiving tray 263) in the material receiving module 26 is full of products, the material receiving tray management module 27 is started, the tray dividing mechanism is controlled to extract an empty material receiving tray from the bottommost layer, and the conveying mechanism (and under the cooperation of the second detection module 23) is controlled to convey the empty material receiving tray to the working position of the first material receiving tray 261 (or the third material receiving tray 263) of the material receiving module 26.
For example, when detecting that the uppermost first receiving tray 261 (or the third receiving tray 263) of the receiving module 26 is full of products, the receiving module 26 is controlled to move the first receiving tray 261 (or the third receiving tray 263) downward, and the receiving tray management module 27 is controlled to transfer an empty receiving tray to the receiving module 26 to replenish the first receiving tray 261 (or the third receiving tray 263). When the tray management module 27 is controlled to convey the empty receiving tray to the receiving module 26, the empty receiving tray can be conveyed to the position near the receiving module 26 by using the conveying belt of the conveying mechanism, then the second detection module 23 (provided with the suction nozzle) is controlled to suck the empty receiving tray, the XYZ three-axis servo motor (of the second detection module 23) is controlled to move, and the empty receiving tray is placed at the working position of the first tray 261 (or the working position of the third receiving tray 263) of the receiving module 26 and is located right above the original first tray 261 (or the third receiving tray 263).
As an optional embodiment, the method for detecting the appearance of a product further includes:
when the number of the first material receiving discs 261 filled with the products is larger than a first numerical value, the output module is controlled to output first prompt information to prompt a worker to take the first material receiving discs 261 and the products in the first material receiving discs 261;
when the second material receiving tray 262 is full of products, the output module is controlled to output second prompt information to prompt a worker to take the second material receiving tray 262 and the products in the second material receiving tray 262;
when the number of the third material receiving discs 263 filled with the products is larger than a third value, the control output module outputs a third prompt message to prompt the staff to take the third material receiving discs 263 and the products therein.
In a specific implementation process, the output module 40 may be: a display, a speaker, etc. are provided on the industrial personal computer 40 for outputting the above prompt information.
For example, if the material collecting module 26 can store 10 first material collecting trays 261 at most, and if the number of the first material collecting trays 261 in the material collecting module 26 reaches 10, the control output module 40 outputs a first prompt message to prompt the worker to manually take the first material collecting tray 261 and the products therein.
For example, if the material collecting module 26 can store 10 third material collecting trays 263 at most, and if the number of the third material collecting trays 263 in the material collecting module 26 reaches 10, the control output module 40 outputs a third prompt message to prompt the worker to manually take the third material collecting trays 263 and the products therein.
For example, if the second material receiving tray 262 is full of products, the control output module 40 outputs a second prompt message to prompt the worker to manually take the second material receiving tray 262 and the products therein and replace the empty second material receiving tray 262.
As an optional embodiment, the method for detecting the appearance of a product further includes: the air purification module 30 is controlled to purify the air of the environment in which the product is located.
In the specific implementation process, air purification module 30 sets up in rack 10, specifically sets up at rack 10 top, is carrying out the testing process to the product appearance, can open this air purification module 30 to carry out air purification to the environment at product place, thereby obtain the dustless environment of thousand grades, the protection is detected the product, is favorable to guaranteeing the accuracy of testing result.
The technical scheme in the embodiment of the application at least has the following technical effects or advantages:
in an embodiment of the present application, a method for detecting an appearance of a product is disclosed, including: controlling a positioning module to position a product to obtain position information of the product; controlling a robot to pick up the product based on the position information and move the product to be within the detection range of a detection module; controlling the detection module to detect a plurality of surfaces of the product; and controlling the robot to place the product into the corresponding material receiving disc based on the obtained detection result. Because utilize orientation module, robot and detection module to detect product appearance, compare in traditional artifical detection mode, this embodiment can improve and detect the rate of accuracy, improve detection efficiency, and simultaneously, can detect a plurality of surfaces of product again, realize the comprehensive detection to the product, so effectively solved among the prior art be used for product appearance's detection method to have the detection rate of accuracy low, detection efficiency is low, detect incomplete technical problem, realized improving and detected the rate of accuracy, improve detection efficiency, and can carry out comprehensive detection's technological effect to product appearance.
Example two
Based on the same inventive concept, as shown in fig. 4, the present embodiment provides a detection apparatus 400 for product appearance, comprising:
the first control unit 401 is configured to control the positioning module 21 to position a product, so as to obtain position information of the product;
a second control unit 402 for controlling the robot 22 to pick up the product and move the product into the detection range of the detection module based on the position information;
a third control unit 403, configured to control the detection module to detect multiple surfaces of the product;
a fourth control unit 404, configured to control the robot 22 to place the product into a corresponding material receiving tray based on the obtained detection result.
As an optional embodiment, the first control unit 401 is specifically configured to:
the first image acquisition unit 211 arranged on the positioning module 21 is controlled to acquire images of the products in the batch pan, and the products are positioned based on the acquired image information to acquire the position information of the products.
As an optional embodiment, the second control unit 402 is specifically configured to:
sucking the product through a suction nozzle provided on the robot 22; or
The product is gripped by gripping jaws provided on the robot 22.
As an optional embodiment, the second control unit 402 is specifically configured to:
controlling the robot 22 to move the product into a detection range of a first detection module 24, wherein the first detection module 24 is used for detecting one or more first surfaces of the product; and controlling the robot 22 to move the product to a detection range of a second detection module 23, wherein the second detection module 23 is used for detecting other one or more second sides of the product.
As an optional embodiment, the third control unit 403 is specifically configured to:
controlling the first detection module 24 to detect one or more first sides of the product; and if the first detection module detects that each side of the product is qualified, controlling the second detection module 23 to detect other one or more second sides of the product.
As an optional embodiment, the fourth control unit 404 is specifically configured to:
if the first detection module detects that each surface of the product is qualified, the robot 22 is controlled to place the product into a first material receiving tray 261; if the first detection module detects that at least one surface of the product is unqualified, the robot 22 is controlled to place the product into a second material receiving tray 262;
if the second detection module detects that each surface of the product is qualified, the product is left in the first material receiving tray 261; if the second detection module detects that at least one of the faces of the product is not qualified, the robot 22 is controlled to transfer the product from the first receiving tray 261 to a third receiving tray 263.
As an alternative embodiment, the detection apparatus 400 for product appearance further includes:
the fifth control unit is used for controlling the material receiving module to move the first material receiving tray downwards when the first material receiving tray is filled with products, and controlling the material receiving tray management module to convey empty material receiving trays to the material receiving module so as to supplement the first material receiving tray; and when the third material receiving tray is full of products, the material receiving module is controlled to move the third material receiving tray downwards, and the material receiving tray management module is controlled to convey empty material receiving trays to the material receiving module so as to supplement the third material receiving tray.
As an alternative embodiment, the detection apparatus 400 for product appearance further includes:
the sixth control unit is used for controlling the output module to output first prompt information when the number of the first material receiving discs filled with the products is larger than a first numerical value so as to prompt a worker to take the first material receiving discs and the products in the first material receiving discs; when the second material receiving tray is full of products, the output module is controlled to output second prompt information to prompt a worker to take the second material receiving tray and the products in the second material receiving tray; when the number of the third material receiving discs filled with the products is larger than a third numerical value, the output module is controlled to output third prompt information to prompt a worker to take the third material receiving discs and the products in the third material receiving discs away.
As an alternative embodiment, the detection apparatus 400 for product appearance further includes:
and a seventh control unit, configured to control the air purification module 30 to perform air purification on the environment where the product is located.
Since the detection apparatus for product appearance described in this embodiment is an apparatus used for implementing the detection method for product appearance described in this embodiment, based on the detection method for product appearance described in this embodiment, a person skilled in the art can understand a specific implementation manner of the detection apparatus for product appearance described in this embodiment and various variations thereof, and therefore, a detailed description of how to implement the method in this embodiment is omitted here. The device used by those skilled in the art to implement the method for detecting the appearance of the product in the embodiments of the present application is within the scope of the present application.
The technical scheme in the embodiment of the application at least has the following technical effects or advantages:
in an embodiment of the present application, a detection apparatus for product appearance is disclosed, including: the first control unit is used for controlling the positioning module to position the product and obtaining the position information of the product; the second control unit is used for controlling the robot to pick up the product and move the product to the detection range of the detection module based on the position information; a third control unit for controlling the detection module to detect a plurality of surfaces of the product; and the fourth control unit is used for controlling the robot to place the product into the corresponding material receiving tray based on the obtained detection result. Because utilize orientation module, robot and detection module to detect product appearance, compare in traditional artifical detection mode, this embodiment can improve and detect the rate of accuracy, improve detection efficiency, and simultaneously, can detect a plurality of surfaces of product again, realize the comprehensive detection to the product, so, it is low to have effectively solved the detection method who is used for product appearance among the prior art and has detected the rate of accuracy low, detection efficiency is low, detect incomplete technical problem, realized improving and detected the rate of accuracy, improve detection efficiency, and can carry out the technological effect of comprehensive detection to product appearance.
EXAMPLE III
Based on the same inventive concept, as shown in fig. 5, the present embodiment provides a detection apparatus 500 for product appearance, which includes a memory 510, a processor 520, and a computer program 511 stored in the memory 510 and capable of running on the processor 520, wherein the processor 520 executes the computer program 511 to implement the following steps:
controlling a positioning module 21 to position a product to obtain position information of the product; controlling the robot 22 to pick up the product and move the product into the detection range of the detection module based on the position information; controlling the detection module to detect a plurality of surfaces of the product; based on the obtained detection result, the robot 22 is controlled to place the product into the corresponding material receiving tray.
In a specific implementation process, when the processor 520 executes the computer program 511, any implementation manner in the first embodiment may be implemented, which is not described herein again.
Example four
Based on the same inventive concept, as shown in fig. 6, the present embodiment provides a computer-readable storage medium 600, on which a computer program 611 is stored, wherein the computer program 611, when executed by a processor (or an industrial personal computer), implements the following steps:
controlling a positioning module 21 to position a product to obtain position information of the product; controlling the robot 22 to pick up the product and move the product into the detection range of the detection module based on the position information; controlling the detection module to detect a plurality of surfaces of the product; based on the obtained detection result, the robot 22 is controlled to place the product into the corresponding material receiving tray.
In a specific implementation process, when being executed by a processor (or an industrial personal computer), the computer program 611 may implement any of the first embodiment, which is not described herein again.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (9)

1. The detection method for the product appearance is characterized by being applied to detection equipment, wherein the detection equipment comprises a cabinet, an air purification module, a positioning module, a robot and a detection module are arranged in the cabinet, and the detection module comprises a first detection module and a second detection module; the method comprises the following steps:
controlling an air purification module to purify air of an environment where a product is located, wherein the product is a camera module;
controlling a positioning module to position a product to obtain position information of the product;
controlling a robot to pick up the product based on the position information and move the product to be within the detection range of a detection module;
controlling the detection module to detect a plurality of surfaces of the product, including: controlling the first detection module to detect a plurality of first surfaces of the product, and if the first detection module detects that each surface of the product is qualified, controlling the second detection module to detect a second surface of the product; the first surface is a side surface, the second surface is a front surface, and the front surface is a surface for the robot to pick up the product;
based on the obtained detection result, controlling the robot to place the product into a corresponding material receiving tray, which comprises: if the first detection module detects that each surface of the product is qualified, controlling the robot to place the product into a first material receiving tray; if the first detection module detects that at least one face of the product is unqualified, controlling the robot to place the product into a second material receiving disc; if the second detection module detects that each surface of the product is qualified, the product is left in the first material receiving tray; and if the second detection module detects that at least one surface of the product is unqualified, controlling the robot to transfer the product from the first material receiving tray to a third material receiving tray.
2. The method for detecting the appearance of a product according to claim 1, wherein the controlling and positioning module positions the product to obtain the position information of the product comprises:
and controlling a first image acquisition unit arranged on the positioning module to acquire images of the products in the feeding tray, positioning the products based on the acquired image information, and acquiring the position information of the products.
3. The inspection method for the appearance of a product according to claim 1, wherein said controlling a robot to pick up said product comprises:
sucking the product through a suction nozzle arranged on the robot; or
And grabbing the product through a clamping jaw arranged on the robot.
4. The inspection method for product appearance according to claim 1, wherein said moving the product into the inspection range of an inspection module comprises:
controlling the robot to move the product into a detection range of a first detection module, wherein the first detection module is used for detecting one or more first surfaces of the product; and
and controlling the robot to move the product to a detection range of a second detection module, wherein the second detection module is used for detecting other one or more second sides of the product.
5. The inspection method for product appearance according to claim 1, the method further comprising:
when the first material receiving tray is full of products, the material receiving module is controlled to move the first material receiving tray downwards, and the material receiving tray management module is controlled to convey empty material receiving trays to the material receiving module so as to supplement the first material receiving tray; and when the third material receiving tray is full of products, the material receiving module is controlled to move the third material receiving tray downwards, and the material receiving tray management module is controlled to convey empty material receiving trays to the material receiving module so as to supplement the third material receiving tray.
6. The inspection method for product appearance according to claim 5, further comprising:
when the number of the first material receiving discs filled with the products is larger than a first numerical value, controlling the output module to output first prompt information to prompt a worker to take the first material receiving discs and the products in the first material receiving discs;
when the second material receiving tray is full of products, the output module is controlled to output second prompt information to prompt a worker to take the second material receiving tray and the products in the second material receiving tray;
when the number of the third material receiving discs filled with the products is larger than a third numerical value, the output module is controlled to output third prompt information to prompt a worker to take the third material receiving discs and the products in the third material receiving discs away.
7. The utility model provides a detection apparatus for product appearance, its characterized in that is applied to in the check out test set, the check out test set includes the rack be provided with air purification module, orientation module, robot and detection module in the rack, detection module includes first detection module and second detection module, the device includes:
the seventh control unit is used for controlling the air purification module to purify the air of the environment where the product is located;
the first control unit is used for controlling the positioning module to position the product to obtain the position information of the product;
the second control unit is used for controlling the robot to pick up the product and move the product to the detection range of the detection module based on the position information;
a third control unit for controlling the detection module to detect a plurality of surfaces of the product, comprising: controlling the first detection module to detect a plurality of first surfaces of the product, and if the first detection module detects that each surface of the product is qualified, controlling the second detection module to detect a second surface of the product; the first surface is a side surface, the second surface is a front surface, and the front surface is a surface for the robot to pick up the product;
the fourth control unit is used for controlling the robot to place the product into the corresponding material receiving tray based on the obtained detection result;
wherein, the fourth control unit is specifically configured to: if the first detection module detects that each surface of the product is qualified, controlling the robot to place the product into a first material receiving tray; if the first detection module detects that at least one face of the product is unqualified, controlling the robot to place the product into a second material receiving disc; if the second detection module detects that each surface of the product is qualified, the product is left in the first material receiving tray; and if the second detection module detects that at least one surface of the product is unqualified, controlling the robot to transfer the product from the first material receiving tray to a third material receiving tray.
8. A detection apparatus for product appearance, comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the following steps when executing the computer program:
controlling an air purification module to purify air of an environment where a product is located, wherein the product is a camera module; controlling a positioning module to position a product to obtain position information of the product; controlling a robot to pick up the product based on the position information and move the product to be within the detection range of a detection module; controlling the detection module to detect a plurality of surfaces of the product, including: controlling a first detection module to detect a plurality of first surfaces of the product, and if the first detection module detects that each surface of the product is qualified, controlling a second detection module to detect a second surface of the product; the first surface is a side surface, the second surface is a front surface, and the front surface is a surface for the robot to pick up the product; controlling the robot to place the product into a corresponding material receiving tray based on the obtained detection result;
wherein, based on the obtained detection result, the robot is controlled to place the product into a corresponding material receiving tray, and the method comprises the following steps: if the first detection module detects that each surface of the product is qualified, controlling the robot to place the product into a first material receiving tray; if the first detection module detects that at least one face of the product is unqualified, controlling the robot to place the product into a second material receiving disc; if the second detection module detects that each surface of the product is qualified, the product is left in the first material receiving tray; and if the second detection module detects that at least one surface of the product is unqualified, controlling the robot to transfer the product from the first material receiving tray to a third material receiving tray.
9. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of:
controlling an air purification module to purify air of an environment where a product is located, wherein the product is a camera module; controlling a positioning module to position a product to obtain position information of the product; controlling a robot to pick up the product based on the position information and move the product to be within the detection range of a detection module; controlling the detection module to detect a plurality of surfaces of the product, including: controlling a first detection module to detect a plurality of first surfaces of the product, and if the first detection module detects that each surface of the product is qualified, controlling a second detection module to detect a second surface of the product; the first surface is a side surface, the second surface is a front surface, and the front surface is a surface for the robot to pick up the product; controlling the robot to place the product into a corresponding material receiving tray based on the obtained detection result;
wherein, based on the obtained detection result, the robot is controlled to place the product into a corresponding material receiving tray, and the method comprises the following steps: if the first detection module detects that each surface of the product is qualified, controlling the robot to place the product into a first material receiving tray; if the first detection module detects that at least one face of the product is unqualified, controlling the robot to place the product into a second material receiving disc; if the second detection module detects that each surface of the product is qualified, the product is left in the first material receiving tray; and if the second detection module detects that at least one surface of the product is unqualified, controlling the robot to transfer the product from the first material receiving tray to a third material receiving tray.
CN201810394348.1A 2018-04-27 2018-04-27 Product appearance detection method and device Active CN108745942B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810394348.1A CN108745942B (en) 2018-04-27 2018-04-27 Product appearance detection method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810394348.1A CN108745942B (en) 2018-04-27 2018-04-27 Product appearance detection method and device

Publications (2)

Publication Number Publication Date
CN108745942A CN108745942A (en) 2018-11-06
CN108745942B true CN108745942B (en) 2021-04-02

Family

ID=64012274

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810394348.1A Active CN108745942B (en) 2018-04-27 2018-04-27 Product appearance detection method and device

Country Status (1)

Country Link
CN (1) CN108745942B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109697730B (en) * 2018-11-26 2021-02-09 深圳市德富莱智能科技股份有限公司 IC chip processing method, system and storage medium based on optical identification
CN109590233A (en) * 2018-11-28 2019-04-09 正大天晴药业集团股份有限公司 A kind of automatic rejection control method and device of capsule wrapping machine
CN110171708B (en) * 2019-05-10 2021-09-03 惠州市德赛电池有限公司 Automatic control method for high-precision material taking and discharging
CN114119570A (en) * 2021-11-30 2022-03-01 广东利元亨智能装备股份有限公司 Automatic model changing method, device, controller and storage medium
CN115069571A (en) * 2022-06-10 2022-09-20 龙旗电子(惠州)有限公司 Outward appearance detection device and detection letter sorting system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104528038A (en) * 2014-12-26 2015-04-22 昆山精讯电子技术有限公司 Liquid crystal module classified packaging device
CN106216268A (en) * 2016-09-13 2016-12-14 浙江舜宇光学有限公司 For detecting equipment and the method for detection camera module thereof of camera module

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007039142A (en) * 2005-07-29 2007-02-15 Murata Mfg Co Ltd Conveying device and appearance inspection device
CN102218406B (en) * 2011-01-04 2013-06-12 华南理工大学 Intelligent detection device of defects of mobile phone outer shell based on machine vision
CN104698004A (en) * 2015-02-06 2015-06-10 太仓天衡电子科技有限公司 Intelligent appearance quality detector and operation method thereof
CN205675773U (en) * 2016-06-06 2016-11-09 江苏瑞莱克斯自动化科技有限公司 A kind of Product checking sorting system
CN106362959A (en) * 2016-10-13 2017-02-01 天津恺丰义科技有限公司 Multi-functional discharging device
CN207076685U (en) * 2017-07-13 2018-03-09 广州市赛康尼机械设备有限公司 Waste material rejecting mechanism and waste material device for eliminating

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104528038A (en) * 2014-12-26 2015-04-22 昆山精讯电子技术有限公司 Liquid crystal module classified packaging device
CN106216268A (en) * 2016-09-13 2016-12-14 浙江舜宇光学有限公司 For detecting equipment and the method for detection camera module thereof of camera module

Also Published As

Publication number Publication date
CN108745942A (en) 2018-11-06

Similar Documents

Publication Publication Date Title
CN108745942B (en) Product appearance detection method and device
CN207832115U (en) A kind of vision detection system based on three-axis moving control
KR102150656B1 (en) Management system
CN106670127A (en) Full-automatic visual inspection system for screen defects
CN108501009A (en) A kind of Jian Dan robots
CN107009391B (en) Robot grabbing method
CN206854141U (en) A kind of screen defect full-automatic vision detecting system
CN104406516B (en) Workpiece abnormity size detecting device
CN209238455U (en) A kind of LED lamp bead detection removal equipment
JP2017100214A (en) Manipulator system, imaging system, object delivery method, and manipulator control program
CN205120615U (en) LED product appearance automated inspection equipment
CN208288472U (en) A kind of mobile phone card slot detection sorting equipment
CN216622191U (en) Lithium battery appearance defect detection system based on machine vision
CN108225214A (en) A kind of CCD vision inspection apparatus and detection method
TWI376003B (en)
CN108501008A (en) One kind picking up egg clamping jaw
CN112605000A (en) Automatic optical detection method and device for die-cutting sheet
CN103292993A (en) Gear detection device
CN113751341B (en) Intelligent detection equipment and method for circuit board, storage medium and terminal
CN205496098U (en) Full -automatic ABS ring gear defect visual inspection device
CN208840024U (en) A kind of positive and negative surface defect detection apparatus of lithium battery pole slice
CN205008248U (en) Equipment is examined to lamp
CN110672600B (en) Optical filter online detection device and method
CN108971009B (en) Detection equipment for product appearance
CN114148753B (en) Sheet workpiece feeding and discharging equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant