CN112304217A - Dimension measurement scoring device and scoring method based on machine vision - Google Patents

Dimension measurement scoring device and scoring method based on machine vision Download PDF

Info

Publication number
CN112304217A
CN112304217A CN202011104917.8A CN202011104917A CN112304217A CN 112304217 A CN112304217 A CN 112304217A CN 202011104917 A CN202011104917 A CN 202011104917A CN 112304217 A CN112304217 A CN 112304217A
Authority
CN
China
Prior art keywords
examination
information
measurement
reading
test
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011104917.8A
Other languages
Chinese (zh)
Other versions
CN112304217B (en
Inventor
潘凌锋
陈浙泊
林建宇
余建安
陈一信
陈镇元
陈龙威
吴荻苇
颜文俊
林斌
郑军
叶雪旺
洪徐健
杨从新
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Research Institute of Zhejiang University Taizhou
Original Assignee
Research Institute of Zhejiang University Taizhou
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Research Institute of Zhejiang University Taizhou filed Critical Research Institute of Zhejiang University Taizhou
Priority to CN202011104917.8A priority Critical patent/CN112304217B/en
Priority to CN202210187234.6A priority patent/CN114674223B/en
Publication of CN112304217A publication Critical patent/CN112304217A/en
Application granted granted Critical
Publication of CN112304217B publication Critical patent/CN112304217B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Geometry (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A dimension measurement scoring device and a scoring method based on machine vision are disclosed, wherein the device comprises an operation table, a detection table, a light source and a camera; a through hole is formed in the middle of the detection table, and a transparent plate is arranged at the hollow part; the light source is arranged below the detection platform and corresponds to the transparent plate; the camera is arranged right above the transparent plate and is fixedly arranged; the operation table is electrically connected with the light source and the camera and can control the actions of the light source and the camera; a laser ranging device is arranged between the camera and the detection platform; the operation panel comprises a display module; the invention realizes the editing and determining of examination questions by setting the management of the authorities of students, teachers and tourists in the system and setting the function of making the examination questions under the authority of the teachers.

Description

Dimension measurement scoring device and scoring method based on machine vision
Technical Field
The invention relates to the field of image recognition, in particular to a size measuring device and a size measuring method based on machine vision.
Background
At present, mechanical processing training courses are set for specific major in domestic colleges and universities, wherein students are generally examined in the training process or at the end of the training, and one widely-used examination project is to require the students to design and manufacture mechanical workpieces according to the requirements of questions, and the mechanical workpieces are compared with standard parts or requirements according to indexes such as accurate values and tolerances by measuring the size specifications of the mechanical workpieces so as to score the works of the students. At present, the traditional measuring tools such as a vernier caliper, a micrometer and the like are still adopted in schools for manual detection, and the size specification of a mechanical workpiece is obtained.
The following drawbacks exist with conventional measuring tools: 1. time is consumed for positioning the measurement object; 2. the more measurement positions of a single measurement object, the longer the time consumption; 3. the long-time measurement causes various burdens such as eye fatigue on the measuring staff; 4. the measurement position is judged by a measurer, so that the measurement result is different from person to person; 5. human errors also exist in the measurement readings; 6. the measured data needs to be manually input and counted by measuring personnel, the time consumption is long, the efficiency is low, and errors are easy to occur.
On the other hand, the current assessment and test method of the machining training course does not have the following functions: 1. the examination questions are intelligently acquired through the server, and the randomness of the examination questions is ensured; 2. in the examination process, the size measurement result of the part manufactured by the examinee is bound according to the identity information of the examinee, so that the accuracy of the examination score is ensured; 3. the measurement result is evaluated in real time according to the examination requirement, and a teacher does not need to manually input the examination result, so that the efficiency is improved, and mistakes are not easy to make; 4. aiming at the special examination scene of the mechanical processing course in colleges and universities, only the front and side surfaces of the same part are required to be measured, and the measurement result is uploaded to the server after the measurement is completed, so that the examination score is obtained, and the method is accurate and efficient.
Therefore, a device and a method for detecting and scoring the mechanical part machined parts manufactured by students are needed to be capable of achieving high efficiency and intellectualization.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a visual scoring measuring device and a visual scoring measuring system, which can respectively give authorities to users of teachers, students and tourists, can accurately read size parameters of objects in pictures, and are simple in structure and convenient to use.
A dimension measurement scoring device based on machine vision comprises an operation table, a detection table, a light source and a camera; a through hole is formed in the middle of the detection table, and a transparent plate is arranged at the hollow part; the light source is arranged below the detection platform and corresponds to the transparent plate; the camera is arranged right above the transparent plate and is fixedly arranged; the operation table is electrically connected with the light source and the camera and can control the actions of the light source and the camera; a laser ranging device is arranged between the camera and the detection platform; the console includes a display module.
A dimension measurement scoring method based on machine vision comprises the following steps:
step 1: the operation desk senses the operation of an operator, opens the software according to the operation of the operator and automatically executes the initialization operation of opening the software;
step 2: after the initialization operation of opening the software is completed, a display module on the operation desk automatically displays a user login interface; wherein, the initial user login interface is provided with a 'tourist measurement' button and a 'system quitting' button;
and step 3: performing user login according to the operation of an operator; comprises two login modes; one is that the visitor logs in through a 'visitor measurement' button and enters a visitor measurement process; the other is that the teacher logs in or the students log in through identification card identification, and the corresponding teacher operation flow or the corresponding student examination flow is entered;
and 4, step 4: the operation desk finishes user login; if the measurement flow is a tourist measurement flow and a student examination flow, automatically entering a size measurement interface, setting parameters, finishing size measurement, and ending the flow; if the teacher operation process is the teacher operation process, displaying a teacher operation panel on a user login interface; buttons for downloading test questions, uploading test questions and making test questions are arranged on the teacher operation panel;
and 5: the operation platform selects the operation content of the teacher according to the operation of the operator; through a 'download test question' button, a test question downloading process can be entered, and the process is ended after the test question downloading is completed; the test question uploading process can be entered through the test question uploading button, and the process is ended after the test questions are uploaded; through the button of 'test question making', the test question making process can be entered, the size measuring interface is entered, and the process is finished after the test question making is completed.
Further, in the step 1, when the software on the console is opened, the initialization operation of opening the software is automatically executed; opening software initialization operations including reading test folders, hardware initialization and other initializations; the software of the operating desk is provided with two folders, one folder is an examination folder, and the other folder is a test question making folder generated by a teacher making test questions.
Further, the reading of the examination folder comprises the following steps:
step 1.1: judging whether the test file name recording file exists or not; if the examination file name recording file exists, reading information in the file, wherein the read information is the name of an examination folder; if the test file name recording file does not exist, prompting the user to download test questions from the server, otherwise, not performing size measurement operation;
step 1.2: judging whether the examination folder exists according to the name of the examination folder read by the examination file name recording file, and if so, indicating that a file required by the examination exists; if the test question does not exist, prompting the user to download the test question from the server and then carrying out size measurement operation;
all parameters and tools required by size measurement are stored in the examination folder, wherein the parameters and the tools comprise a front measurement parameter, a side measurement parameter, a front calibration parameter, a side calibration parameter and a test question making tool;
the hardware initialization comprises the operation of opening a camera, if the camera is successfully opened, when entering a size measurement interface, the camera image acquisition function is started, and real-time image acquisition is carried out; if the camera is failed to be opened, prompting a user that the camera is not successfully opened, simultaneously giving a reason for the failure of the camera to be opened, and ending the step 1 and ending the process;
the other initialization comprises the initialization of an interface and the initialization of related variables, and the initialization of the related variables comprises the following two aspects:
1, initializing a mark bit of a measurement parameter for the size of an examination file read for the first time by a system;
2, initializing a mark position of the examination information obtained by the system for the first time;
the system reads the mark bit of the size measurement parameter of the test file for the first time and has the function of confirming whether the size measurement parameter is read or not; the effect of the system for obtaining the test information zone bit for the first time is to confirm whether to read the test information.
Furthermore, in the user login process in the step 3, when the user enters the system for the first time, the teacher login operation is successful, the login is overtime, the login is abnormal and the login account number is wrong, the identity card information is read regularly, and the reading of the identity card information is realized through an external identity card reader;
when entering the system for the first time, if the identity card is inserted, firstly trying to log in by a teacher according to the identity card information read by the identity card reader; if the teacher fails to log in, the teacher continues to try the student to log in, if the student fails to log in, the teacher prompts related information, and meanwhile, the teacher reads the identity card information again at regular time; if the identity card information reading failure comprises that an identity card is not inserted, the identity card cannot be identified, and the identity card information cannot complete teacher login and student login, only the tourist can log in and enter a tourist measurement process;
the steps of reading the identity card information at regular time are as follows:
step 2.1: initializing connection of an identity card reader; if the initialization is successful, carrying out the next operation; if the initialization fails, prompting the user to confirm whether the connection of the ID card reader is normal, and simultaneously finishing the timing reading operation;
step 2.2: card authentication operation between the identity card reader and the identity card; if the card authentication is successful, the next operation is carried out; if the card authentication fails, prompting the user that the identity card authentication fails, closing the connection of an identity card reader and finishing the timing reading operation;
step 2.3: reading the identity card information; if the reading is successful, filling the identity card information into an interface for display, disabling a 'tourist measurement' button, and starting a login thread; if the reading fails, prompting the user that the reading of the identity card information fails, closing the connection of the identity card reader and finishing the timing reading operation.
Further, the guest measurement process includes entering a size measurement interface, and the following steps are performed before entering the size measurement interface:
step 3.1: judging whether the examination folder exists or not; if the test questions exist, the next operation is carried out, and if the test questions do not exist, the user is prompted to download the test questions first;
step 3.2: judging whether the examination file information is read for the first time; if so, reading the information parameters of the examination file and then carrying out the next operation, and if not, directly carrying out the next operation;
step 3.3: setting the operation authority as the tourist authority; the tourist authority can only measure the size of the part to be detected and cannot carry out data uploading to a server and test question making operation;
step 3.4: entering a dimension measuring interface, and setting a camera to start to acquire images.
Further, after the teacher finishes the login operation, a login process timer is started to time, and the login time is counted; enabling a visitor login button if the teacher logs in overtime or the network is abnormal in the login process, and reading the identity card information again;
after the teacher logs in, displaying a teacher operation panel on a user login interface;
obtaining examination information before the test question downloading process; if the examination information is successfully acquired, setting a test question downloading file name and an examination file name according to the examination information, and simultaneously downloading the test question according to the compressed file name of the examination information; if the examination information acquisition fails, is overtime or is abnormal, prompting corresponding information, clearing the ID card information display, and reading the ID card information again at regular time;
the test question uploading process firstly needs to judge whether a test question making folder exists or not; if the test question exists, compressing the manufactured test question folder, displaying the compression progress, uploading the compressed file to a server after the compression is finished, extracting the serial number of the dimension to be measured and the judgment basis of qualified measurement size according to the dimension measurement information in the manufactured test question file, and uploading the extracted serial number and judgment basis to the server; if not, prompting the user; the number of the dimension to be measured and the judgment basis of qualified measurement are extracted after being edited in a test question making editor, and the editor is provided with the dimension to be measured and the tolerance upper limit and the tolerance lower limit of the qualified dimension making;
the examination question making process firstly judges whether the examination file information of the examination questions made by the teacher is read or not; if the examination file information of the test questions made by the teacher is not read before, reading the examination file information, entering a size measurement interface, and setting a camera to acquire images; otherwise, the method directly enters a size measurement interface, and a camera is set to acquire images.
Further, after the student login operation is finished, a login process timer is automatically started to time, and login time is counted; the student examination process firstly needs to judge whether examination information is read for the first time when the student is started and whether the examination information is empty; if one is true, reading examination information; if the two are not true, the examination information does not need to be read, and the examination question information in the examination folder can be directly read.
Further, the reading test information includes the following steps: if the examination information is successfully read, judging whether the examination file name read by starting is the same as the examination file name in the examination information; if the names are different, updating the test file name, and storing the test file name into a test file name recording file; if the names are the same, the examination information reading is completed; and if the examination information reading fails, including overtime examination information reading or abnormal reading process, prompting the user of corresponding information, clearing the interface identity information, and reading the identity card information again at regular time.
Further, if the examination information exists or is read successfully, whether an examination file exists or not is judged; if the examination file does not exist, prompting the user that the examination questions do not exist, and reading the information of the identity card at regular time again; if the examination file exists, judging whether the examination question information in the examination file is read for the first time; if the reading is the initial reading, reading relevant parameters of size measurement, including camera configuration parameters, camera calibration parameters, relevant parameters of calibration results and template information, and setting operation permission as student permission; if the reading is not the initial reading, the relevant parameters of the size measurement do not need to be read; and entering a size measurement interface to start image acquisition after finishing the judgment of whether the examination question information in the examination file is read for the first time.
The invention has the beneficial effects that:
the invention realizes the editing and determining of examination questions and the image acquisition of placed parts by setting the management of the authorities of students, teachers and tourists and setting the function of making examination questions under the authority of teachers, thereby completing the measurement;
timely confirming the replacement of the user by setting the timing reading of the identity card information;
whether the system reads the size measurement parameters and the test information is confirmed by setting a flag bit for reading the size measurement parameters of the test file for the first time and setting a flag bit for acquiring the test information for the first time.
Drawings
FIG. 1 is a general structural diagram of a first embodiment of the present invention;
FIG. 2 is a front view of a first embodiment of the present invention;
FIG. 3 is a schematic view of a main body of the first embodiment of the present invention;
FIG. 4 is a front view of the body portion of the first embodiment of the present invention;
FIG. 5 is a schematic view of a testing platform according to a first embodiment of the present invention;
FIG. 6 is a flowchart of the adjustment steps according to the first embodiment of the present invention;
FIG. 7 is a detailed flow chart of the first embodiment of the present invention;
FIG. 8 is a simplified flow chart of a first embodiment of the present invention;
FIG. 9 is a flowchart illustrating a first embodiment of extracting feature information of a source map of a template;
FIG. 10 is a flowchart illustrating a method for extracting a measurement result of a source graph circle measurement type of a template according to a first embodiment of the present invention;
FIG. 11 is a flowchart of extracting a measurement result of a template source graph measurement type according to a first embodiment of the present invention;
FIG. 12 is a flowchart illustrating a method for extracting a measurement result of a template source angle measurement type according to a first embodiment of the present invention;
FIG. 13 is a flowchart of an algorithm for obtaining distortion parameters according to a first embodiment of the present invention;
FIG. 14 is a flowchart of a magnification power acquisition algorithm according to a first embodiment of the present invention;
FIG. 15 is a flowchart of an out-of-bounds detection algorithm for a part to be detected according to a first embodiment of the present invention;
FIG. 16 is a general flow chart of a measurement algorithm for measuring the dimensions of a part to be inspected according to a first embodiment of the present invention;
FIG. 17 is a flowchart of an object search and matching algorithm according to a first embodiment of the present invention;
FIG. 18 is a flowchart of a circle measurement algorithm for a part to be detected according to a first embodiment of the present invention;
FIG. 19 is a flowchart of a line measurement algorithm for a part to be measured according to a first embodiment of the present invention;
FIG. 20 is a flowchart of an angle measurement algorithm for a part to be detected according to a first embodiment of the present invention;
FIG. 21 is an exemplary mask diagram of four measurement types, circle, line, arc and angle, in accordance with a first embodiment of the present invention;
FIG. 22 is a diagram of the image phase and the schematic diagram in step 4.7.11 according to the first embodiment of the present invention.
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
The first embodiment is as follows:
as shown in fig. 1, an adjustable dimension measuring apparatus based on machine vision includes an operation table 2, an inspection table 3, a light source 4 and a camera 5. A through hole is formed in the middle of the detection table 3, an object carrying plate 32 is arranged at the through hole, and the object carrying plate 32 is made of transparent materials; the light source 4 is arranged below the carrying plate 32 and corresponds to the carrying plate 32; the camera 5 is arranged right above the carrying plate 32, and the camera 5 is positioned on the camera fixing plate 11; the console 2 is electrically connected to the light source 4 and the camera 5, and the console 2 can control the operation of the light source 4 and the camera 5.
As shown in fig. 2, the detection table 3, the light source 4 and the camera 5 are arranged on the fixed frame 1, the fixed frame 1 is in a straight quadrangular prism shape as a whole and is hollow inside, the fixed frame is arranged above the optical shock isolation table 12, the center of gravity is reduced through the optical shock isolation table 12, and the purpose of isolating external shock is achieved. Four corners at the bottom of the optical shock-isolation platform 12 are provided with horseback wheels.
A telecentric coaxial lens 51 is arranged at the lens position of the camera 5, and a He-Ne laser 52 is arranged at the side surface of the telecentric coaxial lens 51, so that the camera 5, the standard flat crystal 65 and the object carrying plate 32 can be conveniently adjusted to be in a set state.
As shown in fig. 3 and 4, a standard flat crystal 65 is arranged between the camera 5 and the inspection table 3, the standard flat crystal 65 is arranged on the slide rail adjusting device 6, a set included angle is kept between the upper surface and the lower surface of the standard flat crystal 65, and the lower surface of the standard flat crystal 65 is provided with a transmission enhancement film in this embodiment, so that sufficient transmission light intensity is ensured. Wherein an enhanced reflective film is provided on the upper surface of the carrier plate 32. The slide rail adjusting device 6 is located at the side of the camera 5 and the detection platform 3, and the slide rail adjusting device 6 comprises a vertical plate 61 and a horizontal plate 62. The vertical plate 61 passes through the detection table 3; the one side that the riser 61 is close to detecting platform 3 is provided with two vertical tracks that are parallel to each other, and horizontal plate 62 sets up on the vertical track of riser 61. One side of the horizontal plate 62 close to the detection table 3 is provided with a horizontal rail, the standard flat crystal 65 is arranged on the horizontal rail, and the position adjustment of the standard flat crystal 65 is realized by adjusting the position of the standard flat crystal 65 on the horizontal rail and the position of the horizontal plate 62 on the vertical rail. The flatness of the part to be inspected can be detected by acquiring an image through the standard flat crystal 65. The top of riser 61 is provided with camera backup pad 63, and the bottom of riser 61 is provided with light source backup pad 64, and wherein camera backup pad 63 and light source backup pad 64 are parallel to each other and the level sets up, and camera backup pad 63, light source backup pad 64 and riser 61 are integrative to be made to contained angle position between light source backup pad 64 and riser 61 is provided with bearing structure. The camera support plate 63 is fixedly connected with the camera 5, and the light source support plate 64 is fixedly connected with the light source 4. Wherein camera backup pad 63 still laminates with camera fixed plate 11, and camera backup pad 63 is located the lower surface of camera fixed plate 11, and camera fixed plate 11 also is used for fixed camera 5, and camera fixed plate 11 sets up in mount 1.
The detection table 3 is provided with a support 34, the support 34 is located on the object carrying plate 32, and the support 34 is made of transparent materials and used for fixing the part to be detected and avoiding the part to be detected from deviating. The bracket 34 is in a shape of a straight quadrangular prism as a whole, and a groove is formed in the middle of the bracket 34. The top of the bracket 34 enables the part to be detected to be horizontally arranged, so that the camera 5 acquires a front image of the part to be detected; the groove portion of the bracket 34 enables the part to be detected to be vertically arranged, so that the camera 5 acquires a side image of the part to be detected. The depth of the groove of the support 34 is determined by the difference value of the width and the thickness of the part to be detected, and through the arrangement of the support and the groove in the support, when the front face of the part to be detected is arranged upwards and the side face of the part to be detected is arranged upwards, the collected images are close in height, and the image amplification rate of the camera is consistent.
Examine test table 3 and set up on mount 1, examine and be provided with perpendicular high accuracy between test table 3 and the mount 1 and adjust slide rail 35, perpendicular high accuracy is adjusted slide rail 35 and is located four angles that examine test table 3. The adjustment of the overall height and inclination angle of the detection table 3 can be realized by adjusting the vertical high-precision adjusting slide rail 35.
A laser ranging device 31 is arranged between the camera fixing plate 11 and the detection table 3, and the distance between the camera fixing plate 11 and the detection table 3 can be detected through the laser ranging device 31. Laser rangefinder 31 includes laser rangefinder sensor transmission head and laser rangefinder receiver, and wherein the laser rangefinder receiver sets up on four angles that detect platform 3, and the laser rangefinder sensor transmission head sets up on four angles that are used for camera fixed plate 11 lower surface, and wherein the laser sensor transmission head is just setting up laser rangefinder receiver.
A micro-adjusting knob 33 is arranged between the object carrying plate 32 and the detection platform 3, and the height and the inclination angle of the object carrying plate 32 can be adjusted by adjusting the micro-adjusting knob 33.
As shown in fig. 5, a transparent checkerboard 36 is further disposed on the inspection stage 3, and the transparent checkerboard 36 is disposed adjacent to the object plate 32. An accurate focusing of the camera 5 can be achieved by the transparent checkerboard 36. An identity card reader 37 is also arranged on the detection table 3.
The console 2 is disposed above the camera fixing plate 11, and the console 2 includes a display module 21 capable of displaying a detection result and a detection process.
In the implementation process, the vertical high-precision adjusting slide rail 35 is adjusted through the laser ranging device 31, so that the camera fixing plate 11 is parallel to the detection table 3; a laser beam emitted by a He-Ne laser 52 penetrates through a telecentric coaxial lens 51, enters the upper surface of a standard flat crystal 65, is projected from the lower surface of the standard flat crystal 65 and reaches a carrying plate 32, wherein interference fringes with alternate light and shade are formed on the lower surface of the standard flat crystal 65 and the upper surface of the carrying plate 32, and the micro-adjusting knob 33 is adjusted according to the fringe pattern to realize the parallelism between the lower surface of the standard flat crystal 65 and the carrying plate 32. And acquiring front and side images of the part to be detected after the adjustment is finished, and uploading the images to the operating table 2 so as to finish the detection of the part to be detected.
As shown in fig. 6, an adjustment measurement method of a dimension measurement scoring device includes the following steps:
the method comprises the following steps: the camera and the light source are respectively fixedly arranged on the camera fixing plate and the light source supporting plate; a bracket and a part to be detected are arranged on the carrying plate; turning on a light source, and adjusting four vertical high-precision adjusting slide rails according to the definition degree of images continuously acquired by a camera to make the images clear;
step two: according to four groups of data obtained by four laser sensors in the laser ranging device, continuously adjusting the four vertical high-precision adjusting slide rails to enable the four groups of data to be equal, and enabling a camera fixing plate to be parallel to a detection platform;
step three: taking down the bracket and the part to be detected from the detection table; adjusting the horizontal and vertical adjusting slide rails on the slide rail adjusting device to enable the standard flat crystal to face the object carrying plate at intervals, closing the light source, and turning on the He-Ne laser to adjust the lower surface of the standard flat crystal to be parallel to the upper surface of the object carrying plate;
step four: turning off the He-Ne laser, turning on the light source, and removing the standard flat crystal; arranging a support at a set position above a carrying plate, arranging a transparent checkerboard above the support, collecting images, calibrating a flat field and calculating the image magnification;
step five: taking down the transparent checkerboard, placing the part to be detected, enabling the front face of the part to be detected to be upward, and enabling a camera to acquire an image to complete size measurement of the front face length and width and the front face inner items;
step six: and (5) placing the part to be detected into the groove of the bracket with the side surface facing upwards, acquiring an image by using a camera, completing the measurement of the height information of the side surface, and ending the step.
And in the third step, the lower surface of the standard flat crystal is adjusted to be parallel to the upper surface of the object carrying plate, firstly, light rays emitted by a He-Ne laser are required to pass through a camera to obtain interference fringes, after the interference fringes are processed, information difference values of adjacent interference fringes are obtained, four micro-adjusting knobs are continuously adjusted according to the difference values until the information difference values of the adjacent interference fringes are reduced to a set value, the interference fringes are approximately parallel and equidistant, and then, the lower surface of the standard flat crystal is approximately parallel to the upper surface of the object carrying plate.
As shown in fig. 7 and 8, a dimension measurement scoring method based on machine vision includes the following steps:
step 1: the operation desk senses the operation of an operator, opens the software according to the operation of the operator and automatically executes the initialization operation of opening the software;
step 2: after the initialization operation of opening the software is completed, a display module on the operation desk automatically displays a user login interface; wherein, the initial user login interface is provided with a 'tourist measurement' button and a 'system quitting' button;
and step 3: performing user login according to the operation of an operator; comprises two login modes; one is that the 'tourist measurement' button is clicked to finish the login of the tourist and enter the tourist measurement process; the other is that the teacher logs in or the students log in through identification card identification, and the corresponding teacher operation flow or the corresponding student examination flow is entered;
and 4, step 4: the operation desk finishes user login; if the measurement flow is a tourist measurement flow and a student examination flow, automatically entering a size measurement interface, setting parameters, finishing size measurement, and ending the flow; if the teacher operation process is the teacher operation process, displaying a teacher operation panel on a user login interface; buttons for downloading test questions, uploading test questions and making test questions are arranged on the teacher operation panel;
and 5: the operation platform selects the operation content of the teacher according to the operation of the operator; if the 'download test question' button is clicked, the test question downloading process can be entered, and the process is ended after the test question downloading is completed; if the user clicks the button of uploading the test questions, the user can enter the test question uploading process, and the process is ended after the test questions are uploaded; if the button of 'making test questions' is clicked, the test question making process is entered, the size measuring interface is entered, and the process is ended after the test question making is completed.
In the step 1, when the software on the console is opened, an opening software initialization operation is automatically executed, wherein the opening software initialization operation includes reading the test folder, hardware initialization and other initialization. Note that, in the software of the console, two folders related to this embodiment are provided, one is an examination folder, and the other is a test question creating folder created by the teacher creating test questions, and in this embodiment, the name of the test question creating folder is named extrinsic files, and the name of the folder is kept unchanged. After the teacher makes the test questions, the teacher can test and upload the test questions.
The reading of the examination folder comprises the following steps:
step 1.1: and judging whether the examination file name recording file ExamiationFileName. And if the ExamiationFileName.txt file exists, reading the information in the file, wherein the read information is the name of the test folder. If the ExamiationFileName.txt file does not exist, the user is prompted to download the examination questions from the server, and otherwise, the size measurement operation cannot be performed.
Step 1.2: judging whether an examination folder exists according to the name of the examination folder read by ExamiationFileName.txt, and if the examination folder exists, indicating that a file required by an examination exists; if not, prompting the user to download the examination questions from the server, otherwise, not performing the size measurement operation.
Because the name of the examination folder is randomly generated, the name of the examination folder needs to be recorded through an examination file name recording file for searching whether the examination folder exists or not; on the other hand, the examination file name recording file does not exist in the examination folder, and the examination file name recording file is only used for searching the examination folder. All parameters and tools required by size measurement are stored in the examination folder, and the parameters and the tools comprise front measurement parameters, side measurement parameters, front calibration parameters, side calibration parameters and test question making tools. It should be noted that reading the test folder is pre-reading, and the purpose is to quickly load the test folder during subsequent size measurement, so as to improve efficiency, and further perform reading of the information stored in the test folder in combination with the authority of the user during actual measurement.
The hardware initialization comprises the operation of opening a camera, if the camera is successfully opened, when entering a size measurement interface, the camera image acquisition function is started, and real-time image acquisition is carried out; if the camera is failed to be opened, prompting a user that the camera is not successfully opened, giving a reason for the failure of the camera to be opened, ending the step 1 and ending the process, and enabling the user to enter the subsequent step 2 after the problem of the failure of the camera to be solved.
The other initialization comprises initialization of a login interface and initialization of related variables, and the initialization of the login interface comprises the following aspects:
I. teacher operation panel visibility setting: is invisible;
II. Visitor login and exit system operating panel visibility settings: visible;
III, setting the visibility of a display panel of test question downloading and test question uploading progress: is invisible;
IV, setting visibility of an operation process prompt box: visible;
v, enabling a timer for reading identity information at regular time;
VI, enabling an operation timeout timer.
The initialization of the relevant variables includes the following two aspects:
initialization of the flag bit of the test file size measurement parameter for the first reading of the system is set to true in the embodiment.
2, the system initially acquires the initialization of the test information flag, which is set to true in this embodiment.
The system reads the mark bit of the size measurement parameter of the examination file for the first time to confirm whether the size measurement parameter is read or not, if the size measurement parameter is read, the mark is set to false, and at the moment, the user does not need to read the parameter after logging in, and only needs to download and update the size measurement parameter in each test question. The effect of the system obtaining the test information zone bit for the first time is to determine whether to read test information, for example, when a user logs in for the first time, if the test information is not read before, the system obtains the test information zone bit for the first time as true, the test information needs to be read, the test information is read while a test question is downloaded, and the zone bit is set to false after the test question and the test information are downloaded; the examination information is a file for recording examination folder information, and the examination information comprises an examination file name recording file.
In the step 3, in the process of user login, when the user enters the system for the first time, the teacher logs in successfully, logs in overtime, logs in abnormally and logs in with an error account, the user can read the identity card information regularly, the reading of the identity card information is realized by an external identity card reader, and the timing time is set to be 500 milliseconds in the embodiment. The purpose of reading the identity card information at regular time is to be able to react to new identity card information in time.
When entering the system for the first time, if the identity card is inserted, firstly trying to log in by a teacher according to the identity card information read by the identity card reader; if the teacher fails to log in, the teacher continues to try the student to log in, if the student fails to log in, the student prompts related information, and meanwhile, the identity card information is read again at regular time. If the identity card information reading failure during the initial entering of the system comprises that an identity card is not inserted, the identity card cannot be identified, the identity card information cannot complete teacher login, student login and the like, only tourists can log in, and a tourist measurement process is started. It should be noted that after the user completes the student login or the teacher login, even if the user fails to read the identity card information regularly, including that the identity card is not inserted, the identity card cannot be identified, the identity card information cannot complete the teacher login and the student login, and the like, the current login state and login authority cannot be changed, and unless the obtained identity card information is read regularly, other teacher login or student login can be completed, the teacher login or student login corresponding to the identity card information is switched.
The steps of reading the identity card information at regular time are as follows:
step 2.1: initializing connection of an identity card reader; if the initialization is successful, carrying out the next operation; if the initialization fails, prompting the user to confirm whether the connection of the ID card reader is normal, and simultaneously finishing the timing reading operation.
Step 2.2: card authentication operation between the identity card reader and the identity card; if the card authentication is successful, the next operation is carried out; if the card authentication fails, prompting the user that the identity card authentication fails, closing the connection of the identity card reader and finishing the timing reading operation.
Step 2.3: reading the identity card information; if the reading is successful, the ID card information is filled into an interface for display, and a 'tourist measurement' button is disabled, so that a login thread is automatically started; if the reading fails, prompting the user that the reading of the identity card information fails, closing the connection of the identity card reader and finishing the timing reading operation.
In the step 4, the guest measurement process needs to enter a size measurement interface, wherein the following steps are performed before entering the size measurement interface:
step 3.1: judging whether the examination folder exists or not; if the test questions exist, the next operation is carried out, and if the test questions do not exist, the user is prompted to download the test questions first;
step 3.2: judging whether the examination file information is read for the first time; in the embodiment, the method includes the steps that whether a flag bit of examination information acquired by a system for the first time is true or not is shown, if yes, the information parameters of the examination file are read and then the next operation is carried out, and if not, the next operation is directly carried out;
step 3.3: setting the operation authority as the tourist authority; the tourist authority can only measure the size of the part to be detected and cannot carry out data uploading to a server and test question making operation;
step 3.4: entering a dimension measuring interface, and setting a camera to start to acquire images.
In the step 4, the students enter the student examination process after logging in, firstly, a timer in the logging-in process is automatically started for timing, and the logging-in time is counted. The student examination process needs to judge whether examination information is read for the first time when the student is started and whether the examination information is empty; if one is true, reading examination information; if the two are not true, the examination information does not need to be read, and the examination question information in the examination folder can be directly read.
The examination information reading method comprises the following steps: if the examination information is successfully read, judging whether the examination file name read by starting is the same as the examination file name in the examination information; if the names are different, updating the test file name, and storing the test file name into a test file name recording file; if the names are the same, the examination information reading is completed; and if the examination information reading fails, including overtime examination information reading or abnormal reading process, prompting the user of corresponding information, clearing the interface identity information, and reading the identity card information again at regular time.
And when the test information exists or is read successfully, judging whether the test file exists or not. If the examination file does not exist, prompting the user that the examination questions do not exist, and reading the information of the identity card at regular time again; if the examination file exists, whether the examination question information in the examination file is read for the first time is judged. If the reading is the initial reading, reading relevant parameters of size measurement, including camera configuration parameters, camera calibration parameters, relevant parameters of calibration results and template information, and setting operation permission as student permission; if not, then the measurement related parameters need not be read. And entering a size measurement interface to start image acquisition after judging whether the examination question information in the examination file is read for the first time. The test files in this embodiment comprise test folders.
And if the student login operation is not finished, including login overtime or login abnormity, prompting related information, and reading the identity card information again at regular time.
When the operation authority is the student authority, the function of submitting answers and returning to the user login interface is provided, and when the user authority is the teacher or the tourist, the function of returning to the user login interface is provided.
And 5, starting a login process timer to time after the teacher login operation is completed, and counting the login time. And if the teacher logs in overtime or the network is abnormal in the login process, enabling the tourist login button to read the identity card information again.
The teacher operation comprises a test question downloading flow, a test question uploading flow and a test question making flow. In this embodiment, after the teacher logs in, the teacher operation panel is displayed on the user login interface instead of jumping to the size measurement interface immediately, and buttons of "download test questions", "upload test questions", and "make test questions" are provided on the teacher operation panel and correspond to the test question download flow, the test question upload flow, and the test question making flow, respectively.
If the user clicks the button of 'download test question', the buttons of 'download test question', 'upload test question', 'make test question' and 'tourist survey' are disabled. The display module of the operation desk jumps out of the prompt box to display the download test question, please wait! Starting a thread for downloading test questions, and counting the timing time for downloading the test questions. It should be noted that the examination information is obtained before the examination questions are downloaded. If the examination information is successfully acquired, setting a test question downloading file name and an examination file name according to the examination information, and simultaneously downloading the test question according to the compressed file name of the examination information; and if the examination information acquisition fails, is overtime or is abnormal, prompting corresponding information, clearing the ID card information display, and reading the ID card information again at regular time. The downloading process displays the downloading progress, and after the test questions are downloaded, the examination file is automatically decompressed, the information of the examination file is read, and meanwhile, the success of the test question downloading is prompted. The read test file information includes:
a. camera configuration parameters and camera calibration parameters;
b. calibrating result related parameters;
c. and (4) template information.
And if the user clicks the button for uploading the test questions, entering a test question uploading process. The test question uploading process firstly needs to judge whether a test question making folder exists or not; if the test question exists, compressing the test question folder, displaying the compression progress, and uploading the compressed file to the server after the compression is finished; and meanwhile, according to the size measurement information in the test question file, the number of the size to be measured and the judgment basis for qualified measurement are extracted, and the extracted number is uploaded to the server. If the test question does not exist, prompting the user that the test question does not exist, please make the test question first! ". The number of the size to be measured and the judgment basis of qualified measurement are extracted after being edited in a test question making editor, and the size to be measured, the tolerance upper limit and the tolerance lower limit of the qualified dimension are set in the editor.
If the user clicks a button for making test questions, if the examination file information of the test questions made by the teacher is not read before, the examination file information is read, and meanwhile, the operation authority is set as the teacher authority, and the teacher authority can carry out test question making and part size measurement for making the test questions; and opening a size measurement interface, setting a camera to start to collect images, setting parameters and finishing test question making. It should be noted that the dimension measurement interface entered by the teacher authority is to measure the parts by making the information stored in the test question folder; other authorities, including student authority and visitor authority, enter the size measurement interface and measure the size by using the information stored in the examination folder.
The operation of exiting the system can be performed in both the user login interface and the size measurement interface. And after clicking an 'exit system' button in the two interfaces, prompting a user whether to determine to exit the system, and if so, exiting the system.
A dimension measurement scoring system based on machine vision comprises a dimension measurement interface, wherein the dimension measurement interface is provided with a system setting button, a front measurement button and a side measurement button. The size measurement interface can complete the functions of parameter configuration, test question making, template calibration and size measurement, wherein the functions of parameter configuration, test question making and template calibration are arranged under a directory corresponding to a system setting button of the size measurement interface, a password is required to be input when the system setting is entered, and the operations of parameter configuration, test question making and template calibration can be carried out only when a correct password is input, wherein the test question making function can be operated only by teacher authority; clicking a 'front measurement' button or a 'side measurement' button to carry out a size measurement process by the system; the size measurement interface is also provided with a system quitting button, and after the system quitting button is clicked, a user is prompted whether to determine whether to quit the system, and if so, the user quits the system. The size measuring interface which the student is allowed to enter is also provided with an answer submitting button, after the answer quitting button is clicked, the user is prompted whether to determine to submit the answer, if so, the user submits the answer which comprises uploading pictures, scores and the like, and the student quits the system.
And after the 'parameter configuration' button is clicked, a parameter setting dialog box can pop up, and the parameter setting dialog box is provided with functions of camera parameter configuration, calibration parameter configuration and data statistics. The camera parameter configuration comprises pixel combination, collection frame rate, processing frame rate and exposure setting. The calibration parameter configuration comprises the setting of the number of transverse points, the number of longitudinal points and the unit interval. The data statistics comprise the number of pictures and the setting of filter coefficients.
The test question making button can pop up a test question making dialog box after being clicked, and front test question making and side test question making can be completed in the test question making dialog box, wherein the front test question making corresponds to the front of the part to be detected, and the side test question making corresponds to the side of the part to be detected. The front test question making and the side test question making are respectively corresponding to a front button and a side button arranged in the test question making dialog box. Clicking a 'front' or 'side' button in the test question making dialog box to enter a test question making process, wherein the test question making process comprises the following steps:
step 4.1: acquiring a real-time image acquired by a current camera, opening a test question making dialog box, and transmitting the real-time image;
step 4.2: making test questions, wherein the test questions comprise task editions of various size measurement types, and the maximum tolerance upper limit and the maximum tolerance lower limit of measurement can be set;
step 4.3: when exiting from the test question making dialog box, extracting the test question template information, and simultaneously measuring the size of the template source diagram; the template source diagram represents an image of the standard component when the standard component is randomly placed in the field of view of the camera;
step 4.4: and storing the related measurement results for use in real-time dimension measurement.
It should be noted that before the camera collects the real-time image, the system guides the placement of the part to be detected. The process of guiding the placement of the parts to be detected is as follows: and the image display window of the dimension measurement interface displays a guide map based on the template source map image, and guides a user to place the corresponding measurement surface of the object upwards in a proper area range. The guide is waited the putting of part and can prevent to measure the face and make mistakes, also is favorable to next measurement. The manufacturing process of the guide diagram comprises the following steps:
step 4.1.1: taking the gray value of each pixel point in the template source image as 80% of the original value;
step 4.1.2: taking the gray value of each pixel point in the blank image (the gray value of each pixel is 255) with the same size to be 20% of the original value;
step 4.1.3: superposing the two images according to the gray value to obtain a guide image; wherein, the pixel point with the gray value of 0 displays pure black, and the pixel point with the gray value of 255 displays pure white.
As shown in fig. 21, the task editing for various size measurement types in step 4.2 includes editing for measurement types of four basic elements, i.e., circle, line, arc, and angle, to obtain corresponding mask feature information, i.e., a mask diagram. Wherein the line widths of the circle, line, arc and corner element measurement types can be adjusted and are displayed as white areas in the mask image; and outputting corresponding characteristic information after the line width is adjusted.
The circle measurement type task editing comprises the following steps:
step 4.2.1: determining a measurement type name;
step 4.2.2: determining a measurement switch value, a circle radius accurate value, a tolerance upper limit and a tolerance lower limit; the above parameters are input by a user;
step 4.2.3: determining center coordinates, width, height and radius; the parameters are calculated according to vector information corresponding to the minimum line width drawn measurement type graph during editing.
The line measurement type task editing comprises the following steps:
step 4.3.1: determining a measurement type name;
step 4.3.2: determining a measurement switch value, a circle radius accurate value, a tolerance upper limit and a tolerance lower limit; the above parameters are input by a user;
step 4.3.3: determining an inclination angle, a length of a straight line segment and coordinates of two end points; the parameters are calculated according to vector information corresponding to the minimum line width drawn measurement type graph during editing.
The arc measurement type task editing comprises the following steps:
step 4.4.1: determining a measurement type name;
step 4.4.2: determining a measurement switch value, a circle radius accurate value, a tolerance upper limit and a tolerance lower limit; the above parameters are input by a user;
step 4.4.1: determining the radius of the arc, the angle of the arc and coordinates of three points, wherein two points are positioned at the initial and tail end points of the arc, and the other point is positioned elsewhere; the parameters are calculated according to vector information corresponding to the minimum line width drawn measurement type graph during editing.
The angular measurement type task editing comprises the following steps:
step 4.5.1: determining a measurement type name;
step 4.5.2: determining a measurement switch value, a circle radius accurate value, a tolerance upper limit and a tolerance lower limit; the above parameters are input by a user;
step 4.5.3: determining the included angle and three point coordinates forming the included angle, wherein one point is positioned at the vertex of the included angle; the parameters are calculated according to vector information corresponding to the minimum line width drawn measurement type graph during editing.
The combination measurement types such as the distance between two points, the distance between the points and the line, the distance between two straight line segments and the like can be obtained through the combination of four basic elements such as circles, lines, arcs and angles. The distance between the two points comprises: the distance between the centers of circles, the distance between the center of a circle and the center of an arc, the distance between the center of a circle and the vertex of an included angle, the distance between the center of an arc and the vertex of an included angle, the distance between the two vertexes of the included angle and the like; the point-to-line spacing comprises: the distance from the circle center to the straight line, the distance from the arc center to the straight line, the distance from the vertex of the included angle to the straight line and the like.
The two-point distance measurement algorithm is to respectively calculate coordinates of two points and then calculate the distance between the two points. The point-to-straight line distance measurement algorithm comprises the steps of firstly respectively calculating point coordinates and coordinates of two end points of a straight line section, and then calculating the distance between a point and the straight line section. The distance measurement algorithm for the two straight-line segments comprises the steps of firstly respectively calculating the coordinates of the end points of the two straight-line segments, then calculating the distance from the two end points of one straight-line segment to the other straight-line segment, and then adding the two distances to obtain the average value, namely the distance between the two straight-line segments.
In this embodiment, the measurement data of four basic elements, i.e., a circle, a line, an arc, and an angle, are displayed in real time, wherein the circle measurement type displays a center coordinate and a radius value in real time, the straight line measurement type displays an inclination angle and a length value in real time, the arc measurement type displays an arc angle in real time, and the angle measurement type displays an included angle degree in real time. And finishing the circle, line, arc and angle measurement type task editing to obtain corresponding mask characteristic information.
As shown in fig. 9, the extraction of the test question template information in step 4.4 includes the following steps:
step 4.6.1: carrying out mean value filtering processing on the template source image; the size of the average filtering window of the embodiment is 5 × 5; the template source image is an image of the front surface and the side surface of the standard part obtained by a camera;
step 4.6.2: carrying out thresholding treatment further; the pixel gradation value thresholded to be greater than the set threshold is set to 0, and otherwise, is set to 255; in this embodiment, the threshold value is 100;
step 4.6.3: extracting outer contour points of the standard part in the template source diagram;
step 4.6.4: solving the minimum circumscribed circle of the outer contour to obtain the coordinate and the radius of the circle center;
step 4.6.5: extracting ROI according to the coordinate and radius of the circle center; the ROI is a rectangle, the side length of the ROI is the diameter of the minimum circumscribed circle of the standard part in the template source image, the center of the ROI is the center of the minimum circumscribed circle of the standard part, and the rotation angle of the ROI is zero; ROI represents a region of interest
Step 4.6.6: extracting hierarchical contour information from the ROI, wherein the hierarchical contour information comprises outer contour information and inner contour information, and the outer contour and the inner contour meet a parent-child hierarchical relationship; if the outer contour and the inner contour meet the hierarchical relationship of father and son, the outer contour is a father contour, and the inner contour is a son contour;
step 4.6.7: solving a minimum external rectangle with a zero rotation angle of the external outline to obtain the length and the width of the rectangle, and judging whether the length or the width of the rectangle is larger than a set value or not; in this embodiment, whether the length is greater than the row number of the ROI minus 2 and the width is greater than the column number of the ROI minus 2;
step 4.6.8: if the length or width of the rectangle is larger than the set value, the standard component is out of bounds, and the step 4.6.16 is skipped;
step 4.6.9: if the length or the width of the rectangle is not larger than a set value, solving the mass center and the minimum external rectangle of the outer contour to obtain the center coordinate, the rotation angle, the length and the width and the area of the rectangle; then judging whether an inner contour exists or not;
step 4.6.10: if no inner contour exists, go to step 4.6.16;
step 4.6.11: if the inner contour exists, judging whether only one effective inner contour exists;
step 4.6.12: if only one effective inner contour exists, the center of mass, the coordinate of the minimum circumscribed rectangle center, the rotation angle, the length, the width and the area are obtained, and the step 4.6.16 is skipped;
step 4.6.13: if a plurality of effective inner contours exist, traversing all the effective inner contours, solving the maximum value and the minimum value of the minimum external rectangle area of the inner contours, and comparing whether the difference value between the maximum value and the minimum value is larger than a set value; in this embodiment, the set value is the sum of squares of 10 pixels;
step 4.6.14: if the difference value between the maximum value and the minimum value of the area of the minimum circumscribed rectangle is larger than a set value, the maximum inner contour and the minimum inner contour are indicated; respectively obtaining the center of mass, the coordinate of the minimum circumscribed rectangle center, the rotation angle, the length, the width and the area of the minimum circumscribed rectangle center, and jumping to step 4.6.16;
step 4.6.15: if the difference value between the maximum value and the minimum value of the area of the minimum circumscribed rectangle is less than or equal to a set value, the standard component is marked to have a plurality of effective maximum inner contours, one of the effective maximum inner contours is selected randomly, the center of mass, the center coordinate of the minimum circumscribed rectangle, the rotation angle, the length, the width and the area are obtained, and the step 4.6.16 is skipped;
step 4.6.16: according to the task editing information of the circle, line, arc and angle measurement types, extracting characteristic information of the circle, line, arc and angle measurement types;
step 4.6.17: and ending the extraction process of the test question template information.
As shown in fig. 10, in the step 4.6.16, the process of extracting the circle measurement type information in the template source diagram includes the following steps:
step 4.7.1: extracting a circle measurement type ROI from a template source graph according to corresponding circle measurement type mask feature information obtained by task editing;
step 4.7.2: filtering the ROI grayscale image, wherein the filtering is Gaussian filtering; in this embodiment, the size of the filtering window of the gaussian filtering process is 5 × 5, and the standard deviation is 2;
step 4.7.3: carrying out Hough circle finding processing on the filtered image to obtain a plurality of circles;
step 4.7.4: comparing the circles obtained in the step 4.7.3 with the centers of the selected mask circles respectively, and judging whether the center offset is smaller than a set value; in the embodiment, the set value is 2 mm;
step 4.7.5: if the circle center deviation is larger than or equal to the set value, changing the Hough circle finding threshold parameter threshold value, and jumping to step 4.7.3; changing the Hough circle finding parameter threshold value means reducing the parameter threshold value, the parameter threshold value is the same in the subsequent process and steps, the parameter threshold value represents the accumulation threshold value of the circle center of the detection method in the Hough gradient circle finding process, the smaller the changing of the Hough circle finding parameter threshold value, the more absent circles can be detected, and the larger the threshold value, the more perfect circles can be detected through the detected circles; the threshold in this embodiment ranges from 5 pixels to 1/5 corresponding to the mask circumference, which decreases by 5 pixels per execution;
step 4.7.6: if the circle center offset is smaller than the set value, screening out a Hough fitting circle with the minimum absolute difference value between the template source image and the radius of the mask circle, and entering step 4.7.7;
step 4.7.7: comparing the absolute difference value with the radius of the mask circle, and judging whether the absolute difference value of the radii is smaller than a set value; in the embodiment, the set value is 2 mm;
step 4.7.8: if the absolute difference value of the radii is larger than or equal to the set value, changing the parameter threshold value of the Hough circle finding threshold value, and jumping to step 4.7.3;
step 4.7.9: if the absolute difference value of the radii is smaller than a set value, finding a proper Hough fitting circle;
step 4.7.10: performing edge detection on the ROI gray level image by applying a Canny operator;
step 4.7.11: comparing the ROI image after Canny processing with the mask ROI image; the phase-and-sum algorithm means that edge points obtained by Canny edge detection processing in the ROI image are reserved in a white circular ring drawn in the mask ROI image, and the rest points are removed, as shown in FIG. 22;
step 4.7.12: extracting edge contour points to be detected of the phase and the back image;
step 4.7.13: screening out proper contour points according to the distance from the contour points to the Hough fitting circle to form a new contour point set;
step 4.7.14: fitting a circle by using a least square method on the new contour point set to obtain a circle center and a radius;
step 4.7.15: the flow is ended.
It should be noted that the process of extracting the arc measurement type information in the template source map is the same as the process of extracting the circle measurement type information in the template source map, and the difference is only different from the extracted ROI.
It should be noted that there may be mask feature information of a plurality of circle measurement types, and the above measurement procedure is performed separately for the mask feature information of each circle measurement type; the measurement process is also performed separately for the arc, line and angle measurement type mask feature information.
As shown in fig. 11, in the step 4.6.16, the process of extracting the line measurement type information in the template source graph includes the following steps:
step 4.8.1: extracting a linear measurement type ROI from a template source graph according to corresponding line measurement type mask characteristic information obtained by task editing;
step 4.8.2: performing edge detection on the ROI gray level image by applying a Canny operator;
step 4.8.3: carrying out Hough line finding processing on the image subjected to Canny processing to obtain a plurality of straight line segments;
step 4.8.4: comparing the straight line segment obtained in the step 4.8.3 with the inclined angle of the straight line segment of the mask, and judging whether the inclined angle deviation is smaller than a set value; in this embodiment, the set value is 7.5 degrees;
step 4.8.5: if the inclination angle deviation is larger than or equal to a set value, changing a Hough line finding parameter threshold value, and jumping to the step 4.8.3 to execute, wherein the Hough line finding parameter comprises a threshold value parameter of an accumulation plane, the length of the lowest line segment and the maximum line spacing; the threshold parameter of the accumulation plane represents the value that must be reached in the accumulation plane when identifying a portion as a straight line in the graph; the maximum line spacing represents the maximum distance that allows connecting the same row point to a point; in this embodiment, the threshold parameter threshold expression of the accumulation plane is 160-2 × M, the threshold expression of the lowest segment length is 80-M, and the threshold expression of the maximum line spacing is 36-2 × Q, where M is greater than or equal to 0 and less than 71, Q is greater than or equal to 0 and less than or equal to 16, the initial value of M is 0, the initial value of Q is 0, and when step c is executed once, M is increased by 5, and Q is increased by 1;
step 4.8.6: if the deviation of the inclined angle is smaller than a set value, screening out the straight line segment with the longest length from the straight line segments meeting the deviation of the inclined angle;
step 4.8.7: comparing the ROI image after Canny processing with the mask ROI image;
step 4.8.8: extracting edge contour points to be detected of the phase and the back image;
step 4.8.9: screening out proper contour points according to the distance from the contour points to the Hough fitting straight line segment to form a new contour point set;
step 4.8.10: fitting a straight line to the new contour point set by using a least square method, and solving an inclination angle and coordinates of two end points of the straight line section;
step 4.8.11: the flow is ended.
As shown in fig. 12, in the step 4.6.16, the process of extracting the angular measurement type information in the template source map includes the following steps:
step 4.9.1: extracting an angle measurement type ROI from a template source graph according to mask feature information corresponding to the angle measurement type obtained by task editing;
step 4.9.2: performing edge detection on the ROI gray level image in the template source image by using a Canny operator;
step 4.9.3: carrying out Hough line finding processing on the image subjected to Canny processing to obtain a plurality of straight line segments in a template source image;
step 4.9.4: screening two groups of straight-line segments from the plurality of straight-line segments obtained in the step 4.9.3 according to the oblique angles of the two straight-line segments forming the included angle in the mask angle, wherein the two groups of straight-line segments are respectively a first group of straight-line segments and a second group of straight-line segments, the absolute value of the difference between the oblique angle of the straight-line segment in the first group of straight-line segments and the oblique angle of the first straight-line segment in the mask angle is smaller than a set value, in the embodiment, 7.5 degrees, and the absolute value of the difference between the oblique angle of the straight-line segment in the second group of straight-line segments and the oblique angle of the second straight-line segment in the; if each group of screened straight line segments at least comprises one straight line segment, the screening is successful, otherwise, the screening is failed;
step 4.9.5: if the step 4.9.4 is not successfully screened, changing Hough line finding parameters, and jumping to the step 4.9.3 for execution;
step 4.9.6: if the screening in the step 4.9.4 is successful, respectively calculating the distance from the end point of any one straight-line segment in the mask angle to the other straight-line segment, wherein the end point is one end far away from the included angle, obtaining two distance values, and taking the smaller distance value D; traversing the first group of straight-line segments obtained in the step 4.9.4, screening out straight-line segments with the distance from the straight-line segments to a set point smaller than a set value to form a new first group of straight-line segments, wherein the set point is an end point of the first straight-line segment away from one end of the included angle in the mask characteristic information, in the embodiment, if D/5 is greater than 70 pixels, the set value is 70 pixels, and otherwise, the set value is D/5; similarly, traversing the second group of straight-line segments obtained in the step 4.9.4, screening out straight-line segments with the distance from the set point to the straight-line segment smaller than the set value to form a new second group of straight-line segments, wherein the set point is an end point of the second straight-line segment away from one end of the included angle in the mask characteristic information, and the value of the set value is the same as that of the first group of straight-line segments; if the new first group of straight-line segments and the new second group of straight-line segments both meet the condition that at least one straight-line segment is included, the screening is successful, otherwise, the screening is failed;
step 4.9.7: if the screening is not successful in the step 4.9.6, changing Hough straight line finding parameters, and jumping to the step 4.9.3 for execution;
step 4.9.8: if the screening in the step 4.9.6 is successful, traversing a random group of straight-line segments in the new first group of straight-line segments and the new second group of straight-line segments, and screening out the straight-line segments with the longest length;
step 4.9.9: calculating the included angle between the longest straight-line segment obtained in the step 4.9.8 and each straight-line segment in the other group of straight-line segments;
step 4.9.10: comparing the included angle obtained in the step 4.9.9 with the mask included angle, screening out the straight line segments meeting the conditions in another group of straight line segments according to the fact that the difference value of the included angle obtained in the step 4.9.9 and the mask angle degree is smaller than a set value, and judging whether the screening is successful or not; if at least one straight line segment is screened out from the other group of straight line segments, the screening is successful;
step 4.9.11: if the screening is not successful in the step 4.9.10, changing Hough straight line finding parameters, and jumping to the step 4.9.3 for execution;
step 4.9.12: if the screening in step 4.9.10 is successful, screening out the straight-line segment with the longest length from the other group of straight-line segments after the screening in step 4.9.10 is completed;
step 4.9.13: obtaining two straight line segments through the step 4.9.8 and the step 4.9.12, calculating an included angle between the two straight line segments, comparing the included angle with the included angle of the mask, and judging whether the deviation is smaller than a set value; in this embodiment, the set value is 10 degrees;
step 4.9.14: if the deviation is larger than or equal to the set value, the lifting angle measurement type characteristic information fails, and the step 4.9.19 is skipped to execute;
step 4.9.15: if the deviation is smaller than a set value, finding an outer contour of the image processed by the Canny;
step 4.9.16: screening two groups of contour point sets according to the distance between the contour points and the two obtained straight line segments;
step 4.9.17: respectively fitting straight lines to the two groups of contour point sets by using a least square method to obtain coordinates and an oblique angle of the two end points;
step 4.9.18: further solving the included angle and the vertex coordinate of the two straight line segments;
step 4.9.19: the flow is ended.
The 'template calibration' button can enter a template calibration interface after being clicked, distortion correction and magnification calculation can be carried out on the current measurement part on the template calibration interface, distortion parameters and magnification are obtained through calibration, and the parameters are used for size measurement in real-time measurement. Clicking a 'front' or 'side' button arranged in a template calibration interface, and entering a template calibration process, wherein the template calibration process comprises the following steps:
step 5.1: placing calibration plates at different positions in a view field, and respectively collecting calibration plate images;
step 5.2: after the calibration board image acquisition is finished, the camera acquisition is closed, and a calibration algorithm is called to perform image calibration processing;
step 5.3: after the calibration processing is finished, updating the calibration parameters to the latest calibration parameters;
step 5.4: and exiting the template calibration interface.
As shown in fig. 13 and 14, the image calibration process in step 5.2 includes acquiring distortion parameters and acquiring magnification. The distortion parameter is obtained and is passed through a Zhang Zhengyou distortion correction algorithm, and the method comprises the following steps:
step 5.1.1: reading calibration image data and calibration parameters of the adopted images, wherein the number of transverse points in the calibration parameters refers to the number of rows of a checkerboard of the calibration plate, the number of longitudinal points refers to the number of columns of the checkerboard, and the unit interval refers to the real physical size of each small grid of the checkerboard;
step 5.1.2: extracting angular point information from each frame of calibration image;
step 5.1.3: further extracting sub-pixel corner information by using the extracted corner information;
step 5.1.4: initializing a space three-dimensional coordinate system of an angular point on a calibration plate;
step 5.1.5: calibrating the camera by using the extracted sub-pixel angular point information and the spatial three-dimensional coordinate system information of the angular point on the calibration plate to obtain distortion parameters participating in each frame of image in the camera and a rotation vector and a translation vector of each frame of image;
step 5.1.6: evaluating the calibration result; firstly, calibrating by a camera to obtain distortion parameters, carrying out re-projection calculation on spatial three-dimensional coordinate points of each frame of image to obtain new projection points, calculating the error between the new projection points and the old projection points, if the error is less than 0.15 pixel of a set value, meeting the requirements, storing calibration results and the distortion parameters, and ending the process; if the error is more than or equal to 0.15 pixel of the set value, the process is not qualified, and the process is ended and the calibration image is prompted to be collected again.
The process of obtaining the magnification includes the steps of:
step 5.2.1: correcting a certain collected frame of calibration image by using a calibration result;
step 5.2.2: extracting corner information from the corrected image;
step 5.2.3: extracting sub-pixel angular point information;
step 5.2.4: traversing the number of the rows of the corrected image, and calculating and storing the distance from the first row of each row to the first last row;
step 5.2.5: sorting the distances stored in each column;
step 5.2.6: selecting a plurality of columns with the middle column as the center, and accumulating the saved distances of the selected columns;
step 5.2.7: according to the accumulated value, calculating an average value;
step 5.2.8: solving the magnification according to the mean value, the number of columns and the physical size, wherein the calculation formula of the magnification is mean value/(number of columns-2)/physical size;
step 5.2.9: the flow is ended.
The size measurement process comprises the following steps:
step 6.1: detecting the out-of-bounds of the part according to the current real-time image, and judging whether the detected part is out of bounds or not; if the part is out of bounds, popping up a prompt box to prompt a user that the part to be detected is out of bounds, and carrying out next operation after the user closes the prompt box; if the boundary is not out of range, directly carrying out the next operation;
step 6.2: judging whether a front template or a side template exists, if so, carrying out the next operation, and if not, prompting to manufacture the template first and then carry out measurement, and ending the process;
step 6.3: judging whether an image is acquired; if the image is collected, starting a front/side dimension measurement processing thread and operating a dimension measurement algorithm; if the image is not acquired, prompting the user that the image is not acquired, and ending the process;
step 6.4: judging whether the number of the processed pictures reaches a processing threshold value; if the processing threshold is not reached, prompting that the camera is started to continue to collect n images, wherein n represents the number of the missing images, and ending the process; if the processing threshold is reached, performing front/side data processing; the data processing comprises the steps of solving a standard deviation of processing result data of each picture, and after data kicking is carried out according to the standard deviation, averaging the rest data;
step 6.5: and after the data processing is finished, displaying the measurement result of each size on the interface.
As shown in fig. 15, the part out-of-bounds detection in step 6.1 includes the following steps:
step 6.1.1: reading a to-be-tested image and a template source image of a part to be tested;
step 6.1.2: carrying out differential processing on a to-be-detected image and a template source image of a part to be detected, and judging whether the two frames of images are consistent; if the shape and the displacement of the part to be detected are consistent, indicating that the shape and the displacement of the part to be detected are unchanged, setting the Flag bit to be 0, and if the shape and the displacement of the part to be detected are inconsistent, indicating that the shape or the displacement of the part to be detected are changed, and setting the Flag bit to be 1;
step 6.1.3: filtering the graph to be tested; removing high-frequency noise points through median filtering processing, and keeping outline edge information, wherein a median filtering window in the embodiment is 9 pixels by 9 pixels;
step 6.1.4: carrying out grey threshold value binarization processing on the filtered image; wherein the gray value of the pixel greater than the set threshold is set to 255, otherwise, it is set to 0; in the embodiment, the set threshold value is 180;
step 6.1.5: searching all closed-loop contours in the image; the closed-loop contour means that the distance between any two adjacent contour points in the contour is smaller than a set value, and the set value of the embodiment is 2 pixels;
step 6.1.6: solving the maximum closed loop contour perimeter, and judging whether the perimeter meets the setting condition; setting conditions that the maximum closed-loop contour perimeter is not less than 0.99 times and not more than 1.01 times of the image perimeter; if the perimeter does not meet the condition, looking up Flag bit, and ending the process; if the perimeter satisfies the condition, go to step 6.1.7; if Flag is equal to 0, the detection result indicates that the part to be detected is out of bounds but the shape and the displacement of the part to be detected are unchanged; if Flag is equal to 1, the detection result indicates that the part to be detected is out of bounds and the shape or displacement of the part to be detected changes;
step 6.1.7: solving the maximum closed-loop contour centroid, and judging whether the centroid meets the set conditions;
setting conditions that the distance between a centroid transverse coordinate (X-axis coordinate) and an image center point transverse coordinate is not larger than a set value, 5 pixels are adopted in the embodiment, the distance between a centroid longitudinal coordinate (Y-axis coordinate) and an image center point longitudinal coordinate is not larger than the set value, and 5 pixels are adopted in the embodiment; if the centroid does not meet the set condition, checking Flag, wherein if Flag is equal to 0, the detection result indicates that the part to be detected is out of bounds but the shape and the displacement of the part to be detected are unchanged; if Flag is equal to 1, the detection result indicates that the part to be detected is out of bounds and the shape or displacement of the part to be detected changes; if the mass center meets the set conditions, judging whether the total closed loop contour number in the image is 1;
if the total closed loop contour number in the image is 1, checking Flag bit Flag; if Flag is equal to 0, the detection result indicates that the part to be detected is out of bounds but the shape and the displacement of the part to be detected are not changed, and if Flag is equal to 1, the detection result indicates that the part to be detected is out of bounds and the shape or the displacement of the part to be detected is changed; if the total closed loop contour number in the image is more than 1, checking a Flag, if the Flag is equal to 0, determining that the part to be detected is not out of bounds and the shape and the displacement of the part to be detected are not changed, and if the Flag is equal to 1, determining that the part to be detected is not out of bounds and the shape or the displacement of the part to be detected is changed;
step 6.1.8: the flow is ended.
The differential processing in step 6.1.2 is specifically implemented as follows: firstly, performing difference between a to-be-measured image and a template source image, comparing gray values of all pixel points in two frames of images, and adding 1 to a gray value statistic value when the gray values are larger than a set value, wherein an initial value of the gray value is 0, and the set value is 80 in the embodiment; after traversing all the pixel points, if the gray value statistic is greater than the set threshold, it indicates that the shape or displacement of the part to be detected changes, otherwise, it indicates that the shape and displacement of the part to be detected are unchanged, and the set threshold is 99% of the number of all the pixel points in this embodiment.
As shown in fig. 16, the size measurement algorithm in step 6.3 includes the following steps:
step 6.2.1: correcting the read-in real-time image to be tested by using the calibration parameters;
step 6.2.2: judging whether the shape and the displacement of the part to be detected are unchanged;
step 6.2.3: if the shape and the displacement of the part to be detected are not changed, measuring each measurement type according to the extracted characteristic information of the template source diagram;
step 6.2.4: if the shape and/or displacement of the part to be detected changes, searching and matching the object, and judging whether the part to be detected is matched with the template source image;
step 6.2.5: if the part to be detected is matched with the template source diagram, measuring the measurement types of circles, lines, arcs and angles according to the extracted characteristic information of the template source diagram, and ending the process;
step 6.2.6: and if the part to be detected is not matched with the template source diagram, indicating that the part to be detected is not found, and ending the process.
As shown in fig. 17, in step 6.2.4, the process of object searching and matching includes the following steps:
step 6.3.1: carrying out mean value filtering processing on the graph to be tested; the mean filtering window of the embodiment is 3 pixels by 3 pixels;
step 6.3.2: performing thresholding, and setting the gray value of the pixel larger than the set threshold value to be 0, otherwise, to be 255, wherein the set threshold value is 100 in the embodiment;
step 6.3.3: extracting outline information of a to-be-detected hierarchical part; the contour information of the to-be-detected hierarchical part comprises outer contour information and inner contour information of the to-be-detected hierarchical part, and the outer contour and the inner contour meet the parent-child hierarchical relationship, wherein the outer contour is a parent contour, and the inner contour is a child contour;
step 6.3.4: judging whether the absolute value of the difference between the minimum circumscribed rectangular area of the outer contour of the part to be detected and the minimum circumscribed rectangular area of the outer contour of the template source diagram is smaller than a set value or not; the value range of the set value is 15% of the minimum circumscribed rectangle area of the outer contour of the template source diagram;
step 6.3.5: if the absolute value of the difference between the minimum circumscribed rectangular area of the outer contour of the part to be detected and the minimum circumscribed rectangular area of the outer contour of the template source diagram is larger than or equal to a set value, ending the process;
step 6.3.6: if the absolute value of the difference between the minimum circumscribed rectangle area of the outer contour of the part to be detected and the minimum circumscribed rectangle area of the outer contour of the template source diagram is smaller than the set value, judging whether the absolute value of the ratio difference between the length-width ratio of the minimum circumscribed rectangle of the outer contour of the part to be detected and the length-width ratio of the minimum circumscribed rectangle of the outer contour of the template source diagram is smaller than the set value or not; the value range of the set value is 10% of the length-width ratio of the minimum circumscribed rectangle of the outer contour of the template source diagram;
step 6.3.7: if the absolute value of the length-width ratio difference between the minimum external rectangle length-width ratio of the outer contour of the part to be detected and the minimum external rectangle length-width ratio of the outer contour of the template source diagram is greater than or equal to a set value, ending the flow;
step 6.3.8: if the absolute value of the length-width ratio difference between the minimum external rectangle length-width ratio of the outer contour of the part to be detected and the minimum external rectangle length-width ratio of the outer contour of the template source diagram is smaller than a set value, solving a minimum external circle of the outer contour, and taking the circle center of the minimum external circle as a rotation center;
step 6.3.9: judging whether the distance between the centroid of the outer contour of the template source image and the center of the minimum circumscribed rectangle of the outer contour of the template source image is larger than a set value or not; the setting value in this embodiment is 20 pixels;
step 6.3.10: if the distance between the centroid of the outer contour of the template source diagram and the center of the minimum circumscribed rectangle of the outer contour of the template source diagram is less than or equal to a set value, the step 6.3.18 is skipped;
step 6.3.11: if the distance between the centroid of the outer contour of the template source image and the center of the minimum circumscribed rectangle of the outer contour of the template source image is larger than a set value, judging whether the absolute value of the difference between the distance between the centroid of the outer contour of the part to be detected and the center of the minimum circumscribed rectangle of the outer contour of the template source image and the length-width ratio of the centroid of the outer contour of the part to be detected to the minimum circumscribed rectangle of the outer contour of the; the value range of the set value is 10% of the length-width ratio from the outline centroid of the template source diagram to the minimum circumscribed rectangle of the outline;
step 6.3.12: if the absolute value of the difference between the distance between the outer contour centroid of the part to be detected and the minimum circumscribed rectangle center thereof and the length-width ratio between the outer contour centroid of the template source diagram and the minimum circumscribed rectangle length-width ratio of the outer contour is greater than or equal to a set value, jumping to step 6.3.18;
step 6.3.13: if the absolute value of the difference between the distance between the center of mass of the outer contour of the part to be detected and the center of the minimum circumscribed rectangle of the part to be detected and the length-width ratio between the center of mass of the outer contour of the template source image and the minimum circumscribed rectangle of the outer contour of the part to be detected is smaller than a set value, calculating the rotation angle of the part to be detected relative to the template source image;
step 6.3.14: combining the central coordinates and the angle information of the template source diagram, translating and rotating the diagram to be tested, namely extracting a region of interest (ROI) of the part to be tested in the diagram to be tested; then, newly building a blank image equal to the image to be tested, translating the part to be tested to the center of the blank image, and rotating the part to be tested to the same angle of the template source image; in the embodiment, the ROI is a rectangle, the side length of the ROI is the diameter of the minimum circumscribed circle of the template source image, and the center of the ROI is the center of the minimum circumscribed circle of the part to be detected;
step 6.3.15: and judging whether the vector angle from the outline centroid of the part to be detected to the minimum circumscribed rectangle center of the part to be detected is smaller than a set value or not by comparing the vector angle from the outline centroid of the template source diagram to the minimum circumscribed rectangle center of the template source diagram, wherein the set value of the embodiment is 7.5 degrees.
Step 6.3.16: if the difference value of the vector angles in the step 6.3.15 is smaller than the set value, the object matching is successful, and the process is ended;
step 6.3.17: if the difference value of the vector angles in the step 6.3.15 is greater than or equal to the set value, entering a step 6.3.18;
step 6.3.18: judging whether an inner contour exists according to the area of the minimum circumscribed rectangle of the maximum inner contour of the template source image; if not, the object matching is successful, the rotation angle of the part to be detected relative to the template source image is calculated, the rotating object is translated by combining the central coordinate and the angle information of the template source image, and the process is ended; if yes, judging whether the template source graph only has an effective maximum inner contour;
step 6.3.19: if the template source image only has one effective maximum inner contour, carrying out effective inner contour matching; the effective inner contour matching comprises the steps of judging whether the minimum external rectangle area of the maximum inner contour of a part to be detected is matched with the minimum external rectangle area of the maximum inner contour of the template source image, further sequentially judging whether the length-width ratio of the minimum external rectangle of the maximum inner contour is matched with the minimum external rectangle area of the maximum inner contour, whether the distance between the centers of the minimum external rectangle of the outer contour and the maximum inner contour is matched with the minimum external rectangle area of the template source image, calculating the rotation angle of the part to be detected relative to the template source image, rotating and translating the object by combining the central coordinate and the angle information of the template source image, and detecting whether the matching is successful or not by;
step 6.3.20: if the template source image does not meet the condition that only one effective maximum inner contour exists, judging whether the template source image has a plurality of maximum inner contours or both maximum inner contours and minimum outer contours;
step 6.3.21: if the template source image has a plurality of maximum inner contours, traversing all the inner contours in the part to be detected, performing effective inner contour matching on each inner contour, judging whether at least one inner contour is successfully matched, if so, successfully matching the object, and ending the process; if not, the object is not successfully matched, and the process is ended;
step 6.3.22: if the template source image has the maximum inner contour and the minimum outer contour; firstly, carrying out effective inner contour matching on the maximum inner contour; if the maximum inner contour is matched, the object is successfully matched, and the process is ended;
if the maximum inner contour is not matched, then carrying out effective inner contour matching on the minimum inner contour detection object; if the minimum inner contour is matched, the object is successfully matched, and the process is ended; if the minimum inner contour is not matched, the object is not successfully matched, and the process is ended.
As shown in fig. 18, the measurement flow of the circle measurement type to be measured in step 6.2.5 includes the following steps:
step 6.4.2: extracting the ROI of the circle measurement type in the graph to be measured according to the characteristic information of the circle measurement type obtained by making the template source graph;
step 6.4.3: performing Gaussian filtering processing on the ROI gray level image in the image to be measured;
step 6.4.4: carrying out Hough circle finding processing on the filtered image to obtain a plurality of circles in the image to be detected;
step 6.4.5: comparing the circle center of the circle obtained in the graph to be measured with the template source graph, and judging whether at least one circle on the graph to be measured deviates within 2mm from the circle center of the circle selected on the template source graph;
step 6.4.6: if no circle meeting the step 6.4.5 exists on the graph to be tested, changing the Hough circle finding parameter threshold value, and skipping to the step 6.4.4;
step 6.4.7: if the circle meeting the step 6.4.5 exists on the graph to be tested, screening out the circle on the template source graph and the circle on the graph to be tested, and enabling the absolute difference value of the radii of the two Hough fitting circles to be minimum;
step 6.4.8: comparing the radii of the circles screened in the step 6.4.7, and judging whether the absolute difference of the radii is within 2 mm;
step 6.4.9: if the absolute difference value of the radii is not within 2mm, changing the Hough circle finding parameter threshold, and skipping to the step 6.4.4;
step 6.4.9: if the absolute difference of the radii is within 2mm, a proper Hough fitting circle is found;
step 6.4.20: solving the gradient of the image after Gaussian filtering;
step 6.4.21: calculating sub-pixel edge points; the edge point is defined as the maximum value of the difference between adjacent gradient module values, in the embodiment, a quadratic function interpolation of the gradient module values at three points adjacent in the gradient direction is calculated, namely, three coordinate points (point a, point B, point C) are used for carrying out quadratic equation fitting, and a compensation value η is obtained:
Figure BDA0002726635700000181
the edge sub-pixel point is the middle point of the adjacent three points plus the compensation value, wherein | | | g (a) | | represents the gradient module value of the point a, | g (B) | | represents the gradient module value of the point B, and | | | g (C) | | represents the gradient module value of the point C;
step 6.4.22: connecting the sub-pixel edge points into a contour;
step 6.4.23: double-threshold screening contour points;
step 6.4.24: screening out proper contour points according to the distance from the contour points to the found proper Hough fitting circle to form a new contour point set;
step 6.4.25: fitting a circle by using a least square method on the new contour point set to obtain a circle center and a radius;
step 6.4.26: the flow is ended.
In step 6.4.3, the gaussian filtering process represents sliding convolution with a discretized window. The gaussian filtering process first needs to calculate a gaussian weight matrix, and assuming that the coordinates of the central point are (0,0), the coordinates of the 8 points nearest to it are as follows: assuming that the standard deviation σ is 1.5, the weight matrix with a filter radius of 1 is as follows: the sum of the weights at these 9 points is equal to 0.4787147. If only the weighted average of these 9 points is calculated, the sum of their weights must be equal to 1, so the above 9 values are divided by 0.4787147 respectively to obtain the final weight matrix: with the weight matrix, the central point and the peripheral n points can be calculated, each point is multiplied by the own weight value and the values are added, and the result is the Gaussian filtering value of the central point.
This process is repeated for all points, resulting in a gaussian filtered image.
Figure BDA0002726635700000182
In step 6.4.20, the gradient calculation of the image after gaussian filtering is performed to indicate that approximate image gradient and gradient module value are calculated by using central difference, specifically, the gradient of any pixel point (X, Y) in the image is divided into an X component and a Y component, and the X component is obtained by subtracting the gray value of the pixel point (X-1, Y) from the gray value of the pixel point (X +1, Y); the Y component is the gray value of the pixel point (x, Y +1) minus the gray value of the pixel point (x, Y-1); the gradient modulus is the sum of the squares of the X component plus the sum of the squares of the Y component plus the root sign.
In step 6.4.22, connecting the sub-pixel edge points into a contour representation, and grouping the contour point sets belonging to the same edge together to form a link; the corresponding should be a pixel point of every contour point among them, at first, should have approximate gradient direction for the pixel point of being classified as the same link, the approximate gradient direction represents the included angle should be less than 90 degrees between the adjacent pixel points on the same link, take pixel point A and B as an example, the mathematical expression is: g (A), g (B) > 0, wherein g (A) represents the gradient of the point A and g (B) represents the gradient of the point B; in addition, the image contour may separate bright and dark regions, so a continuous link needs to divide the dark regions to the same side of the curve, a simple way is to verify that the vector from edge point a to point B is approximately orthogonal to one of the two possible gradient directions (X-axis direction or Y-axis direction) of point a.
In the step 6.4.23, the double-threshold-value-screened contour point represents a contour point formed by screening a high threshold value and a low threshold value, specifically, for each point in the link, it is verified whether the gradient modulus value is greater than the set high threshold value, where the high threshold value is 4.3; if the gradient module value is larger than the set high threshold value, then verifying whether the gradient module value of the previous point linked with the point is larger than the set low threshold value, wherein the low threshold value of the embodiment is 0.8; if the contour point is larger than the set low threshold, the contour point is reserved; if the contour point is less than or equal to the set low threshold, marking the contour point and removing; similarly, the condition of the next contour point linked with the contour point is verified; and finally, deleting the contour points with the elimination marks after all the points are traversed, and reforming the link.
It should be noted that feature information of a plurality of circle measurement types may exist on the template original image, and the measurement flow is performed independently for the feature information of each circle measurement type; the characteristic information of the arc, line and angle measurement types is also subjected to the measurement process independently.
It should be noted that the arc measurement type of measurement to be mapped coincides with the circle measurement type of measurement.
As shown in fig. 19, the measurement flow of the line measurement type to be measured in step 6.2.5 includes the following steps:
step 6.5.1: extracting a line measurement type ROI in the graph to be measured according to line measurement type characteristic information obtained by the template source graph;
step 6.5.2: carrying out Canny edge detection processing on the ROI gray level image in the image to be detected;
step 6.5.3: carrying out Hough line finding processing on the image subjected to Canny processing to obtain a plurality of straight line segments in the image to be detected;
step 6.5.4: performing oblique angle comparison on the straight line segment in the to-be-detected image obtained in the step 6.5.3 and the straight line segment of the template source image, and judging whether the oblique angle deviation between at least one straight line segment in the to-be-detected image and at least one straight line segment in the template source image is smaller than a set value or not;
step 6.5.5: if no straight line segment smaller than the set bevel offset value exists in the graph to be measured, jumping to step 6.5.3;
step 6.5.6: if the straight line sections smaller than the set bevel deviation value exist in the graph to be tested, obtaining a plurality of Hough fitting straight line sections which meet the conditions in the step 6.5.4 in the bevel graph to be tested;
step 6.5.7: selecting the straight line segment with the longest length from the straight line segments obtained in the step 6.5.6;
step 6.5.8: performing Gaussian filtering on the ROI grayscale image corresponding to the straight line segment obtained in the step 6.5.7;
step 6.5.9: solving the gradient of the image after Gaussian filtering;
step 6.5.10: calculating sub-pixel edge points;
step 6.5.11: connecting the sub-pixel edge points into a contour;
step 6.5.12: double-threshold screening sub-pixel edge points to form a contour point set again;
step 6.5.13: according to the distance from the contour point to the selected straight-line segment, selecting appropriate contour points to form a new contour point set
Step 6.5.14: fitting a straight line to the new contour point set by using a least square method to obtain an oblique angle and coordinates of two end points;
step 6.5.15: the flow is ended.
As shown in fig. 20, the process of measuring the angle measurement type of the image to be measured in step 6.2.5 includes the following steps:
step 6.6.1: extracting an angle measurement type ROI in the graph to be measured according to circle measurement type characteristic information obtained by the template source graph;
step 6.6.2: carrying out Canny edge detection processing on the ROI gray level image in the image to be detected;
step 6.6.3: carrying out Hough line finding processing on the image subjected to Canny processing to obtain a plurality of straight line segments in the image to be detected;
step 6.6.4: screening two groups of straight-line segments from the plurality of straight-line segments obtained in the step 6.6.3 according to oblique angles of two straight-line segments forming an included angle in the template source diagram, wherein the oblique angles include a first straight-line segment oblique angle and a second straight-line segment oblique angle, the two groups of straight-line segments are respectively a first group of straight-line segments and a second group of straight-line segments, the absolute value of the difference between the oblique angle of the straight-line segment in the first group of straight-line segments and the oblique angle of the first straight-line segment in the mask characteristic information is smaller than a set value, in the embodiment, 7.5 degrees, and the absolute value of the difference between the oblique angle of the straight-line segment; if each group of screened straight line segments at least comprises one straight line segment, the screening is successful, otherwise, the screening is failed;
step 6.6.5: if the screening in the step 6.6.4 fails, changing Hough straight line finding parameters, and jumping to the step 6.6.3 for execution;
step 6.6.6: if the screening in the step 6.6.4 is successful, respectively calculating the distance from the end point of any one of the two straight line segments forming the included angle in the template source image to the other straight line segment, wherein the end point is the end far away from the included angle, obtaining two distance values, and taking the smaller distance value D'; traversing the first group of straight-line segments obtained in the step 6.6.4, and screening out straight-line segments with a distance from a straight-line segment to a set point smaller than a set value to form a new first group of straight-line segments, where the set point is an end point of the first straight-line segment in the template source image, which is far away from one end of the included angle, in this embodiment, if D '/5 >70 pixels, the set value is 70 pixels, otherwise, the set value is D'/5; similarly, traversing the second group of straight-line segments obtained in the step 6.6.4, and screening out straight-line segments with the distance from the set point to the straight-line segment smaller than the set value to form a new second group of straight-line segments, wherein the set point is an end point of the second straight-line segment in the template source image, which is far away from the included angle, and the set value is the same as that of the first group of straight-line segments; if the new first group of straight-line segments and the new second group of straight-line segments both meet the condition that at least one straight-line segment is included, the screening is successful, otherwise, the screening is failed;
step 6.6.7: if the screening in the step 6.6.6 fails, changing Hough line finding parameters, and jumping to a step 6.6.3;
step 6.6.8: if the screening in the step 6.6.6 is successful, traversing a random group of straight-line segments in the new first group of straight-line segments and the new second group of straight-line segments in the step 6.6.6, and screening out the straight-line segment with the longest length in the group of straight-line segments;
step 6.6.9: calculating the included angle between the straight line segment obtained in the step 6.6.8 and each straight line segment in the other group of straight line segments;
step 6.6.10: comparing the included angle obtained in the step 6.6.9 with the included angle selected in the template source image, screening out the straight line segments meeting the conditions in another group of straight line segments according to the condition that the difference value of the included angle in the template source image and the included angle selected in the template source image is smaller than a set value, and judging whether the screening is successful; if at least one straight line segment is screened out from the other group of straight line segments, the screening is successful;
step 6.6.11: if the screening in the step 6.6.10 fails, changing Hough straight line finding parameters, and jumping to the step 6.6.3 for execution;
step 6.6.12: if the screening in step 6.6.10 is successful, screening the straight line segment with the longest length in another group of straight line segments;
step 6.6.13: obtaining the straight line segment with the longest length in each group of straight line segments, calculating the included angle between the two straight line segments, comparing the included angle with the included angle of the corresponding straight line segment in the template source image, and judging whether the deviation is smaller than a set value; in this embodiment, the set value is 10 degrees;
step 6.6.14: if the deviation is larger than or equal to the set value, the angular measurement type measurement is failed, and the step 6.6.23 is skipped to execute;
step 6.6.15: if the deviation is smaller than the set value, performing Gaussian filtering processing on the ROI grayscale image;
step 6.6.16: solving the gradient of the image after Gaussian filtering;
step 6.6.17: calculating sub-pixel edge points;
step 6.6.18: connecting the sub-pixel edge points into a contour;
step 6.6.19: double-threshold screening sub-pixel edge points to form a contour point set again;
step 6.6.20: screening two groups of contour point sets according to the distance between the contour points and the two obtained straight line segments;
step 6.6.21: respectively fitting straight lines to the two groups of contour point sets by using a least square method to obtain coordinates and an oblique angle of the two end points;
step 6.6.22: solving the included angle and the vertex coordinate of the two straight line segments;
step 6.6.23: the flow is ended.
The above description is only one specific example of the present invention and should not be construed as limiting the invention in any way. It will be apparent to persons skilled in the relevant art(s) that, having the benefit of this disclosure and its principles, various modifications and changes in form and detail can be made without departing from the principles and structures of the invention, which are, however, encompassed by the appended claims.

Claims (10)

1. A dimension measurement scoring device based on machine vision is characterized by comprising an operation table, a detection table, a light source and a camera; a through hole is formed in the middle of the detection table, and a transparent plate is arranged at the hollow part; the light source is arranged below the detection platform and corresponds to the transparent plate; the camera is arranged right above the transparent plate and is fixedly arranged; the operation table is electrically connected with the light source and the camera and can control the actions of the light source and the camera; a laser ranging device is arranged between the camera and the detection platform; the console includes a display module.
2. A dimension measurement scoring method based on machine vision is characterized by comprising the following steps:
step 1: the operation desk senses the operation of an operator, opens the software according to the operation of the operator and automatically executes the initialization operation of opening the software;
step 2: after the initialization operation of opening the software is completed, a display module on the operation desk automatically displays a user login interface; wherein, the initial user login interface is provided with a 'tourist measurement' button and a 'system quitting' button;
and step 3: performing user login according to the operation of an operator; comprises two login modes; one is that the visitor logs in through a 'visitor measurement' button and enters a visitor measurement process; the other is that the teacher logs in or the students log in through identification card identification, and the corresponding teacher operation flow or the corresponding student examination flow is entered;
and 4, step 4: the operation desk finishes user login; if the measurement flow is a tourist measurement flow and a student examination flow, automatically entering a size measurement interface, setting parameters, finishing size measurement, and ending the flow; if the teacher operation process is the teacher operation process, displaying a teacher operation panel on a user login interface; buttons for downloading test questions, uploading test questions and making test questions are arranged on the teacher operation panel;
and 5: the operation platform selects the operation content of the teacher according to the operation of the operator; through a 'download test question' button, a test question downloading process can be entered, and the process is ended after the test question downloading is completed; the test question uploading process can be entered through the test question uploading button, and the process is ended after the test questions are uploaded; through the button of 'test question making', the test question making process can be entered, the size measuring interface is entered, and the process is finished after the test question making is completed.
3. The machine vision-based dimension measurement and scoring method as claimed in claim 2, wherein in the step 1, when software on an operation table is opened, an initialization operation of opening the software is automatically executed; opening software initialization operations including reading test folders, hardware initialization and other initializations; the software of the operating desk is provided with two folders, one folder is an examination folder, and the other folder is a test question making folder generated by a teacher making test questions.
4. A machine vision based dimensional measurement scoring method according to claim 3, wherein said reading an examination folder comprises the steps of:
step 1.1: judging whether the test file name recording file exists or not; if the examination file name recording file exists, reading information in the file, wherein the read information is the name of an examination folder; if the test file name recording file does not exist, prompting the user to download test questions from the server, otherwise, not performing size measurement operation;
step 1.2: judging whether the examination folder exists according to the name of the examination folder read by the examination file name recording file, and if so, indicating that a file required by the examination exists; if the test question does not exist, prompting the user to download the test question from the server and then carrying out size measurement operation;
all parameters and tools required by size measurement are stored in the examination folder, wherein the parameters and the tools comprise a front measurement parameter, a side measurement parameter, a front calibration parameter, a side calibration parameter and a test question making tool;
the hardware initialization comprises the operation of opening a camera, if the camera is successfully opened, when entering a size measurement interface, the camera image acquisition function is started, and real-time image acquisition is carried out; if the camera is failed to be opened, prompting a user that the camera is not successfully opened, simultaneously giving a reason for the failure of the camera to be opened, and ending the step 1 and ending the process;
the other initialization comprises the initialization of an interface and the initialization of related variables, and the initialization of the related variables comprises the following two aspects:
1, initializing a mark bit of a measurement parameter for the size of an examination file read for the first time by a system;
2, initializing a mark position of the examination information obtained by the system for the first time;
the system reads the mark bit of the size measurement parameter of the test file for the first time and has the function of confirming whether the size measurement parameter is read or not; the effect of the system for obtaining the test information zone bit for the first time is to confirm whether to read the test information.
5. The dimension measurement and scoring method based on machine vision according to claim 2, characterized in that in the user login process in step 3, when the user enters the system for the first time, the teacher logs in successfully, logs in overtime, logs in abnormally and logs in with an error account number, the identity card information is read regularly, and the reading of the identity card information is realized by an external identity card reader;
when entering the system for the first time, if the identity card is inserted, firstly trying to log in by a teacher according to the identity card information read by the identity card reader; if the teacher fails to log in, the teacher continues to try the student to log in, if the student fails to log in, the teacher prompts related information, and meanwhile, the teacher reads the identity card information again at regular time; if the identity card information reading failure comprises that an identity card is not inserted, the identity card cannot be identified, and the identity card information cannot complete teacher login and student login, only the tourist can log in and enter a tourist measurement process;
the steps of reading the identity card information at regular time are as follows:
step 2.1: initializing connection of an identity card reader; if the initialization is successful, carrying out the next operation; if the initialization fails, prompting the user to confirm whether the connection of the ID card reader is normal, and simultaneously finishing the timing reading operation;
step 2.2: card authentication operation between the identity card reader and the identity card; if the card authentication is successful, the next operation is carried out; if the card authentication fails, prompting the user that the identity card authentication fails, closing the connection of an identity card reader and finishing the timing reading operation;
step 2.3: reading the identity card information; if the reading is successful, filling the identity card information into an interface for display, disabling a 'tourist measurement' button, and starting a login thread; if the reading fails, prompting the user that the reading of the identity card information fails, closing the connection of the identity card reader and finishing the timing reading operation.
6. A machine vision based sizing scoring method according to claim 2, wherein said guest measurement process includes entering a sizing interface, and prior to entering the sizing interface performing the steps of:
step 3.1: judging whether the examination folder exists or not; if the test questions exist, the next operation is carried out, and if the test questions do not exist, the user is prompted to download the test questions first;
step 3.2: judging whether the examination file information is read for the first time; if so, reading the information parameters of the examination file and then carrying out the next operation, and if not, directly carrying out the next operation;
step 3.3: setting the operation authority as the tourist authority; the tourist authority can only measure the size of the part to be detected and cannot carry out data uploading to a server and test question making operation;
step 3.4: entering a dimension measuring interface, and setting a camera to start to acquire images.
7. The machine vision-based dimension measurement and scoring method as claimed in claim 2, wherein a login process timer is started to count login time after a teacher login operation is completed; enabling a visitor login button if the teacher logs in overtime or the network is abnormal in the login process, and reading the identity card information again;
after the teacher logs in, displaying a teacher operation panel on a user login interface;
obtaining examination information before the test question downloading process; if the examination information is successfully acquired, setting a test question downloading file name and an examination file name according to the examination information, and simultaneously downloading the test question according to the compressed file name of the examination information; if the examination information acquisition fails, is overtime or is abnormal, prompting corresponding information, clearing the ID card information display, and reading the ID card information again at regular time;
the test question uploading process firstly needs to judge whether a test question making folder exists or not; if the test question exists, compressing the manufactured test question folder, displaying the compression progress, uploading the compressed file to a server after the compression is finished, extracting the serial number of the dimension to be measured and the judgment basis of qualified measurement size according to the dimension measurement information in the manufactured test question file, and uploading the extracted serial number and judgment basis to the server; if not, prompting the user; the number of the dimension to be measured and the judgment basis of qualified measurement are extracted after being edited in a test question making editor, and the editor is provided with the dimension to be measured and the tolerance upper limit and the tolerance lower limit of the qualified dimension making;
the examination question making process firstly judges whether the examination file information of the examination questions made by the teacher is read or not; if the examination file information of the test questions made by the teacher is not read before, reading the examination file information, entering a size measurement interface, and setting a camera to acquire images; otherwise, the method directly enters a size measurement interface, and a camera is set to acquire images.
8. The machine vision-based dimension measurement and scoring method as claimed in claim 2, wherein a login process timer is automatically started to count login time after the student login operation is completed; the student examination process firstly needs to judge whether examination information is read for the first time when the student is started and whether the examination information is empty; if one is true, reading examination information; if the two are not true, the examination information does not need to be read, and the examination question information in the examination folder can be directly read.
9. A machine vision based dimensional measurement scoring method according to claim 8, wherein said reading test information comprises the steps of: if the examination information is successfully read, judging whether the examination file name read by starting is the same as the examination file name in the examination information; if the names are different, updating the test file name, and storing the test file name into a test file name recording file; if the names are the same, the examination information reading is completed; and if the examination information reading fails, including overtime examination information reading or abnormal reading process, prompting the user of corresponding information, clearing the interface identity information, and reading the identity card information again at regular time.
10. The machine vision-based dimensional measurement scoring method according to claim 9, wherein if the test information exists or is read successfully, whether a test file exists is judged; if the examination file does not exist, prompting the user that the examination questions do not exist, and reading the information of the identity card at regular time again; if the examination file exists, judging whether the examination question information in the examination file is read for the first time; if the reading is the initial reading, reading relevant parameters of size measurement, including camera configuration parameters, camera calibration parameters, relevant parameters of calibration results and template information, and setting operation permission as student permission; if the reading is not the initial reading, the relevant parameters of the size measurement do not need to be read; and entering a size measurement interface to start image acquisition after finishing the judgment of whether the examination question information in the examination file is read for the first time.
CN202011104917.8A 2020-10-15 2020-10-15 Dimension measurement scoring device and scoring method based on machine vision Active CN112304217B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202011104917.8A CN112304217B (en) 2020-10-15 2020-10-15 Dimension measurement scoring device and scoring method based on machine vision
CN202210187234.6A CN114674223B (en) 2020-10-15 2020-10-15 Dimension measurement scoring system based on machine vision detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011104917.8A CN112304217B (en) 2020-10-15 2020-10-15 Dimension measurement scoring device and scoring method based on machine vision

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202210187234.6A Division CN114674223B (en) 2020-10-15 2020-10-15 Dimension measurement scoring system based on machine vision detection method

Publications (2)

Publication Number Publication Date
CN112304217A true CN112304217A (en) 2021-02-02
CN112304217B CN112304217B (en) 2022-04-08

Family

ID=74327627

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202210187234.6A Active CN114674223B (en) 2020-10-15 2020-10-15 Dimension measurement scoring system based on machine vision detection method
CN202011104917.8A Active CN112304217B (en) 2020-10-15 2020-10-15 Dimension measurement scoring device and scoring method based on machine vision

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202210187234.6A Active CN114674223B (en) 2020-10-15 2020-10-15 Dimension measurement scoring system based on machine vision detection method

Country Status (1)

Country Link
CN (2) CN114674223B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112991316A (en) * 2021-03-30 2021-06-18 中国空气动力研究与发展中心超高速空气动力研究所 Dynamic measurement technology for model edge ablation amount
WO2022206022A1 (en) * 2021-04-01 2022-10-06 浙江大学台州研究院 Size measuring instrument system based on multi-template matching and automatic focusing functions

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2268964Y (en) * 1996-07-25 1997-11-26 林永才 Operation table for working skills
JP2009193477A (en) * 2008-02-17 2009-08-27 Obirin Gakuen e-LEARNING TEST SYSTEM
CN101839700A (en) * 2010-03-29 2010-09-22 重庆建设工业(集团)有限责任公司 Non-contact image measuring system
CN204255313U (en) * 2014-11-10 2015-04-08 沈阳黎明航空发动机(集团)有限责任公司 A kind of detection seal flatness special measurement equipment
CN106767566A (en) * 2016-11-29 2017-05-31 湖北文理学院 A kind of workpiece quality monitors appraisal procedure and monitoring system on-line
US20180130226A1 (en) * 2016-11-07 2018-05-10 Lincoln Global, Inc. System and method for calibrating a welding trainer
CN208109047U (en) * 2018-02-19 2018-11-16 陈碧波 A kind of mathematical education measuring instrument
CN108986572A (en) * 2018-08-06 2018-12-11 沈机(上海)智能系统研发设计有限公司 Machinetool workpiece processes examining method, system, comprehensive examination and evaluation system and server end
CN109141232A (en) * 2018-08-07 2019-01-04 常州好迪机械有限公司 A kind of circle plate casting online test method based on machine vision
CN109406522A (en) * 2018-12-10 2019-03-01 陕西维视智造科技股份有限公司 Multi-functional automatic detecting platform based on machine vision technique
JP2019060519A (en) * 2017-09-26 2019-04-18 株式会社日立国際電気 Shooting evaluation system
CN109798833A (en) * 2019-02-25 2019-05-24 黄文广 A kind of machined piece automatic measurement system and automatic scoring method
CN110021006A (en) * 2018-09-06 2019-07-16 浙江大学台州研究院 A kind of device and method whether detection automobile parts are installed
CN110807969A (en) * 2019-11-28 2020-02-18 深圳市华兴鼎盛科技有限公司 Machine vision recognition teaching system and teaching method
CN110954023A (en) * 2019-12-23 2020-04-03 芜湖哈特机器人产业技术研究院有限公司 Multifunctional visual experiment table and working method thereof
CN110992416A (en) * 2019-12-20 2020-04-10 扬州大学 High-reflection-surface metal part pose measurement method based on binocular vision and CAD model

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6078846A (en) * 1996-02-06 2000-06-20 Perceptron, Inc. Calibration and compensation of robot-based gauging system
FR2870935A1 (en) * 2004-05-25 2005-12-02 Insidix Sarl DEVICE FOR MEASURING SURFACE DEFORMATIONS
US8494115B2 (en) * 2006-03-14 2013-07-23 The University Of Notre Dame Du Lac Methods and apparatus for hardware based radiation dose calculation
US8466380B2 (en) * 2008-11-27 2013-06-18 Teraoka Seiko Co., Ltd. Apparatus and method for measuring articles including conveyor-weighers supported on weighing unit
CN104089574A (en) * 2014-06-25 2014-10-08 镇江高等职业技术学校 New classroom teaching model based on machine vision technology
CN111189387A (en) * 2020-01-02 2020-05-22 西安工程大学 Industrial part size detection method based on machine vision

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2268964Y (en) * 1996-07-25 1997-11-26 林永才 Operation table for working skills
JP2009193477A (en) * 2008-02-17 2009-08-27 Obirin Gakuen e-LEARNING TEST SYSTEM
CN101839700A (en) * 2010-03-29 2010-09-22 重庆建设工业(集团)有限责任公司 Non-contact image measuring system
CN204255313U (en) * 2014-11-10 2015-04-08 沈阳黎明航空发动机(集团)有限责任公司 A kind of detection seal flatness special measurement equipment
US20180130226A1 (en) * 2016-11-07 2018-05-10 Lincoln Global, Inc. System and method for calibrating a welding trainer
CN106767566A (en) * 2016-11-29 2017-05-31 湖北文理学院 A kind of workpiece quality monitors appraisal procedure and monitoring system on-line
JP2019060519A (en) * 2017-09-26 2019-04-18 株式会社日立国際電気 Shooting evaluation system
CN208109047U (en) * 2018-02-19 2018-11-16 陈碧波 A kind of mathematical education measuring instrument
CN108986572A (en) * 2018-08-06 2018-12-11 沈机(上海)智能系统研发设计有限公司 Machinetool workpiece processes examining method, system, comprehensive examination and evaluation system and server end
CN109141232A (en) * 2018-08-07 2019-01-04 常州好迪机械有限公司 A kind of circle plate casting online test method based on machine vision
CN110021006A (en) * 2018-09-06 2019-07-16 浙江大学台州研究院 A kind of device and method whether detection automobile parts are installed
CN109406522A (en) * 2018-12-10 2019-03-01 陕西维视智造科技股份有限公司 Multi-functional automatic detecting platform based on machine vision technique
CN109798833A (en) * 2019-02-25 2019-05-24 黄文广 A kind of machined piece automatic measurement system and automatic scoring method
CN110807969A (en) * 2019-11-28 2020-02-18 深圳市华兴鼎盛科技有限公司 Machine vision recognition teaching system and teaching method
CN110992416A (en) * 2019-12-20 2020-04-10 扬州大学 High-reflection-surface metal part pose measurement method based on binocular vision and CAD model
CN110954023A (en) * 2019-12-23 2020-04-03 芜湖哈特机器人产业技术研究院有限公司 Multifunctional visual experiment table and working method thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
马如宏 等: "金工实习在线考试系统的设计", 《盐城工学院学报(自然科学版)》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112991316A (en) * 2021-03-30 2021-06-18 中国空气动力研究与发展中心超高速空气动力研究所 Dynamic measurement technology for model edge ablation amount
WO2022206022A1 (en) * 2021-04-01 2022-10-06 浙江大学台州研究院 Size measuring instrument system based on multi-template matching and automatic focusing functions

Also Published As

Publication number Publication date
CN114674223B (en) 2023-06-23
CN114674223A (en) 2022-06-28
CN112304217B (en) 2022-04-08

Similar Documents

Publication Publication Date Title
CN112284250B (en) Dimension measurement scoring system and measurement method based on machine vision
CN112304217B (en) Dimension measurement scoring device and scoring method based on machine vision
US9052253B2 (en) Method of determining at least one refraction characteristic of an ophthalmic lens
CN103185728B (en) Image processing apparatus and image processing method
CN102713671A (en) Point group data processing device, point group data processing method, and point group data processing program
CN108645345A (en) The system that pin is inserted into object
CN109509378B (en) A kind of online testing method for supporting handwriting input
JP2009544002A (en) Window glass inspection method
US20190114762A1 (en) Computer-Controlled 3D Analysis Of Collectible Objects
CN113139894A (en) Microscope and method for determining a measuring position of a microscope
CN111242902A (en) Method, system and equipment for identifying and detecting parts based on convolutional neural network
CN108548825A (en) A kind of transparent plate defect detecting device and method based on two-dimentional illumination
CN112330599B (en) Dimension measurement scoring device, adjustment method and scoring method
CN111210479A (en) Laser auxiliary calibration device and method for measuring sizes of parts with different heights
CN105391998B (en) Automatic detection method and apparatus for resolution of low-light night vision device
CN112461846B (en) Workpiece defect detection method and device
CN108760755A (en) A kind of dust granule detection method and device
CN115014248B (en) Laser projection line identification and flatness judgment method
CN118172304A (en) Size measurement scoring device
CN118172303A (en) Adjustment measurement method of size measurement scoring device
CN113111788B (en) Iris 3D information acquisition equipment with adjusting device
CN1323417C (en) Intelligent measuring instrument for shadow mask aperture
JP2014035260A (en) Inspection device, and inspection method for game board
CN207472195U (en) A kind of binocular vision volume weight measuring system
WO1999041621A2 (en) Circuit board assembly inspection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant