CN117288111B - Non-contact distance measurement method and system based on machine vision light spot detection - Google Patents

Non-contact distance measurement method and system based on machine vision light spot detection Download PDF

Info

Publication number
CN117288111B
CN117288111B CN202311575240.XA CN202311575240A CN117288111B CN 117288111 B CN117288111 B CN 117288111B CN 202311575240 A CN202311575240 A CN 202311575240A CN 117288111 B CN117288111 B CN 117288111B
Authority
CN
China
Prior art keywords
light spot
spot
center
connected domain
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311575240.XA
Other languages
Chinese (zh)
Other versions
CN117288111A (en
Inventor
陈辽林
钟度根
肖成柱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Reader Technology Co ltd
Original Assignee
Shenzhen Reader Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Reader Technology Co ltd filed Critical Shenzhen Reader Technology Co ltd
Priority to CN202311575240.XA priority Critical patent/CN117288111B/en
Publication of CN117288111A publication Critical patent/CN117288111A/en
Application granted granted Critical
Publication of CN117288111B publication Critical patent/CN117288111B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/14Measuring arrangements characterised by the use of optical techniques for measuring distance or clearance between spaced objects or spaced apertures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23KSOLDERING OR UNSOLDERING; WELDING; CLADDING OR PLATING BY SOLDERING OR WELDING; CUTTING BY APPLYING HEAT LOCALLY, e.g. FLAME CUTTING; WORKING BY LASER BEAM
    • B23K26/00Working by laser beam, e.g. welding, cutting or boring
    • B23K26/70Auxiliary operations or equipment
    • B23K26/702Auxiliary equipment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23QDETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
    • B23Q17/00Arrangements for observing, indicating or measuring on machine tools
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Plasma & Fusion (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a non-contact distance measurement method and a non-contact distance measurement system based on machine vision light spot detection, and relates to the field of non-contact distance measurement, wherein the method comprises the following steps: projecting a light spot to a workpiece to be detected by using a light spot emitter, and shooting a light spot image of the workpiece to be detected by using a camera; determining the center position of the light spot according to the light spot image; determining a spot center offset according to the spot center position and the camera view center position; based on a preset mapping relation between the actual distance from the camera to the workpiece and the light spot center offset, determining the actual distance from the camera to the workpiece to be detected according to the determined light spot center offset. The invention can realize non-contact ranging, avoid the problem of damage caused by contact with the surface of a material, improve the response speed of ranging and improve the processing efficiency of products.

Description

Non-contact distance measurement method and system based on machine vision light spot detection
Technical Field
The invention relates to the field of non-contact distance measurement, in particular to a non-contact distance measurement method and system based on machine vision light spot detection.
Background
In the product processing process, the processing equipment needs to adjust corresponding parameters such as processing power, processing height and the like according to the distance between a processing cutter or a laser cutting head and a product. Therefore, it is necessary to obtain information on the distance from the machining tool or the laser cutting head to the product in advance before machining. For some softer product materials, the surface of the material is easily damaged by contact ranging, and the response speed of contact ranging is slower, so that the processing efficiency of the product is not improved.
Disclosure of Invention
The invention aims to provide a non-contact ranging method and a non-contact ranging system based on machine vision light spot detection, which are used for solving the problems that the surface of a material is easy to damage in a contact ranging mode and the response speed is low.
In order to achieve the above object, the present invention provides the following solutions:
a non-contact ranging method based on machine vision spot detection, comprising:
projecting a light spot to a workpiece to be detected by using a light spot emitter, and shooting a light spot image of the workpiece to be detected by using a camera;
determining the center position of the light spot according to the light spot image;
determining a spot center offset according to the spot center position and the camera view center position;
based on a preset mapping relation between the actual distance from the camera to the workpiece and the light spot center offset, determining the actual distance from the camera to the workpiece to be detected according to the determined light spot center offset.
Optionally, determining the center position of the light spot according to the light spot image specifically includes:
preprocessing the facula image and the original facula-free image to obtain a preprocessed facula image and a preprocessed non-facula image; the original flare-free image is obtained by shooting the camera under the projection of no flare;
subtracting pixel values corresponding to the same coordinates in the preprocessed spot image and the preprocessed spot-free image to generate a first mask;
performing binarization operation on the preprocessed facula image to generate a second mask;
generating a third mask by performing AND operation according to the first mask and the second mask;
calculating a connected domain set in the third mask according to a search strategy;
screening the connected domains in the connected domain set to obtain a final connected domain;
determining a light spot gray center according to the final connected domain;
determining the geometrical center of the light spot according to the final connected domain;
and determining the position of the light spot center according to the light spot gray center and the light spot geometric center.
Optionally, calculating the connected domain set in the third mask according to a search strategy specifically includes:
selecting any pixel point from the queue, removing the point from the queue, adding the pixel point into a connected domain, and sequentially accessing eight neighborhood pixel points of the pixel point;
if a second unread non-0 pixel point exists in the eight neighborhood pixel points, adding the second unread non-0 pixel point into the queue, marking the second unread non-0 pixel point as read, and repeatedly selecting any pixel point in the queue until the queue is empty;
traversing the next first unread non-0 pixel point in the third mask, and constructing a connected domain through a queue until all the first unread non-0 pixel points in the third mask are traversed, so as to obtain the connected domain in the third mask.
Optionally, screening the connected domain in the connected domain set to obtain a final connected domain, which specifically includes:
calculating the center coordinates of all connected domains in the connected domain set;
calculating the difference between the central coordinates of the rest connected domains and the reference coordinates by taking the central coordinates corresponding to the maximum connected domain as the reference coordinates;
and removing the communication domains which do not meet the requirements according to the difference value and the set threshold value, and merging the rest communication domains to obtain the final communication domain.
Optionally, determining the gray center of the light spot according to the final connected domain specifically includes:
using the formulaAnd +.>Calculating the gray center of the light spot; wherein x is g Y is the abscissa of the gray center of the light spot g B is a pixel point reference threshold value, f is the ordinate of the gray center of the light spot i For the pixel value corresponding to the ith pixel point, x i For the corresponding horizontal of the ith pixel pointCoordinates, y i And n is the number of the pixel points in the final connected domain, wherein n is the ordinate corresponding to the ith pixel point.
Optionally, determining the geometrical center of the light spot according to the final connected domain specifically includes:
taking the central value of the maximum pixel value and the minimum pixel value in the x direction in the final connected domain as the abscissa of the geometrical center of the light spot;
and taking the central value of the maximum pixel value and the minimum pixel value in the y direction in the final connected domain as the ordinate of the geometrical center of the light spot.
Optionally, determining the position of the spot center according to the gray level center of the spot and the geometrical center of the spot specifically includes:
using the formulaAnd->Calculating the center position of the light spot; wherein x is f Y is the abscissa of the central position of the light spot f X is the ordinate of the central position of the light spot g Y is the abscissa of the gray center of the light spot g X is the ordinate of the gray center of the light spot s Y is the abscissa of the geometrical center of the light spot s Is the ordinate of the geometrical center of the light spot; a is a mashup coefficient.
A non-contact ranging system based on machine vision spot detection, comprising:
the spot image acquisition module is used for projecting a spot to a workpiece to be detected by using a spot emitter and shooting a spot image of the workpiece to be detected by using a camera;
the spot center position determining module is used for determining the spot center position according to the spot image;
the light spot center offset determining module is used for determining the light spot center offset according to the light spot center position and the camera view center position;
the actual distance determining module is used for determining the actual distance from the camera to the workpiece to be measured according to the light spot center offset based on the mapping relation between the known light spot center offset and the known actual distance from the camera to the workpiece.
An electronic device comprising a memory for storing a computer program and a processor that runs the computer program to cause the electronic device to perform the above-described non-contact ranging method based on machine vision spot detection.
A computer readable storage medium storing a computer program which when executed by a processor implements the above-described non-contact ranging method based on machine vision spot detection.
According to the specific embodiment provided by the invention, the invention discloses the following technical effects: according to the invention, the light spot is projected onto the workpiece to be detected, a corresponding light spot image is obtained through shooting, the central position (called as the light spot central position) of the light spot in the light spot image can be further determined, the camera vision central position is subtracted based on the light spot central position, so that the light spot central offset can be obtained, and finally, the actual distance from the camera to the workpiece to be detected can be obtained based on the mapping relation between the actual distance from the camera to the workpiece and the light spot central offset. The whole process does not need to contact the surface of a workpiece, and only realizes non-contact ranging by irradiating light spots, so that the problem of damage caused by contact with the surface of a material is avoided, the response speed of ranging is improved, and the processing efficiency of a product is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions of the prior art, the drawings that are needed in the embodiments will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of a non-contact ranging device based on machine vision spot detection provided by the invention;
fig. 2 is a flow chart of a non-contact ranging method based on machine vision spot detection provided by the invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The invention aims to provide a non-contact ranging method and a non-contact ranging system based on machine vision facula detection, which can realize non-contact ranging, avoid the problem of damage caused by contact with the surface of a material, improve the response speed of ranging and improve the processing efficiency of products.
In order that the above-recited objects, features and advantages of the present invention will become more readily apparent, a more particular description of the invention will be rendered by reference to the appended drawings and appended detailed description.
Example 1
As shown in fig. 1, the light beam axis emitted by the light spot emitter forms a certain included angle alpha with the plane of the workbench, and the light axis of the camera forms a certain included angle beta with the plane of the workbench. During operation, the light spot emitter projects light spots onto the workpieces, the light spots need to be shot by the camera, namely the light spots are included in the field of view of the camera, the camera shoots light spot images on a plurality of workpieces, the position of the light spot center is detected, and the distance from the camera to the workpieces is calculated according to the mapping relation between the offset degree of the light spot center and the actual distance.
As shown in fig. 2, the present invention provides a non-contact ranging method based on machine vision spot detection, which includes:
step 201: and projecting a light spot to a workpiece to be detected by using a light spot emitter, and shooting a light spot image of the workpiece to be detected by using a camera.
Step 202: and determining the center position of the light spot according to the light spot image.
In practical application, the actual operation process of the step 202 is specifically as follows:
1) And placing the workpiece on a workbench, opening a light spot emitter, and acquiring a light spot image L by a camera.
2) And closing the light spot emitter, and acquiring an image D of the workpiece without light spot irradiation.
3) And respectively carrying out graying treatment on the images L and D acquired by the camera.
4) And carrying out Gaussian filtering or median filtering operation on the gray-scaled images L and D to remove noise.
5) For the denoised images L and D, performing image subtraction operation, subtracting the pixel value corresponding to the same coordinates in D from the pixel value in L, traversing each pixel of L and D one by one to obtain a first mask M 0
6) For the image L after noise removal, performing binarization operation by using a threshold t, setting the pixel higher than t in the image L as 255, setting the pixel lower than t as 0, and obtaining a second mask M 1
7) Will M 0 And M 1 Image AND operation is carried out, M is taken out respectively 0 And M 1 If the values of the two pixels are 255, the new pixel value is 255, otherwise, the new pixel value is 0, and M is traversed one by one 0 And M 1 To obtain a mask M 2
8) Calculating M according to the search strategy 2 Is a connected domain in the middle (a). The search strategy is as follows:
a. a queue Q is created.
b. Creating connected domain R i (i=0, 1, … n, i is initially 0, and sequentially incremented), traversing M 2 Finding unread pixel point p of non-0 1 Add it to queue Q and mark p as read.
c. Fetch a point p from queue Q 2 Remove the point from queue Q and add the point to connected domain R i If the pixel point is an unread non-0 pixel point, the pixel point is added into a queue Q, and the point is marked as read.
d. Repeating step c until the queue Q is empty.
e. Repeating the steps b-d until M is traversed 2 Obtaining a connected domain set R { R } i }。
9) Screening the connected domain. The screening steps are as follows:
presetting a first threshold t x Second t y Traversing all connected domains, and calculating the region center coordinates x of all connected domains i 、y i (i=0, 1 … n, where n is the number of connected domains).
With the central coordinate x corresponding to the maximum connected domain b And y b Calculating the central coordinates x of the rest connected domains as a reference i 、y i Respectively with x b 、y b Wherein the differences in the x-direction and the y-direction are denoted as d, respectively x And d y
Will d x Is greater than a preset first threshold t x Or d y Is greater than a preset second threshold t y And (3) removing the connected domains, and combining the connected domains to obtain a final connected domain H.
10 Calculating the gray center of the light spot. All points on connected domain H are traversed. Calculating a gray center, wherein the calculation formula of the gray center in the x direction is as follows:
the calculation formula of the gray level center in the y direction is as follows:
wherein x is g Y is the abscissa of the gray center of the light spot g B is a pixel point reference threshold value, f is the ordinate of the gray center of the light spot i For the pixel value corresponding to the ith pixel point, x i Is the abscissa corresponding to the ith pixel point, y i And n is the number of the pixel points in the final connected domain, wherein n is the ordinate corresponding to the ith pixel point.
11 Calculating the geometrical center of the light spot, traversing all points on the connected domain H, taking the x direction as an example, searching the maximum value and the minimum value in the x direction in the connected domain, taking the central value of the maximum value and the minimum value as the abscissa of the geometrical center, searching the maximum value and the minimum value in the y direction in the connected domain, and taking the central value of the maximum value and the minimum value as the ordinate of the geometrical center.
12 Calculating the center position of the light spot. Taking the x direction as an example, the abscissa of the gray center is x g The abscissa of the geometric center is x s The abscissa x of the central position of the light spot f The calculation method of (1) is as follows:
ordinate y of spot center position f The calculation method of (1) is as follows:
wherein a is a mashup coefficient.
Step 203: and determining the light spot center offset according to the light spot center position and the camera view center position.
In practical application, the light spot center offset delta d is calculated by determining the light spot center position, and the actual distance I (delta d) from the camera to the workpiece is calculated by the mapping function I (x) of the light spot center offset and the actual distance, so that the distance calculation is completed.
Wherein x is the offset of the spot center pixel,and b, c and d are flare correction coefficients.
The spot center offset between the spot center positions can be calculated, then the average value of the spot center offset is determined by averaging, the average value is substituted into the mapping function I (x) of the spot center offset and the actual distance, and the actual distance I (delta d) between the camera and the workpiece is calculated.
In the invention, the mode of calculating the central offset of the light spot can be adjusted according to actual needs because the positions of the light spot emitters are different.
Step 204: based on a preset mapping relation between the actual distance from the camera to the workpiece and the light spot center offset, determining the actual distance from the camera to the workpiece to be detected according to the determined light spot center offset.
Example two
In order to perform a corresponding method of the above embodiment to achieve the corresponding functions and technical effects, a non-contact ranging system based on machine vision spot detection is provided below.
A non-contact ranging system based on machine vision spot detection, comprising:
and the light spot image acquisition module is used for projecting light spots onto the workpiece to be detected by utilizing the light spot emitter and shooting light spot images of the workpiece to be detected by utilizing the camera.
And the light spot center position determining module is used for determining the light spot center position according to the light spot image.
And the light spot center offset determining module is used for determining the light spot center offset according to the light spot center position and the camera view center position.
The actual distance determining module is used for determining the actual distance from the camera to the workpiece to be measured according to the light spot center offset based on the mapping relation between the known light spot center offset and the known actual distance from the camera to the workpiece.
Example III
An embodiment of the present invention provides an electronic device including a memory and a processor, where the memory is configured to store a computer program, and the processor is configured to execute the computer program to cause the electronic device to perform the non-contact ranging method based on machine vision spot detection provided in the first embodiment.
In practical applications, the electronic device may be a server.
In practical applications, the electronic device includes: at least one processor (processor), memory (memory), bus, and communication interface (Communications Interface).
Wherein: the processor, communication interface, and memory communicate with each other via a communication bus.
And the communication interface is used for communicating with other devices.
And a processor, configured to execute a program, and specifically may execute the method described in the foregoing embodiment.
In particular, the program may include program code including computer-operating instructions.
The processor may be a central processing unit, CPU, or specific integrated circuit ASIC (Application Specific Integrated Circuit), or one or more integrated circuits configured to implement embodiments of the present invention. The one or more processors included in the electronic device may be the same type of processor, such as one or more CPUs; but may also be different types of processors such as one or more CPUs and one or more ASICs.
And the memory is used for storing programs. The memory may comprise high-speed RAM memory or may further comprise non-volatile memory, such as at least one disk memory.
Based on the description of the embodiments above, embodiments of the present application provide a storage medium having stored thereon computer program instructions executable by a processor to implement the method of any of the embodiments.
The non-contact ranging system based on machine vision spot detection provided in the embodiments of the present application exists in a variety of forms, including but not limited to:
(1) A mobile communication device: such devices are characterized by mobile communication capabilities and are primarily aimed at providing voice, data communications. Such terminals include: smart phones (e.g., iPhone), multimedia phones, functional phones, and low-end phones, etc.
(2) Ultra mobile personal computer device: such devices are in the category of personal computers, having computing and processing functions, and generally having mobile internet access capabilities. Such terminals include: PDA, MID, and UMPC devices, etc., such as iPad.
(3) Portable entertainment device: such devices may display and play multimedia content. The device comprises: audio, video players (e.g., iPod), palm game consoles, electronic books, and smart toys and portable car navigation devices.
(4) Other electronic devices with data interaction functions.
Thus, particular embodiments of the present subject matter have been described. Other embodiments are within the scope of the following claims. In some cases, the actions recited in the claims can be performed in a different order and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may be advantageous.
The system, apparatus, module or unit set forth in the above embodiments may be implemented in particular by a computer chip or entity, or by a product having a certain function. One typical implementation is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smart phone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being functionally divided into various units, respectively. Of course, the functions of each element may be implemented in one or more software and/or hardware elements when implemented in the present application. It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by the computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular transactions or implement particular abstract data types. The application may also be practiced in distributed computing environments where transactions are performed by remote processing devices that are connected through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other. For the system disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section.
The principles and embodiments of the present invention have been described herein with reference to specific examples, the description of which is intended only to assist in understanding the methods of the present invention and the core ideas thereof; also, it is within the scope of the present invention to be modified by those of ordinary skill in the art in light of the present teachings. In view of the foregoing, this description should not be construed as limiting the invention.

Claims (9)

1. The non-contact ranging method based on machine vision facula detection is characterized by comprising the following steps of:
projecting a light spot to a workpiece to be detected by using a light spot emitter, and shooting a light spot image of the workpiece to be detected by using a camera;
the method for determining the center position of the light spot according to the light spot image specifically comprises the following steps:
preprocessing the facula image and the original facula-free image to obtain a preprocessed facula image and a preprocessed non-facula image; the original flare-free image is obtained by shooting the camera under the projection of no flare;
subtracting pixel values corresponding to the same coordinates in the preprocessed spot image and the preprocessed spot-free image to generate a first mask;
performing binarization operation on the preprocessed facula image to generate a second mask;
generating a third mask by performing AND operation according to the first mask and the second mask;
calculating a connected domain set in the third mask according to a search strategy;
screening the connected domains in the connected domain set to obtain a final connected domain;
determining a light spot gray center according to the final connected domain;
determining the geometrical center of the light spot according to the final connected domain;
determining the position of the light spot center according to the light spot gray center and the light spot geometric center;
determining a spot center offset according to the spot center position and the camera view center position;
based on a preset mapping relation between the actual distance from the camera to the workpiece and the light spot center offset, determining the actual distance from the camera to the workpiece to be detected according to the determined light spot center offset.
2. The non-contact ranging method based on machine vision spot detection according to claim 1, wherein calculating the connected domain set in the third mask according to a search strategy specifically comprises:
traversing the third mask, adding the first unread non-0 pixel point in the third mask into a queue, and marking the first unread non-0 pixel point as read;
selecting any pixel point from the queue, removing the point from the queue, adding the pixel point into a connected domain, and sequentially accessing eight neighborhood pixel points of the pixel point;
if a second unread non-0 pixel point exists in the eight neighborhood pixel points, adding the second unread non-0 pixel point into the queue, marking the second unread non-0 pixel point as read, and repeatedly selecting any pixel point in the queue until the queue is empty;
traversing the next first unread non-0 pixel point in the third mask, and constructing a connected domain through a queue until all the first unread non-0 pixel points in the third mask are traversed, so as to obtain the connected domain in the third mask.
3. The non-contact ranging method based on machine vision spot detection according to claim 1, wherein the step of screening the connected domain in the connected domain set to obtain a final connected domain specifically comprises:
calculating the center coordinates of all connected domains in the connected domain set;
calculating the difference between the central coordinates of the rest connected domains and the reference coordinates by taking the central coordinates corresponding to the maximum connected domain as the reference coordinates;
and removing the communication domains which do not meet the requirements according to the difference value and the set threshold value, and merging the rest communication domains to obtain the final communication domain.
4. The non-contact ranging method based on machine vision spot detection according to claim 1, wherein the determining the spot gray center according to the final connected domain specifically comprises:
using the formulaAnd +.>Calculating the gray center of the light spot;
wherein x is g Y is the abscissa of the gray center of the light spot g B is a pixel point reference threshold value, f is the ordinate of the gray center of the light spot i For the pixel value corresponding to the ith pixel point, x i Is the abscissa corresponding to the ith pixel point, y i And n is the number of the pixel points in the final connected domain, wherein n is the ordinate corresponding to the ith pixel point.
5. The non-contact ranging method based on machine vision spot detection according to claim 1, wherein determining a spot geometric center according to the final connected domain specifically comprises:
taking the central value of the maximum pixel value and the minimum pixel value in the x direction in the final connected domain as the abscissa of the geometrical center of the light spot;
and taking the central value of the maximum pixel value and the minimum pixel value in the y direction in the final connected domain as the ordinate of the geometrical center of the light spot.
6. The non-contact ranging method based on machine vision spot detection according to claim 1, wherein determining the spot center position according to the spot gray level center and the spot geometric center specifically comprises:
using formula x f =ax s +(1-a)x g And y f =ay s +(1-a)y g Calculating the center position of the light spot; wherein x is f Y is the abscissa of the central position of the light spot f X is the ordinate of the central position of the light spot g Y is the abscissa of the gray center of the light spot g X is the ordinate of the gray center of the light spot s Y is the abscissa of the geometrical center of the light spot s And a is a mashup coefficient, which is the ordinate of the geometrical center of the light spot.
7. A non-contact ranging system based on machine vision spot detection, comprising:
the spot image acquisition module is used for projecting a spot to a workpiece to be detected by using a spot emitter and shooting a spot image of the workpiece to be detected by using a camera;
the spot center position determining module is used for determining the spot center position according to the spot image; the step of determining the center position of the light spot according to the light spot image specifically comprises the following steps:
preprocessing the facula image and the original facula-free image to obtain a preprocessed facula image and a preprocessed non-facula image; the original flare-free image is obtained by shooting the camera under the projection of no flare;
subtracting pixel values corresponding to the same coordinates in the preprocessed spot image and the preprocessed spot-free image to generate a first mask;
performing binarization operation on the preprocessed facula image to generate a second mask;
generating a third mask by performing AND operation according to the first mask and the second mask;
calculating a connected domain set in the third mask according to a search strategy;
screening the connected domains in the connected domain set to obtain a final connected domain;
determining a light spot gray center according to the final connected domain;
determining the geometrical center of the light spot according to the final connected domain;
determining the position of the light spot center according to the light spot gray center and the light spot geometric center;
the light spot center offset determining module is used for determining the light spot center offset according to the light spot center position and the camera view center position;
the actual distance determining module is used for determining the actual distance from the camera to the workpiece to be measured according to the light spot center offset based on the mapping relation between the known light spot center offset and the known actual distance from the camera to the workpiece.
8. An electronic device comprising a memory for storing a computer program and a processor that runs the computer program to cause the electronic device to perform the machine vision spot detection-based non-contact ranging method according to any of claims 1-6.
9. A computer-readable storage medium, characterized in that it stores a computer program, which when executed by a processor implements the machine vision spot detection-based non-contact ranging method according to any of claims 1-6.
CN202311575240.XA 2023-11-24 2023-11-24 Non-contact distance measurement method and system based on machine vision light spot detection Active CN117288111B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311575240.XA CN117288111B (en) 2023-11-24 2023-11-24 Non-contact distance measurement method and system based on machine vision light spot detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311575240.XA CN117288111B (en) 2023-11-24 2023-11-24 Non-contact distance measurement method and system based on machine vision light spot detection

Publications (2)

Publication Number Publication Date
CN117288111A CN117288111A (en) 2023-12-26
CN117288111B true CN117288111B (en) 2024-02-20

Family

ID=89239345

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311575240.XA Active CN117288111B (en) 2023-11-24 2023-11-24 Non-contact distance measurement method and system based on machine vision light spot detection

Country Status (1)

Country Link
CN (1) CN117288111B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103791860A (en) * 2014-03-07 2014-05-14 哈尔滨工业大学 Tiny angle measuring device and method based on vision detecting technology
CN104501720A (en) * 2014-12-24 2015-04-08 河海大学常州校区 Non-contact object size and distance image measuring instrument
CN109099818A (en) * 2018-06-27 2018-12-28 武汉理工大学 Portable micron order high definition range-measurement system
CN112846485A (en) * 2020-12-31 2021-05-28 武汉华工激光工程有限责任公司 Laser processing monitoring method and device and laser processing equipment
CN112985259A (en) * 2021-01-25 2021-06-18 中国人民解放军军事科学院国防科技创新研究院 Target positioning method and system based on multi-view vision
CN113369990A (en) * 2021-07-06 2021-09-10 成都飞机工业(集团)有限责任公司 On-line detection device for non-contact measuring hole and use method thereof
CN114235351A (en) * 2021-12-17 2022-03-25 深圳市先地图像科技有限公司 Laser spot deviation detection method and system in laser array and related equipment
CN115358992A (en) * 2022-08-19 2022-11-18 软通动力信息技术(集团)股份有限公司 Light spot detection method and device, electronic equipment and storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103791860A (en) * 2014-03-07 2014-05-14 哈尔滨工业大学 Tiny angle measuring device and method based on vision detecting technology
CN104501720A (en) * 2014-12-24 2015-04-08 河海大学常州校区 Non-contact object size and distance image measuring instrument
CN109099818A (en) * 2018-06-27 2018-12-28 武汉理工大学 Portable micron order high definition range-measurement system
CN112846485A (en) * 2020-12-31 2021-05-28 武汉华工激光工程有限责任公司 Laser processing monitoring method and device and laser processing equipment
CN112985259A (en) * 2021-01-25 2021-06-18 中国人民解放军军事科学院国防科技创新研究院 Target positioning method and system based on multi-view vision
CN113369990A (en) * 2021-07-06 2021-09-10 成都飞机工业(集团)有限责任公司 On-line detection device for non-contact measuring hole and use method thereof
CN114235351A (en) * 2021-12-17 2022-03-25 深圳市先地图像科技有限公司 Laser spot deviation detection method and system in laser array and related equipment
CN115358992A (en) * 2022-08-19 2022-11-18 软通动力信息技术(集团)股份有限公司 Light spot detection method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN117288111A (en) 2023-12-26

Similar Documents

Publication Publication Date Title
CN109977935B (en) Text recognition method and device
CN108381549B (en) Binocular vision guide robot rapid grabbing method and device and storage medium
US20130322763A1 (en) Apparatus and method for tracking object using feature descriptor, and apparatus and method for removing garbage feature
CN110163786B (en) Method, device and equipment for removing watermark
CN111612834A (en) Method, device and equipment for generating target image
CN111754546A (en) Target tracking method, system and storage medium based on multi-feature map fusion
CN112037287B (en) Camera calibration method, electronic equipment and storage medium
CN114972621A (en) Three-dimensional building contour extraction method and device, electronic equipment and storage medium
CN109102026A (en) A kind of vehicle image detection method, apparatus and system
CN117288111B (en) Non-contact distance measurement method and system based on machine vision light spot detection
CN112598687B (en) Image segmentation method and device, storage medium and electronic equipment
CN108573510A (en) A kind of grating map vectorization method and equipment
CN111783777A (en) Image processing method, image processing device, electronic equipment and computer readable medium
CN107945202B (en) Image segmentation method and device based on adaptive threshold value and computing equipment
US10558882B2 (en) Performing distance-based feature suppression
CN116109695A (en) Centroid positioning method and related equipment
CN116015492A (en) Radio station fault diagnosis method, system and equipment
CN113282535B (en) Quantization processing method and device and quantization processing chip
CN115431270A (en) Robot-assisted positioning method, device and medium based on two-dimensional code label
CN112906708B (en) Picture processing method and device, electronic equipment and computer storage medium
CN113657317A (en) Cargo position identification method and system, electronic equipment and storage medium
CN116993847A (en) Method, system and product for processing scanned image of handheld cone beam CT equipment
CN114065868B (en) Training method of text detection model, text detection method and device
CN115145839B (en) Depth convolution accelerator and method for accelerating depth convolution
CN114663350B (en) Track searching method, device and equipment for arc welding seam and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant