CN117433450A - Cross line three-dimensional camera and modeling method - Google Patents

Cross line three-dimensional camera and modeling method Download PDF

Info

Publication number
CN117433450A
CN117433450A CN202311755096.8A CN202311755096A CN117433450A CN 117433450 A CN117433450 A CN 117433450A CN 202311755096 A CN202311755096 A CN 202311755096A CN 117433450 A CN117433450 A CN 117433450A
Authority
CN
China
Prior art keywords
word line
laser
stripe
center
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202311755096.8A
Other languages
Chinese (zh)
Other versions
CN117433450B (en
Inventor
邰大勇
吴志雄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pmt Technology Suzhou Co ltd
Original Assignee
Pmt Technology Suzhou Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Pmt Technology Suzhou Co ltd filed Critical Pmt Technology Suzhou Co ltd
Priority to CN202311755096.8A priority Critical patent/CN117433450B/en
Publication of CN117433450A publication Critical patent/CN117433450A/en
Application granted granted Critical
Publication of CN117433450B publication Critical patent/CN117433450B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/01Arrangements or apparatus for facilitating the optical investigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/01Arrangements or apparatus for facilitating the optical investigation
    • G01N2021/0106General arrangement of respective parts
    • G01N2021/0112Apparatus in one mechanical, optical or electronic block
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8806Specially adapted optical and illumination features
    • G01N2021/8829Shadow projection or structured background, e.g. for deflectometry
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8883Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges involving the calculation of gauges, generating models
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Biochemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Pathology (AREA)
  • Immunology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Signal Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a cross line three-dimensional camera and a modeling method, which belong to the fields of optical originals and machine vision, wherein laser emitted by different word line lasers is distinguished by periodically controlling the closing and opening of two word line lasers, and then the two word line lasers are simultaneously opened together, and the word line lasers to which the currently extracted laser point position belongs are determined by comparing with the position extracted by a previous frame, so that the frame rate of a point cloud can be increased to 1.3 times; the invention can effectively distinguish the light planes of the cross line lasers.

Description

Cross line three-dimensional camera and modeling method
Technical Field
The invention belongs to the fields of optical elements and machine vision, and particularly relates to a cross line three-dimensional camera and a modeling method.
Background
The cross line structure optical camera is a device for capturing the shape and surface geometric information of an object, and can solve the problem that the measured edge is parallel to the light bar because two projected line structure optical planes are mutually intersected.
The traditional method for distinguishing the emission plane of the cross line laser is to alternately project by using two lasers, only one laser line appears in the camera each time, and the frame rate of the point cloud is lower. The other is to use two kinds of laser with different colors and to use color camera to distinguish different laser planes.
Disclosure of Invention
The invention aims to provide a laser three-dimensional camera light and modeling method for distinguishing cross wires, which can effectively distinguish the light plane of the cross wire laser. The invention distinguishes the laser emitted by different lasers by periodically controlling the closing and opening of the two lasers, simultaneously opens the two lasers together, determines the laser to which the currently extracted laser point position belongs by comparing with the position extracted by the previous frame, and can improve the frame rate of the point cloud to 1.3 times of the original frame rate. The cross line laser three-dimensional camera is suitable for various objects, including irregularly shaped objects.
The technical solution for realizing the purpose of the invention is as follows:
a cross line three-dimensional camera comprises a camera, a word line laser A, a word line laser B, a controller and a computer;
the two word line lasers output laser lines which are arranged in a crossing way;
the camera is used for collecting stripes formed by laser striking on an object;
the controller is used for controlling photographing of the camera and starting sequence of the two one-word line lasers: a word line laser A is turned on, a word line laser A is turned off, a word line laser B is turned on, and two word line lasers are turned on simultaneously;
the computer is used for extracting the center of the shot laser line stripe and processing the stripe position of the image: comparing the center of two stripes when two one-word line lasers are simultaneously opened with the center of the stripe when the two one-word line lasers are independently opened, judging which one-word line laser the stripe belongs to, and carrying out three-dimensional reconstruction of an extraction point after determining the one-word line laser to which the stripe belongs.
Compared with the prior art, the invention has the remarkable advantages that:
(1) The efficiency is high: the cross line laser three-dimensional camera adopts a frame inserting design, and the efficiency is 1.3 times that of the traditional three-dimensional camera;
(2) The price is low: the cross line laser three-dimensional camera is a black-and-white camera, and compared with the traditional three-dimensional camera, the cross line laser three-dimensional camera has the advantage that the color camera is cheaper;
(3) Easy to use: the cross line laser three-dimensional camera does not need complex setting and calibration, so that the cross line laser three-dimensional camera can be easily used by non-professional staff.
Drawings
Fig. 1 is a diagram of the working state of a word line laser a in a cross-line three-dimensional camera.
Fig. 2 is a diagram of the working state of a word line laser B in a cross-line three-dimensional camera.
Fig. 3 is a diagram of the operation of one word line laser a and B in a cross-line three-dimensional camera.
Fig. 4 is a schematic diagram of the image acquisition and processing of a cross-line three-dimensional camera.
Detailed Description
The invention is further described with reference to the drawings and specific embodiments.
Example 1
Referring to fig. 1 to 3, a cross-line three-dimensional camera of the present embodiment includes a camera 1, a connection board 2, a word line laser A3, a word line laser B4, a controller 8, and a computer;
the camera 1, the word line laser A3 and the word line laser B4 are fixed together through the connecting plate 2, the camera 1 and the two word line lasers are connected to the controller 8, and the controller 8 is responsible for opening and closing the two word line lasers and photographing the camera 1. When a word line laser A3 is turned on, a first stripe 6 formed by laser striking on an object 5 is shot by a camera 1, the shot laser line center is extracted by a computer through a gray center method, the word line laser A3 is turned off by a controller 8, the laser 4 is turned on, and a second stripe 7 formed by the laser on the object is shot by the camera 1. And extracting the center of the shot laser line by a gray level center method. After a word line laser A3 and a word line laser B4 are simultaneously turned on, a first stripe 6 and a second stripe 7 are simultaneously formed on the measured object, and the centers of the two stripes are extracted to pass through and are compared with the center point of the previous grabbing laser. The turn-on sequence of the two one-word line lasers is: a word line laser a is on, a word line laser a is off and a word line laser B, two word line lasers are on at the same time, so that periodic cyclic opening is performed.
The computer is used for processing the image shot by the camera 1, and the processing procedure is as follows:
in FIG. 4, the current frame is denoted as N when a word line laser A3 is on, and the center of the light bar is extracted in the j-th column of the image, whose abscissa isThe center set of the light bars extracted on all columns C of the image is
(1)
When a word line laser B4 is turned on, the current frame is recorded as N+1 frames, the j th column extracts the center of the light bar, and the abscissa isThe center set of the light bars extracted on all columns C of the image is
(2)
One word line laserWhen the device A3 and the word line laser B4 are simultaneously turned on, the current frame is recorded as an N+2 frame, the j column extracts the center of the light bar, and the abscissa isThe center set of the light bars extracted on all columns C of the image is
(3)
In the aboveThe recorded light bar position has ambiguity and needs to be distinguished, and can be obtained by the following formula
(4)
And
(5)
and (5) comparing.A threshold is allowed for the extraction of the light bar position. If equation (4) is satisfied, it can be determined that the light bar belongs to a word line laser A3. Satisfying equation (5), it is determined that the stripe belongs to a word line laser B4. If equations (4) and (5) are both satisfied, indicating that the two lines are very close, possibly at the intersection of the laser lines or overlapping due to jitter during acquisition, the extracted points are not reconstructed in three dimensions. After the point extracted by the laser center determines the laser to which the point belongs, three-dimensional reconstruction of the extracted point can be performed through a light plane equation calibrated in advance by the laser.
Example 2
Based on the above-mentioned cross line three-dimensional camera, the present embodiment proposes a modeling method of the cross line three-dimensional camera, including the following steps:
firstly, a word line laser A is turned on, the laser line center of a stripe formed on an object is extracted, the current frame of an image is recorded as N, and the abscissa of the center of the stripe in the j-th row of image extraction isCenter set of light bars extracted on all columns C of imageThe method comprises the following steps:
(1)
turning off a word line laser A and turning on a word line laser B, extracting the laser line center of a stripe formed on an object, recording the current frame of an image as N+1, and extracting the abscissa of the center of the j-th stripe of the image asCenter set of light bars extracted on all columns C of imageThe method comprises the following steps:
(2)
simultaneously turning on a word line laser A and a word line laser B, extracting the laser line center of the stripe formed on the shot object, recording the current frame of the image as N+2, and extracting the abscissa of the center of the j-th stripe of the image asCenter set of light bars extracted on all columns C of imageThe method comprises the following steps:
(3)
can be achieved by the following method
(4)
(5)
Comparison: if the formula (4) is satisfied, determining that the stripe belongs to a word line laser A; if the formula (5) is satisfied, determining that the stripe belongs to a word line laser B; if the formulas (4) and (5) are both satisfied, the extracted points are not reconstructed in three dimensions;the threshold is allowed for the extraction of the fringe position.
The invention provides a novel cross line laser three-dimensional camera, which adopts cross line laser composed of two line lasers to project on the surface of a measured object, and a black-and-white camera captures reflection points on the surface of the object and generates a three-dimensional model through computer processing. Compared with the traditional three-dimensional camera, the camera has the advantages of high efficiency, low price, easy use and wide application range. The technology can be applied to the fields of robot vision, unmanned operation, medical treatment and the like. The scope of the present patent application is intended to include both the various embodiments and applications of the technology, as well as all claims and benefits associated with the technology.

Claims (5)

1. A cross line three-dimensional camera comprises a camera, a word line laser A, a word line laser B, a controller and a computer; it is characterized in that the method comprises the steps of,
the two word line lasers output laser lines which are arranged in a crossing way;
the camera is used for collecting stripes formed by laser striking on an object;
the controller is used for controlling photographing of the camera and starting sequence of the two one-word line lasers: a word line laser A is turned on, a word line laser A is turned off, a word line laser B is turned on, and two word line lasers are turned on simultaneously;
the computer is used for extracting the center of the shot laser line stripe and processing the stripe position of the image: comparing the center of two stripes when two one-word line lasers are simultaneously opened with the center of the stripe when the two one-word line lasers are independently opened, judging which one-word line laser the stripe belongs to, and carrying out three-dimensional reconstruction of an extraction point after determining the one-word line laser to which the stripe belongs.
2. The cross-line three-dimensional camera of claim 1, wherein the computer processes: the current frame is marked as N when a word line laser A is turned on, and the abscissa at the center of the j-th column stripe of image extraction isLight bar center set extracted on all columns C of image +.>The method comprises the following steps:
(1)
the current frame is marked as n+1 when a word line laser B is turned on, and the abscissa at the center of the j-th column stripe in image extraction isLight bar center set extracted on all columns C of image +.>The method comprises the following steps:
(2)
when a word line laser A and a word line laser B are simultaneously turned on, the current frame is marked as N+2, and the abscissa of the center of the j-th row stripe in image extraction isLight bar center set extracted on all columns C of image +.>The method comprises the following steps:
(3)
can be achieved by the following method
(4)
(5)
Comparison: if the formula (4) is satisfied, determining that the stripe belongs to a word line laser A; if the formula (5) is satisfied, determining that the stripe belongs to a word line laser B; if the formulas (4) and (5) are both satisfied, the extracted points are not reconstructed in three dimensions;the threshold is allowed for the extraction of the fringe position.
3. The cross-line three-dimensional camera according to claim 1, wherein the shot laser line center is extracted by a gray-scale center method.
4. The cross-line three-dimensional camera of claim 1, wherein the camera, a word line laser a, a word line laser B are secured together by a web.
5. The modeling method of the cross line three-dimensional camera is characterized by comprising the following steps of:
firstly, a word line laser A is turned on, the laser line center of a stripe formed on an object is extracted, the current frame of an image is recorded as N, and the abscissa of the center of the stripe in the j-th row of image extraction isCenter set of light bars extracted on all columns C of imageThe method comprises the following steps:
(1)
turning off a word line laser A and turning on a word line laser B, extracting the laser line center of a stripe formed on an object, recording the current frame of an image as N+1, and extracting the abscissa of the center of the j-th stripe of the image asLight bar center set extracted on all columns C of image +.>The method comprises the following steps:
(2)
simultaneously turning on a word line laser A and a word line laser B to extract the image formed on the objectThe center of the laser line of the stripe is recorded as N+2 in the current frame of the image, and the abscissa of the center of the stripe in the j-th row of image extraction isLight bar center set extracted on all columns C of image +.>The method comprises the following steps:
(3)
can be achieved by the following method
(4)
(5)
Comparison: if the formula (4) is satisfied, determining that the stripe belongs to a word line laser A; if the formula (5) is satisfied, determining that the stripe belongs to a word line laser B; if the formulas (4) and (5) are both satisfied, the extracted points are not reconstructed in three dimensions;the threshold is allowed for the extraction of the fringe position.
CN202311755096.8A 2023-12-20 2023-12-20 Cross line three-dimensional camera and modeling method Active CN117433450B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311755096.8A CN117433450B (en) 2023-12-20 2023-12-20 Cross line three-dimensional camera and modeling method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311755096.8A CN117433450B (en) 2023-12-20 2023-12-20 Cross line three-dimensional camera and modeling method

Publications (2)

Publication Number Publication Date
CN117433450A true CN117433450A (en) 2024-01-23
CN117433450B CN117433450B (en) 2024-04-19

Family

ID=89556853

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311755096.8A Active CN117433450B (en) 2023-12-20 2023-12-20 Cross line three-dimensional camera and modeling method

Country Status (1)

Country Link
CN (1) CN117433450B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101856773A (en) * 2010-04-22 2010-10-13 广州中国科学院工业技术研究院 Focusing positioning method based on initial laser processing position and laser processing device
KR20130016838A (en) * 2011-08-09 2013-02-19 (주)위시스 Apparatus for measuring device unit using the cross beam laser
CN103727927A (en) * 2013-12-19 2014-04-16 大连理工大学 High-velocity motion object pose vision measurement method based on structured light
CN104501768A (en) * 2014-08-14 2015-04-08 武汉武大卓越科技有限责任公司 Rail rigidity measuring method based on machine vision
CN105300310A (en) * 2015-11-09 2016-02-03 杭州讯点商务服务有限公司 Handheld laser 3D scanner with no requirement for adhesion of target spots and use method thereof
CN107907048A (en) * 2017-06-30 2018-04-13 长沙湘计海盾科技有限公司 A kind of binocular stereo vision method for three-dimensional measurement based on line-structured light scanning
CN108458670A (en) * 2018-05-10 2018-08-28 清华大学深圳研究生院 A kind of the three-D profile scanning means and method of dual laser
CN112161574A (en) * 2020-10-12 2021-01-01 昆明理工大学 Three-dimensional measurement system and measurement method based on divergent multi-line laser projection
CN112571159A (en) * 2020-12-10 2021-03-30 高铭科维科技无锡有限公司 Component polishing method based on visual detection and polishing system thereof
CN116934826A (en) * 2023-06-14 2023-10-24 昆明理工大学 Multi-line laser stripe clustering and matching method based on custom window iteration

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101856773A (en) * 2010-04-22 2010-10-13 广州中国科学院工业技术研究院 Focusing positioning method based on initial laser processing position and laser processing device
KR20130016838A (en) * 2011-08-09 2013-02-19 (주)위시스 Apparatus for measuring device unit using the cross beam laser
CN103727927A (en) * 2013-12-19 2014-04-16 大连理工大学 High-velocity motion object pose vision measurement method based on structured light
CN104501768A (en) * 2014-08-14 2015-04-08 武汉武大卓越科技有限责任公司 Rail rigidity measuring method based on machine vision
CN105300310A (en) * 2015-11-09 2016-02-03 杭州讯点商务服务有限公司 Handheld laser 3D scanner with no requirement for adhesion of target spots and use method thereof
CN107907048A (en) * 2017-06-30 2018-04-13 长沙湘计海盾科技有限公司 A kind of binocular stereo vision method for three-dimensional measurement based on line-structured light scanning
CN108458670A (en) * 2018-05-10 2018-08-28 清华大学深圳研究生院 A kind of the three-D profile scanning means and method of dual laser
CN112161574A (en) * 2020-10-12 2021-01-01 昆明理工大学 Three-dimensional measurement system and measurement method based on divergent multi-line laser projection
CN112571159A (en) * 2020-12-10 2021-03-30 高铭科维科技无锡有限公司 Component polishing method based on visual detection and polishing system thereof
CN116934826A (en) * 2023-06-14 2023-10-24 昆明理工大学 Multi-line laser stripe clustering and matching method based on custom window iteration

Also Published As

Publication number Publication date
CN117433450B (en) 2024-04-19

Similar Documents

Publication Publication Date Title
CN106338521B (en) Increasing material manufacturing surface and internal flaw and pattern composite detection method and device
CN108510476B (en) Mobile phone screen circuit detection method based on machine vision
CN105069789B (en) Structure light dynamic scene depth acquisition methods based on coding grid template
CN102020036A (en) Visual detection method for transparent paper defect of outer package of strip cigarette
CN106210444A (en) Kinestate self adaptation key frame extracting method
CN112884880B (en) Line laser-based honey pomelo three-dimensional modeling device and method
CN105069751A (en) Depth image missing data interpolation method
CN104331924A (en) Three-dimensional reconstruction method based on single camera SFS algorithm
CN107463659A (en) Object search method and its device
CN107264570A (en) steel rail light band distribution detecting device and method
CN206638598U (en) A kind of electric connector housing defect detecting device based on the comprehensive active vision of single camera
CN117433450B (en) Cross line three-dimensional camera and modeling method
CN110348344B (en) Special facial expression recognition method based on two-dimensional and three-dimensional fusion
Li et al. Face detection based on depth information using HOG-LBP
CN109816738A (en) A kind of striped boundary extraction algorithm based on coded structured light
CN112991517B (en) Three-dimensional reconstruction method for texture image coding and decoding automatic matching
CN114494169A (en) Industrial flexible object detection method based on machine vision
CN204329903U (en) Hand-held laser three-dimensional scanning equipment
CN106937105A (en) The 3D rendering method for building up of three-dimensional scanner and target object based on structure light
CN102682293A (en) Method and system for identifying salient-point mould number of revolution-solid glass bottle based on images
CN113486878B (en) Graphical interactive machine vision system
CN115641577A (en) Fruit identification method based on improved YOLOv5 network
CN113237895B (en) Metal surface defect detection system based on machine vision
Germain et al. Non destructive counting of wheatear with picture analysis
CN208383079U (en) A kind of laser strobe difference three-dimension sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant