CN109064474B - Method for automatically acquiring mask diagram by interactive classroom teaching system - Google Patents

Method for automatically acquiring mask diagram by interactive classroom teaching system Download PDF

Info

Publication number
CN109064474B
CN109064474B CN201810852828.8A CN201810852828A CN109064474B CN 109064474 B CN109064474 B CN 109064474B CN 201810852828 A CN201810852828 A CN 201810852828A CN 109064474 B CN109064474 B CN 109064474B
Authority
CN
China
Prior art keywords
pixel
image
value
difference value
obtaining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810852828.8A
Other languages
Chinese (zh)
Other versions
CN109064474A (en
Inventor
汪俊锋
邓宏平
高祥
戴平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui Huishi Jintong Technology Co ltd
Original Assignee
Anhui Huishi Jintong Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui Huishi Jintong Technology Co ltd filed Critical Anhui Huishi Jintong Technology Co ltd
Priority to CN201810852828.8A priority Critical patent/CN109064474B/en
Publication of CN109064474A publication Critical patent/CN109064474A/en
Application granted granted Critical
Publication of CN109064474B publication Critical patent/CN109064474B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20224Image subtraction

Abstract

The invention discloses a method for automatically acquiring a mask image by an interactive classroom teaching system, and relates to the technical field of mask image acquisition. The method comprises the steps of collecting a sample, fusing an outer block diagram and extracting a mask diagram. Drawing eight stripe patterns in different directions, projecting the stripe patterns onto a wall, and capturing by a camera to obtain corresponding external block diagrams in different directions; a plurality of outer frame images are processed and then fused into a fusion image, so that the interference of other software is effectively avoided; when one outer frame image is grabbed and other software is popped suddenly, the other outer frame images can make up the vacancy blocked by the software; the problem of incomplete mask patterns caused by sudden software popping is effectively solved, the completeness of the mask patterns is improved, and the efficiency is improved.

Description

Method for automatically acquiring mask diagram by interactive classroom teaching system
Technical Field
The invention belongs to the technical field of mask image acquisition, and particularly relates to a method for automatically acquiring a mask image by an interactive classroom teaching system.
Background
The interactive classroom teaching system integrates products such as a computer, a projector, a camera, an infrared emitter and the like, and realizes an interactive function. The 'interaction' is a method for realizing the same effect as the operation of a computer screen by touching the projection surface with a finger or an infrared laser pen.
When interactive projection is used, automatic calibration and manual calibration are needed, and the automatic calibration and the manual calibration are used for calculating the mapping relation between the projection surface and the computer screen, so that the projection surface is operated to simulate a mouse event, and an interactive function is realized. Before automatic and manual calibration, a mask is required to determine the range that can be manipulated when using interactive projection, and the interactive projection cannot be used beyond the range. The mask image is an outer block diagram projected by a projector, the image is captured by a camera, and a series of processing is carried out to obtain an image white area, namely the size of a control range which can be controlled by a user by using the interactive medical system. The original method comprises the following steps: the method for obtaining the mask image at present is to draw a single external block diagram, project the external block diagram onto a wall, capture the external block diagram through a camera, and obtain a mask image after image processing, and when software wants to calibrate the capture of the projected external block diagram through the camera, if some other software suddenly appears, for example: 360 safety guards, input methods and the like, when the outer frame diagram is shielded, the obtained outer frame diagram has a large gap, an incomplete mask diagram can be obtained in later image processing, and software cannot be normally used.
The invention aims to provide a method for automatically acquiring a mask diagram by an interactive classroom teaching system, which is used for solving the problem of incomplete mask diagram caused by sudden pop-up of an interface in the existing mask diagram acquisition process.
Disclosure of Invention
The invention aims to provide a method for automatically acquiring a mask image by an interactive classroom teaching system, which comprises the steps of drawing eight striped images in different directions, projecting the striped images onto a wall, and capturing corresponding external block diagrams in different directions by a camera; a plurality of outer frame graphs are fused into a fusion graph after being processed, and the problem that the mask graph is incomplete due to the fact that an interface pops out suddenly in the existing mask graph obtaining process is solved.
In order to solve the technical problems, the invention is realized by the following technical scheme:
the invention discloses a method for automatically acquiring a mask image by an interactive classroom teaching system, which comprises the following steps: collecting a sample, fusing an outer block diagram and extracting a mask diagram; wherein:
the collecting of the sample specifically comprises:
a000: drawing a pure black full screen image and eight pure black stripe images with different directions of a background;
a001: capturing the pure black full screen image through a camera to obtain a background image;
a002: capturing eight stripe graphs through a camera to obtain eight external block diagrams;
wherein, eight said exterior block diagrams include: a four-edge out-of-range block diagram, a 0-degree out-of-range block diagram, a 90-degree out-of-range block diagram, a 30-degree out-of-range block diagram, a 60-degree out-of-range block diagram, a 15-degree out-of-range block diagram, a 75-degree out-of-range block diagram, and a 45-degree out-of-range block diagram;
the fused outer frame figure specifically comprises: performing fusion pretreatment on the outer frame diagrams and fusion treatment on eight outer frame diagrams;
the outer frame fusion pretreatment comprises the following processes:
b000: subtracting the background image pixel value from the outer frame image pixel value to obtain a difference image;
b001: traversing the difference graph and sequentially carrying out eight-direction gradient processing on the pixel points to obtain a gradient processing graph;
b002: carrying out binarization processing on the gradient processing image to obtain a binarization image;
the fusion processing of the eight outer frame images comprises the following processes:
c000: fusing the binary image and the background image of the four-outside block diagram through the function processing of phase combination to obtain a pre-fused image of the four-outside block diagram;
c001: c000 processing is carried out on the binarization images of the remaining seven outer block diagrams to obtain corresponding pre-fusion images;
c002: fusing the pre-fused graph of the four outer frame diagrams and the pre-fused graphs of the remaining seven outer frame diagrams through a phase and phase function to obtain a fused graph;
wherein, extracting the mask diagram comprises the following processes:
acquiring a maximum connected domain maxContour in the fused graph connected domain set; and assigning a background image to the image pixel of the maximum connected domain maxContour and obtaining a mask image.
Preferably, the stripe pattern is a stripe of white 40 pixel width drawn on a solid black background; eight of the stripe patterns comprise: a four-side stripe pattern, a 0-degree stripe pattern, a 90-degree stripe pattern, a 30-degree stripe pattern, a 60-degree stripe pattern, a 15-degree stripe pattern, a 75-degree stripe pattern, and a 45-degree stripe pattern.
Preferably, traversing the difference map in B001 specifically includes the following:
confirming the traversal range and the traversal sequence of the difference map;
the traversal range is a rectangular frame on the difference graph; the distances between the four sides of the rectangular frame and the four sides of the difference graph are step;
the traversal sequence is a line-by-line traversal and the traversal starting point is (0, 0).
Preferably, the processing procedure for the eight-direction gradient of the pixel points is as follows:
t000: confirming a pixel point CurrPt (x, y) and a gradient distance step, wherein x is step, and y is step;
t001: obtaining a left upper difference value leftTopDiff by subtracting a pixel point CurrPt (x, y) pixel from a left upper point leftTopPt (x-step, y-step) pixel, obtaining a right lower difference value rightBottomDiff by subtracting the pixel point CurrPt (x, y) pixel from a right lower point rightBottomPt (x + step, y + step) pixel, obtaining the minimum value of the left upper difference value leftTopDiff and the right lower difference value rightBottomDiff, and assigning the minimum value to edgeVal;
t002: taking the difference between a CurrPt (x, y) pixel of a pixel point and a TopPt (x, y-step) pixel of an upper point to obtain an upper difference value topDiff, taking the difference between the CurrPt (x, y) pixel of the pixel point and a BottomPt (x, y + step) pixel of a lower point to obtain a lower difference value botttomDiff, and obtaining the minimum value of the upper difference value topDiff and the lower difference value botttomDiff and assigning the minimum value to tempVal 1;
t003: obtaining an upper right difference value rightTopDiff by subtracting a pixel point CurrPt (x, y) pixel from an upper right RightTopPt (x + step, y-step) pixel, obtaining a lower left difference value leftBottomDiff by subtracting the pixel point CurrPt (x, y) pixel from a lower left LeftBottomPt (x + step, y + step) pixel, obtaining the minimum value of the upper right difference value rightTopDiff and the lower left difference value leftBottomDiff, and assigning the minimum value to tempVal 2;
t004: obtaining a left difference value leftDiff by differentiating a CurrPt (x, y) pixel of a pixel point with a left point LeftPt (x-step, y) pixel, obtaining a right difference value rightDiff by differentiating a CurrPt (x, y) pixel of the pixel point with a right point RightPt (x + step, y) pixel, obtaining the minimum value of the left difference value leftDiff and the right difference value rightDiff, and assigning the minimum value to a tempVal 3;
t005: obtaining the maximum value of tempVal1, tempVal2, tempVal3 and edgeVal and assigning the maximum value to the edgeVal;
t006: and assigning the edgeVal to the CurrPt (x, y) gray value of the pixel point.
Preferably, the binarization processing includes the following processes:
traversing the two-dimensional data of the gradient processing image, and subtracting the gray value of the background image from the gray value of the gradient processing image to obtain a difference value; if the difference value is larger than the critical value, the gray value of the current row and column of the gradient processing image is 255; otherwise, the gray values of the current row and column of the gradient processing graph are 0; the critical value ranges from 20 to 50 pixels.
Preferably, the connected domain refers to an outline of an image region which has the same pixel value and is formed by foreground pixel points adjacent in position in the image.
The invention has the following beneficial effects:
drawing eight stripe patterns in different directions, projecting the stripe patterns onto a wall, and capturing by a camera to obtain corresponding external block diagrams in different directions; a plurality of outer frame images are processed and then fused into a fusion image, so that the interference of other software is effectively avoided; when one outer frame image is grabbed and other software is popped suddenly, the other outer frame images can make up the vacancy blocked by the software; the problem of incomplete mask patterns caused by sudden software popping is effectively solved, the completeness of the mask patterns is improved, and the efficiency is improved.
Of course, it is not necessary for any product in which the invention is practiced to achieve all of the above-described advantages at the same time.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flow chart of the present invention for collecting a sample;
FIG. 2 is a flow chart of the pre-fusion processing of the external block diagram in the present invention;
FIG. 3 is a flow chart of eight outer frame graph fusion processing according to the present invention;
FIG. 4 is a background view of the present invention;
FIG. 5 is a diagram of four-sided stripes in the present invention;
FIG. 6 is a 60 degree fringe pattern in accordance with the present invention;
FIG. 7 is a 75 degree fringe pattern in accordance with the present invention;
FIG. 8 is a 45 degree fringe pattern in accordance with the present invention;
FIG. 9 is a 90 degree fringe pattern in accordance with the present invention;
FIG. 10 is a0 degree fringe pattern in the present invention;
FIG. 11 is a 75 degree fringe pattern in accordance with the present invention;
FIG. 12 is a 15 degree fringe pattern in accordance with the present invention;
FIG. 13 is a mask diagram of the present invention;
FIG. 14 is a schematic diagram of a pixel and gradient point structure according to the present invention;
FIG. 15 is a schematic view of a traversal range in the present invention;
fig. 16 is a flowchart of the eight-direction gradient processing of pixel points in the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention discloses a method for automatically acquiring a mask image by an interactive classroom teaching system, which comprises the following steps: collecting a sample, fusing an outer block diagram and extracting a mask diagram; wherein:
referring to fig. 1 and 4, the acquiring the sample specifically includes:
a000: drawing a pure black full screen image and eight pure black stripe images with different directions of a background;
a001: capturing a pure black full screen image through a camera to obtain a background image;
a002: capturing eight fringe patterns through a camera to obtain eight external block diagrams;
referring to fig. 5-12, eight outer frame diagrams include: a four-edge out-of-range block diagram, a 0-degree out-of-range block diagram, a 90-degree out-of-range block diagram, a 30-degree out-of-range block diagram, a 60-degree out-of-range block diagram, a 15-degree out-of-range block diagram, a 75-degree out-of-range block diagram, and a 45-degree out-of-range block diagram;
the fused outer frame figure specifically comprises: performing fusion pretreatment on the outer frame diagrams and fusion treatment on eight outer frame diagrams;
referring to fig. 2, the processing before fusing the external block diagram includes the following processes:
b000: subtracting the background image pixel value from the outer frame image pixel value to obtain a difference image;
b001: traversing the difference graph and sequentially carrying out eight-direction gradient processing on the pixel points to obtain a gradient processing graph;
b002: carrying out binarization processing on the gradient processing image to obtain a binarization image;
referring to fig. 3, the fusion process of the eight outer frame images includes the following steps:
c000: fusing the binary image and the background image of the four-outside block diagram through the function processing of phase combination to obtain a pre-fused image of the four-outside block diagram;
c001: c000 processing is carried out on the binarization images of the remaining seven outer block diagrams to obtain corresponding pre-fusion images;
c002: fusing the pre-fused graph of the four outer frame diagrams and the pre-fused graphs of the remaining seven outer frame diagrams through a phase and phase function to obtain a fused graph;
referring to fig. 13, extracting the mask map includes the following steps:
acquiring a maximum connected domain maxContour in the connected domain set of the fusion graph; and assigning the image pixel of the maximum connected domain maxContour to a background image and obtaining a mask image.
Wherein the stripe pattern is a stripe of white 40 pixel width drawn on a solid black background; the eight stripe patterns comprise: a four-side stripe pattern, a 0-degree stripe pattern, a 90-degree stripe pattern, a 30-degree stripe pattern, a 60-degree stripe pattern, a 15-degree stripe pattern, a 75-degree stripe pattern, and a 45-degree stripe pattern.
The traversal difference map in B001 specifically includes the following:
confirming the traversal range and the traversal sequence of the difference map;
the traversal range is a rectangular frame on the difference graph; the distances between the four sides of the rectangular frame and the four sides of the difference graph are step;
the traversal order is a row-by-row traversal and the traversal start point is (0, 0).
Referring to fig. 14-16, the process of processing the eight-directional gradient of the pixel point is as follows:
t000: confirming a pixel point CurrPt (x, y) and a gradient distance step, wherein x is step, and y is step;
t001: obtaining a left upper difference value leftTopDiff by subtracting a pixel point CurrPt (x, y) pixel from a left upper point leftTopPt (x-step, y-step) pixel, obtaining a right lower difference value rightBottomDiff by subtracting the pixel point CurrPt (x, y) pixel from a right lower point rightBottomPt (x + step, y + step) pixel, obtaining the minimum value of the left upper difference value leftTopDiff and the right lower difference value rightBottomDiff, and assigning the minimum value to edgeVal;
t002: taking the difference between a CurrPt (x, y) pixel of a pixel point and a TopPt (x, y-step) pixel of an upper point to obtain an upper difference value topDiff, taking the difference between the CurrPt (x, y) pixel of the pixel point and a BottomPt (x, y + step) pixel of a lower point to obtain a lower difference value botttomDiff, and obtaining the minimum value of the upper difference value topDiff and the lower difference value botttomDiff and assigning the minimum value to tempVal 1;
t003: obtaining an upper right difference value rightTopDiff by subtracting a pixel point CurrPt (x, y) pixel from an upper right RightTopPt (x + step, y-step) pixel, obtaining a lower left difference value leftBottomDiff by subtracting the pixel point CurrPt (x, y) pixel from a lower left LeftBottomPt (x + step, y + step) pixel, obtaining the minimum value of the upper right difference value rightTopDiff and the lower left difference value leftBottomDiff, and assigning the minimum value to tempVal 2;
t004: obtaining a left difference value leftDiff by differentiating a CurrPt (x, y) pixel of a pixel point with a left point LeftPt (x-step, y) pixel, obtaining a right difference value rightDiff by differentiating a CurrPt (x, y) pixel of the pixel point with a right point RightPt (x + step, y) pixel, obtaining the minimum value of the left difference value leftDiff and the right difference value rightDiff, and assigning the minimum value to a tempVal 3;
t005: obtaining the maximum value of tempVal1, tempVal2, tempVal3 and edgeVal and assigning the maximum value to the edgeVal;
t006: and assigning the edgeVal to the CurrPt (x, y) gray value of the pixel point.
The binarization processing comprises the following processes:
traversing the two-dimensional data of the gradient processing image, and subtracting the gray value of the background image from the gray value of the gradient processing image to obtain a difference value; if the difference value is larger than the critical value, the gray value of the current row and column of the gradient processing image is 255; otherwise, the gray values of the current row and column of the gradient processing graph are 0; the threshold value ranges from 20 to 50 pixels.
The connected domain refers to the outline of an image region which is formed by foreground pixel points with the same pixel value and adjacent positions in the image.
It should be noted that, in the above system embodiment, each included unit is only divided according to functional logic, but is not limited to the above division as long as the corresponding function can be implemented; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
In addition, those skilled in the art can understand that all or part of the steps in the method for implementing the embodiments described above can be implemented by a program to instruct the relevant hardware.
The preferred embodiments of the invention disclosed above are intended to be illustrative only. The preferred embodiments are not intended to be exhaustive or to limit the invention to the precise embodiments disclosed. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, to thereby enable others skilled in the art to best utilize the invention. The invention is limited only by the claims and their full scope and equivalents.

Claims (6)

1. A method for automatically acquiring a mask image by an interactive classroom teaching system is characterized by comprising the following steps: collecting a sample, fusing an outer block diagram and extracting a mask diagram; wherein:
the collecting of the sample specifically comprises:
a000: drawing a pure black full screen image and eight pure black stripe images with different directions of a background;
a001: capturing the pure black full screen image through a camera to obtain a background image;
a002: capturing eight stripe graphs through a camera to obtain eight external block diagrams;
wherein, eight said exterior block diagrams include: a four-edge out-of-range block diagram, a 0-degree out-of-range block diagram, a 90-degree out-of-range block diagram, a 30-degree out-of-range block diagram, a 60-degree out-of-range block diagram, a 15-degree out-of-range block diagram, a 75-degree out-of-range block diagram, and a 45-degree out-of-range block diagram;
the fused outer frame figure specifically comprises: performing fusion pretreatment on the outer frame diagrams and fusion treatment on eight outer frame diagrams;
the outer frame fusion pretreatment comprises the following processes:
b000: subtracting the background image pixel value from the outer frame image pixel value to obtain a difference image;
b001: traversing the difference graph and sequentially carrying out eight-direction gradient processing on the pixel points to obtain a gradient processing graph;
b002: carrying out binarization processing on the gradient processing image to obtain a binarization image;
the fusion processing of the eight outer frame images comprises the following processes:
c000: fusing the binary image and the background image of the four-outside block diagram through the function processing of phase combination to obtain a pre-fused image of the four-outside block diagram;
c001: c000 processing is carried out on the binarization images of the remaining seven outer block diagrams to obtain corresponding pre-fusion images;
c002: fusing the pre-fused graph of the four outer frame diagrams and the pre-fused graphs of the remaining seven outer frame diagrams through a phase and phase function to obtain a fused graph;
wherein, extracting the mask diagram comprises the following processes:
acquiring a maximum connected domain maxContour in the fused graph connected domain set; and assigning a background image to the image pixel of the maximum connected domain maxContour and obtaining a mask image.
2. The method for automatically acquiring the mask image by the interactive classroom teaching system as claimed in claim 1, wherein the strip chart is a white 40-pixel wide strip drawn on a pure black background; eight of the stripe patterns comprise: a four-side stripe pattern, a 0-degree stripe pattern, a 90-degree stripe pattern, a 30-degree stripe pattern, a 60-degree stripe pattern, a 15-degree stripe pattern, a 75-degree stripe pattern, and a 45-degree stripe pattern.
3. The method of claim 1, wherein traversing the difference map in B001 specifically comprises:
confirming the traversal range and the traversal sequence of the difference map;
the traversal range is a rectangular frame on the difference graph; the distances between the four sides of the rectangular frame and the four sides of the difference graph are step;
the traversal sequence is a line-by-line traversal and the traversal starting point is (0, 0).
4. The method for automatically acquiring the mask image by the interactive classroom teaching system as claimed in claim 1, wherein the processing of eight-direction gradients of pixel points is as follows:
t000: confirming a pixel point CurrPt (x, y) and a gradient distance step, wherein x is step, and y is step;
t001: obtaining a left upper difference value leftTopDiff by subtracting a pixel point CurrPt (x, y) pixel from a left upper point leftTopPt (x-step, y-step) pixel, obtaining a right lower difference value rightBottomDiff by subtracting the pixel point CurrPt (x, y) pixel from a right lower point rightBottomPt (x + step, y + step) pixel, obtaining the minimum value of the left upper difference value leftTopDiff and the right lower difference value rightBottomDiff, and assigning the minimum value to edgeVal;
t002: taking the difference between a CurrPt (x, y) pixel of a pixel point and a TopPt (x, y-step) pixel of an upper point to obtain an upper difference value topDiff, taking the difference between the CurrPt (x, y) pixel of the pixel point and a BottomPt (x, y + step) pixel of a lower point to obtain a lower difference value botttomDiff, and obtaining the minimum value of the upper difference value topDiff and the lower difference value botttomDiff and assigning the minimum value to tempVal 1;
t003: obtaining an upper right difference value rightTopDiff by subtracting a pixel point CurrPt (x, y) pixel from an upper right RightTopPt (x + step, y-step) pixel, obtaining a lower left difference value leftBottomDiff by subtracting the pixel point CurrPt (x, y) pixel from a lower left LeftBottomPt (x + step, y + step) pixel, obtaining the minimum value of the upper right difference value rightTopDiff and the lower left difference value leftBottomDiff, and assigning the minimum value to tempVal 2;
t004: obtaining a left difference value leftDiff by differentiating a CurrPt (x, y) pixel of a pixel point with a left point LeftPt (x-step, y) pixel, obtaining a right difference value rightDiff by differentiating a CurrPt (x, y) pixel of the pixel point with a right point RightPt (x + step, y) pixel, obtaining the minimum value of the left difference value leftDiff and the right difference value rightDiff, and assigning the minimum value to a tempVal 3;
t005: obtaining the maximum value of tempVal1, tempVal2, tempVal3 and edgeVal and assigning the maximum value to the edgeVal;
t006: and assigning the edgeVal to the CurrPt (x, y) gray value of the pixel point.
5. The method for automatically acquiring the mask image in the interactive classroom teaching system as claimed in claim 1, wherein the binarization processing comprises the following processes:
traversing the two-dimensional data of the gradient processing image, and subtracting the gray value of the background image from the gray value of the gradient processing image to obtain a difference value; if the difference value is larger than the critical value, the gray value of the current row and column of the gradient processing image is 255; otherwise, the gray values of the current row and column of the gradient processing graph are 0; the critical value ranges from 20 to 50 pixels.
6. The method as claimed in claim 1, wherein the connected component is an outline of an image region composed of foreground pixels with the same pixel value and adjacent positions.
CN201810852828.8A 2018-07-30 2018-07-30 Method for automatically acquiring mask diagram by interactive classroom teaching system Active CN109064474B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810852828.8A CN109064474B (en) 2018-07-30 2018-07-30 Method for automatically acquiring mask diagram by interactive classroom teaching system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810852828.8A CN109064474B (en) 2018-07-30 2018-07-30 Method for automatically acquiring mask diagram by interactive classroom teaching system

Publications (2)

Publication Number Publication Date
CN109064474A CN109064474A (en) 2018-12-21
CN109064474B true CN109064474B (en) 2022-01-04

Family

ID=64831766

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810852828.8A Active CN109064474B (en) 2018-07-30 2018-07-30 Method for automatically acquiring mask diagram by interactive classroom teaching system

Country Status (1)

Country Link
CN (1) CN109064474B (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9053562B1 (en) * 2010-06-24 2015-06-09 Gregory S. Rabin Two dimensional to three dimensional moving image converter
KR101742816B1 (en) * 2010-12-20 2017-06-02 삼성디스플레이 주식회사 Mask frame assembly, manufacturing method of the same, and manufacturing method of organic light emitting display device thereused
CN105844655B (en) * 2016-04-19 2018-06-15 南京工程学院 A kind of laser rays stripe edge extracting method
CN105976332B (en) * 2016-05-03 2019-03-01 北京大学深圳研究生院 Image deblurring method based on bright fringes information in image
CN108168464B (en) * 2018-02-09 2019-12-13 东南大学 phase error correction method for defocusing phenomenon of fringe projection three-dimensional measurement system

Also Published As

Publication number Publication date
CN109064474A (en) 2018-12-21

Similar Documents

Publication Publication Date Title
US20170308736A1 (en) Three dimensional object recognition
US10223839B2 (en) Virtual changes to a real object
CN100367757C (en) Image recognition method and image recognition apparatus
US9454824B2 (en) Pattern processing device, pattern processing method, and pattern processing program
CN113146073B (en) Vision-based laser cutting method and device, electronic equipment and storage medium
CN110879131B (en) Imaging quality testing method and imaging quality testing device for visual optical system, and electronic apparatus
CN111957040A (en) Method and device for detecting shielding position, processor and electronic device
JP2007285754A (en) Flaw detection method and flaw detector
CN113034447B (en) Edge defect detection method and device
US9264680B2 (en) Apparatus, method and computer-readable recording medium for improving visibility of an image projected on a screen
KR20160115663A (en) Image processing apparatus and image processing method
CN107808109A (en) A kind of image in 2 D code recognition methods and mobile terminal
WO2023019793A1 (en) Determination method, cleaning robot, and computer storage medium
CN101464953A (en) Outline extracting apparatus and method
CN109064474B (en) Method for automatically acquiring mask diagram by interactive classroom teaching system
Khan et al. A deep hybrid few shot divide and glow method for ill-light image enhancement
JP2005165387A (en) Method and device for detecting stripe defective of picture and display device
CN106303153B (en) A kind of image processing method and device
CN102855025B (en) Optical multi-touch contact detection method based on visual attention model
CN114860184A (en) Processing device, system and method for blackboard writing display
CN110874814A (en) Image processing method, image processing device and terminal equipment
JPS63153682A (en) Method and apparatus for processing contrast picture
JP2020091817A (en) Drawing recognition device, drawing recognition method, and program
KR20180045164A (en) Reconstructing Method of Oil Painting Using Multi-Angle Photographing Image
JP3941403B2 (en) Image density unevenness detection method and inspection apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 230000 Yafu Park, Juchao Economic Development Zone, Chaohu City, Hefei City, Anhui Province

Applicant after: ANHUI HUISHI JINTONG TECHNOLOGY Co.,Ltd.

Address before: 230000 Room 102, 1st Floor, C District, Science Park, Hefei National University, 602 Huangshan Road, Hefei High-tech Zone, Anhui Province

Applicant before: ANHUI HUISHI JINTONG TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant