CN114356189A - Editing method and device for panel image to be detected, electronic equipment and storage medium - Google Patents

Editing method and device for panel image to be detected, electronic equipment and storage medium Download PDF

Info

Publication number
CN114356189A
CN114356189A CN202111544959.8A CN202111544959A CN114356189A CN 114356189 A CN114356189 A CN 114356189A CN 202111544959 A CN202111544959 A CN 202111544959A CN 114356189 A CN114356189 A CN 114356189A
Authority
CN
China
Prior art keywords
detection
frames
detection frames
detection frame
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111544959.8A
Other languages
Chinese (zh)
Inventor
匡梦良
张鑫
朱小明
殷亚男
许超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Mega Technology Co Ltd
Original Assignee
Suzhou Mega Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Mega Technology Co Ltd filed Critical Suzhou Mega Technology Co Ltd
Priority to CN202111544959.8A priority Critical patent/CN114356189A/en
Publication of CN114356189A publication Critical patent/CN114356189A/en
Pending legal-status Critical Current

Links

Images

Abstract

The embodiment of the invention provides a method and a device for editing a panel image to be detected, electronic equipment and a storage medium. The method comprises the following steps: displaying a user interface, wherein the user interface is used for displaying an image of the panel to be detected and a detection frame of an electrode of the panel to be detected; sequencing at least one part of the detection frames according to the positions of the detection frames in the image; and grouping at least one part of the detection frames according to the sorting result to generate a detection frame group. Therefore, the efficiency and the speed of panel detection are effectively improved, the time and the energy of a user are saved, and better use experience is provided for the user.

Description

Editing method and device for panel image to be detected, electronic equipment and storage medium
Technical Field
The present invention relates to the field of panel detection, and more particularly, to a method and an apparatus for editing an image of a panel to be detected, an electronic device, and a storage medium.
Background
Chip On Glass (COG for short) is a technology in which a driving circuit Chip is directly bonded On a Glass substrate, and is widely applied to various display products such as liquid crystal display and electroluminescence technologies. In the COG process, a Conductive pin of a driving circuit is aligned to an electrode (bump) on a glass substrate, an Anisotropic Conductive Film (ACF) is used as a bonding dielectric material, and the Conductive pin of the driving circuit is connected and conducted with the electrode on the glass substrate at a high temperature and a high voltage for a certain period of time. Similarly, the flexible circuit board On Glass (FPC On Glass, FOG for short) is a technique in which a flexible circuit board (FPC) is directly bonded to a Glass substrate, and the process is similar to COG. Similarly, Chip On Film (COF) technology is a technology in which a semiconductor chip is first packaged on a flexible substrate, and then the flexible substrate of the packaged product is bonded to a glass substrate, and the manufacturing process is similar to COG. The panel detection technology may be used to detect the connection and conduction quality of the conductive pins of the driving circuit and the electrodes on the glass substrate, so as to judge the quality of the panel according to certain criteria.
At present, the related operations of panel detection are generally performed in units of electrodes of the panel. For example, the detection parameters thereof are set for a certain electrode. This seriously affects the efficiency of panel inspection, greatly reducing the speed of panel inspection. In particular, there are many cases of electrodes, and it may be necessary to repeat the exact same operation several times. This affects the user experience.
Disclosure of Invention
The present invention has been made in view of the above problems. According to an aspect of the present invention, there is provided a method for editing an image of a panel to be detected, including: displaying a user interface, wherein the user interface is used for displaying an image of the panel to be detected and a detection frame of an electrode of the panel to be detected; sequencing at least one part of the detection frames according to the positions of the detection frames in the image; and grouping at least one part of the detection frames according to the sorting result to generate a detection frame group.
Illustratively, sorting at least a part of the detection frames according to the positions of the detection frames in the image comprises: determining a detection frame of each line according to the position of at least one part of the detection frames in the image, wherein the longitudinal distance between adjacent detection frames in the detection frames of each line is smaller than a first preset value; continuously sequencing all the detection frames in at least a part of the detection frames according to the positions of at least a part of the detection frames in the image, wherein for the detection frames in each row, the sequence number is earlier the farther the detection frame is to the left; the number of the leftmost detection frame in the detection frames of the next adjacent row is adjacent to the number of the rightmost detection frame in the detection frames of the current row, and the number of the leftmost detection frame in the detection frames of the next adjacent row is arranged behind the number of the rightmost detection frame in the detection frames of the current row.
Illustratively, grouping at least a part of the detection boxes according to the sorting result to generate a detection box group comprises: and dividing the detection frames belonging to the same row into the same detection frame group according to the sorting result, or else, dividing the detection frames into different detection frame groups.
Illustratively, grouping at least a part of the detection boxes according to the sorting result to generate a detection box group comprises: judging whether the transverse distance between two adjacent detection frames in each row of detection frames in at least one part of detection frames is smaller than a second preset value, if so, dividing the detection frames into the same detection frame group, otherwise, dividing the detection frames into different detection frame groups; and dividing the detection frames of different rows in at least one part of the detection frames into different detection frame groups.
The user interface is further used for displaying a second operable control for setting the minimum sequence number and the maximum sequence number of the detection boxes in at least one detection box group in the detection box groups in response to the operation of the user; grouping at least a part of the detection frames according to the sorting result to generate a detection frame group, comprising: and dividing the detection frames with the sequence numbers which are more than or equal to the minimum sequence number and less than or equal to the maximum sequence number into corresponding detection frame groups according to the minimum sequence number and the maximum sequence number.
Illustratively, the user interface is further used for displaying a third operable control for responding to the operation of the user to start the selection operation of the detection box; the method further comprises the following steps: at least a portion of the detection boxes are determined in response to a user selection of at least a portion of the detection boxes.
Illustratively, the user interface is further for displaying a fourth operable control, the method further comprising: and responding to the operation of the user on the fourth operable control, and switching the currently activated detection frame group.
Illustratively, the user interface is further configured to display a fifth operable control, and group at least a part of the detection frames according to the sorting result to generate a detection frame group, including: identifying the sorted detection frames by using a uniform group identifier; the method further comprises the following steps: and deleting the group identification of the detection frame in the detection frame group in response to the operation of the fifth operable control by the user so as to delete the detection frame group.
Illustratively, the method further comprises: and responding to the parameter setting operation of the user, and performing unified parameter setting on the detection frames in the detection frame group.
According to another aspect of the present invention, there is also provided an apparatus for editing an image of a panel to be inspected, the apparatus comprising: the display module is used for displaying a user interface, and the user interface is used for displaying the image of the panel to be detected and the detection frame of the electrode of the panel to be detected; the sorting module is used for sorting at least one part of the detection frames according to the positions of the detection frames in the image; and the grouping module is used for grouping at least one part of the detection frames according to the sorting result so as to generate a detection frame group.
According to yet another aspect of the present invention, there is also provided an electronic device comprising a display, a processor and a memory, wherein the display is used for displaying a user interface, and the memory stores computer program instructions which, when executed by the processor, are used for executing the method for editing an image of a panel to be inspected as described above.
According to a further aspect of the present invention, there is also provided a storage medium on which are stored program instructions for executing, when executed, the method of editing an image of a panel to be inspected as described above.
In the technical scheme of the application, the detection frames of part or all of the electrodes of the panel to be detected are divided into one group, so that the unified subsequent operation of each detection frame group obtained after grouping can be realized. The panel detection method and the panel detection device effectively improve the efficiency and speed of panel detection, save time and energy of users, and provide better use experience for the users.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
The above and other objects, features and advantages of the present invention will become more apparent by describing in more detail embodiments of the present invention with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention. In the drawings, like reference numbers generally represent like parts or steps.
Fig. 1 shows a schematic flow diagram of a method of editing an image of a panel to be inspected according to an embodiment of the invention;
FIG. 2 shows a schematic diagram of a user interface according to one embodiment of the invention;
FIG. 3 shows a schematic flow diagram of ordering at least a portion of the detection frames according to their position in the image according to one embodiment of the invention;
FIG. 4 shows a schematic diagram of a user interface according to another embodiment of the invention;
FIG. 5 shows a schematic flow diagram of grouping at least a portion of the test frames according to the sorting result to generate a test frame group according to one embodiment of the invention;
fig. 6 shows a schematic block diagram of an editing apparatus of an image of a panel to be detected according to an embodiment of the present invention; and
FIG. 7 shows a schematic block diagram of an electronic device according to an embodiment of the invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, exemplary embodiments according to the present invention will be described in detail below with reference to the accompanying drawings. It is to be understood that the described embodiments are merely a subset of embodiments of the invention and not all embodiments of the invention, with the understanding that the invention is not limited to the example embodiments described herein. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the invention described herein without inventive step, shall fall within the scope of protection of the invention.
Fig. 1 shows a schematic flow diagram of a method 100 for editing an image of a panel to be inspected, according to an embodiment of the invention. As shown in fig. 1, the method 100 may include the following steps.
And step S110, displaying a user interface, wherein the user interface is used for displaying the image of the panel to be detected and the detection frame of the electrode of the panel to be detected.
It will be appreciated that a user interface may be provided to the user for human-computer interaction, in order to facilitate user operation.
FIG. 2 shows a schematic diagram of a user interface according to one embodiment of the invention. As shown in fig. 2, the user interface may include an image display area 210 for displaying an image of the panel to be inspected. The image of the panel to be detected may be an original image acquired by an image acquisition device such as a camera in the panel detection system, or may be an image obtained after preprocessing the original image. The preprocessing operation may include all operations for more clearly performing panel detection. For example, the preprocessing operation may include a denoising operation such as filtering. The image may contain all or part of the electrodes in the panel to be detected. It will be appreciated that the image of the panel to be inspected is generally rectangular. The image of the panel to be detected can comprise a detection frame of the electrode. Each detection frame corresponds to one electrode. Ideally, the edge of the detection frame completely coincides with the edge of the electrode corresponding thereto. Illustratively, the detection frame of the electrode may be drawn along the electrode boundary in response to a user's operation. Alternatively, the detection frame of the electrode may be automatically drawn based on the gray scale of the image.
And step S120, sequencing at least a part of the detection frames according to the positions of the detection frames in the image.
At least a portion of the detection frames may be sorted according to their positions in the image. These detection frames can be considered to be arranged approximately in an array. Illustratively, according to the positions of the detection frames in the image, the lines in which the detection frames are respectively located can be determined. The detection boxes may be ordered according to row priority, left and right rules within the same row.
The aforementioned at least a part of the detection block may be determined in response to a selection operation by the user. Illustratively, the user interface is further configured to display a third operable control for initiating a selection operation of the detection box in response to an operation by the user. At least a portion of the detection boxes may be determined in response to a user selection of the at least a portion of the detection boxes. For example, the user may click the third operable control by using an input device such as a mouse, and the selection operation of the detection box may be initiated. At this time, the user may select, by using an input device such as a mouse or a touch screen, a detection frame on which sorting is desired on the image of the panel to be detected, so as to determine at least a part of the detection frame. Therefore, the user can select different detection frames according to actual requirements, the requirements of different users are met, and the use experience is improved. If the user does not select any detection frame, the detection frames of all the electrodes of the panel to be detected can be ordered by default.
It is to be understood that the above-mentioned manner of sorting the detection frames is only exemplary, and does not constitute a limitation on the sorting manner.
And step S130, grouping at least one part of the detection frames according to the sorting result to generate a detection frame group.
The sorting result can be represented by the above-mentioned numbering sequence for sorting the detection frames. For example, grouping operation may be performed on the sorted detection frames according to the number sequence, so that the detection frames located close to each other are in one group, thereby facilitating subsequent uniform processing on the detection frames in one detection frame group. For example, at least a part of the detection frames may be equally divided into n groups according to the number sequence, and the generated n groups of detection frames may be collectively referred to as a detection frame group.
In the technical scheme of the application, the detection frames of part or all of the electrodes of the panel to be detected are divided into one group, so that the unified subsequent operation of each detection frame group obtained after grouping can be realized. The panel detection method and the panel detection device effectively improve the efficiency and speed of panel detection, save time and energy of users, and provide better use experience for the users.
Fig. 3 shows a schematic flow chart of the step S120 of sorting at least a part of the detection frames according to their positions in the image according to one embodiment of the present invention. As shown in fig. 3, step S120 may include the following steps.
Step S121, determining the detection frame of each line according to the position of at least one part of the detection frame in the image, wherein the longitudinal distance between adjacent detection frames in the detection frame of each line is smaller than a first preset value.
It should be noted that "line" described in this embodiment is a broad line, and includes both the case where the upper and lower boundaries of all the detection boxes in the same line are flush with each other, and the case where the upper and lower boundaries of a plurality of detection boxes in the same line are not flush with each other completely. For the condition that the upper boundaries and the lower boundaries of a plurality of detection frames in the same row are not completely and correspondingly leveled, the detection frames can be considered to be in the same row as long as the longitudinal distance between adjacent detection frames in the detection frames is smaller than a first preset value.
For example, a rectangular coordinate system may be established with an upper left vertex of the image of the panel to be detected as an origin and with two adjacent boundaries intersecting the upper left vertex on the image of the panel to be detected as x and y axes, respectively.
The position of the respective detection frame can be expressed by the position coordinates of the upper left vertex of each detection frame. And respectively comparing the coordinate values of the abscissa in the position coordinates of the upper left vertex of each detection frame in the detection frames to be sorted. Alternatively, a detection frame in which the coordinate value of the abscissa is minimum may be taken as the first detection frame. And sequencing all the detection frames to be sequenced from the first detection frame according to the coordinate values of the abscissa until the detection frame with the largest coordinate value of the abscissa.
And respectively comparing the coordinate values of the vertical coordinate in the position coordinates of the upper left vertex of each two adjacent detection frames to obtain a vertical coordinate value difference value. The ordinate difference may represent a longitudinal distance of adjacent detection boxes. When the difference value of the coordinate values is smaller than the first preset value, the two adjacent detection frames can be regarded as the detection frames in the same row. Otherwise, the detection frame with the larger ordinate value is determined to be in the next row. Thus, the detection box of the first line is determined. The first preset value can be set in response to the operation of a user, and can be factory set on the basis of meeting the image editing requirements of most panels to be detected.
After the detection box of the first row is determined, the above operations may be repeated for the remaining detection boxes. From which the detection box of each row can be determined.
Step S122, continuously sequencing all the detection frames in at least a part of the detection frames according to the positions of the detection frames in the image, wherein for the detection frames in each row, the sequence number is higher the farther the detection frame is to the left; the number of the leftmost detection frame in the detection frames of the next adjacent row is adjacent to the number of the rightmost detection frame in the detection frames of the current row, and the number of the leftmost detection frame in the detection frames of the next adjacent row is arranged behind the number of the rightmost detection frame in the detection frames of the current row.
Illustratively, two rows of detection frames as shown in fig. 2 are determined according to the above step S121, the upper row of detection frames is referred to as a first row of detection frames, and the lower row of detection frames is referred to as a second row of detection frames. It can be seen that the two rows of detection boxes are adjacent. And for each line of detection frames, sorting the detection frames in the order from small to large according to the coordinate value of an abscissa in the position coordinates of the upper left vertex of each detection frame. In the same row of detection frames, the smaller the coordinate value of the abscissa, the more leftward the position of the corresponding detection frame. And for the first row of detection frames, giving continuous sequence numbers to each detection frame according to the sequencing sequence in the sequencing process. Thereby enabling a sequential ordering of the first row of detection boxes. The serial number may be a consecutive number, letter, etc. As shown in fig. 2, the first row of test frames are sorted by taking the serial number as a number, so that the first row of test frames have serial numbers of 1-8 in series. That is, the number of the last detection box (the rightmost detection box) in the first row of detection boxes is 8. And for the second row of detection frames, continuously sequencing each detection frame according to the sequence number of the last detection frame in the first row of detection frames from small to large according to the coordinate value of the abscissa in the position coordinate of the upper left vertex of each detection frame. For example, as described above, since the number of the last detection frame in the first line detection frame is 8, the number of the first detection frame (leftmost detection frame) in the second line detection frame is 9. Thus, the second line detection boxes are sorted such that the second line detection boxes have consecutive serial numbers of 9-15 as shown in fig. 2.
Exemplarily, the step S130 may include the steps of: and dividing the detection frames belonging to the same row into the same detection frame group according to the sorting result, or else, dividing the detection frames into different detection frame groups. With continued reference to fig. 2, as described above, according to the present embodiment, in conjunction with fig. 2 and 4, the detection boxes numbered 1 to 8 belong to the first row of detection boxes, and the detection boxes numbered 9 to 15 belong to the second row of detection boxes. Thus, the detection frames with serial numbers 1-8 can be divided into the same detection frame group, i.e., the detection frame group with group number 1 in FIG. 4, and the detection frame groups with serial numbers 9-15 can be divided into another detection frame group, i.e., the detection frame group with group number 2 in FIG. 4.
In panels to be inspected, typically the same type of inspection box will be in the same row. The same type of detection frame typically has similar function and shape. In the technical scheme, the detection frames of the electrodes can be sequenced according to a line priority principle, the ordering of sequencing results is guaranteed, the possibility of grouping the detection frames is considered, the detection frames of the same type are divided into the same group in the subsequent grouping operation, and then the subsequent unified editing and processing are facilitated.
FIG. 4 shows a schematic diagram of a user interface according to another embodiment of the invention. As shown in fig. 4, the user interface is further configured to display a first operable control 410 for setting a first preset value in response to a user operation. The first operable control 410 may be a text entry box, a tuner or filter box, or the like. For example, when the first operable control 410 is a counter, it is shown in fig. 4 as a "split gap" control. The user can adjust the first preset value by clicking the up-down arrow behind the 'split gap' control. Specifically, clicking the up arrow may increase the first preset value, and clicking the down arrow may decrease the first preset value. It is to be understood that the setting of the first preset value as 100 in this embodiment is merely exemplary and is not a limitation of the first preset value. In fact, the user can reasonably set the first preset value according to the distance between two adjacent rows of detection frames, that is, the distance between the lower boundary of the previous row of detection frames and the upper boundary of the next row of detection frames, for example, set the first preset value to be 80% of the distance between two adjacent rows of detection frames, so as to tolerate a certain error.
Therefore, the user can set a corresponding first preset value through the first operable control 410 according to actual conditions to reasonably group the detection frames of the electrodes, different requirements under different conditions are met, and the use experience of the user is improved.
Fig. 5 shows a schematic flowchart of the step S130 of grouping at least a part of the detection boxes according to the sorting result to generate the detection box group according to an embodiment of the present invention. As shown, step S130 may include the following steps.
Step S131, judging whether the transverse distance between two adjacent detection frames in each row of detection frames in at least one part of detection frames is smaller than a second preset value, if so, dividing the detection frames into the same detection frame group, otherwise, dividing the detection frames into different detection frame groups.
For example, the lateral distance between two adjacent detection boxes in the same row of detection boxes to be grouped may be represented by a difference between coordinate values of the abscissa in the position coordinates of the vertices of their adjacent boundaries. For example, the following description will be made taking the first two detection frames among the first line detection frames in fig. 2 as an example. Of the first two detection frames, the left detection frame is referred to as a first detection frame, and the right detection frame is referred to as a second detection frame. Wherein, the right boundary of the first detection frame and the left boundary of the second detection frame are adjacent boundaries. First, a difference value between the coordinate value of the abscissa in the position coordinates of the upper right vertex of the first detection frame and the coordinate value of the abscissa in the position coordinates of the upper left vertex of the second detection frame is calculated. And secondly, comparing the obtained difference value with a second preset value, and dividing the first detection frame and the second detection frame into the same detection frame group when the obtained difference value is smaller than the second preset value. Otherwise, the detection frames are divided into different detection frame groups. The second preset value can be set differently according to the actual requirements of the user. Alternatively, the lateral distance between two adjacent detection frames can also be represented by the distance between the left vertices of the two adjacent detection frames, and accordingly, whether the detection frames in the same row are divided into the same detection frame group is determined, which is similar to the foregoing process and is not repeated.
Step S132, dividing the detection frames of different rows in at least one part of detection frames into different detection frame groups.
The step S131 is to group the detection frames in the same row. In this step S132, the detection frames for different rows are grouped. If the detection frames with longer distances are divided into the same group, the difficulty of subsequent panel detection may be increased, and therefore, the detection frames in different rows in at least one part of the detection frames are divided into different detection frame groups.
Illustratively, the detection frames of the first row are grouped according to step S131. Then, for the first detection frame in the detection frames of the second row, it is divided into another group as per step S132. Further, the detection frames of the second row are still grouped according to step S131 until the grouping of the detection frames of the second row is completed. And the detection boxes of the rest rows and the detection boxes of the second row perform similar grouping operation until all the detection boxes to be grouped are grouped.
In panels to be inspected, electrodes of the same type are usually arranged together. In the above technical scheme, based on this rule, not only the detection frames in different rows are automatically divided into different detection frame groups, but also the detection frames in the same row are automatically grouped according to the inter-frame distance. By the scheme, the detection frames of different types are grouped differently, and the subsequent group editing and processing of the detection frames are facilitated.
As shown in fig. 4, the user interface may also be used to display a second operable control 420. The second operable control 420 is used for setting a minimum sequence number and a maximum sequence number of the detection boxes in at least one of the detection box groups in response to an operation of a user. The second actionable controls 420 may include a "bump number grouping" control, an "add bump grouping" control, and a "bump number" control shown in fig. 4.
Step S130 groups at least a part of the detection frames according to the sorting result to generate a detection frame group, and may further include dividing the detection frames with sequence numbers greater than or equal to the minimum sequence number and less than or equal to the maximum sequence number into corresponding detection frame groups according to the minimum sequence number and the maximum sequence number.
Illustratively, the user first clicks a rectangular box in front of the "bump number grouping" control by using an input device such as a mouse to check the box. Fig. 4 shows the selected state. In this embodiment, the manual grouping mode may be turned on after the "bump number grouping" control is checked. And then the user can adjust the maximum sequence number and the minimum sequence number in one detection box group by clicking the up-down arrow behind the 'bump number'. The numerical value of the minimum sequence number can be adjusted by clicking the up and down arrow on the left side, for example, the numerical value of the minimum sequence number can be increased by clicking the up arrow, and the numerical value of the minimum sequence number can be decreased by clicking the down arrow. Clicking the upper and lower arrows on the right side can adjust the numerical value of the maximum serial number, and the adjustment method is similar to the above. After the maximum sequence number and the minimum sequence number are set, the user clicks the 'increase bump grouping' control, and the detection boxes with the sequence numbers which are greater than or equal to the set minimum sequence number and less than or equal to the set maximum sequence number are divided into the same detection box group. According to the above technical solution, the detection block grouping result 430 shown in fig. 4 can be obtained.
Therefore, the user can manually group the detection frames according to actual requirements, the requirements of different users are met, the user operation is simple, and the use experience is improved.
Illustratively, the panel to be detected may comprise a plurality of boundary markers. The boundary marks may include a left mark (mark) and a right mark. A plurality of boundary markers in an image can be identified by acquiring position coordinates of left and right markers. After the detection frame grouping result is obtained through the technical scheme, the grouping information of the detection frame group in the area between the left mark and the right mark can be stored in response to the click operation of the user on the 'storage' control in the figure 4.
As shown in fig. 4, the user interface may also be used to display a fourth operable control 440. Illustratively, the currently activated detection box group may be toggled in response to user operation of the fourth operable control 440. Grouping at least a portion of the detection boxes may obtain at least one detection box group. For the detection frame groups obtained after grouping, the user may perform different operations on the detection frame groups respectively. The user may activate the detection frame group with the fourth operable control 440 before performing subsequent operations on a different detection frame group. For example, the user may click on the "previous group" control and the "next group" control shown in fig. 4 by using an input device such as a mouse to switch the currently activated detection frame group. When the grouping operation ends at step S130, the currently activated detection frame group may be the last detection frame group. At this time, the user clicks the "previous group" control to switch the currently activated detection frame group to the second last detection frame group. Clicking on the "next group" control may loop back to the first detection box group. Alternatively, the fourth operable control 440 may also be a text entry box. In this embodiment, the user may input the group number in the detection box grouping result 430 by using the fourth operable control 440 to implement switching of the currently activated detection box group.
Therefore, the detection frame group with expected operation can be quickly and conveniently positioned to uniformly execute subsequent operation on the detection frames in the detection frame group, and the efficiency and the speed of panel detection are effectively improved.
Exemplarily, the step S130 of grouping at least a part of the detection boxes according to the sorting result to generate the detection box group may further include identifying the sorted detection boxes with a uniform group identification. For example, the sorted detection boxes may be identified with a uniform character or color mark, etc. to achieve differentiation between each detection box group. It is understood that each detection frame group has a one-to-one correspondence relationship with the group identifier. Alternatively, the group identifier may be a serial number of the detection box.
Illustratively, the user interface may also be used to display a fifth operable control 450, shown in FIG. 4 as a "delete current grouping" control and a "clear all groupings" control. The group identification of the detection box in the detection box group may be deleted in response to a user operation of the fifth operable control 450 to delete the detection box group. And the user can click the corresponding control according to specific requirements to delete the detection frame group. For example, after a user selects a certain detection box group, the user may click a "delete current group" control using an input device such as a mouse to delete the currently selected detection box group. Still alternatively, the user desires to delete all the grouped detection frame groups to regroup them. At this time, the user may click the "clear all groups" control, and delete all detection box groups.
The technical scheme has the advantages that the algorithm is simple and easy to realize, and for the user, the detection frame groups can be deleted in batches without complicated operation, so that the time and the energy of the user are saved, and the use experience of the user is improved.
Illustratively, the method further comprises the step of carrying out unified parameter setting on the detection frames in the detection frame group in response to the parameter setting operation of the user. The parameter setting may include setting of the size, position, and the like of the detection frame. For one detection frame group, the user can correspondingly and automatically adjust other detection frames by adjusting the size of one detection frame, and the adjustment amplitude is kept consistent.
Therefore, batch operation on the detection frames can be realized by taking the detection frame group as a unit. The efficiency of panel detection has further been promoted, has saved user's time and energy, has promoted user's use and has experienced.
According to another aspect of the present invention, there is also provided an editing apparatus for an image of a panel to be detected, and fig. 6 shows a schematic block diagram of an editing apparatus 600 for an image of a panel to be detected according to an embodiment of the present invention. As shown in fig. 6, the apparatus 600 includes a display module 610, a sorting module 620, and a grouping module 630.
The display module 610 is configured to display a user interface, where the user interface is configured to display an image of the panel to be detected and a detection frame of the electrode of the panel to be detected.
The sorting module 620 is configured to sort at least a part of the detection frames according to the positions of the detection frames in the image.
The grouping module 630 is configured to group at least a portion of the detection boxes according to the sorting result to generate a detection box group.
According to another aspect of the invention, an electronic device is also provided. Fig. 7 shows a schematic block diagram of an electronic device 700 according to an embodiment of the invention. As shown in fig. 7, the electronic device 700 may include a display 710, a processor 720, and a memory 730. The display 710 is used for displaying a user interface, and the memory 730 stores computer program instructions, which are executed by the processor 720 to execute the editing method 100 for the image of the panel to be detected.
According to a further aspect of the present invention, there is also provided a storage medium on which are stored program instructions for executing the method 100 for editing an image of a panel to be inspected, as described above, when executed. The storage medium may include, for example, a storage component of a tablet computer, a hard disk of a personal computer, Read Only Memory (ROM), Erasable Programmable Read Only Memory (EPROM), portable compact disc read only memory (CD-ROM), USB memory, or any combination of the above storage media. The computer-readable storage medium may be any combination of one or more computer-readable storage media.
A person skilled in the art can understand specific implementation schemes of the editing apparatus, the electronic device, and the storage medium for determining the image of the panel to be detected by reading the above description related to the editing method of the image of the panel to be detected, and details are not described herein for brevity.
Although the illustrative embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the foregoing illustrative embodiments are merely exemplary and are not intended to limit the scope of the invention thereto. Various changes and modifications may be effected therein by one of ordinary skill in the pertinent art without departing from the scope or spirit of the present invention. All such changes and modifications are intended to be included within the scope of the present invention as set forth in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, and for example, the division of the units is only one logical functional division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another device, or some features may be omitted, or not executed.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the invention and aiding in the understanding of one or more of the various inventive aspects. However, the method of the present invention should not be construed to reflect the intent: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
It will be understood by those skilled in the art that all of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where such features are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the invention may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. It will be appreciated by those skilled in the art that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functions of some of the modules in the apparatus for editing an image of a panel to be inspected according to embodiments of the present invention. The present invention may also be embodied as apparatus programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing the present invention may be stored on computer-readable media or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
The above description is only for the specific embodiment of the present invention or the description thereof, and the protection scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and the changes or substitutions should be covered within the protection scope of the present invention. The protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (12)

1. A method for editing an image of a panel to be inspected, the method comprising:
displaying a user interface, wherein the user interface is used for displaying an image of a panel to be detected and a detection frame of an electrode of the panel to be detected;
sorting at least a part of the detection frames according to the positions of the detection frames in the image;
and grouping at least one part of detection frames according to the sorting result to generate a detection frame group.
2. The method of claim 1, wherein said sorting at least a portion of the detection boxes according to their positions in the image comprises:
determining a detection frame of each line according to the position of at least one part of detection frames in the image, wherein the longitudinal distance between adjacent detection frames in the detection frames of each line is smaller than a first preset value;
according to the positions of the at least one part of detection frames in the image, continuously sequencing all the detection frames in the at least one part of detection frames, wherein for the detection frames in each row, the sequence number is earlier the farther the detection frame is to the left; the number of the leftmost detection frame in the detection frames of the next adjacent row is adjacent to the number of the rightmost detection frame in the detection frames of the current row, and the number of the leftmost detection frame in the detection frames of the next adjacent row is arranged behind the number of the rightmost detection frame in the detection frames of the current row.
3. The method of claim 2, wherein the grouping the at least a portion of the detection boxes according to the sorting result to generate a detection box group comprises:
and dividing the detection frames belonging to the same row into the same detection frame group according to the sorting result, or else, dividing the detection frames into different detection frame groups.
4. The method of claim 1 or 2, wherein the grouping the at least a portion of the detection boxes according to the sorting result to generate a detection box group comprises:
judging whether the transverse distance between two adjacent detection frames in each row of the detection frames in at least one part of the detection frames is smaller than a second preset value, if so, dividing the detection frames into the same detection frame group, otherwise, dividing the detection frames into different detection frame groups;
and dividing the detection frames of different rows in at least one part of detection frames into different detection frame groups.
5. The method of claim 1 or 2, wherein the user interface is further configured to display a second operable control for setting a minimum sequence number and a maximum sequence number of the detection boxes in at least one of the detection box groups in response to a user operation;
the grouping at least a part of the detection frames according to the sorting result to generate a detection frame group, including:
and dividing the detection frames with the sequence numbers which are more than or equal to the minimum sequence number and less than or equal to the maximum sequence number into corresponding detection frame groups according to the minimum sequence number and the maximum sequence number.
6. The method of any of claims 1 to 3, wherein the user interface is further configured to display a third operable control for initiating a selection operation of the detection box in response to an operation by the user;
the method further comprises the following steps:
and determining the at least one part of the detection frame in response to the user's selection operation on the at least one part of the detection frame.
7. The method of any of claims 1-3, wherein the user interface is further for displaying a fourth operable control, the method further comprising:
and responding to the operation of the user on the fourth operable control, and switching the currently activated detection frame group.
8. The method of any of claims 1-3, wherein the user interface is further for displaying a fifth operable control,
the grouping at least a part of the detection frames according to the sorting result to generate a detection frame group, including: identifying the sorted detection frames by using a uniform group identifier;
the method further comprises the following steps: and deleting the group identification of the detection frame in the detection frame group in response to the operation of the user on the fifth operable control so as to delete the detection frame group.
9. The method of any of claims 1 to 3, wherein the method further comprises:
and responding to the parameter setting operation of the user, and performing unified parameter setting on the detection frames in the detection frame group.
10. An apparatus for editing an image of a panel to be inspected, the apparatus comprising:
the display module is used for displaying a user interface, and the user interface is used for displaying an image of the panel to be detected and a detection frame of an electrode of the panel to be detected;
the sorting module is used for sorting at least one part of the detection frames according to the positions of the detection frames in the image;
and the grouping module is used for grouping at least one part of the detection frames according to the sorting result so as to generate a detection frame group.
11. An electronic device comprising a display for displaying the user interface, a processor and a memory, wherein the memory has stored therein computer program instructions for executing the method of editing an image of a panel to be inspected according to any one of claims 1 to 9 when run by the processor.
12. A storage medium on which are stored program instructions for executing, when running, a method of editing an image of a panel to be inspected according to any one of claims 1 to 9.
CN202111544959.8A 2021-12-16 2021-12-16 Editing method and device for panel image to be detected, electronic equipment and storage medium Pending CN114356189A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111544959.8A CN114356189A (en) 2021-12-16 2021-12-16 Editing method and device for panel image to be detected, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111544959.8A CN114356189A (en) 2021-12-16 2021-12-16 Editing method and device for panel image to be detected, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114356189A true CN114356189A (en) 2022-04-15

Family

ID=81099661

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111544959.8A Pending CN114356189A (en) 2021-12-16 2021-12-16 Editing method and device for panel image to be detected, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114356189A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020113234A1 (en) * 2001-02-21 2002-08-22 Hirohito Okuda Method and system for inspecting electronic circuit pattern
JP2004048002A (en) * 2003-06-27 2004-02-12 Hitachi Ltd Circuit-pattern inspecting apparatus and method
KR100608225B1 (en) * 2005-02-24 2006-08-08 테크밸리 주식회사 Method for verification of the components of printed circuit board
JP2007206050A (en) * 2000-11-17 2007-08-16 Ebara Corp Substrate inspection method, substrate inspection device, and electron beam unit
JP2008004863A (en) * 2006-06-26 2008-01-10 Hitachi High-Technologies Corp Appearance inspection method and device therefor
JP2012049503A (en) * 2010-07-27 2012-03-08 Fujitsu Semiconductor Ltd Inspection device for semiconductor device, and inspection method for semiconductor device
KR101256369B1 (en) * 2012-05-15 2013-04-25 (주) 에스엘테크 Flat display pannel test equipment and test method using multi ccd camera
JP2017207972A (en) * 2016-05-19 2017-11-24 キヤノン株式会社 Image processing device, image processing method, and program
CN111582267A (en) * 2020-04-08 2020-08-25 北京皮尔布莱尼软件有限公司 Text detection method, computing device and readable storage medium
KR20210024767A (en) * 2019-08-26 2021-03-08 레이디소프트 주식회사 Method for non-destructive inspection based on image and Computer-readable storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007206050A (en) * 2000-11-17 2007-08-16 Ebara Corp Substrate inspection method, substrate inspection device, and electron beam unit
US20020113234A1 (en) * 2001-02-21 2002-08-22 Hirohito Okuda Method and system for inspecting electronic circuit pattern
JP2004048002A (en) * 2003-06-27 2004-02-12 Hitachi Ltd Circuit-pattern inspecting apparatus and method
KR100608225B1 (en) * 2005-02-24 2006-08-08 테크밸리 주식회사 Method for verification of the components of printed circuit board
JP2008004863A (en) * 2006-06-26 2008-01-10 Hitachi High-Technologies Corp Appearance inspection method and device therefor
JP2012049503A (en) * 2010-07-27 2012-03-08 Fujitsu Semiconductor Ltd Inspection device for semiconductor device, and inspection method for semiconductor device
KR101256369B1 (en) * 2012-05-15 2013-04-25 (주) 에스엘테크 Flat display pannel test equipment and test method using multi ccd camera
JP2017207972A (en) * 2016-05-19 2017-11-24 キヤノン株式会社 Image processing device, image processing method, and program
KR20210024767A (en) * 2019-08-26 2021-03-08 레이디소프트 주식회사 Method for non-destructive inspection based on image and Computer-readable storage medium
CN111582267A (en) * 2020-04-08 2020-08-25 北京皮尔布莱尼软件有限公司 Text detection method, computing device and readable storage medium

Similar Documents

Publication Publication Date Title
US11054936B2 (en) Touch panel with non-uniform touch node layout
US9892504B2 (en) Image inspection method and inspection region setting method
CN113448787B (en) Wafer abnormity analysis method and device, electronic equipment and readable storage medium
US11423531B2 (en) Image-recognition apparatus, image-recognition method, and non-transitory computer-readable storage medium thereof
CN114266773A (en) Display panel defect positioning method, device, equipment and storage medium
Akhtar et al. A methodology for evaluating accuracy of capacitive touch sensing grid patterns
CN103399674B (en) A kind of multipoint touch detection method and device
CN108694265A (en) Intelligent pre-diagnosis system and method for failure risk of design layout
CN107492079A (en) A kind of image mill skin method and mobile terminal
CN112362679A (en) Image recognition device, image recognition method and computer program product thereof
CN112419229A (en) Display screen linear defect detection method and device and storage medium
Sokolov et al. Automatic vision system for final test of liquid crystal displays
CN114356189A (en) Editing method and device for panel image to be detected, electronic equipment and storage medium
Lin et al. An automatic inspection method for the fracture conditions of anisotropic conductive film in the TFT-LCD assembly process
CN113608805A (en) Mask prediction method, image processing method, display method and equipment
CN112308816A (en) Image recognition device, image recognition method and computer program product thereof
CN112015634A (en) Page structure information generation method and device and electronic equipment
US10241618B2 (en) Touchscreen display with monitoring functions
CN103325704B (en) Method for inspecting chip quality
CN114359179A (en) Panel detection method, system, electronic device and storage medium
CN114359176A (en) Panel detection method and device, electronic equipment and storage medium
CN114372959A (en) Method, device, equipment and storage medium for determining detection area in panel image
CN110781973B (en) Article identification model training method, article identification device and electronic equipment
CN115294078B (en) Glass asymmetric chamfer identification method, device, equipment and storage medium
CN116627372B (en) PNL material alignment preview display method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination