CN117876490A - Feature determination method and device, electronic equipment and storage medium - Google Patents

Feature determination method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN117876490A
CN117876490A CN202410102145.6A CN202410102145A CN117876490A CN 117876490 A CN117876490 A CN 117876490A CN 202410102145 A CN202410102145 A CN 202410102145A CN 117876490 A CN117876490 A CN 117876490A
Authority
CN
China
Prior art keywords
image
detected
area
target
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410102145.6A
Other languages
Chinese (zh)
Inventor
李天洲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Lingyunguang Industrial Intelligent Technology Co Ltd
Original Assignee
Suzhou Lingyunguang Industrial Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Lingyunguang Industrial Intelligent Technology Co Ltd filed Critical Suzhou Lingyunguang Industrial Intelligent Technology Co Ltd
Priority to CN202410102145.6A priority Critical patent/CN117876490A/en
Publication of CN117876490A publication Critical patent/CN117876490A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a feature determination method, a feature determination device, electronic equipment and a storage medium. The method comprises the following steps: acquiring a 3D image of a target material processing scene; extracting a 3D reference area and a 3D area to be detected from the 3D image; the 3D region to be detected comprises a 3D target image to be detected; when the 3D object image to be measured meets the preset condition, determining the three-dimensional geometric characteristics of the object to be measured corresponding to the 3D object image to be measured. According to the technical scheme provided by the embodiment of the invention, the 3D region to be measured comprising the 3D target image to be measured is extracted from the 3D image, and the accurate positioning of the region to be measured can be realized in 3D vision by determining the three-dimensional geometric characteristics of the target to be measured in the 3D target image to be measured which meets the preset condition.

Description

Feature determination method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of machine vision, and in particular, to a feature determining method, a feature determining device, an electronic device, and a storage medium.
Background
With technological breakthroughs brought by four visual revolution, machine vision comes to be a brand-new 3D stereoscopic 'FOV'. Precision manufacturing also promotes the popularization of 3D machine vision in consumer-level and industrial-level application scenes with respect to precision requirements. Meanwhile, since 3D hardware is identical to 2D hardware, we can refer to the 3D hardware scheme as a "special 2D hardware scheme" from a hardware perspective. Particularly, a 3D line profiler commonly used in the 3C industry is formed by integrating three major parts of line laser (light source), 2D cmos and 3D imaging algorithm, so that the line profiler becomes a highly integrated and special 2D lighting scheme, but due to the high hardware integration level, the line profiler cannot change the light source and lens according to the imaging effect as in the 2D scheme, the 3D hardware cannot be as fine as in the 2D imaging in profile detail, and the traditional positioning mode cannot accurately position the region to be measured in the 3D vision.
Disclosure of Invention
The invention provides a feature determination method, a feature determination device, electronic equipment and a storage medium, which can realize accurate positioning of a region to be measured in 3D vision.
According to an aspect of the present invention, there is provided a feature determination method, the method comprising:
acquiring a 3D image of a target material processing scene;
extracting a 3D reference area and a 3D area to be detected from the 3D image; the 3D region to be detected comprises a 3D target image to be detected;
and when the 3D object image to be detected meets a preset condition, determining the three-dimensional geometric characteristics of the object to be detected corresponding to the 3D object image to be detected.
According to another aspect of the present invention, there is provided a feature determining apparatus, the apparatus comprising:
the 3D image acquisition module is used for acquiring a 3D image of the target material processing scene;
the 3D image extraction module is used for extracting a 3D reference area and a 3D area to be detected from the 3D image; the 3D region to be detected comprises a 3D target image to be detected;
the geometric feature determining module is used for determining the three-dimensional geometric feature of the to-be-detected target corresponding to the 3D to-be-detected target image when the 3D to-be-detected target image meets a preset condition.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the feature determination method of any one of the embodiments of the present invention.
According to another aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to execute a feature determining method according to any one of the embodiments of the present invention.
According to the technical scheme, the 3D image of the target material processing scene is acquired; extracting a 3D reference area and a 3D area to be detected from the 3D image; the 3D region to be detected comprises a 3D target image to be detected; when the 3D object image to be measured meets the preset condition, determining the three-dimensional geometric characteristics of the object to be measured corresponding to the 3D object image to be measured. According to the technical scheme provided by the embodiment of the invention, the 3D region to be measured comprising the 3D target image to be measured is extracted from the 3D image, and the accurate positioning of the region to be measured can be realized in 3D vision by determining the three-dimensional geometric characteristics of the target to be measured in the 3D target image to be measured which meets the preset condition.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a feature determination method provided in accordance with a first embodiment of the present invention;
FIG. 2 is a schematic view of a scenario in which at least two separate parts of a tray clamp are integrated into a unitary material according to a first embodiment of the present invention;
FIG. 3 is a flow chart of a feature determination method according to a second embodiment of the present invention;
fig. 4 is a schematic diagram of a scenario for detecting an integration effect of a bulk material integrated by at least two separate parts according to a second embodiment of the present invention;
fig. 5 is a schematic structural view of a feature determining apparatus according to a third embodiment of the present invention;
fig. 6 is a schematic structural diagram of an electronic device implementing a feature determining method according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
Fig. 1 is a flowchart of a feature determining method according to an embodiment of the present invention, where the method may be performed by a feature determining device, and the feature determining device may be implemented in hardware and/or software, and the feature determining device may be configured in an electronic device. As shown in fig. 1, the method includes:
s110, acquiring a 3D image of the target material processing scene.
Wherein, the target material handling scenario generally refers to an industrial site or production line where material handling is required. In the embodiment of the invention, the target material processing scene includes, but is not limited to, a scene of integrating at least two separated parts clamped by a material tray into an integral material, and a scene of detecting an integration effect of the integral material integrated by the at least two separated parts.
In the embodiment of the invention, the 3D image of the target material processing scene can be acquired by acquiring the 3D image data of the material to be measured in the target material processing scene by using the 3D line laser profiler. When the 3D line laser profiler is used to collect 3D image data of a material to be measured in a target material processing scene, there is a need for a relative motion between the material to be measured and the 3D line laser profiler to collect real 3D image data.
S120, extracting a 3D reference area and a 3D area to be detected from the 3D image; the 3D region to be measured comprises a 3D target image to be measured.
The 3D reference area refers to a relatively stable and reliable area in the 3D image, and can be used as a reference to determine states and changes of other areas so as to ensure accuracy of material processing.
In the embodiment of the invention, after the 3D image of the target material processing scene is acquired, the 3D reference area and the 3D area to be detected containing the 3D target image to be detected can be extracted from the 3D image.
Optionally, extracting the 3D reference region and the 3D region to be measured from the 3D image includes: determining a 3D reference region from the 3D image and determining a reference plane based on the 3D reference region; and extracting the 3D region to be detected from the 3D image based on the reference plane.
In the embodiment of the invention, after the 3D reference area is determined from the 3D image, the reference plane can be determined based on the 3D reference area. Alternatively, a relatively flat surface may be selected from the 3D reference region as the reference plane, or a plane may be fitted by a fitting algorithm as the reference plane. The reference plane is selected taking into account its relative position and angle to the 3D region to be measured to ensure accuracy and reliability of measurement and analysis.
For example, fig. 2 shows a schematic diagram of a scenario in which at least two separate parts clamped to a tray are integrated into an integral material, as shown in fig. 2, a dotted line area indicated by an arrow 1 is a to-be-measured area, a material indicated by an arrow 2 is a to-be-measured object, a moving space of the material indicated by an arrow 3 is smaller, a state of the moving space is relatively stable, and the area can be used as a reference area. After the 3D image of the scene is acquired, the area where the material indicated by the arrow 3 is located may be determined as a 3D reference area, then a reference plane is determined based on the 3D reference area, and finally a dotted line area including the material image indicated by the arrow 2 is extracted from the 3D image based on the reference plane.
And S130, when the 3D object image to be detected meets the preset condition, determining the three-dimensional geometric characteristics of the object to be detected corresponding to the 3D object image to be detected.
In the embodiment of the invention, when the 3D object image to be measured meets the preset condition, the three-dimensional geometric characteristics of the object to be measured corresponding to the 3D object image to be measured can be determined. The preset conditions may be set according to the actual target material processing scene, which is not limited in the embodiment of the present invention. For example, in the scenario that at least two separate parts clamped to the tray are integrated into a whole material as shown in fig. 2, the preset condition may be that a specific position of the material indicated by the arrow 2 in the dotted line area indicated by the arrow 1 can be determined by the 3D target image to be measured.
Optionally, when the 3D target image to be measured meets a preset condition, determining the three-dimensional geometric feature of the target to be measured corresponding to the 3D target image to be measured includes: when the 3D target image to be detected meets the preset condition, converting the 3D region to be detected into a 2D region to be detected; determining a 2D target position area where a target to be detected is located from the 2D region to be detected; determining a 3D target position area where a target to be detected is located in the 3D target position area according to the 2D target position area; and analyzing the 3D target position area to determine the three-dimensional geometric characteristics of the target to be detected corresponding to the 3D target image to be detected.
In the embodiment of the invention, when the 3D target image to be detected meets the preset condition, the 3D region to be detected is required to be converted into the 2D region to be detected, so that further processing and analysis can be performed on the 2D region to be detected. Alternatively, the 3D region to be measured may be converted into the 2D region to be measured by orthogonal projection or perspective projection, and may specifically be selected according to practical situations. And then, determining a 2D target position area where the target to be measured is located in the 2D target position area, and reversely calculating a 3D target position area where the target to be measured is located in the 3D target position area according to the 2D target position area. And finally, analyzing the 3D target position area to determine the three-dimensional geometric characteristics of the target to be detected corresponding to the 3D target image to be detected.
Optionally, when the 3D target image to be measured does not meet the preset condition, returning to execute the extraction of the 3D reference area and the 3D area to be measured from the 3D image until the 3D target image to be measured meets the preset condition.
In the embodiment of the invention, when the 3D target image to be detected does not meet the preset condition, it can be understood that the reference plane determined by the 3D reference area cannot provide an effective reference for extracting the 3D reference area in the process of extracting the 3D reference area and the 3D area to be detected. At this time, it is necessary to return to perform extraction of the 3D reference region and the 3D region to be measured from the 3D image until the 3D target image to be measured satisfies the preset condition.
According to the technical scheme, the 3D image of the target material processing scene is acquired; extracting a 3D reference area and a 3D area to be detected from the 3D image; the 3D region to be detected comprises a 3D target image to be detected; when the 3D object image to be measured meets the preset condition, determining the three-dimensional geometric characteristics of the object to be measured corresponding to the 3D object image to be measured. According to the technical scheme provided by the embodiment of the invention, the 3D region to be measured comprising the 3D target image to be measured is extracted from the 3D image, and the accurate positioning of the region to be measured can be realized in 3D vision by determining the three-dimensional geometric characteristics of the target to be measured in the 3D target image to be measured which meets the preset condition.
Example two
Fig. 3 is a flowchart of a feature determining method according to a second embodiment of the present invention, where the embodiment of the present invention is optimized based on the foregoing embodiment, and a scheme not described in detail in the embodiment of the present invention is shown in the foregoing embodiment. As shown in fig. 3, the method includes:
s210, acquiring a 3D image of the target material processing scene.
S220, preprocessing the 3D image.
In the embodiment of the invention, the 3D image of the acquired target material processing scene is also required to be preprocessed before the 3D reference area and the 3D area to be detected are extracted from the 3D image. Optionally, noise reduction processing may be performed on the 3D image, so as to improve the quality of the 3D image, thereby improving the accuracy of measurement and analysis.
S230, judging the category of the target material processing scene.
In the embodiment of the invention, the category of the target material processing scene is required to be judged manually, and the category of the target material processing scene corresponding to the judgment result is selected in the corresponding window so as to carry out the subsequent flow. Optionally, when the target material handling scenario is a first scenario, i.e. a scenario in which at least two separate parts clamped to the tray are integrated into a whole material, steps S240-S250 are performed; when the target material processing scene is the second scene, i.e. the scene for detecting the integration effect of the whole material integrated by at least two separated parts, steps S260-S270 are executed.
S240, extracting the area with the minimum corresponding acupuncture points of at least two clamps from the 3D image to be used as a 3D reference area, and extracting the acupuncture point areas of at least two clamps except the area with the minimum corresponding acupuncture points from the 3D image to be used as a 3D area to be measured.
The acupoints corresponding to the clamps are the areas where the clamps clamp the separated parts, and the areas represent the movable areas of the separated parts.
In the embodiment of the invention, the region where the separated parts are relatively stable, namely the region where at least two clamps correspond to the minimum acupoints, is taken as the 3D reference region, and the acupoints of at least two clamps except the region where the corresponding acupoints are minimum are extracted from the 3D image to be taken as the 3D region to be measured. Illustratively, as shown in fig. 2, the concave portion on the fixture is the acupoint corresponding to the fixture, and the area corresponding to the acupoint of the fixture 02 may be used as the 3D reference area, and the area corresponding to the acupoint of the fixture 01 may be extracted from the 3D image as the 3D area to be measured.
S250, when the 3D separation part image in the 3D region to be detected meets the preset condition, determining the three-dimensional geometric characteristics of the target separation part to be detected corresponding to the 3D separation part image in the 3D region to be detected.
In the embodiment of the invention, when the 3D separated part image in the 3D region to be detected meets the preset condition, the three-dimensional geometric characteristics of the target separated part to be detected corresponding to the 3D separated part image in the 3D region to be detected can be determined. For example, in a scenario where at least two separate parts clamped to the tray as shown in fig. 2 are integrated into a whole material, the preset condition may be that a specific position of the 3D separate part in the 3D region to be detected can be determined by a 3D separate part image in the 3D region to be detected. Therefore, the problem that the scene of integrating at least two separated parts clamped by the material tray into a whole material in the traditional positioning mode is not positioned and positioned inaccurately can be solved.
S260, extracting non-joint areas of the whole material from the 3D image to serve as 3D reference areas, and extracting joint areas of at least two separated parts contained in the whole material from the 3D image to serve as 3D to-be-detected areas.
The connection area is an area for connecting at least two separated parts when the at least two separated parts contained in the whole material are integrated, and the connection mode comprises adhesion, welding and the like.
In the embodiment of the invention, a relatively stable non-joint area in the whole material can be used as a 3D reference area, and joint areas of at least two separated parts contained in the whole material are extracted from a 3D image to be used as 3D areas to be detected. For example, fig. 4 shows a schematic diagram of a scenario for detecting an integration effect of an integral material formed by integrating at least two separate parts, as shown in fig. 4, a plane indicated by an arrow 1 is a reference plane determined according to a non-joint area in the integral material, a fixed point indicated by an arrow 2 is a joint area in the integral material when the joint areas are connected, glue or welding points left after adhesion can be used as a 3D reference area, and a non-joint area where the reference plane is located, and a joint area where the glue or welding points are located can be used as a 3D area to be measured.
And S270, when the 3D fixed point image in the 3D area to be detected is a complete fixed point image, determining the three-dimensional geometric characteristics of the fixed point corresponding to the 3D fixed point image in the 3D area to be detected.
In the embodiment of the invention, when the 3D fixed point image in the 3D region to be detected is a complete fixed point image, the bonding surface or the welding surface can be better displayed, and under the condition that the condition is met, the three-dimensional geometric characteristics of the fixed point corresponding to the 3D fixed point image in the 3D region to be detected can be determined. In the scenario shown in fig. 4, where the integration effect of the whole material formed by integrating at least two separate parts is detected, the ellipse point indicated by the arrow 2 is a complete fixed point, and at this time, the determination of the three-dimensional geometric feature of the fixed point corresponding to the 3D fixed point image in the 3D area to be detected may be performed. Therefore, the problems that the scene of detecting the integration effect of the integral material formed by integrating at least two separated parts by 3D machine vision is not positioned and positioned inaccurately can be solved.
According to the technical scheme, the 3D image of the target material processing scene is acquired; preprocessing the 3D image; judging the category of the target material processing scene; when the target material processing scene is a scene that at least two separated parts clamped on a material tray are integrated into a whole material: extracting at least two areas of the clamps, which correspond to the minimum acupoints, from the 3D image to serve as 3D reference areas, and extracting acupoint areas of the at least two clamps, except the areas, which correspond to the minimum acupoints, from the 3D image to serve as 3D areas to be detected; when the 3D separated part image in the 3D to-be-detected area meets a preset condition, determining the three-dimensional geometric characteristics of the to-be-detected target separated part corresponding to the 3D separated part image in the 3D to-be-detected area; when the target material processing scene is a scene for detecting the integration effect of the whole material integrated by at least two separated parts: and extracting a non-connection area of the whole material from the 3D image as a 3D reference area, extracting connection areas of at least two separated parts contained in the whole material from the 3D image as a 3D area to be detected, and determining the three-dimensional geometric characteristics of a fixed point corresponding to the 3D fixed point image in the 3D area to be detected when the 3D fixed point image in the 3D area to be detected is a complete fixed point image. According to the technical scheme provided by the embodiment of the invention, the 3D region to be measured comprising the 3D target image to be measured is extracted from the 3D image, the three-dimensional geometric characteristics of the target to be measured in the 3D target image to be measured meeting the preset condition are determined, the accurate positioning of the region to be measured can be realized in 3D vision, and the problems that the scene of integrating at least two separated parts clamped by the material tray into an integral material in the traditional positioning mode and the scene of detecting the integration effect of the integral material integrated by the at least two separated parts by 3D machine vision are not positioned and positioned inaccurately are solved.
Example III
Fig. 5 is a schematic structural diagram of a feature determining apparatus according to a third embodiment of the present invention. As shown in fig. 5, the apparatus includes:
a 3D image acquisition module 310, configured to acquire a 3D image of a target material processing scene;
a 3D image extraction module 320, configured to extract a 3D reference area and a 3D area to be detected from the 3D image; the 3D region to be detected comprises a 3D target image to be detected;
the geometric feature determining module 330 is configured to determine a three-dimensional geometric feature of the target to be measured corresponding to the 3D target to be measured image when the 3D target to be measured image meets a preset condition.
Optionally, the 3D image extraction module 320 includes:
a reference plane determining unit configured to determine a 3D reference region from the 3D image and determine a reference plane based on the 3D reference region;
and the 3D region to be detected extracting unit is used for extracting the 3D region to be detected from the 3D image based on the reference plane.
Optionally, the geometric feature determination module 330 includes:
the 2D region to be detected conversion unit is used for converting the 3D region to be detected into a 2D region to be detected when the 3D target image to be detected meets a preset condition;
a 2D target location area determining unit, configured to determine, from the 2D target location area, a 2D target location area where the target to be measured is located;
a 3D target position area determining unit, configured to determine, according to the 2D target position area, a 3D target position area where the target to be measured is located in the 3D target to-be-measured area;
and the geometric feature determining unit is used for analyzing the 3D target position area and determining the three-dimensional geometric feature of the target to be detected corresponding to the 3D target image to be detected.
Optionally, the apparatus further comprises:
and the return execution module is used for returning to execute the extraction of the 3D reference area and the 3D area to be detected from the 3D image until the 3D object image to be detected meets the preset condition when the 3D object image to be detected does not meet the preset condition.
Optionally, the target material processing scene is a scene in which at least two separated parts clamped on the material tray are integrated into a whole material; the material tray comprises at least two clamps for clamping the separation parts, and the clamps correspond to the separation parts one by one;
the 3D image extraction module 320 is configured to:
extracting the area with the minimum corresponding acupuncture points of the at least two clamps from the 3D image to serve as a 3D reference area, and extracting the acupuncture point areas of the at least two clamps except the area with the minimum corresponding acupuncture points from the 3D image to serve as 3D areas to be detected;
the geometric feature determining module 330 is configured to:
when the 3D separation part image in the 3D to-be-detected area meets a preset condition, determining the three-dimensional geometric characteristics of the to-be-detected target separation part corresponding to the 3D separation part image in the 3D to-be-detected area.
Optionally, the target material processing scene is a scene for detecting an integration effect of an integral material integrated by at least two separated parts;
the 3D image extraction module 320 is configured to:
extracting a non-joint area of the whole material from the 3D image as a 3D reference area, and extracting joint areas of at least two separated parts contained in the whole material from the 3D image as a 3D area to be detected;
the geometric feature determining module 330 is configured to:
and when the 3D fixed point image in the 3D to-be-detected area is a complete fixed point image, determining the three-dimensional geometric characteristics of the fixed point corresponding to the 3D fixed point image in the 3D to-be-detected area.
Optionally, the apparatus further comprises:
and the preprocessing module is used for preprocessing the 3D image.
The feature determining device provided by the embodiment of the invention can execute the feature determining method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the executing method.
Example IV
Fig. 6 shows a schematic diagram of the structure of an electronic device 10 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 6, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the respective methods and processes described above, such as the feature determination method.
In some embodiments, the feature determination method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as the storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. One or more of the steps of the feature determination method described above may be performed when the computer program is loaded into RAM 13 and executed by processor 11. Alternatively, in other embodiments, the processor 11 may be configured to perform the feature determination method in any other suitable way (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (10)

1. A method of determining a feature, comprising:
acquiring a 3D image of a target material processing scene;
extracting a 3D reference area and a 3D area to be detected from the 3D image; the 3D region to be detected comprises a 3D target image to be detected;
and when the 3D object image to be detected meets a preset condition, determining the three-dimensional geometric characteristics of the object to be detected corresponding to the 3D object image to be detected.
2. The method according to claim 1, wherein extracting a 3D reference region and a 3D region to be measured from the 3D image comprises:
determining a 3D reference region from the 3D image and determining a reference plane based on the 3D reference region;
and extracting a 3D region to be detected from the 3D image based on the reference plane.
3. The method according to claim 1, wherein determining the three-dimensional geometric feature of the object to be measured corresponding to the 3D object to be measured image when the 3D object to be measured image satisfies a preset condition includes:
when the 3D target image to be detected meets a preset condition, converting the 3D region to be detected into a 2D region to be detected;
determining a 2D target position area where the target to be detected is located from the 2D region to be detected;
determining a 3D target position area where the target to be detected is located in the 3D region to be detected according to the 2D target position area;
and analyzing the 3D target position area to determine the three-dimensional geometric characteristics of the target to be detected corresponding to the 3D target image to be detected.
4. The method as recited in claim 1, further comprising:
and when the 3D target image to be detected does not meet the preset condition, returning to execute the extraction of the 3D reference area and the 3D area to be detected from the 3D image until the 3D target image to be detected meets the preset condition.
5. The method of claim 1, wherein the target material handling scenario is a scenario in which at least two separate parts holding a tray are integrated into a unitary material; the material tray comprises at least two clamps for clamping the separation parts, and the clamps correspond to the separation parts one by one;
extracting a 3D reference region and a 3D region to be detected from the 3D image, including:
extracting the area with the minimum corresponding acupuncture points of the at least two clamps from the 3D image to serve as a 3D reference area, and extracting the acupuncture point areas of the at least two clamps except the area with the minimum corresponding acupuncture points from the 3D image to serve as 3D areas to be detected;
correspondingly, when the 3D object image to be measured meets a preset condition, determining the three-dimensional geometric feature of the object to be measured corresponding to the 3D object image to be measured comprises the following steps:
when the 3D separation part image in the 3D to-be-detected area meets a preset condition, determining the three-dimensional geometric characteristics of the to-be-detected target separation part corresponding to the 3D separation part image in the 3D to-be-detected area.
6. The method according to claim 1, wherein the target material handling scenario is a scenario in which an integration effect of a whole material integrated by at least two separate parts is detected;
extracting a 3D reference region and a 3D region to be detected from the 3D image, including:
extracting a non-joint area of the whole material from the 3D image as a 3D reference area, and extracting joint areas of at least two separated parts contained in the whole material from the 3D image as a 3D area to be detected;
correspondingly, when the 3D object image to be measured meets a preset condition, determining the three-dimensional geometric feature of the object to be measured corresponding to the 3D object image to be measured comprises the following steps:
and when the 3D fixed point image in the 3D to-be-detected area is a complete fixed point image, determining the three-dimensional geometric characteristics of the fixed point corresponding to the 3D fixed point image in the 3D to-be-detected area.
7. The method of claim 1, further comprising, prior to extracting a 3D reference region and a 3D region to be measured from the 3D image:
preprocessing the 3D image.
8. A feature determining apparatus, comprising:
the 3D image acquisition module is used for acquiring a 3D image of the target material processing scene;
the 3D image extraction module is used for extracting a 3D reference area and a 3D area to be detected from the 3D image; the 3D region to be detected comprises a 3D target image to be detected;
the geometric feature determining module is used for determining the three-dimensional geometric feature of the to-be-detected target corresponding to the 3D to-be-detected target image when the 3D to-be-detected target image meets a preset condition.
9. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the feature determination method of any one of claims 1-7.
10. A computer readable storage medium storing computer instructions for causing a processor to perform the feature determination method of any one of claims 1-7 when executed.
CN202410102145.6A 2024-01-24 2024-01-24 Feature determination method and device, electronic equipment and storage medium Pending CN117876490A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410102145.6A CN117876490A (en) 2024-01-24 2024-01-24 Feature determination method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410102145.6A CN117876490A (en) 2024-01-24 2024-01-24 Feature determination method and device, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117876490A true CN117876490A (en) 2024-04-12

Family

ID=90586655

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410102145.6A Pending CN117876490A (en) 2024-01-24 2024-01-24 Feature determination method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117876490A (en)

Similar Documents

Publication Publication Date Title
CN116596854A (en) Equipment defect identification method, device, equipment and medium
CN116051558B (en) Defect image labeling method, device, equipment and medium
CN116124081B (en) Non-contact workpiece detection method and device, electronic equipment and medium
CN117372663A (en) Method, device, equipment and storage medium for supplementing log end face shielding
CN117876490A (en) Feature determination method and device, electronic equipment and storage medium
CN116208853A (en) Focusing angle determining method, device, equipment and storage medium
CN116072575A (en) Wafer alignment method, device, electronic equipment and readable storage medium
CN112785650A (en) Camera parameter calibration method and device
CN109523530A (en) A kind of micro strip circular pad detection method and system
CN115908581A (en) Vehicle-mounted camera pitch angle calibration method, device, equipment and storage medium
CN113781392A (en) Method for detecting adhesive path, electronic device and storage medium
CN116952166B (en) Method, device, equipment and medium for detecting parts of automobile door handle assembly
CN105654473A (en) Sub-pixel edge detection method
CN117788423A (en) Fitting contour degree determining method, device, equipment and storage medium
CN117350995A (en) Product defect detection method, device, equipment and storage medium
CN117826167A (en) Ranging method, device, equipment and medium
CN114812391B (en) Minimum safe distance measuring method, device, equipment and storage medium for power equipment
CN116182807B (en) Gesture information determining method, device, electronic equipment, system and medium
CN117804337A (en) Transformer coil size determining method, device, equipment and storage medium
CN117689660B (en) Vacuum cup temperature quality inspection method based on machine vision
CN115953469A (en) Positioning method and device based on single and binocular vision, electronic equipment and storage medium
CN116258714A (en) Defect identification method and device, electronic equipment and storage medium
CN115713496A (en) Belt coal quantity detection method, device, equipment and storage medium
CN118031838A (en) Three-dimensional object height detection method, device, equipment and medium
CN116977930A (en) Image-based oil trace detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination