CN114170419A - Equipment region image extraction method and device under natural scene - Google Patents

Equipment region image extraction method and device under natural scene Download PDF

Info

Publication number
CN114170419A
CN114170419A CN202111491142.9A CN202111491142A CN114170419A CN 114170419 A CN114170419 A CN 114170419A CN 202111491142 A CN202111491142 A CN 202111491142A CN 114170419 A CN114170419 A CN 114170419A
Authority
CN
China
Prior art keywords
image
equipment
recovery
detection
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111491142.9A
Other languages
Chinese (zh)
Inventor
田寨兴
余卫宇
廖伟权
刘嘉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Epbox Information Technology Co ltd
Original Assignee
Guangzhou Epbox Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Epbox Information Technology Co ltd filed Critical Guangzhou Epbox Information Technology Co ltd
Priority to CN202111491142.9A priority Critical patent/CN114170419A/en
Publication of CN114170419A publication Critical patent/CN114170419A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/30Administration of product recycling or disposal
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02WCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO WASTEWATER TREATMENT OR WASTE MANAGEMENT
    • Y02W90/00Enabling technologies or technologies with a potential or indirect contribution to greenhouse gas [GHG] emissions mitigation

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Operations Research (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Quality & Reliability (AREA)
  • Sustainable Development (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a method and a device for extracting an equipment area image in a natural scene. Further, positioning a preset range area of the gray level image, and extracting an equipment area from the preset range area; wherein the device area is used as detection data for recovery detection. Based on the method, in the recovery detection in the natural scene, the effective equipment area is extracted from the appearance image, so that the environmental interference in the natural scene is reduced, and the accuracy of the subsequent recovery image detection is improved.

Description

Equipment region image extraction method and device under natural scene
Technical Field
The invention relates to the technical field of recovery detection, in particular to a method and a device for extracting an equipment region image in a natural scene.
Background
With the development of electronic product technology, various intelligent devices such as smart phones, notebook computers, tablet computers, and the like are developed. At present, along with the rapid development of economy and technology, the popularization and the updating speed of intelligent equipment are also faster and faster. Taking a smart phone as an example, the coming of the 5G era accelerates the generation change of the smart phone. In the iterative process of the intelligent equipment, effective recovery is one of effective utilization means of the residual value of the intelligent equipment, and the chemical pollution to the environment and the waste can be reduced.
Therefore, various recycling methods such as recycling by a recycling machine or after-sales service have been carried out in accordance with the recycling process of the facility. In the recycling process, appearance detection is an extremely important link. Taking mobile phone recycling as an example, when collecting device appearance images for appearance detection, the environment may introduce a great interference. Therefore, in the current recycling detection, the interference of light or other factors is prevented by providing a stable background environment for the shooting of the equipment in the form of a box or a cabinet. However, this approach limits the usage scenarios and flexibility of recovery detection, and for example, in general outdoor or indoor environments, the accuracy of recovery detection performed by captured images is affected.
Disclosure of Invention
Based on this, it is necessary to provide a method and an apparatus for extracting an apparatus region image in a natural scene, aiming at the limitation of the conventional shooting environment on the use scene and flexibility of the recovery detection.
An equipment area image extraction method under a natural scene comprises the following steps:
acquiring an appearance image of equipment to be recovered;
performing color space conversion processing on the appearance image to generate a corresponding gray level image;
positioning a preset range area of the gray level image;
extracting an equipment area from a preset range area; wherein the device area is used as detection data for recovery detection.
According to the method for extracting the device area image in the natural scene, after the appearance image of the device to be recovered is obtained, the appearance image is subjected to color space conversion processing, and a corresponding gray level image is generated. Further, positioning a preset range area of the gray level image, and extracting an equipment area from the preset range area; wherein the device area is used as detection data for recovery detection. Based on the method, in the recovery detection in the natural scene, the effective equipment area is extracted from the appearance image, so that the environmental interference in the natural scene is reduced, and the accuracy of the subsequent recovery image detection is improved.
In one embodiment, before the process of performing color space conversion processing on the appearance image and generating the corresponding gray scale image, the method further comprises the step of
And performing Gaussian filtering processing on the appearance image.
In one embodiment, the process of gaussian filtering the appearance image is as follows:
Figure BDA0003399407250000021
wherein X1 represents the appearance image after gaussian filtering, and X represents the appearance image; j 1,2, H, i 1,2, W, j and i respectively represent coordinate values in a horizontal direction and a vertical direction with respect to an origin at the upper left corner in the appearance image, H represents the height of X, and W represents the width of X; w represents the length of the rectangular window, set to 3; a represents the amplitude of the corresponding rectangular window, set to 16; the rectangular window represents a template corresponding to the Gaussian filter as follows
Figure BDA0003399407250000031
In one embodiment, the process of performing color space conversion processing on the appearance image to generate the corresponding gray image is as follows:
G1(j,i)=0.1140*X1b(j,i)+0.5870*X1g(j,i)+0.2989*X1r(j,i)
wherein G1(j, i) represents a grayscale image, X1r、X1gAnd X1bColor components of R, G and B channels of the appearance image X1 are shown.
In one embodiment, the process of locating the preset range region of the grayscale image includes the steps of:
extracting gradient information of the gray level image in the horizontal direction and the vertical direction;
determining the point coordinates of the gray image angle according to the gradient information;
and (4) selecting according to the range of the coordinates of each point to obtain a preset range area.
In one embodiment, the process of extracting the device region from the preset range region includes the steps of:
calculating texture features in a preset range area;
and carrying out binarization processing on the texture features, and screening out the equipment area.
In one embodiment, the process of calculating texture features in the preset range region includes the steps of:
calculating a gray level co-occurrence matrix of a preset range area; wherein, the entropy value of the gray level co-occurrence matrix is positively correlated with the complexity of the texture features.
An apparatus for extracting an image of a device region in a natural scene includes:
the image acquisition module is used for acquiring an appearance image of the equipment to be recovered;
the image conversion module is used for performing color space conversion processing on the appearance image to generate a corresponding gray level image;
the area positioning module is used for positioning a preset range area of the gray level image;
the region extraction module is used for extracting an equipment region from a preset range region; wherein the device area is used as detection data for recovery detection.
After the device area image extraction device under the natural scene obtains the appearance image of the device to be recovered, the device area image extraction device performs color space conversion processing on the appearance image to generate a corresponding gray level image. Further, positioning a preset range area of the gray level image, and extracting an equipment area from the preset range area; wherein the device area is used as detection data for recovery detection. Based on the method, in the recovery detection in the natural scene, the effective equipment area is extracted from the appearance image, so that the environmental interference in the natural scene is reduced, and the accuracy of the subsequent recovery image detection is improved.
A computer storage medium having stored thereon computer instructions which, when executed by a processor, implement the method for extracting device region images in natural scenes according to any of the above embodiments.
After the computer storage medium obtains the appearance image of the device to be recovered, the appearance image is subjected to color space conversion processing to generate a corresponding gray level image. Further, positioning a preset range area of the gray level image, and extracting an equipment area from the preset range area; wherein the device area is used as detection data for recovery detection. Based on the method, in the recovery detection in the natural scene, the effective equipment area is extracted from the appearance image, so that the environmental interference in the natural scene is reduced, and the accuracy of the subsequent recovery image detection is improved.
A computer device includes a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor executes the computer program to implement the device region image extraction method in the natural scene according to any of the embodiments.
After the computer equipment obtains the appearance image of the equipment to be recovered, the appearance image is subjected to color space conversion processing to generate a corresponding gray level image. Further, positioning a preset range area of the gray level image, and extracting an equipment area from the preset range area; wherein the device area is used as detection data for recovery detection. Based on the method, in the recovery detection in the natural scene, the effective equipment area is extracted from the appearance image, so that the environmental interference in the natural scene is reduced, and the accuracy of the subsequent recovery image detection is improved.
Drawings
FIG. 1 is a flowchart of an apparatus region image extraction method in a natural scene according to an embodiment;
FIG. 2 is a flowchart of an apparatus region image extraction method in a natural scene according to another embodiment;
FIG. 3 is a block diagram of an apparatus for extracting device area images in a natural scene according to an embodiment;
FIG. 4 is a schematic diagram of an internal structure of a computer according to an embodiment.
Detailed Description
For better understanding of the objects, technical solutions and effects of the present invention, the present invention will be further explained with reference to the accompanying drawings and examples. Meanwhile, the following described examples are only for explaining the present invention, and are not intended to limit the present invention.
The embodiment of the invention provides an equipment recovery system.
The equipment recovery system of an embodiment comprises a detection server and an equipment recovery terminal.
In one embodiment, the device recycling system of an embodiment further includes a third party server or a payment server.
The user side of the device to be recovered before being recovered comprises terminal devices which can communicate with each server or other side terminal devices, and the terminal devices can be the device to be recovered or other intelligent devices.
The staff side includes terminal equipment capable of communicating with each server or other side terminal equipment.
In one embodiment, the terminal device performing communication includes a smart device such as a mobile phone or a computer. As a preferred embodiment, the terminal device performing the communication has a network communication capability.
The equipment to be recovered comprises intelligent equipment or non-intelligent equipment such as a mobile phone, a computer, a watch, a television, furniture and the like.
The user side may communicate with the detection server, the third party server or the payment server, and the staff side may communicate with the detection server or the payment server.
The payment server is in communication interaction with the user side or the staff side and is used for completing collection and payment of a user account or a staff account.
The third-party server is communicated with one side of the user, comprises a shopping platform, a recovery platform or a communication platform and the like, and can be used for capturing a recovery request of the equipment to be recovered, which is removed by the user.
The equipment recovery terminal establishes communication with the detection server, and the detection server executes detection analysis on detection data obtained by executing recovery detection on the equipment to be recovered.
In one embodiment, an apparatus recycling terminal of an embodiment includes:
the detection module is used for detecting the equipment to be recovered to obtain detection data;
the display module is used for displaying the detection result of the detection server;
the human-computer interaction module is used for collecting display feedback of the detection result;
and the communication module is respectively connected with the detection module, the display module and the human-computer interaction module and is used for communicating with each server.
The detection module is used for detecting the equipment to be recovered, obtaining detection data and sending the detection data to the detection server through the communication module.
In one embodiment, the detection data includes an appearance image or interactive data of the device to be recycled. The staff accessible operating means retrieves the terminal and shoots the outward appearance image of waiting to retrieve the equipment, generates video or picture and uploads to the detection server. And the communication connection can be established with the equipment to be recovered through the operation equipment recovery terminal to obtain the interactive data.
In one embodiment, the detection module comprises a camera unit or a data interaction unit.
The camera shooting unit is used for shooting the appearance of the equipment to be recovered. The data interaction unit is used for establishing communication connection with the equipment to be recovered and acquiring interaction data.
In one embodiment, the data interaction unit comprises a wireless interaction unit or a wired interaction unit.
The wireless interaction unit comprises a WIFI interaction unit, a 4G interaction unit, an infrared interaction unit or a ZIGBEE interaction unit and the like. The wired interaction unit comprises a USB interaction unit or a bus interaction unit.
As a preferred embodiment, the data interaction unit comprises a USB interaction unit. The device recovery terminal establishes USB connection with the device to be recovered through the integrated USB interactive unit when recovery detection is executed, and acquires USB interactive data required by the detection server for acquiring the detection result.
The display module is used for displaying the detection result, and comprises display, acousto-optic display or voice display and the like.
In one embodiment, the display module includes a display unit. Wherein, the display unit comprises a display screen or a nixie tube display and the like. And the detection result is displayed to a user or staff and the like through a display unit.
The human-computer interaction module is used for realizing human-computer interaction between related personnel and the equipment recovery terminal and can be used for collecting display feedback or adjusting detection result display.
As a better implementation mode, the human-computer interaction module and the display module are the same touch display screen.
The communication module has the capability of communicating with each server, and comprises a plurality of communication modes such as 4G communication, WLAN communication, local area network communication and the like.
In order to better explain the deployment characteristics of the device recycling terminal according to the embodiment of the present invention, the device recycling terminal development and deployment in an application are described in the following with a specific application example. In a specific application example, the equipment recovery terminal development can be performed by a common tablet computer, and the tablet computer has network communication and shooting capabilities and can be used for shooting equipment to be recovered and uploading a shot appearance image to a detection server. Meanwhile, the tablet personal computer is improved, and is provided with a USB external data line (which can simultaneously comprise USB data lines of various external interface types) through software integration or hardware integration, and is connected with equipment to be recovered in the recovery process. And through software integration, required interactive data is rapidly read or captured and uploaded to a detection server. Meanwhile, when a detection result sent by the detection server is received, display and display can be carried out through the display screen, or voice display can be carried out through the power amplifier. Meanwhile, relevant personnel such as users or workers can operate the tablet personal computer to realize acquisition and display feedback or adjustment and display of detection results.
The device recycling terminal of any embodiment comprises a detection module, a display module, a human-computer interaction module and a communication module. Based on this, on the basis of satisfying and accomplishing the recovery process with the detection server, will retrieve the detection and realize through the detection server in high in the clouds, reduce equipment and retrieve terminal's cost, stability and portability.
Based on this, the embodiment of the invention provides a device recovery method on the side of a detection server.
The apparatus recovery method of an embodiment includes the steps of:
acquiring a device recovery request;
according to the equipment recovery request, sending recovery information to one side of a worker; the recovery information is used for indicating a worker to obtain a recovery terminal of the equipment from the terminal address, and holding the recovery terminal of the equipment to perform recovery detection on the equipment to be recovered from the recovery address;
receiving detection data sent by a device recovery terminal executing recovery detection, executing detection on the detection data and feeding back a detection result to the device recovery terminal; the equipment recovery terminal is used for displaying a detection result;
and executing corresponding equipment recovery operation according to the display feedback of the detection result.
The device recycle request is used to trigger the start of device recycle. The equipment recovery request can be sent by a user side, a worker side and various server sides, and is acquired by the detection server. Or the detection server executes final acquisition through indirect transmission from a user side, a worker side or multiple sides of various servers and the like.
In one embodiment, the process of obtaining a device recycle request includes the steps of:
and receiving request information sent by a user of the equipment to be recovered, and determining the request as an equipment recovery request.
The request information sent by the user side comprises the request information directly sent to the detection server or indirectly sent to the detection server through other sides. The user communicates with the detection server by operating the intelligent device (including the device to be recovered) to send the request information.
In one embodiment, the process of obtaining a device recycle request includes the steps of:
and acquiring communication interaction data between the user of the equipment to be recovered and the third-party server as an equipment recovery request.
The initiation of the device recycle request is not limited to the direct communication between the user and the detection server. In the communication interaction data of the user and the third-party server, a device recovery request exists, and the communication interaction data serves as the recovery request.
To better explain the embodiment, the device to be recycled is taken as a mobile phone as an example. A user purchases a new mobile phone on an online shopping platform (a third-party server) to replace an original old mobile phone, determines that the old mobile phone can be recycled in communication interaction of the online shopping platform, and sends data to a detection server by the online shopping platform as an equipment recycling request based on the communication interaction data.
The detection server may be implemented by an application, such as an APP or various applets. Sales drainage or user drainage by an application, providing online services related to device recovery requests: purchase new equipment, exchange for old equipment, or equipment recycling. And the user directly operates the application program or the third-party server to communicate with the interface of the application program, so that the triggering of the equipment recycling request is realized.
In one embodiment, before the process of sending the recycling information to the staff side according to the device recycling request, the method further comprises the following steps:
and determining corresponding staff according to the communication interaction data.
The communication interaction data of the user and the third-party server correspond to logistics services such as new equipment purchase or distribution, and based on the logistics services, the interaction process of the user and the third-party server is determined according to the communication interaction data, and corresponding staff are determined. For example, if the interaction between the user and the third-party server is online shopping, the corresponding logistics distribution personnel for online shopping goods is determined as staff.
In one embodiment, the method for recycling plant of the further embodiment further comprises the steps of:
and sending delivery information to the staff according to the communication interaction data so as to instruct the staff to deliver the equipment to be delivered to the recycling address according to the delivery information.
The delivery information is used for realizing the purpose of communication interaction between the user and the third-party server. In this embodiment, the third-party server implements delivery information delivery by interacting with the detection server, so as to reduce communication cost. Meanwhile, the multi-side servers required to be communicated by the staff are reduced, so that the staff can complete corresponding work only by communicating with the detection server, the work efficiency is improved, and the error rate is reduced.
And after the equipment recovery request is determined to be initiated, sending recovery information to one side of a worker, and instructing the worker to execute a series of operations such as equipment recovery terminal acquisition, recovery address confirmation, recovery detection and the like.
In one embodiment, the reclamation information is used to determine a terminal address and a reclamation address. The intelligent equipment on one side of the staff can be used for displaying the terminal address and the recovery address corresponding to the recovery information to the staff.
As a preferred embodiment, the terminal address is associated with a recycle address.
The recycling address may be predetermined during the obtaining process of the device recycling request, for example, a receiving address of the user or address information to be filled in. The terminal address is associated with the recycling address, such as determining the terminal address with the minimum distance according to the recycling address.
In a specific application example, each device recycling terminal is placed in a corresponding store or warehouse, and the address of the store or warehouse is the terminal address. And the staff acquires the equipment recovery terminal from the terminal address with the minimum distance to the recovery address according to the recovery information, and the store or warehouse updates the inventory information of the equipment recovery terminal according to the acquisition information.
After the device to be recovered is recovered, the worker can return the device recovery terminal to the corresponding terminal address, or execute recovery detection to the next recovery address according to the recovery information.
In one embodiment, the recycling information is further used for instructing a worker to operate the device recycling terminal. The recovery information is sent to the intelligent equipment on one side of the worker, and operation display is provided for the worker. The operation display comprises character display, picture display or video display.
In one embodiment, the recycle information includes a boot code. The boot code is used for unlocking the corresponding equipment recovery terminal.
In one embodiment, after the device recycling terminal is acquired, the method further includes the steps of:
and the updating equipment recovers the acquired information corresponding to the terminal.
Taking the device recycling terminal attached to the store or the warehouse as an example, the relevant personnel can log in the detection server to check whether the device recycling terminal of each terminal address is acquired or used, and know the state of the device recycling terminal in time. For example, the detection server can attach corresponding small programs, APPs and the like, and the staff can check the number of idle equipment recovery terminals of a store or a warehouse, whether the equipment recovery terminals are damaged or not by logging in the corresponding small programs. After the staff acquire the equipment recovery terminal, updating the corresponding equipment recovery terminal to be acquired; and after the staff returns the equipment recovery terminal, updating the corresponding equipment recovery terminal to be not acquired.
The staff holds the equipment recovery terminal to the target address, collects the detection data of the equipment to be recovered through the equipment recovery terminal, and sends the detection data to the detection server to obtain the detection result.
And the equipment recovery terminal displays the detection result, including character display, picture display or video display and the like.
As a preferred embodiment, when the detected data is a picture of the device to be recovered, the detection result includes marking a flaw or damage trace on the picture, and prompting the user through marking. And when the detected data is interactive data of the equipment to be recovered and the equipment recovery terminal, displaying the data to the user in a text form through data exception marking or data processing results.
In one embodiment, the detection result comprises a recovery price besides the problem detection of the equipment to be recovered, and the detection and quotation process is performed by advancing to the target address through showing the recovery price for the user.
And after the detection result is displayed, executing corresponding equipment recovery operation according to the display feedback.
In one embodiment, the presentation feedback may be sent to the detection server from the user side, the staff side, or the servers side at the data communication level. On the man-machine interaction level, the display feedback can be made by an intelligent device on one side of a user operation user, a user operation device recovery terminal, an intelligent device on one side of a worker operation worker, a worker operation device recovery terminal and the like.
According to the display feedback, corresponding equipment recycling operation is carried out, such as recycling determination, non-recycling determination, recycling price adjustment and the like. In one embodiment, the detection server completes the device recovery operation through interaction with a third-party server or a payment server and the like. For example, in the process of replacing old equipment with new equipment, when the equipment to be recovered is determined to be recovered, the payment server completes corresponding operations of deducting money of the user, adjusting payment of the user, paying remuneration by staff and the like.
Based on this, in one embodiment, the process of executing the corresponding device recycling operation according to the display feedback of the detection result includes the following steps:
and indicating the staff to hold the equipment to be recovered to the target address according to the display feedback.
Target address information is directly or indirectly issued to one side of a worker through the detection server, and the worker holds the equipment to be recovered to the target address to complete recovery.
In one embodiment, the process of executing the corresponding device recycling operation according to the display feedback of the detection result includes the steps of:
and adjusting the recycling price of the equipment to be recycled according to the display feedback.
And the detection server and the payment server complete interaction through the display feedback, and the recovery price of the equipment to be recovered is adjusted.
In one embodiment, the process of executing the corresponding device recycling operation according to the display feedback of the detection result includes the steps of:
and generating logistics information according to the display feedback to instruct the staff to carry out logistics delivery on the equipment to be recovered according to the logistics information.
The detection server can directly issue the logistics information to one side of the worker, or a third-party server executes indirect issuing to indicate the worker to execute logistics delivery of the equipment to be recovered, and therefore the recovery efficiency of the equipment is further improved.
In one embodiment, after the process of performing the corresponding device recycling operation according to the display feedback of the detection result, the method further includes the steps of:
and sending a payment instruction to the payment server to instruct the payment server to complete the payment and receipt for the account of the target person.
After the recovery is completed, the detection server sends a payment instruction to the payment server to instruct the payment server to complete the collection and payment of the account of the target person.
In one embodiment, the target personnel comprises users, staff or third-party merchants and the like, so that timely settlement and intelligentization of payment received by all parties in the equipment recycling process are achieved.
In the device recovery method in any embodiment, after the device recovery request is obtained, recovery information is sent to the worker side, the worker is instructed to obtain the device recovery terminal from the terminal address, and the device recovery terminal is held to perform recovery detection on the device to be recovered from the recovery address. And receiving detection data sent by the equipment recovery terminal for executing recovery detection, executing detection on the detection data and feeding back a detection result to the equipment recovery terminal, displaying the detection result by the equipment recovery terminal, and executing corresponding equipment recovery operation according to the display feedback of the detection result. Based on this, through long-range testing result feedback, reduce the professional requirement to the staff to reduce the hardware capability requirement to equipment recovery terminal, reduce the recovery cost in all aspects. Meanwhile, the detection result is displayed and fed back to the recovery address in a preposed mode, so that disputes and communication cost in the recovery process are reduced.
Based on the method, after the recovery detection is carried out on the equipment to be recovered by the equipment recovery terminal held by the staff to obtain the appearance image in the detection data, the equipment region image extraction method in the natural scene is provided.
Fig. 1 is a flowchart illustrating a method for extracting a device region image in a natural scene according to an embodiment, where as shown in fig. 1, the method for extracting a device region image in a natural scene according to an embodiment includes steps S100 to S103:
s100, acquiring an appearance image of equipment to be recovered;
s101, performing color space conversion processing on the appearance image to generate a corresponding gray level image;
s102, positioning a preset range area of the gray level image;
s103, extracting an equipment area from a preset range area; wherein the device area is used as detection data for recovery detection.
The appearance image can be obtained from the detection data sent by the detection server through the equipment recovery terminal.
In one embodiment, fig. 2 is a flowchart of an apparatus region image extraction method in a natural scene according to another embodiment, and as shown in fig. 2, before the process of performing color space conversion processing on the appearance image in step S101 to generate a corresponding grayscale image, the method further includes step S200:
s200, Gaussian filtering processing is carried out on the appearance image.
And removing white noise in the appearance image of the intelligent equipment through Gaussian filtering processing.
In one embodiment, the process of gaussian filtering the appearance image is as follows:
Figure BDA0003399407250000141
wherein X1 represents the appearance image after gaussian filtering, and X represents the appearance image; j 1,2, H, i 1,2, W, j and i respectively represent coordinate values in a horizontal direction and a vertical direction with respect to an origin at the upper left corner in the appearance image, H represents the height of X, and W represents the width of X; w represents the length of the rectangular window, set to 3; a represents the amplitude of the corresponding rectangular window, set to 16; the rectangular window represents a template corresponding to the Gaussian filter as follows
Figure BDA0003399407250000142
In one embodiment, the process of performing color space conversion processing on the external image in step S101 to generate a corresponding gray image is as follows:
G1(j,i)=0.1140*X1b(j,i)+0.5870*X1g(j,i)+0.2989*X1r(j,i)
wherein G1(j, i) represents a grayscale image, X1r、X1gAnd X1bColor components of R, G and B channels of the appearance image X1 are shown.
Through the color space conversion process, a corresponding grayscale image is generated. The parameter setting of the above formula is convenient for adapting to image processing in natural scenes.
In one embodiment, as shown in fig. 2, the process of positioning the preset range region of the grayscale image in step S102 includes steps S300 to S302:
s300, extracting gradient information of the gray level image in the horizontal direction and the vertical direction;
s301, determining the point coordinates of the gray image angle according to the gradient information;
s302, selecting according to the range of the coordinates of each point to obtain a preset range area.
Here, the gradient information in the horizontal direction and the gradient information in the vertical direction are extracted from the grayscale image G1. KxAnd KyThe convolution kernels of G1 in the horizontal direction and the vertical direction are:
Figure BDA0003399407250000151
wherein,
Figure BDA0003399407250000152
take the example that the gray-scale image is a quadrangle including four corners. For gradient image G1xAnd G1yPoint coordinates P1, P2, P3, and P4 of four corners of the virtual frame are obtained.
L1 and L2 represent the strongest gradient edges of the two perpendicular directions to which the gradient image G1x corresponds; l3 and L4 indicate the strongest gradient edges in the two horizontal directions corresponding to the gradient image G1 y.
And obtaining a preset range area according to the range selected by each corner frame. For example:
connecting the P1, the P2, the P3 and the P4 to obtain a preset range area Y of the appearance image.
L1:a1x+b1y+c1=0
L2:a2x+b2y+c2=0
L3:a3x+b3y+c3=0
L4:a4x+b4y+c4=0
The calculation formulas of P1, P2, P3 and P4 are similar, taking the P1 intersection as an example:
P1x=(b1c3-b3c1)/(a1b3-a3b1)
P1y=(a3c1-a1c3)/(a1b3-a3b1)
wherein, a1, a2, a3, a4, b1, b2, b3, b4, c1, c2, c3 and c4 are general straight line equation parameters of L1, L2, L3 and L4 respectively, and P1x and P1y represent coordinate values of the intersection point P1.
In one embodiment, as shown in fig. 2, the process of extracting the device region from the preset range region in step S103 includes steps S400 and S401:
s400, calculating texture features in a preset range area;
s401, carrying out binarization processing on the texture features, and screening out the equipment area.
In one embodiment, the size of the texture features may be characterized by a gray level co-occurrence matrix. Namely, the process of calculating the texture features in the preset range region in step S400 includes the steps of:
calculating a gray level co-occurrence matrix of a preset range area; wherein, the entropy value of the gray level co-occurrence matrix is positively correlated with the complexity of the texture features.
In one embodiment, for the preset range area Y, a Gray-level co-occurrence matrix GM thereof is calculated by using a GLCM (Gray-level co-occurrence matrix) method. And calculating an entropy value for the GM, wherein the larger the entropy value is, the larger the texture complexity of the image represented by the entropy value is. The formula for calculating the entropy value is:
Figure BDA0003399407250000161
wherein x represents a certain pixel point in GM, and p (x) is the probability of the pixel point.
And performing binarization processing on the texture features of the image in the virtual frame, wherein the gray value of a pixel point corresponding to a region of which the GM entropy value is greater than a threshold value T is set to be 255, and calculating a corresponding binarization Mask image Mask. 3) And extracting the binarized rectangular frame set, and screening the device area in the rectangular frame set.
In the method for extracting an image of an equipment region in a natural scene according to any of the embodiments, after an appearance image of equipment to be recovered is acquired, color space conversion processing is performed on the appearance image to generate a corresponding gray image. Further, positioning a preset range area of the gray level image, and extracting an equipment area from the preset range area; wherein the device area is used as detection data for recovery detection. Based on the method, in the recovery detection in the natural scene, the effective equipment area is extracted from the appearance image, so that the environmental interference in the natural scene is reduced, and the accuracy of the subsequent recovery image detection is improved.
Fig. 3 is a block diagram of an apparatus region image extraction device in a natural scene according to an embodiment, and as shown in fig. 3, the apparatus region image extraction device in a natural scene according to an embodiment includes:
an image obtaining module 100, configured to obtain an appearance image of a device to be recovered;
the image conversion module 101 is configured to perform color space conversion processing on the appearance image to generate a corresponding grayscale image;
the area positioning module 102 is configured to position a preset range area of the grayscale image;
the region extraction module 103 is configured to extract a device region from a preset range region; wherein the device area is used as detection data for recovery detection.
After the device area image extraction device under the natural scene obtains the appearance image of the device to be recovered, the device area image extraction device performs color space conversion processing on the appearance image to generate a corresponding gray level image. Further, positioning a preset range area of the gray level image, and extracting an equipment area from the preset range area; wherein the device area is used as detection data for recovery detection. Based on the method, in the recovery detection in the natural scene, the effective equipment area is extracted from the appearance image, so that the environmental interference in the natural scene is reduced, and the accuracy of the subsequent recovery image detection is improved.
The embodiment of the invention also provides a computer storage medium, on which computer instructions are stored, and the instructions are executed by a processor to implement the device region image extraction method in the natural scene in any one of the embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware related to instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, the computer program can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
Alternatively, the integrated unit of the present invention may be stored in a computer-readable storage medium if it is implemented in the form of a software functional module and sold or used as a separate product. Based on such understanding, the technical solutions of the embodiments of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a terminal, or a network device) to execute all or part of the methods of the embodiments of the present invention. And the aforementioned storage medium includes: a removable storage device, a RAM, a ROM, a magnetic or optical disk, or various other media that can store program code.
Corresponding to the computer storage medium, in one embodiment, a computer device is further provided, where the computer device includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor executes the computer program to implement the device region image extraction method in a natural scene as in any one of the embodiments.
The computer device may be a terminal, and its internal structure diagram may be as shown in fig. 4. The computer device includes a processor, a memory, a network interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a device region image extraction method in a natural scene. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
After the computer equipment acquires the appearance image of the equipment to be recovered, the appearance image is subjected to color space conversion processing to generate a corresponding gray level image. Further, positioning a preset range area of the gray level image, and extracting an equipment area from the preset range area; wherein the device area is used as detection data for recovery detection. Based on the method, in the recovery detection in the natural scene, the effective equipment area is extracted from the appearance image, so that the environmental interference in the natural scene is reduced, and the accuracy of the subsequent recovery image detection is improved.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above examples only show some embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An equipment region image extraction method under a natural scene is characterized by comprising the following steps:
acquiring an appearance image of equipment to be recovered;
performing color space conversion processing on the appearance image to generate a corresponding gray level image;
positioning a preset range area of the gray level image;
extracting an equipment area from the preset range area; wherein the device area is used as detection data for recovery detection.
2. The method as claimed in claim 1, wherein the method further comprises a step of performing a color space conversion process on the appearance image to generate a corresponding gray image
And carrying out Gaussian filtering processing on the appearance image.
3. The method according to claim 2, wherein the process of gaussian filtering the appearance image is as follows:
Figure FDA0003399407240000011
wherein X1 represents an appearance image after gaussian filtering, and X represents the appearance image; j 1,2, H, i 1,2, W, j and i respectively represent coordinate values in a horizontal direction and a vertical direction with respect to an origin at the upper left corner in the appearance image, H represents the height of X, and W represents the width of X; w represents the length of the rectangular window, set to 3; a represents the amplitude of the corresponding rectangular window, set to 16; the rectangular window represents a template corresponding to the Gaussian filter as follows
Figure FDA0003399407240000012
4. The method for extracting device region image in natural scene according to claim 1, wherein the process of performing color space conversion processing on the appearance image to generate the corresponding gray image is as follows:
G1(j,i)=0.1140*X1b(j,i)+0.5870*X1g(j,i)+0.2989*X1r(j,i)
wherein G1(j, i) represents the grayscale image, X1r、X1gAnd X1bColor components of R, G and B channels of the appearance image X1 are shown.
5. The method for extracting the device region image in the natural scene according to claim 1, wherein the process of locating the preset range region of the grayscale image includes the steps of:
extracting gradient information of the gray level image in the horizontal direction and the vertical direction;
determining the point coordinates of the gray-scale image angle according to the gradient information;
and selecting according to the range of the point coordinates to obtain the preset range area.
6. The method for extracting the device region image in the natural scene according to claim 1, wherein the process of extracting the device region from the preset range region includes:
calculating texture features in the preset range area;
and carrying out binarization processing on the texture features, and screening out the equipment area.
7. The method for extracting the device region image in the natural scene according to claim 1, wherein the process of calculating the texture feature in the preset range region includes:
calculating a gray level co-occurrence matrix of the preset range area; wherein the entropy of the gray level co-occurrence matrix is positively correlated with the complexity of the texture feature.
8. An apparatus for extracting an image of a device region in a natural scene, comprising:
the image acquisition module is used for acquiring an appearance image of the equipment to be recovered;
the image conversion module is used for performing color space conversion processing on the appearance image to generate a corresponding gray level image;
the area positioning module is used for positioning a preset range area of the gray level image;
the region extraction module is used for extracting an equipment region from the preset range region; wherein the device area is used as detection data for recovery detection.
9. A computer storage medium having computer instructions stored thereon, wherein the computer instructions, when executed by a processor, implement the device region image extraction method in a natural scene according to any one of claims 1 to 7.
10. A computer device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the device region image extraction method in a natural scene according to any one of claims 1 to 7 when executing the program.
CN202111491142.9A 2021-12-08 2021-12-08 Equipment region image extraction method and device under natural scene Pending CN114170419A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111491142.9A CN114170419A (en) 2021-12-08 2021-12-08 Equipment region image extraction method and device under natural scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111491142.9A CN114170419A (en) 2021-12-08 2021-12-08 Equipment region image extraction method and device under natural scene

Publications (1)

Publication Number Publication Date
CN114170419A true CN114170419A (en) 2022-03-11

Family

ID=80484316

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111491142.9A Pending CN114170419A (en) 2021-12-08 2021-12-08 Equipment region image extraction method and device under natural scene

Country Status (1)

Country Link
CN (1) CN114170419A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11989701B2 (en) 2014-10-03 2024-05-21 Ecoatm, Llc System for electrically testing mobile devices at a consumer-operated kiosk, and associated devices and methods

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11989701B2 (en) 2014-10-03 2024-05-21 Ecoatm, Llc System for electrically testing mobile devices at a consumer-operated kiosk, and associated devices and methods

Similar Documents

Publication Publication Date Title
CN110033293B (en) Method, device and system for acquiring user information
US20180232135A1 (en) Method for window displaying on a mobile terminal and mobile terminal
CN114170435A (en) Method and device for screening appearance images for recovery detection
CN105874471A (en) Client side filtering of card OCR images
CN109359582B (en) Information searching method, information searching device and mobile terminal
CN114219105A (en) Equipment recovery method, terminal and system
CN114170419A (en) Equipment region image extraction method and device under natural scene
CN110264314A (en) Transaction data processing method and device
CN113128244A (en) Scanning method and device and electronic equipment
JP2017102915A (en) Information processing device, processing method therefor and program
CN112532884B (en) Identification method and device and electronic equipment
KR20200101024A (en) Service system of returning goods being able to pickup request using location information included in photo of returning goods
CN113160494A (en) Device and system for self-service lease of equipment
CN114186702A (en) Method and device for processing appearance image of recovery detection
CN114155260A (en) Appearance image matting model training method and matting method for recycling detection
US10095714B2 (en) Mobile device capable of offline and online synchronous image identifying, an image identifying system, and a storage medium for the same
CN115099904A (en) Meal delivery management method, device, system, equipment, storage medium and program product
CN113034182A (en) Automatic point integrating method, system, server and storage medium
CN114518859A (en) Display control method, display control device, electronic equipment and storage medium
CN114491218A (en) Information updating method, information updating device, electronic device, and medium
JP2017097859A (en) Information processing device, and processing method and program thereof
CN106020684A (en) Order placing method and order placing device of order
CN109493546B (en) Cash registering method, cash registering system and cash registering device
CN109960909B (en) Social contact method based on three-dimensional map, server and computer-readable storage medium
CN111914741A (en) House property certificate identification method, device and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination