CN117461051A - Detection method and detection device for object to be detected - Google Patents

Detection method and detection device for object to be detected Download PDF

Info

Publication number
CN117461051A
CN117461051A CN202280032539.8A CN202280032539A CN117461051A CN 117461051 A CN117461051 A CN 117461051A CN 202280032539 A CN202280032539 A CN 202280032539A CN 117461051 A CN117461051 A CN 117461051A
Authority
CN
China
Prior art keywords
detection
sampling
images
detected
server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280032539.8A
Other languages
Chinese (zh)
Inventor
张焜杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yixin Co ltd
Original Assignee
Yixin Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yixin Co ltd filed Critical Yixin Co ltd
Publication of CN117461051A publication Critical patent/CN117461051A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A method (600) for detecting an object to be detected by a detection device (40) is disclosed, the method (600) comprising: transmitting a request for detecting the object to be detected (S610); receiving a locating point of the object to be detected (S620); acquiring a plurality of detection images along a plurality of detection directions on a positioning point of an object to be detected (S630); and obtaining a detection result according to the plurality of detection images (S640), wherein the detection result is generated based on a comparison of the plurality of detection images and a corresponding positioning point of a reference object corresponding to the object to be detected.

Description

Detection method and detection device for object to be detected
Cross Reference to Related Applications
The present disclosure claims the benefit and priority of U.S. provisional patent application serial No. 63/183643 (hereinafter "the' 643 provisional") filed on 5/4 of 2021, entitled "Surface Micro Feature Recognition Using Blockchain". The disclosure of the' 643 provisional is hereby fully incorporated by reference into this disclosure.
Technical Field
The invention relates to the field of detecting objects, in particular to a method and a device for detecting an object to be detected.
Background
Although the current anti-counterfeiting technology is very different, many problems still exist in the anti-counterfeiting technology of articles such as artworks, jewelry, antiques and the like. Partial authentication techniques require sampling of the item to be authenticated, resulting in disruption of the integrity of the item to be authenticated. The actual condition of the object to be identified is difficult to master in a whole disc by naked eye observation. In addition, if authentication is performed by an electronic device, it is difficult to ensure whether or not information stored in the electronic device is tampered with or replaced. Therefore, if authentication is to be performed using electronic equipment, it is necessary to ensure data security and non-falsification.
Disclosure of Invention
The present invention has been made in view of the above-mentioned problems, and provides a method and a device for detecting an object to be detected.
In order to solve the above-mentioned problems, the present invention provides a method for detecting an object to be detected by a detecting device, the method comprising: transmitting a request for detecting the object to be detected; receiving the positioning point of the object to be detected; acquiring a plurality of detection images along a plurality of detection directions on the positioning point of the object to be detected; and obtaining a detection result according to the plurality of detection images, wherein the detection result is generated based on the comparison of the plurality of detection images and a corresponding positioning point of a reference object corresponding to the object to be detected.
In order to solve the above problems, the present invention further provides a detection apparatus for detecting an object to be detected, the detection apparatus comprising: the image capturing unit is used for obtaining a plurality of detection images; the mobile control unit is used for enabling the image capturing unit to move when detecting the object to be detected; the processor is coupled with the image capturing unit and the mobile unit; a transfer unit coupled to the processor; a receiving unit coupled to the processor; and a storage device coupled to the processor and storing a plurality of instructions that when executed by the processor cause the processor to: transmitting a request for detecting the object to be detected through the transmitting unit; receiving the positioning point of the object to be detected through the receiving unit; the image capturing unit obtains a plurality of detection images along a plurality of detection directions on the positioning point of the object to be detected through the mobile control unit; and obtaining a detection result according to the plurality of detection images, wherein the detection result is generated based on the comparison of the plurality of detection images and a corresponding positioning point of a reference object corresponding to the object to be detected.
In order to solve the above problems, the present invention further provides a method for detecting an object to be detected by a server, the method comprising: receiving a request for detecting the object to be detected; transmitting the positioning point of the object to be detected; receiving a plurality of detection images acquired along a plurality of detection directions on the positioning point of the object to be detected; and obtaining a detection result according to the plurality of detection images, wherein the detection result is generated based on the comparison of the plurality of detection images and a reference object corresponding to the object to be detected.
In this way, the detection system maintains information about the reference object and thereby limits the sampling point (i.e., the corresponding anchor point) of the reference object. In this way, the detection device must upload the information of the object to be detected first, so that the detection system can confirm the corresponding reference object, and then the positioning point of the object to be detected, which must capture the image, can be obtained. The method can safely store the identification information of the reference object, and the sampling point of the reference object is not easy to directly obtain, so that the difficulty of passing detection of the counterfeit product is also improved. In addition, if the information of the reference object is a stored blockchain storage device, the security and the non-falsification of the information can be further improved.
Drawings
The aspects of the disclosure are best understood from the following detailed description when read with the accompanying drawing figures. The various features are not drawn to scale and the dimensions of the various features may be arbitrarily increased or reduced for clarity of discussion.
FIG. 1A is a block diagram of a sampling system for sampling fiducials by creating identification data in accordance with the present invention.
FIG. 1B is a block diagram of another sampling system of the present invention for sampling fiducials by creating identification data.
Fig. 2 is a block diagram of a sampling device for sampling a reference object by creating identification data according to the present invention.
FIG. 3 is a flow chart of a sampling method for sampling a reference object by creating identification data according to the present invention.
FIG. 4A is a block diagram of a detection system for detecting an object to be detected according to the present invention.
FIG. 4B is a block diagram of another detection system for detecting an object to be detected according to the present invention.
Fig. 5 is a block diagram of a detection apparatus for detecting an object to be detected according to the present invention.
FIG. 6 is a flow chart of a detection method for detecting an object to be detected according to the present invention.
Fig. 7A is a schematic diagram showing an image capturing unit capturing a sampled image in a sampling direction at a positioning point of a reference object according to an exemplary embodiment of the present invention.
Fig. 7B is a schematic diagram showing the image capturing unit capturing a sampled image in another sampling direction at the positioning point of the reference object according to the exemplary embodiment of the present invention.
Fig. 8A-8E are photographs of different sampled images captured on a reference object according to an exemplary embodiment of the present invention.
Fig. 8F-8J are photographs of different inspection images captured on an object to be inspected according to an exemplary embodiment of the present invention.
Fig. 9A and 9B are sample images of different stones of the same processing pattern in the same sampling direction, according to an exemplary embodiment of the invention.
Fig. 10A is a photograph of antique appliances having the same grain but different.
Fig. 10B and 10C are sample images of the different antique devices of fig. 10A sampled in the same sampling direction according to an exemplary embodiment of the present invention.
Detailed Description
The following description contains specific information pertaining to exemplary embodiments in the present disclosure. The drawings in the present disclosure and their accompanying detailed description are directed to merely exemplary embodiments. However, the present disclosure is not limited to only these exemplary embodiments. Other variations and embodiments of the present disclosure will occur to those skilled in the art. Similar or corresponding elements in the drawings may be indicated by similar or corresponding reference numerals unless otherwise indicated. The drawings and illustrations in the present disclosure are generally not drawn to scale and are not intended to correspond to actual relative dimensions.
For consistency and ease of understanding, similar features are identified by numerals in the exemplary figures (although not shown in some examples). However, the features in the different embodiments may differ in other respects and therefore should not be narrowly limited to what is shown in the drawings.
The terms "at least one embodiment," "an embodiment," "a plurality of embodiments," "different embodiments," "some embodiments," "the present embodiment," and the like may indicate that an embodiment of the invention so described may include a particular feature, structure, or characteristic, but not every possible embodiment of the invention necessarily includes the particular feature, structure, or characteristic. Furthermore, repeated use of the phrases "in an embodiment," "in this embodiment," and "in the present embodiment" do not necessarily refer to the same embodiment, although they may. Furthermore, the use of phrases such as "embodiments" in connection with the present invention does not necessarily mean that all embodiments of the invention may include a particular feature, structure, or characteristic, and it should be understood that "at least some embodiments of the invention" include the particular feature, structure, or characteristic. The term "coupled" is defined as connected, either directly or indirectly, through intervening elements, and not necessarily limited to physical connections. When the terms "comprises," "comprising," and "includes" are used in the sense of "including but not limited to," they are used in a generic sense to specify the presence of stated features, groups, sequences, and equivalents. .
For purposes of explanation and not limitation, specific details are set forth, such as functional entities, techniques, protocols, and standards, in order to provide an understanding of the disclosed techniques. In other instances, detailed descriptions of well-known methods, techniques, systems, and architectures are omitted so as not to obscure the description with unnecessary detail.
The terms "first," "second," and "third," and the like in the description and in the above drawings, are used for distinguishing between different objects and not necessarily for describing a particular sequential or chronological order. Furthermore, the terms "comprise," "include," and any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or modules is not limited to only those steps or modules but may include other steps or modules not expressly listed or inherent to such process, method, article, or apparatus.
Those of skill in the art will directly appreciate that any one or more of the disclosed encoding functions or algorithms described in this disclosure may be implemented in hardware, software, or a combination of software and hardware. The described functions may correspond to modules, which may be software, hardware, firmware, or any combination thereof. The software implementation may include computer-executable instructions stored on a computer-readable medium, such as a memory or other type of storage device. For example, one or more microprocessors or general purpose computers with communication processing capabilities may be programmed with executable instructions and perform one or more of the disclosed functions or algorithms. The microprocessor or general purpose computer may be formed from an application specific integrated circuit (applications specific integrated circuitry, ASIC), a programmable logic array, and/or using one or more digital signal processors (digital signal processor, DSP). While some of the disclosed embodiments are directed to software installed and executed on computer hardware, alternative embodiments implemented as firmware or as hardware or a combination of hardware and software are well within the scope of the disclosure.
Computer-readable media includes, but is not limited to, random access memory (random access memory, RAM), read Only Memory (ROM), erasable programmable read only memory (erasable programmable read-only memory, EPROM), electrically erasable programmable read only memory (electrically erasable programmable read-only memory, EEPROM), flash memory, compact disc read only memory (compact disc read-only memory, CD ROM), magnetic cassettes, magnetic tape, magnetic disk storage devices, or any other equivalent medium capable of storing computer-readable instructions.
The coupling between the devices of the present invention may employ custom protocols or follow existing standards or facts including, but not limited to, ethernet, IEEE 802.11 or IEEE 802.15 family, wireless USB or telecommunications standards including, but not limited to, GSM (Global System for Mobile Communications ), CDMA2000 (Code Division Multiple Access, code Division multiple access technology), TD-SCDMA (Time Division-Synchronization Code Division Multiple Access, time Division synchronous code Division multiple access technology), wiMAX (World Interoperability for Microwave Access, worldwide interoperability for microwave access), 3GPP-LTE (Long Term Evolution, long term evolution technology) or TD-LTE (Time Division Long Term Evolution, time Division long term evolution technology). Furthermore, the apparatus of the invention may each comprise any device configured to transmit and/or store data to and receive data from a computer-readable medium. Furthermore, each apparatus of the invention may include a computer system interface that may enable data to be stored on or received from a storage device. For example: the apparatus of the present invention may include a chipset supporting peripheral component interconnect (Peripheral Component Interconnec, PCI) and peripheral component interconnect express (Peripheral Component Interconnect Express, PCIe) bus protocols, a proprietary bus protocol, a universal serial bus (Universal Serial Bus, USB) protocol, I2C, or any other logical and physical structure that may be used to interconnect peer devices.
The invention is described in further detail below with reference to the embodiments of the drawings.
Referring to fig. 1A, fig. 1A is a block diagram of a sampling system 101 for sampling a reference object by creating identification data. In one embodiment, the sampling system 101 includes a server 10 and a sampling device 20. The server 10 may include an internal storage device 11 to store the sampled results. In one embodiment, when the sampling device 20 receives a sampling request, the sampling device 20 can be coupled to the server 10 to complete the sampling method according to the present invention. When the sampling system 101 completes the sampling method, the sampling device 20 may stop coupling with the server 10.
Referring to fig. 1B, fig. 1B is a block diagram of another sampling system 102 for sampling a reference object by creating identification data. In one embodiment, the sampling system 102 includes a server 10, a sampling device 20, and an online storage device 30. In one embodiment, when the sampling device 20 receives a sampling request, the sampling device 20 can be coupled to the server 10, and the server 10 can be further coupled to the online storage device 30 to complete the sampling method according to the present invention. When the sampling system 102 completes the sampling method, the sampling device 20 may be decoupled from the server 10, and the server 10 may also be decoupled from the online storage device 30. In another embodiment, when the sampling device 20 receives a sampling request, the sampling device 20 can be coupled to the server 10 and the online storage device 30 to complete the sampling method according to the present invention. When the sampling system 102 completes the sampling method, the sampling device 20 can be decoupled from the server 10 and the online storage device 30.
In the illustrated embodiment, the online storage device 30 may be a network data storage or a blockchain storage device. When the online storage device 30 is the blockchain storage device, the possibility of tampering or replacement of the sampling result can be reduced by the characteristics of the blockchain. In the embodiment, the online storage device 30 can store records of all transactions, time and results of verification, updated pictures of the reference object.
Referring to fig. 2, fig. 2 is a block diagram of a sampling device 20 for sampling a reference object by creating identification data. The sampling device 20 may be a mobile phone, tablet computer, desktop computer, notebook computer, camera, video recorder or other electronic device, etc., without limitation. The sampling device 20 includes a motion control unit 21, an image capturing unit 22, a processor 23, a memory 24 and a transmitting unit 25.
The movement control unit 21 is used to allow the sampling device 20 to reach the movements required for the sampling process. In one embodiment, the movement control unit 21 is configured to enable the image capturing unit 22 to complete the movement when sampling the reference object. In one embodiment, the motion control unit 21 may be a display screen on the sampling device 20 or an automatic motion device coupled to the image capturing unit 22.
When the display screen is used as the movement control unit 21, the display screen can be used to provide a movement instruction to the user to instruct the user to move the image capturing unit 22 to the sampling position. In the embodiment, the sampling position may be determined by a sampling distance and a sampling direction. For example, when a photographing distance between the image capturing unit 22 and a positioning point on the reference object is equal to the sampling distance, and a photographing direction of the image capturing unit 22 to the positioning point on the reference object is equal to the sampling direction, the display screen displays that the image capturing unit 22 has moved to the sampling position, so as to start obtaining the required sampled image. When the image capturing distance is not equal to the sampling distance, the display screen can instruct the user to further move the image capturing unit 22 to zoom out or zoom in the image capturing distance. When the image capturing direction is not equal to the sampling direction, the display screen can instruct the user to further move the image capturing unit 22 to pull up or pull down the image capturing direction to the left, right, or to pull down the image capturing direction. In another embodiment, the sampling position may be determined by the sampling direction. In the embodiment, the sampling distance can be adjusted by adjusting the focal length of the image capturing unit 22 to ensure that the surface information of the reference object detail is obtained.
When the automatic moving device is used as the movement control unit 21, the automatic moving device may be a mechanical arm or other device capable of moving the image capturing unit 22, and the automatic moving device may receive the sampling position indicated by the processor 23 to move the image capturing unit 22 to the sampling position. In another embodiment, the sampling distance can be adjusted by adjusting the focal length of the image capturing unit 22 to ensure that the surface information of the reference object detail is obtained. Therefore, the automatic moving device can only adjust the sampling direction of the moving image capturing unit 22 to obtain the required sampled image.
The image capturing unit 22 is used for capturing a plurality of sampled images. When the sampling device 20 is to sample the reference object, the image capturing unit 22 can move to a plurality of different sampling positions, so that the image capturing unit 22 can capture the reference object at the different sampling positions to obtain a plurality of sampled images. The image capturing unit 22 may be a Charge-Coupled Device (CCD) image sensor, a Complementary Metal Oxide Semiconductor (CMOS) image sensor or a camera. In one embodiment, the image capturing unit 22 may include a high magnification image capturing lens to obtain the surface information of the reference object detail. In the embodiment, the image capturing unit 22 may include a microscope image capturing lens. In one embodiment, the plurality of sampled images captured by the image capturing unit 22 are all microscopic images of the surface of the reference object. In the described embodiment, the surface microimage is a surface texture image.
The processor 23 is coupled to the memory 24. The memory 24 stores a plurality of instructions for the processor 23 to execute the sampling method of the sampling device 20 according to the plurality of instructions stored in the memory 24. To complete the sampling method of the present invention, the memory 24 stores a sampling program 240. In the embodiment, the sampling process 240 further includes a positioning module 241 and a sampling module 242. The positioning module 241 is used for enabling the processor 23 to assist the movement control unit 21 to move the image capturing unit 22 to the sampling position. The sampling module 242 is used for enabling the processor 23 to assist the image capturing unit 22 to obtain a desired sampled image at the sampling position.
The transmitting unit 25 may utilize custom protocols or conform to existing or actual standards including, but not limited to, ethernet, IEEE 802.11 or IEEE 802.15 family, wireless USB or telecommunications standards (including, but not limited to, GSM, CDMA2000, TD-SCDMA, wiMAX, 3GPP-LTE or TD-LTE) to thereby transmit the sampling features to other devices than the sampling device 20.
FIG. 3 is a flow chart of a sampling method 300 for sampling a reference object by creating identification data. The sampling method 300 shown in fig. 3 is merely an example, as there are a variety of ways to perform the sampling method 300. The sampling method 300 may be performed using the configuration shown in fig. 1A, 1B, and 2, and please refer to the various elements in fig. 1A, 1B, and 2 in combination while describing the sampling method 300. Each step shown in fig. 3 may represent one or more processes, methods, or subroutines being performed, and the order of each step may be arbitrarily adjusted without departing from the spirit of the sampling method 300.
In step S310, the sampling device 20 obtains a positioning point on the reference object.
In one embodiment, the sampling device 20 can set the positioning point of the reference object by itself. In one embodiment, the sampling device 20 may select the location point on the reference object by itself when the sampling device 20 receives a request to establish the authentication data for the reference object. In one embodiment, the image capturing unit 22 of the sampling device 20 can obtain an overall image for the reference object, and the user can select a position of the reference object as the positioning point from the overall image. In another embodiment, the image capturing unit 22 of the sampling device 20 can obtain an overall image for the reference object, and randomly or through a preset selection manner, select a position on the reference object as the positioning point.
In one embodiment, the sampling device 20 may receive the location point for the reference object from a device other than the sampling device 20. In this embodiment, the sampling device 20 may further include a receiving unit (not shown). The receiving unit may receive data provided to the sampling device 20 from a device other than the sampling device 20. In one embodiment, the receiving unit may be integrated with the transmitting unit 25 as a communication unit.
In the embodiment, when the sampling device 20 receives the request for establishing the authentication data for the reference object, the image capturing unit 22 of the sampling device 20 can obtain an overall image for the reference object, and transmit the overall image to the server 10 through the transmitting unit 25. When the server 10 receives the request for establishing the authentication data for the reference object and the overall image, the server 10 may select the anchor point from the overall image based on a preset selection manner, and transmit the anchor point back to the receiving unit of the sampling device 20. In one embodiment, the server 10 may store the location point in an internal storage 11 in the server 10. In another embodiment, the server 10 may transmit the anchor point to the online storage device 30 for storage.
In step S320, the sampling device 20 obtains a plurality of sampled images along a plurality of sampling directions at the positioning point.
In one embodiment, the sampling device 20 can set the sampling directions of the positioning points by itself. In the embodiment, when the sampling device 20 starts to capture the sampled image for the anchor point, the sampling device 20 may select the multiple sampling directions for the anchor point by itself. In one embodiment, the sampling device 20 further includes a positioning unit (not shown), which may include a positioning device such as a gyroscope. In one embodiment, the sampling device 20 can record the sampling direction corresponding to the sampled image through the positioning unit when the image capturing unit 22 captures the sampled image of the positioning point. In another embodiment, the sampling device 20 may select the plurality of sampling directions in advance based on a predetermined orientation, and the image capturing unit 22 may capture the plurality of sampled images in the plurality of sampling directions by moving the control unit 21.
In one embodiment, the sampling device 20 may receive the plurality of sampling directions from a device other than the sampling device 20. In the embodiment, after the whole image is transmitted to the server 10 by the transmitting unit 25, the server 10 may set the plurality of sampling directions based on a preset orientation mode, and when the positioning point is transmitted back to the receiving unit of the sampling device 20, the set plurality of sampling directions are transmitted back to the receiving unit together. In one embodiment, the server 10 may store the plurality of sampling directions in the internal storage device 11 in the server 10. In another embodiment, the server 10 may send the plurality of sampling directions to the online storage device 30 for storage. In yet another embodiment, the server 10 may not store the plurality of sampling directions in the internal storage device 11 and also not transmit the plurality of sampling directions to the online storage device 30. In other words, the sampling system does not store the plurality of sampling directions.
In one embodiment, the image capturing unit 22 obtains the plurality of sampled images based on the plurality of sampling directions. In the embodiment, the image capturing unit 22 may obtain a plurality of first sampled images based on a first sampling direction of the plurality of sampling directions, and may obtain a plurality of second sampled images based on a second sampling direction of the plurality of sampling directions.
In step S330, the sampling device 20 transmits the sampling characteristics established by the plurality of sampled images for an online device to store.
In one embodiment, the network device may be the internal storage device 11 or the online storage device 30 in the server 10. In one embodiment, the sampling device 20 may transmit the sampling characteristics established by the plurality of sampled images to the server 10 through the transmitting unit 25. In one embodiment, the server 10 may store the sampled feature in an internal storage device 11 in the server 10, or the server 10 may transfer the sampled feature to an online storage device 30 for storage. In another embodiment, although the sampling device 20 also transmits the sampling characteristics established by the plurality of sampled images to the server 10 through the transmitting unit 25, the server 10 obtains a verification characteristic through a preset image processing manner and stores the verification characteristic in the internal storage device 11 in the server 10, or the server 10 may transmit the verification characteristic to the online storage device 30 for storage. In yet another embodiment, the sampling device 20 can directly transmit the sampling characteristics established by the plurality of sampled images to the online storage device 30 through the transmitting unit 25.
In one embodiment, the sampling feature may be a sampled set of the plurality of sampled images. In other words, all of the plurality of sampled images, the plurality of sampling directions and the sampling correspondence therebetween are the sampling features. In one embodiment, when the server 10 receives the plurality of sampled images, the plurality of sampling directions and the sampling correspondence, the server 10 may obtain the verification feature according to a predetermined image processing manner, and store the verification feature in the internal storage device 11 or the online storage device 30. In another embodiment, after the server 10 receives the plurality of sampled images, the plurality of sampled directions and the sampling correspondence, the server 10 directly stores the plurality of sampled images, the plurality of sampled directions and the sampling correspondence in the internal storage device 11 or the online storage device 30, and directly uses the plurality of sampled images, the plurality of sampled directions and the sampling correspondence as the subsequent verification feature.
In one embodiment, the sampling feature may be the verification feature generated based on the plurality of sampling images, the plurality of sampling directions, and the sampling correspondence. In other words, the processor 23 of the sampling device 20 can obtain the verification feature based on the preset image processing mode. In one embodiment, the sampling device 20 may transmit the verification feature to the server 10 via the transmission unit 25. In one embodiment, the server 10 may store the verification feature in an internal storage device 11 in the server 10, or the server 10 may transfer the verification feature to an online storage device 30 for storage. In another embodiment, the sampling device 20 can directly transfer the verification feature to the online storage device 30 for storage via the transfer unit 25.
In one embodiment, if the location point is set by the sampling device 20, the sampling device 20 transmits the sampling feature to the server 10 or the online storage device 30, and the sampling device 20 may transmit the location point to the server 10 or the online storage device 30 for storage.
In an embodiment, the sampling device 20 transmits the object information of the reference object together, and the on-line storage device 30 or the internal storage device 11 stores the object information, so that when an object to be detected is to be detected later, the server 10 can determine whether the server 10 should invoke the positioning point and the verification feature of the reference object as the authentication information for detecting the object to be detected through comparing the object information with the information of the object to be detected.
In one embodiment, since the reference object changes slightly with time, if the reference object is again sub-sampled in the same manner after a long period of time (e.g., 10 years or 20 years), the sub-sampling result is different from the sampling feature or the verification feature stored in the server 10 or the online storage device 30, so that the reference object is considered as a counterfeit. Thus, the sampling device 20 can simultaneously transmit a change message to the server 10 when transmitting the sampling feature to the server 10. In the embodiment, the change information may be the material information or the object information of the reference object, and the server 10 may search for a degradation information of the reference object according to the material information and the object information, so that the possible degradation condition of the reference object may be referred to together when the sampling feature is used for detecting an object to be detected. In the embodiment, the server 10 may store the degradation information in the internal storage device 11 or the online storage device 30. In another embodiment, the server 10 may directly store the change information in the internal storage device 11 or the online storage device 30, so as to search the degradation information according to the material information and the object information when the object to be detected is detected later. In yet another embodiment, the change information may be the degradation information of the reference object. In the embodiment, the sampling device 20 can search the degradation information of the reference object according to the material information or the object information of the reference object by itself, and transmit the degradation information to the server 10 or directly to the online storage device 30. In the embodiment, when the server 10 receives the degradation information, the server 10 may store the degradation information in the internal storage device 11 or the online storage device 30.
The sampling method 300 of the present invention may include, but is not limited to, all of the following embodiments:
in a first embodiment of the sampling method 300 of the present invention, the set of the plurality of sampled images is stored directly as the verification feature in the internal storage device 11 or the online storage device 30, and the location point and the plurality of sampling directions are determined by the sampling device 20. In the embodiment, the sampling device 20 first sets the positioning point and the plurality of sampling directions by itself, obtains the plurality of sampling images based on the positioning point and the plurality of sampling directions, uses the set of the plurality of sampling images as sampling characteristics, and then transmits the set of the plurality of sampling images to the server 10 or directly transmits the set of the plurality of sampling images and the plurality of sampling directions to the online storage device 30 for storage. In the embodiment, when the server 10 receives the plurality of sampled images, the positioning point, the plurality of sampling directions and the sampling correspondence, the server 10 may store the plurality of sampled images, the positioning point, the plurality of sampling directions and the sampling correspondence in the internal storage device 11 or the online storage device 30. In the embodiment, the data transmitted by the sampling device 20 further includes the change information, and the data stored in the internal storage device 11 or the online storage device 30 may further include the material information, the object information or the degradation information.
In a second embodiment of the sampling method 300 of the present invention, the set of the plurality of sampled images is stored directly as the verification feature in the internal storage device 11 or the online storage device 30, and the setpoint or the plurality of sampling directions are determined by the sampling device 20, and the other is determined by the server 10. In one embodiment, the plurality of sampling directions are determined by the sampling device 20, and the positioning point is determined by the server 10. Therefore, when the sampling device 20 receives the request for establishing the authentication data for the reference object, the sampling device 20 can acquire the whole image for the reference object to the server 10, and the server 10 can select the positioning point from the whole image based on the preset selection mode and provide the positioning point to the sampling device 20. Meanwhile, the sampling device 20 may also select the multiple sampling directions based on a predetermined orientation. In another embodiment, the positioning point is determined by the sampling device 20, and the plurality of sampling directions are determined by the server 10. Therefore, when the sampling device 20 receives the request to establish the identification data for the reference object, the sampling device 20 can acquire the whole image for the reference object, and can select the positioning point from the whole image based on a preset selection mode. Meanwhile, the sampling device 20 transmits a sampling request to the server 10, so that the server 10 selects the plurality of sampling directions based on a preset orientation mode to provide to the sampling device 20. In one embodiment, the sampling device 20 further obtains the plurality of sampled images after obtaining the positioning point and the plurality of sampling directions, and uses the set of the plurality of sampled images as the sampling feature, and transmits the set of the plurality of sampled images, together with the positioning point, one of the plurality of sampling directions, and the sampling correspondence to the server 10 or directly to the online storage device 30 for storage. In the embodiment, when the server 10 receives the plurality of sampled images, the server 10 may store the plurality of sampled images, the positioning point, the plurality of sampling directions and the sampling correspondence in the internal storage device 11 or the online storage device 30. In the embodiment, when the online storage device 30 receives the plurality of sampling images directly from the sampling device 20, the online storage device 30 may obtain the positioning point and one of the plurality of sampling directions from the server 10, which are not received by the online storage device 30 from the sampling device 20. In the embodiment, if the anchor point is determined by the server 10, the sampling device 20 may not need to transmit the anchor point additionally. If the plurality of sampling directions are determined by the server 10, the sampling device 20 may not need to transmit the plurality of sampling directions additionally, but only need to transmit the sampling correspondence. In the embodiment, the data transmitted by the sampling device 20 further includes the change information, and the data stored in the internal storage device 11 or the online storage device 30 may further include the material information, the object information or the degradation information.
In a third embodiment of the sampling method 300 of the present invention, the set of the plurality of sampled images is stored directly as the verification feature in the internal storage device 11 or the online storage device 30, and the location point and the plurality of sampling directions are determined by the server 10. In the embodiment, when the sampling device 20 receives the request for establishing the authentication data for the reference object, the sampling device 20 may acquire the whole image for the reference object to the server 10, and the server 10 may select the positioning point from the whole image and additionally determine the sampling directions based on a preset selection mode and an orientation mode, so as to provide the sampling device 20 with the positioning points. The sampling device 20 obtains the plurality of sampled images based on the received positioning points and the plurality of sampling directions, and uses the set of the plurality of sampled images as sampling characteristics, and then transmits the sampling characteristics to the server 10 or directly transmits the sampling characteristics to the online storage device 30 for storage. In the embodiment, when the server 10 receives the plurality of sampling images and the sampling correspondence, the server 10 may store the plurality of sampling images, the positioning point, the plurality of sampling directions and the sampling correspondence in the internal storage device 11 or the online storage device 30. In the embodiment, when the online storage device 30 directly receives the plurality of sampling images and the sampling correspondence from the sampling device 20, the online storage device 30 can obtain the positioning point and the plurality of sampling directions from the server 10. In the embodiment, the data transmitted by the sampling device 20 further includes the change information, and the data stored in the internal storage device 11 or the online storage device 30 may further include the material information, the object information or the degradation information.
In a fourth embodiment of the sampling method 300 of the present invention, the plurality of sampled images are stored in the internal storage device 11 or the online storage device 30 after the verification feature is generated by the predetermined image processing method, and the positioning point and the plurality of sampling directions are determined by the sampling device 20. In the embodiment, the sampling device 20 first sets the positioning point and the plurality of sampling directions by itself, and obtains the plurality of sampled images based on the positioning point and the plurality of sampling directions. In an embodiment, the sampling device 20 generates the verification feature for the anchor point according to the plurality of sampled images, the plurality of sampling directions and the sampling correspondence through a preset image processing mode, and the sampling device 20 uses the verification feature as the sampling feature and transmits the sampling feature to the server 10 together with the anchor point or directly to the online storage device 30 for storage. In the embodiment, when the server 10 receives the verification feature and the anchor point, the server 10 may store the verification feature and the anchor point in the internal storage device 11 or the online storage device 30. In another embodiment, the sampling device 20 directly transmits the positioning point, the plurality of sampling directions, the plurality of sampling images and the sampling correspondence to the server 10, and the server 10 generates the verification feature for the positioning point according to the plurality of sampling images, the plurality of sampling directions and the sampling correspondence by a preset image processing method and stores the verification feature together with the positioning point in the internal storage device 11 or the online storage device 30. In the embodiment, the data transmitted by the sampling device 20 further includes the change information, and the data stored in the internal storage device 11 or the online storage device 30 may further include the material information, the object information or the degradation information.
In a fifth embodiment of the sampling method 300 of the present invention, the plurality of sampled images are stored in the internal storage device 11 or the online storage device 30 after the verification feature is generated by the predetermined image processing method, and the positioning point or the plurality of sampling directions are determined by the sampling device 20, and the other is determined by the server 10. According to the fifth embodiment, the sampling device 20 and the server 10 respectively generate the positioning point and the plurality of sampling directions according to the predetermined selection mode and the overall image of the reference object or according to the predetermined orientation mode, and acquire the plurality of sampling images accordingly. In an embodiment, the sampling device 20 generates the verification feature for the positioning point according to the plurality of sampled images, the plurality of sampling directions and the sampling correspondence through a preset image processing mode, and the sampling device 20 transmits the verification feature as the sampling feature to the server 10 or directly transmits the verification feature to the online storage device 30 for storage. In the embodiment, if the anchor point is determined by the server 10, the sampling device 20 may not need to transmit the anchor point additionally. If the location point is determined by the sampling device 20, the sampling device 20 may additionally transmit the location point to the server 10 or directly to the online storage device 30 for storage. In the embodiment, when the server 10 receives the verification feature, the server 10 may store the verification feature and the location point in the internal storage device 11 or the online storage device 30. In another embodiment, the sampling device 20 directly transmits the plurality of sampled images and the sampling correspondence to the server 10, and the server 10 generates the verification feature for the anchor point according to the plurality of sampled images, the plurality of sampling directions and the sampling correspondence by a preset image processing method and stores the verification feature together with the anchor point in the internal storage device 11 or the online storage device 30. In the embodiment, if the sampling directions are determined by the sampling device 20, the sampling device 20 may additionally transmit the sampling directions to the server 10. If the location point is determined by the sampling device 20, the sampling device 20 may additionally transmit the location point to the server 10 or directly to the online storage device 30 for storage. In the embodiment, the data transmitted by the sampling device 20 further includes the change information, and the data stored in the internal storage device 11 or the online storage device 30 may further include the material information, the object information or the degradation information.
In a sixth embodiment of the sampling method 300 of the present invention, the plurality of sampled images are stored in the internal storage device 11 or the online storage device 30 after the verification feature is generated by the predetermined image processing method, and the positioning point and the plurality of sampling directions are determined by the server 10. Sixth embodiment according to the same manner as the aforementioned third embodiment, the server 10 generates the positioning point according to a predetermined selection manner and the overall image of the reference object received from the sampling device 20, generates the plurality of sampling directions according to a predetermined orientation manner, and provides the positioning point and the plurality of sampling directions to the sampling device 20. In an embodiment, the sampling device 20 generates the verification feature for the positioning point according to the plurality of sampled images, the plurality of sampling directions and the sampling correspondence through a preset image processing mode, and the sampling device 20 transmits the verification feature as the sampling feature to the server 10 or directly transmits the verification feature to the online storage device 30 for storage. In the embodiment, since the anchor point is determined by the server 10, the sampling device 20 may not need to additionally transmit the anchor point. In the embodiment, when the server 10 receives the verification feature, the server 10 may store the verification feature and the location point in the internal storage device 11 or the online storage device 30. In another embodiment, the sampling device 20 directly transmits the plurality of sampled images and the sampling correspondence to the server 10, and the server 10 generates the verification feature for the anchor point according to the plurality of sampled images, the plurality of sampling directions and the sampling correspondence by a preset image processing method and stores the verification feature together with the anchor point in the internal storage device 11 or the online storage device 30. In the embodiment, the data transmitted by the sampling device 20 further includes the change information, and the data stored in the internal storage device 11 or the online storage device 30 may further include the material information, the object information or the degradation information.
In the embodiment, the predetermined selection mode may be a random selection mode, so that the sampling device 20 or the server 10 can randomly select the positioning point from the whole image of the reference object. In the embodiment, the predetermined selection mode may be a surface analysis technology for the sampling device 20 or the server 10 to select the positioning point from the whole image of the reference object. The surface analysis technique can be image analysis techniques such as image complexity analysis, etc., to obtain a high complexity region in the whole image as a location point. In the embodiment, the preset selection manner includes, but is not limited to, the selection manner described above. Any manner of selecting a particular location of the fiducial may be used with the sampling method 300 described herein.
In the embodiment, the predetermined orientation mode may be a random orientation mode, so that the sampling device 20 or the server 10 can arbitrarily select the multiple sampling directions. In the embodiment, if the sampling device 20 or the server 10 obtains the positioning point in the whole image in advance, the plurality of sampling directions can be selected by a surface analysis technique according to the positioning point. The surface analysis technology can set more sampling directions for the part with higher color change degree around the locating point according to the gradient of the pixel value change so as to obtain more complete sampling characteristics around the locating point. In the embodiment, the preset orientation mode includes, but is not limited to, the orientation mode described above. Any manner of selecting a plurality of different sampling directions may be used in the sampling method 300 described herein.
In the embodiment, the preset image processing manner may be a three-dimensional modeling process. In the embodiment, the sampling device 20 or the server 10 may generate the sampling model for the positioning points by a machine learning technique according to the plurality of sampling images, the plurality of sampling directions, and the sampling correspondence between the plurality of sampling images and the plurality of sampling directions. The sampling model may be a three-dimensional sampling model.
In the embodiment, the sampling device 20 may select a plurality of positioning points for the reference object, and obtain a plurality of sampled images in a plurality of sampling directions for each positioning point, so as to obtain different sampling characteristics for different positioning points of the reference object. In one embodiment, the multiple sampling directions of each anchor point may be completely different. In another embodiment, the same set of sampling combinations may be used for each anchor point, and the sampling combinations have multiple sampling directions, so that the multiple sampling directions between each anchor point are identical. In yet another embodiment, the multiple sampling directions of the various anchor points may be partially identical and partially different.
Referring to fig. 4A, fig. 4A is a block diagram of a detection system 401 for detecting an object to be detected according to the present invention. In one embodiment, the detection system 401 includes the server 10 and the detection device 40. The server 10 may include an internal storage device 11 to store in advance the sampling features obtained by the sampling device 20 of fig. 2. In one embodiment, when the detecting device 40 receives a detecting request, the detecting device 40 can be coupled to the server 10 to complete the detecting method according to the present invention. When the detection system 401 completes the detection method, the detection device 40 may stop coupling with the server 10.
Referring to fig. 4B, fig. 4B is a block diagram of another detection system 402 for detecting an object to be detected according to the present invention. In one embodiment, the detection system 402 includes the server 10, the detection device 40, and the online storage device 30. In one embodiment, when the detecting device 40 receives a detecting request, the detecting device 40 may be coupled to the server 10, and the server 10 may be further coupled to the online storage device 30 to complete the detecting method according to the present invention. When the inspection system 402 completes the inspection method, the inspection device 40 may be decoupled from the server 10, and the server 10 may also be decoupled from the online storage device 30. In another embodiment, when the detecting device 40 receives a detecting request, the detecting device 40 can be coupled to the server 10 and the online storage device 30 to complete the detecting method according to the present invention. After the inspection system 402 completes the inspection method, the inspection device 40 may be decoupled from the server 10 and the online storage device 30.
In the illustrated embodiment, the online storage device 30 may be a network data storage or a blockchain storage device. When the online storage device 30 is the blockchain storage device, the possibility of tampering or replacement of the previously stored sampling feature can be reduced by the characteristics of the blockchain. In the embodiment, the online storage device 30 can store records of all transactions, time and results of verification, updated pictures of the reference object.
Referring to fig. 5, fig. 5 is a block diagram of a detection apparatus 40 for detecting an object to be detected according to the present invention. The detection device 40 may be a mobile phone, tablet computer, desktop computer, notebook computer, camera, video recorder or other electronic device, etc., without limitation. The detecting device 40 includes a motion control unit 41, an image capturing unit 42, a processor 43, a storage 44, a transmitting unit 45 and a receiving unit 46.
The movement control unit 41 is used to make the detection means 40 accessible to the movements required for the detection process. In one embodiment, the movement control unit 41 is configured to enable the image capturing unit 42 to complete the movement during the detection of the object. In one embodiment, the movement control unit 41 may be a display screen on the detection device 40 or an automatic movement device coupled to the image capturing unit 42.
When the display screen is used as the movement control unit 41, the display screen may be used to provide a movement instruction to the user to instruct the user to move the image capturing unit 42 to the detection position. In the embodiment, the detection position may be determined by a detection distance and a detection direction. For example, when a photographing distance between the image capturing unit 42 and a positioning point on the object to be detected is equal to the detecting distance, and a photographing direction of the image capturing unit 42 to the positioning point on the object to be detected is equal to the detecting direction, the display screen displays that the image capturing unit 42 has moved to the detecting position, so as to start obtaining the required detected image. When the image capturing distance is not equal to the detection distance, the display screen can instruct the user to further move the image capturing unit 42 to zoom out or zoom in the image capturing distance. When the image capturing direction is not equal to the detection direction, the display screen can instruct the user to further move the image capturing unit 42 to pull up or pull down the image capturing direction to the left, right, or lower the image capturing direction. In another embodiment, the detection position may be determined by the detection direction. In the embodiment, the detection distance can be adjusted by adjusting the focal length of the image capturing unit 42 to ensure that the surface information of the details of the object to be detected is obtained.
When the automatic moving device is used as the movement control unit 41, the automatic moving device may be a mechanical arm or other device capable of moving the image capturing unit 42, and the automatic moving device may receive the detection position indicated by the processor 43 to move the image capturing unit 42 to the detection position. In another embodiment, the detection distance can be adjusted by adjusting the focal length of the image capturing unit 42 to ensure that the surface information of the details of the object to be detected is obtained. Therefore, the automatic moving device can only adjust the detection direction of the moving image capturing unit 42 to obtain the required detection image.
The image capturing unit 42 is used for capturing a plurality of detection images. When the detecting device 40 is to detect the object, the image capturing unit 42 can move to a plurality of different detecting positions, so that the image capturing unit 42 can capture images of the object at the different detecting positions to obtain a plurality of detected images. The image capturing unit 42 may be a Charge-Coupled Device (CCD) image sensor, a Complementary Metal Oxide Semiconductor (CMOS) image sensor or a camera. In one embodiment, the image capturing unit 42 may include a high magnification image capturing lens to obtain the surface information of the details of the object to be detected. In the embodiment, the image capturing unit 42 may include a microscope image capturing lens. In one embodiment, the plurality of detection images captured by the image capturing unit 42 are all microscopic images of the surface of the object to be detected. In the described embodiment, the surface microimage is a surface texture image.
The processor 43 is coupled to the memory 44. The storage 44 stores a plurality of instructions for the processor 43 to execute the detection method of the detection device 40 according to the plurality of instructions stored in the storage 44. To complete the inspection method of the present invention, the memory 44 stores an inspection program 440. In the embodiment, the detection program 440 further includes a positioning module 441 and a detection module 442. The positioning module 441 is configured to enable the processor 43 to assist the movement control unit 41 to move the image capturing unit 42 to the detection position. The detection module 442 is used for enabling the processor 43 to assist the image capturing unit 42 to obtain a desired detection image at the detection position.
The transmitting unit 45 and the receiving unit 46 may utilize custom protocols or conform to existing standards or actual standards, including but not limited to ethernet, IEEE 802.11 or IEEE 802.15 series, wireless USB or telecommunications standards (including but not limited to GSM, CDMA2000, TD-SCDMA, wiMAX, 3GPP-LTE or TD-LTE), to thereby transmit the detection features to other devices than the detection device 40 and to thereby receive the sampling features transmitted from other devices than the detection device 40.
Fig. 6 is a flow chart of a detection method 600 for detecting an object to be detected. The detection method 600 shown in fig. 3 is merely an example, as there are a number of ways to perform the detection method. The detection method 600 may be performed using the configurations shown in fig. 4A, 4B, and 5, and please refer to the various elements of fig. 4A, 4B, and 5 in combination while describing the detection method 600. Each step shown in fig. 6 may represent one or more processes, methods, or subroutines that are performed, and the order of each step may be arbitrarily adjusted without departing from the spirit of the detection method 600 from the scope of the detection method 600 aspects.
In step S610, the detection device 40 transmits a request for detecting the object to be detected.
In an embodiment, before capturing the detected image, the detecting device 40 needs to know which block of the object to be detected should be detected to compare with the verification feature stored in the online storage device 30 or the internal storage device 11, so the detecting device 40 can transmit a request for requesting to detect the location point of the object to be detected, so as to obtain the location point of the object to be detected. In the embodiment, the detection device 40 may transmit the request to the server 10, and the request may include the object information of the object to be detected. Therefore, the server 10 can confirm which reference object's corresponding positioning point and verification feature to be called as the identification data required to be detected by the object information of the object to be detected. In the embodiment, the corresponding positioning point of the reference object is the positioning point of the object to be detected.
In one embodiment, when the corresponding positioning point of the reference object and the verification feature corresponding to the reference object to be detected are stored in the internal storage device 11, the server 10 transmits the corresponding positioning point of the reference object to the detection device 40. In another embodiment, when the corresponding positioning point and the verification feature of the reference object corresponding to the object to be detected are stored in the online storage device 30, the server 10 will first transmit a request for requesting the positioning point of the object to be detected to the online storage device 30, so that the online storage device 30 can confirm which positioning point and verification feature of the reference object to be retrieved as the authentication data required to be detected by the object to be detected, and then the online storage device 30 returns the positioning point and the verification feature to the server 10, and then the server 10 transmits the positioning point to the detection device 40. In the embodiment, the online storage device 30 transmits the change information of the reference object to the server 10.
In step S620, the detection device 40 receives information having the location point of the object to be detected.
In one embodiment, the server 10 transmits the positioning point of the object to be detected to the detecting device 40 for the detecting device 40 to perform subsequent image capturing. In one embodiment, if the verification feature obtained by the server 10 is a plurality of sampling images of the corresponding positioning points of the reference object, the information transmitted by the server 10 provides the respective sampling directions of the plurality of sampling images together, so that the detection device 40 can obtain the detection image according to the same detection direction.
In one embodiment, if the detecting device 40 automatically performs the comparison between the plurality of detected images and the verification feature, the server 10 may transmit the verification feature of the reference object to the detecting device 40. In the embodiment, the server 10 transmits the change information of the reference object to the detecting device 40. In another embodiment, if the comparison is performed by the server 10, the server 10 may not transmit the verification feature of the reference object to the detection device 40, but may directly remain in the server 10 for subsequent comparison.
In step S630, the detection device 40 acquires a plurality of detection images along a plurality of detection directions at the positioning point.
In one embodiment, the detecting device 40 can set the detecting directions of the locating point by itself. In the embodiment, when the detecting device 40 starts to capture the detected image for the positioning point, the detecting device 40 may select the plurality of detecting directions for the positioning point by itself. In an embodiment, the detecting device 40 further includes a positioning unit (not shown), which may include a positioning device such as a gyroscope. In one embodiment, the detecting device 40 may record the detecting direction corresponding to the detected image by the positioning unit when the image capturing unit 42 captures the detected image of the positioning point. In another embodiment, the detecting device 40 may select the plurality of detecting directions in advance based on a predetermined orientation mode, and the moving control unit 41 causes the image capturing unit 42 to capture the plurality of detected images in the plurality of detecting directions.
In one embodiment, the detection device 40 may receive the plurality of detection directions from a device other than the detection device 40. In the embodiment, the detecting device 40 may obtain the plurality of detecting directions through the receiving unit 46. In an embodiment, if the internal storage device 11 and the online storage device 30 do not store the plurality of sampling directions of the corresponding positioning points of the reference object, the server 10 may select the plurality of detection directions in advance based on a predetermined orientation mode and transmit the plurality of detection directions to the receiving unit 46. In another embodiment, if the internal storage device 11 or the online storage device 30 stores the plurality of sampling directions of the corresponding positioning points of the reference object, the plurality of sampling directions can be re-sent to the detection device 40 as the plurality of detection directions of the positioning points of the object to be detected. In one embodiment, if the plurality of sampling directions are stored in the internal storage device 11, the server 10 may transmit the plurality of sampling directions to the receiving unit 46 as the plurality of detection directions. In another embodiment, if the multiple sampling directions are stored in the online storage device 30, the online storage device 30 may directly transmit the multiple sampling directions to the receiving unit 46 as the multiple detection directions, or indirectly transmit the multiple sampling directions to the receiving unit 46 through the server 10 as the multiple detection directions.
In one embodiment, the image capturing unit 42 obtains the plurality of detection images based on the plurality of detection directions. In the embodiment, the image capturing unit 42 may obtain a plurality of first detection images based on a first detection direction of the plurality of detection directions, and may obtain a plurality of second detection images based on a second detection direction of the plurality of detection directions.
In step S640, the detection device 40 obtains detection results according to the plurality of detection images.
In an embodiment, the detection result is generated based on the comparison of the plurality of detection images and the reference object corresponding to the object to be detected. In one embodiment, the detecting device 40 can compare the plurality of detected images with the verification feature of the reference object by itself to obtain the detection result. In another embodiment, the detecting device 40 may generate a detecting feature according to the plurality of detected images, and transmit the detecting feature to the server 10 for the server 10 to compare the detecting feature with the verifying feature, and finally, the server 10 transmits the detecting result to the detecting device 40.
In one embodiment, when the detection result is generated by the detection device 40, the detection device 40 compares the plurality of detection images with the received verification feature. In one embodiment, the verification feature may be a sampling model of the corresponding anchor point of the fiducial. The sampling model may be a three-dimensional sampling model. In one embodiment, the detection device 40 may generate the detection features based on the plurality of detection images, the plurality of detection directions, and a detection correspondence between the plurality of detection images and the plurality of detection directions. In other words, the processor 43 of the detecting device 40 can obtain the detection feature based on the preset image processing mode. In the embodiment, the detection feature is a detection model of the localization point of the object to be detected. The detection model may be a three-dimensional detection model. Therefore, the detecting device 40 can directly compare the similarity between the three-dimensional sampling model and the three-dimensional detecting model to generate the detecting result. In another embodiment, the detection device 40 can compare with the three-dimensional sampling model based on the plurality of detection images, the plurality of detection directions and the detection correspondence. In the embodiment, the detecting device 40 can estimate a plurality of verification images through the three-dimensional sampling model according to the plurality of detection directions, and then compare the similarities of the plurality of verification images and the plurality of detection images one by one according to the detection correspondence, so as to generate a detection result. In an embodiment, the verification feature may be the plurality of sampling images, the plurality of sampling directions and the sampling correspondence of the corresponding positioning points of the reference object. In the embodiment, the plurality of sampling directions may be identical to the plurality of detection directions, so the detection device 40 may compare the similarities of the plurality of sampling images and the plurality of detection effects one by one according to the sampling correspondence and the detection correspondence, so as to generate a detection result.
In one embodiment, the detection device 40 transmits the detection feature to the server 10 when the detection result is generated by the server 10. In one embodiment, the transmitted detection feature may be the three-dimensional sampling model or a combination of the plurality of detection images, the plurality of detection directions, and the detection correspondence. In an embodiment, the detection device 40 may generate the three-dimensional detection model based on the plurality of detection images, the plurality of detection directions and the detection correspondence. In other words, the processor 43 of the detecting device 40 can obtain the three-dimensional detecting model based on the preset image processing mode. Therefore, the detecting device 40 can directly transmit the three-dimensional detecting model to the server 10, and the server 10 compares the similarity between the three-dimensional sampling model and the three-dimensional detecting model to provide the detecting result to the detecting device 40.
In another embodiment, the detection device 40 can transmit the combination of the plurality of detection images, the plurality of detection directions and the detection correspondence to the server 10. In one embodiment, the server 10 may generate the three-dimensional detection model based on the plurality of detection images, the plurality of detection directions and the detection correspondence. In other words, the server 10 can obtain the three-dimensional detection model based on a preset image processing mode. In the embodiment, the server 10 may directly compare the similarity between the three-dimensional sampling model and the three-dimensional detection model to provide the detection result to the detection device 40. In another embodiment, the server 10 may compare the three-dimensional sampling model based on the plurality of detection images, the plurality of detection directions and the detection correspondence. In the embodiment, the server 10 can estimate a plurality of verification images through the three-dimensional sampling model according to the plurality of detection directions, and then compare the similarities between the plurality of verification images and the plurality of detection images one by one according to the detection correspondence, so as to provide the detection result to the detection device 40. In yet another embodiment, the verification feature may be the plurality of sampled images, the plurality of sampling directions, and a sampling correspondence of the corresponding positioning points of the reference object. In the embodiment, the plurality of sampling directions may be identical to the plurality of detection directions, so the server 10 may compare the similarities of the plurality of sampling images and the plurality of detection effects one by one according to the sampling correspondence and the detection correspondence, so as to provide the detection result to the detection device 40. In the embodiment, the plurality of detection directions are the plurality of sampling directions obtained from the online storage device 30 or the internal storage device 11.
In one embodiment, if the detection result shows that the similarity between the reference object and the object to be detected is high, the detection result obtained by the detection device 40 can determine that the object to be detected is the reference object. If the detection result shows that the similarity between the reference object and the object to be detected is low, the detection result obtained by the detection device 40 can determine that the object to be detected is different from the reference object. In another embodiment, if the detection result shows that the similarity between the reference object and the object to be detected is low, and the detection result determines that the object to be detected is different from the reference object, the detection device 40 or the server 10 can update the detection result according to the change information of the reference object. In the embodiment, the detection device 40 or the server 10 can adjust the verification feature according to the change information of the reference object, and update the detection result according to the adjustment feature and the comparison of the plurality of detection images.
In one embodiment, the change information may be the material information, the object information, or the degradation information of the reference object. If the change information is the material information or the object information, the detection device 40 or the server 10 can search the degradation information of the reference object through a network or an internal database according to the material information or the object information. The detection device 40 or the server 10 can estimate the possible degradation degree of the reference object by the degradation information and the time difference between the detection time point and the sampling time point. Therefore, the detection device 40 or the server 10 can obtain the adjustment feature according to the verification feature and the degradation degree.
In one embodiment, the degradation information may be texture degradation information, which may be related to a color change (e.g., fading). If the degradation information is texture degradation information, the actual texture degradation degree can be estimated by the texture degradation information and the time difference. For example, the degree of possible discoloration may be estimated. In another embodiment, the degradation information may be material condition information, which may be related to material decomposition or breakage. If the degradation information is the material condition information, the actual material damage degree can be estimated by the material condition information and the time difference. For example: the possible crack size or crack number can be estimated.
In one embodiment, if the updated detection result shows that the similarity between the reference object and the object to be detected is high, the detection result obtained by the detection device 40 can determine that the object to be detected is the reference object. In the embodiment, the detection device 40 or the server 10 may store the detection features generated by the plurality of detection images in the internal storage device 11 or the online storage device 30. In one embodiment, the detection feature may directly replace the verification feature as a basis for subsequent detection. If the change information is relied on for a long time to predict the transition of the reference object, if the actual change condition of the reference object and the predicted change condition are too different, the future object to be detected is difficult to be accurately identified. Therefore, if the updated detection result shows that the similarity between the reference object and the object to be detected is high, the representative detection feature can show that the reference object is used as the current actual change condition of the object to be detected. The detection device 40 or the server 10 can transmit to store the detection feature, in this way, the actual change condition of the reference object can be appropriately tracked, so as to avoid the actual change condition exceeding the expected predicted change condition, and the correctness of the verification feature or the detection feature on the subsequent verification can be maintained at any time.
In another embodiment, if the updated detection result shows that the similarity between the reference object and the object to be detected is high, the detection device 40 or the server 10 may store the detection feature and the verification feature generated by the plurality of detection images together. In other words, the online storage device 30 or the internal storage device 11 will have the characteristic information of the reference object at two different times. Therefore, if a new object to be detected is compared with the detection feature and the verification feature after a period of time (for example, after 1 year), it is possible to further detect whether the degradation degree of the new object to be detected with respect to the verification feature is greater than the degradation degree of the detection feature, in addition to whether the new object to be detected is similar to the detection feature and the verification feature. If the degradation degree of the new object to be detected is smaller than the degradation degree of the detection feature, the detection system can still warn the new object to be detected even if the new object to be detected is similar to the detection feature and the verification feature. In this way, the accuracy of detection can be further provided by the irrecoverability of the degree of degradation.
The detection method 600 of the present invention may include, but is not limited to, all of the following embodiments:
In a first embodiment of the detection method 600 of the present invention, the comparison of the plurality of detection images and the verification feature is performed by the detection device 40, and the verification feature includes the plurality of sampling images, the plurality of sampling directions, and the sampling correspondence between the plurality of sampling images and the plurality of sampling directions of the corresponding positioning points of the reference object corresponding to the object to be detected. In one embodiment, the detection device 40 transmits a request to detect the object. After receiving the request, the server 10 obtains the corresponding positioning point and the verification feature of the reference object corresponding to the object to be detected from the internal storage device 11 or the online storage device 30, and transmits the plurality of sampling directions in the corresponding positioning point and the verification feature to the detection device 40. In one embodiment, the server 10 may simultaneously transmit the plurality of sampled images and the sampling correspondence to the detecting device 40.
In an embodiment, after the detection device 40 receives the corresponding positioning point and the plurality of sampling directions, the corresponding positioning point and the plurality of sampling directions are directly set as the positioning point and the plurality of detection directions of the object to be detected, respectively, and the plurality of detection images are obtained thereby. In one embodiment, if the detection device 40 has acquired the plurality of sampled images before acquiring the plurality of detected images, the comparison may be directly started. In another embodiment, if the detecting device 40 does not acquire the plurality of sampling images before acquiring the plurality of detecting images, the detecting device 40 may send a request for comparing the reference object to the server 10 again, and the server 10 provides the corresponding relationship between the plurality of sampling images and the sampling to the detecting device 40 after receiving the request. The server 10 can avoid the user from randomly sending out the first request only to cheat all the verification information by means of the second request transmission. In the embodiment, the request sent again by the detecting device 40 may include the plurality of detected images for the server 10 to perform a preliminary verification, and if the server 10 considers that the plurality of detected images are obviously completely unrelated to the sampled images, the server 10 may refuse to provide the plurality of sampled images. If the server 10 considers that there is a slight correlation between the plurality of detected images and the plurality of sampled images, the server 10 may transmit the plurality of sampled images for comparison by the detecting device 40.
In an embodiment, the detecting device 40 compares the plurality of sampled images with the plurality of detected images through the sampling correspondence and the detection correspondence, and if the similarity between the plurality of sampled images and the plurality of detected images is high, the detecting device 40 can determine that the object to be detected is the reference object and transmit the detection result to the server 10. If the similarity between the plurality of sampled images and the plurality of detected images is low, the detecting device 40 may determine that the object to be detected is not the reference object, and transmit the detection result to the server 10. In another embodiment, if the similarity between the plurality of sampled images and the plurality of detected images is low, the detecting device 40 may transmit a request for the change information of the reference object to the server 10. In the embodiment, the server 10 also makes another preliminary verification (e.g. ask for the complete multiple detection images) by asking for the change information to confirm whether the change information is needed by the detection device 40. In an embodiment, the detecting device 40 adjusts the plurality of sampling images according to the change information to generate a plurality of adjustment images as a part of the adjustment feature, and then compares the plurality of adjustment images with the plurality of detection images according to the sampling correspondence and the detection correspondence. If the similarity between the plurality of adjustment images and the plurality of detection images is low, the detection device 40 can determine that the object to be detected is not the reference object, and transmit the detection result to the server 10. If the similarity between the plurality of adjustment images and the plurality of detection images is high, the detection device 40 can update the original detection results to be different, and then identify the object to be detected as the reference object, and transmit the detection results to the server 10. In the embodiment, the detecting device 40 may upload the plurality of detected images to the server 10, so that the server 10 stores the plurality of detected images in the internal storage device 11 or the online storage device 30. In one embodiment, the detected images and the sampled images may be stored in the internal storage device 11 or the online storage device 30 corresponding to the reference object at the same time, so as to be used as the verification feature of the reference object. In another embodiment, the plurality of detection images may directly replace the plurality of sampling images to serve as the verification feature of the reference object. In one embodiment, the server 10 may further confirm the plurality of detected images again when receiving the plurality of detected images, so as to confirm the similarity between the plurality of detected images and the plurality of adjusted images, so as to avoid storing the wrong verification features.
In a second embodiment of the detection method 600 of the present invention, the comparison of the plurality of detection images and the verification feature is performed by the detection device 40, and the verification feature includes a sampling model of the corresponding positioning point of the reference object corresponding to the object to be detected. In one embodiment, the detection device 40 transmits a request to detect the object. After receiving the request, the server 10 obtains the corresponding positioning point and the verification feature of the reference object corresponding to the object to be detected from the internal storage device 11 or the online storage device 30, and transmits the corresponding positioning point to the detection device 40. In one embodiment, the server 10 may transmit the sampling model to the detection device 40 at this time.
In an embodiment, after receiving the corresponding positioning point, the detecting device 40 directly sets the corresponding positioning point as the positioning point of the object to be detected, and selects the plurality of detecting directions by itself, thereby obtaining the plurality of detecting images. In one embodiment, if the detection device 40 has acquired the sampling model before acquiring the plurality of detection images, the comparison may be directly started. In another embodiment, if the detection device 40 does not acquire the sampling model before acquiring the plurality of detection images, the detection device 40 may send a request for comparing the reference object to the server 10 again, and the server 10 provides the sampling model to the detection device 40 again after receiving the request. The server 10 can avoid the user from randomly sending out the first request only to cheat all the verification information by means of the second request transmission. In the embodiment, the request sent again by the detecting device 40 may include the plurality of detected images for the server 10 to perform a preliminary verification, and if the server 10 considers that the plurality of detected images are obviously completely unrelated to the sampling model, the server 10 may refuse to provide the sampling model. If the server 10 considers that there is a slight correlation between the plurality of inspection images and the sampling model, the server 10 may transmit the sampling model for comparison by the inspection device 40.
In an embodiment, the detection device 40 may generate a detection model based on the plurality of detection images, the plurality of detection directions and the detection correspondence, and generate a detection result by comparing the similarity between the detection model and the sampling model. In another embodiment, the detecting device 40 may estimate a plurality of verification images based on the plurality of detection directions and the sampling model, and then compare the similarities of the plurality of verification images and the plurality of detection images one by one according to the detection correspondence between the plurality of detection images and the plurality of detection directions, so as to generate a detection result.
In one embodiment, if the similarity between the plurality of verification images and the plurality of detection images is high or the similarity between the sampling model and the detection model is high, the detection device 40 may determine that the object to be detected is the reference object, and transmit the detection result to the server 10. If the similarity between the plurality of verification images and the plurality of detection images is low or the similarity between the sampling model and the detection model is low, the detection device 40 may determine that the object to be detected is not the reference object, and transmit the detection result to the server 10. In another embodiment, if the similarity between the plurality of verification images and the plurality of detection images is low or the similarity between the sampling model and the detection model is low, the detection device 40 may transmit a request for the change information of the reference object to the server 10. In the embodiment, the server 10 also makes another preliminary verification (e.g. ask for the complete multiple detection images or detection models) by asking for the request of the change information to confirm whether the change information is needed by the detection device 40. In an embodiment, the detecting device 40 adjusts the sampling model according to the change information to generate an adjustment model as the adjustment feature, and then compares the adjustment model with the plurality of detection images or directly compares the adjustment model with the detection model through the detection correspondence. If the similarity between the adjustment model and the plurality of detection images is low or the similarity between the adjustment model and the detection model is low, the detection device 40 may determine that the object to be detected is not the reference object, and transmit the detection result to the server 10. If the similarity between the adjustment model and the plurality of detection images is high or the similarity between the adjustment model and the detection model is high, the detection device 40 can update the original detection results to be different, and then identify the object to be detected as the reference object, and transmit the detection results to the server 10. In the embodiment, the detecting device 40 may upload the plurality of detected images or the detected model to the server 10. If the detecting device 40 uploads the plurality of detection images to the server 10, the detecting device 40 needs to upload the plurality of detection directions and the detection correspondence together for the server 10 to generate the detection model. In the embodiment, the server 10 stores the detection model in the internal storage 11 or the online storage 30. In one embodiment, the detection model and the plurality of sampling models may be stored in the internal storage device 11 or the online storage device 30 corresponding to the reference object at the same time, so as to be used as the verification feature of the reference object together. In another embodiment, the detection model may directly replace the plurality of sampling models as the verification feature of the reference object. In one embodiment, the server 10 may further confirm the detection model again when obtaining the detection model, to confirm the similarity between the detection model and the sampling model, so as to avoid storing the wrong verification feature.
In a third embodiment of the detection method 600 of the present invention, the comparison of the plurality of detection images and the verification feature is performed by the server 10, and the verification feature includes the plurality of sampling images, the plurality of sampling directions, and the sampling correspondence between the plurality of sampling images and the plurality of sampling directions of the corresponding positioning points of the reference object corresponding to the object to be detected. In one embodiment, the detection device 40 transmits a request to detect the object. After receiving the request, the server 10 obtains the corresponding positioning point and the verification feature of the reference object corresponding to the object to be detected from the internal storage device 11 or the online storage device 30, and transmits the plurality of sampling directions in the corresponding positioning point and the verification feature to the detection device 40. In the embodiment, the server 10 does not need to transmit the plurality of sampled images and the sampling correspondence to the detecting device 40, so that there is no concern that the plurality of sampled images and the sampling correspondence are obtained in an improper manner.
In an embodiment, after the detection device 40 receives the corresponding positioning point and the plurality of sampling directions, the corresponding positioning point and the plurality of sampling directions are directly set as the positioning point and the plurality of detection directions of the object to be detected, respectively, and the plurality of detection images are obtained thereby. In an embodiment, the detecting device 40 transmits the plurality of detected images and the detected correspondence between the plurality of detected images and the plurality of detected directions to the server 10 for the server 10 to perform the comparison with the verification feature.
In an embodiment, the server 10 compares the plurality of sampled images with the plurality of detected images according to the sampling correspondence and the detection correspondence, and if the similarity between the plurality of sampled images and the plurality of detected images is high, the server 10 can determine that the object to be detected is the reference object and send back the detection result to the detection device 40. If the similarity between the plurality of sampled images and the plurality of detected images is low, the server 10 may determine that the object to be detected is not the reference object, and transmit the detection result to the detecting device 40. In another embodiment, if the similarity between the plurality of sampled images and the plurality of detected images is low, the server 10 may obtain the change information of the reference object from the online storage device 30 or the internal storage device 11. In the embodiment, the server 10 does not need to transmit the change information to the detecting device 40, and thus there is no fear that the change information is obtained in an improper manner.
In an embodiment, the server 10 adjusts the plurality of sampling images according to the change information to generate a plurality of adjustment images as a part of the adjustment feature, and then compares the plurality of adjustment images with the plurality of detection images according to the sampling correspondence and the detection correspondence. If the similarity between the plurality of adjustment images and the plurality of detection images is low, the server 10 may determine that the object to be detected is not the reference object, and transmit the detection result to the detection device 40. If the similarity between the plurality of adjustment images and the plurality of detection images is high, the server 10 may update the original detection results to be different, identify the object to be detected as the reference object, and transmit the detection results to the detection device 40. In the embodiment, the server 10 may store the plurality of detected images in the internal storage device 11 or the online storage device 30. In one embodiment, the detected images and the sampled images may be stored in the internal storage device 11 or the online storage device 30 corresponding to the reference object at the same time, so as to be used as the verification feature of the reference object. In another embodiment, the plurality of detection images may directly replace the plurality of sampling images to serve as the verification feature of the reference object. In one embodiment, there is no concern that false authentication features are stored, as the server 10 itself has made a complete comparison.
In a fourth embodiment of the inspection method 600, the comparison of the plurality of inspection images and the verification feature is performed by the server 10, and the verification feature includes a sampling model of the corresponding positioning points of the reference object corresponding to the object to be inspected. In one embodiment, the detection device 40 transmits a request to detect the object. After receiving the request, the server 10 obtains the corresponding positioning point and the verification feature of the reference object corresponding to the object to be detected from the internal storage device 11 or the online storage device 30, and transmits the corresponding positioning point to the detection device 40. In the embodiment, the server 10 does not need to transmit the sampling pattern to the detecting device 40, and thus there is no fear that the sampling pattern is obtained in an improper manner.
In an embodiment, after receiving the corresponding positioning point, the detecting device 40 directly sets the corresponding positioning point as the positioning point of the object to be detected, and selects the plurality of detecting directions by itself, thereby obtaining the plurality of detecting images. In an embodiment, the detecting device 40 transmits the plurality of detected images, the plurality of detected directions, and the detected correspondence between the plurality of detected images and the plurality of detected directions to the server 10 for the server 10 to perform the comparison with the verification feature. In another embodiment, the detecting device 40 may generate the detection model according to the plurality of detection images, the plurality of detection directions and the detection correspondence, and transmit the detection model to the server 10 for the server 10 to perform the comparison with the verification feature.
In an embodiment, the server 10 may generate the detection model based on the plurality of detection images, the plurality of detection directions and the detection correspondence, and generate the detection result by comparing the similarity between the detection model and the sampling model. In another embodiment, the server 10 may generate the detection result by comparing the similarity between the sampling model and the received detection model. In yet another embodiment, the server 10 may estimate a plurality of verification images based on the plurality of detection directions and the sampling model, and then compare the similarities of the plurality of verification images and the plurality of detection images one by one according to the detection correspondence between the plurality of detection images and the plurality of detection directions, so as to generate a detection result.
In one embodiment, if the similarity between the plurality of verification images and the plurality of detection images is high or the similarity between the sampling model and the detection model is high, the server 10 may determine that the object to be detected is the reference object, and transmit the detection result to the detection device 40. If the similarity between the plurality of verification images and the plurality of detection images is low or the similarity between the sampling model and the detection model is low, the server 10 may determine that the object to be detected is not the reference object, and transmit the detection result to the detection device 40. In another embodiment, if the similarity between the plurality of verification images and the plurality of detection images is low or the similarity between the sampling model and the detection model is low, the server 10 may obtain the change information of the reference object from the online storage device 30 or the internal storage device 11. In the embodiment, the server 10 does not need to transmit the change information to the detecting device 40, and thus there is no fear that the change information is obtained in an improper manner.
In one embodiment, the server 10 adjusts the sampling model according to the change information to generate an adjustment model as the adjustment feature, and then compares the adjustment model with the plurality of detection images or directly compares the adjustment model with the detection model through the detection correspondence. If the similarity between the adjustment model and the plurality of detection images is low or the similarity between the adjustment model and the detection model is low, the server 10 may determine that the object to be detected is not the reference object and transmit the detection result to the detection device 40. If the similarity between the adjustment model and the plurality of detection images is high or the similarity between the adjustment model and the detection model is high, the server 10 may update the detection results that are different, identify the object to be detected as the reference object, and send the detection results to the detection device 40. In the embodiment, the server 10 may store the detection model in the internal storage device 11 or the online storage device 30. If the server 10 only has the plurality of detection images and the detection correspondence, the server 10 may store the detection model generated based on the plurality of detection images, the plurality of detection directions and the detection correspondence in the internal storage device 11 or the online storage device 30. In one embodiment, the detection model and the sampling model may be stored in the internal storage device 11 or the online storage device 30 corresponding to the reference object at the same time, so as to be used as the verification feature of the reference object. In another embodiment, the detection model may directly replace the sampling model as the verification feature of the reference object. In one embodiment, there is no concern that false authentication features are stored, as the server 10 itself has made a complete comparison.
Fig. 7A and 7B are schematic diagrams showing the image capturing unit 22 capturing the sampled images at different sampling directions at the positioning points of the reference object according to the exemplary embodiment of the present invention. The image capturing unit 22 in fig. 7A is located directly above the positioning point of the reference object, and the image capturing unit 22 in fig. 7B has a deflection angle compared with the image capturing unit 22 in fig. 7A.
In one embodiment, the image capturing unit 22 is opposite to the positioning point of the reference object and captures a first sampled image 760 in a first capturing range 710, the sampling device 20 or the server 10 can take a first sampled area 761 from the first sampled image 760 and divide the first sampled area 761 into a plurality of first sampling blocks 7611-7616, and the sampling device 20 or the server 10 can calculate the values of the plurality of first sampling blocks 7611-7616. In the embodiment, the first sampling area 761 corresponds to an imaging area 711 in the first imaging range 710, and the plurality of first sampling blocks 7611-7616 correspond to the plurality of imaging blocks 7111-7116.
In one embodiment, the image capturing unit 22 is located at the location point of the reference object with the deflection angle, and captures a second sampled image 770 in a second capturing range 720, the sampling device 20 or the server 10 can take out a second sampling area 771 from the second sampled image 770, and divide the second sampling area 771 into a plurality of second sampling blocks 7711-7716, and the sampling device 20 or the server 10 can calculate the values of the plurality of second sampling blocks 7711-7716. In the embodiment, the sampling device 20 or the server 10 makes the first sampling area 761 and the second sampling area 771 correspond to the same imaging area, so that the sampling results of the first sampling area 761 and the second sampling area 771 can be in the same area but different directions. Thus, the sampling device 20 or the server 10 can project the first sampling region 761 onto the second sampling image 770 through the deflection angle to find the second sampling region 771. Therefore, if the first sampling area 761 and the second sampling area 771 are projected onto the first image capturing range 710 and the second image capturing range 720, the same image capturing area 711 is obtained, and the plurality of second sampling blocks 7711-7716 can correspond to the plurality of image capturing blocks 7111-7116.
In the illustrated embodiment, the sampling device 20 or the server 10 may calculate a value for each of the plurality of first sampling blocks 7611-7616 and the plurality of second sampling blocks 7711-7716, respectively, where the plurality of values are representative of the plurality of first sampling blocks 7611-7616 and the plurality of second sampling blocks 7711-7716. In one embodiment, the plurality of values may be an average value, a mode value, a median value, or the like of a plurality of pixels within each block. For example, the values of the first sampling blocks 7611-7616 may be a11, a12, a13, a14, a15, and a16, respectively, and the values of the second sampling blocks 7711-7716 may be a21, a22, a23, a24, a25, and a26, respectively. In one embodiment, the sampling device 20 or the server 10 may obtain more values at other sampling angles, respectively. For example: a31, a32, a33, a34, a35, a36, …, an1, an2, an3, an4, an5, and an6. Thus, the sampling device 20 or the server 10 can obtain a plurality of training sets of image data as follows:
A1=[a11,a21,a31,…,an1]
A2=[a12,a22,a32,…,an2]
A3=[a13,a23,a33,…,an3]
A4=[a14,a24,a34,…,an4]
A5=[a15,a25,a35,…,an5]
A6=[a16,a26,a36,…,an6]
in the embodiment, the sampling device 20 or the server 10 may generate the sampling model by three-dimensionally modeling the plurality of sampling sample point data based on the plurality of image data training sets and through a machine learning technique. In one embodiment, the detection device 40 or the server 10 may also capture a plurality of detection sample points from the plurality of detection images in the same manner, create a plurality of training sets of image data, and generate a detection model by using machine learning technology to model the plurality of detection sample points in a three-dimensional manner.
Fig. 8A-8E are photographs of different sampled images taken on a painting as a reference, according to an exemplary embodiment of the present invention. Referring to fig. 1 and 2, the image capturing unit 22 captures a plurality of sampled images on the reference object, and the sampling device 20 or the server 10 can generate verification features by using the plurality of sampled images. In one embodiment, the verification feature may be directly the plurality of sampled images. In another embodiment, the verification feature may also be a verification feature that is generated by performing analysis training on the plurality of sampled samples by a machine learning technique. In the described embodiment, the verification feature may be a three-dimensional modeled sampling model.
Fig. 8F-8J are photographs of different inspection images taken on a painting as an object to be inspected according to an exemplary embodiment of the present invention. Referring to fig. 4 and fig. 5, a plurality of detection images are captured on the object to be detected by the image capturing unit 42, and the detection device 40 or the server 10 can compare the verification feature with the plurality of detection images.
In one embodiment, if the verification feature is the plurality of sampling images, the detection device 40 or the server 10 can directly compare fig. 8A-8E with fig. 8F-8J, so as to identify that the object to be detected is significantly different from the reference object. In another embodiment, if the verification feature is the sampling model, as shown in fig. 8A-8E, the sampling model should have no obvious roughness and a steeper gradient, so that the detection device 40 or the server 10 can find that the detected image of fig. 8F-8J is significantly different in gradient from the sampling model when the detected image is compared with the sampling model, and can recognize that the object to be detected is different from the reference object.
Referring to fig. 9A and 9B, fig. 9A and 9B are sample images of different stones of the same processing mode in the same sampling direction according to an exemplary embodiment of the present invention. As can be seen from fig. 9A and 9B, even though the sampling direction is the same and the processing is the same, different sampling images may still be generated between different precious stones due to factors such as the color formation or the purity of the precious stones themselves. Therefore, only one gemstone is used as a reference object, and the acquisition of the sampling image is performed in advance to generate the verification feature, so that the detection device and the detection method can be used for determining whether the object to be detected is the gemstone which is subjected to the previous sampling.
Referring to fig. 10A to 10C, fig. 10A is a photograph of antique appliances having the same grain but different. Fig. 10B and 10C are sample images of the different antique devices of fig. 10A sampled in the same sampling direction according to an exemplary embodiment of the present invention. As seen from fig. 10B and 10C, even though the antique appliances have the same texture, the variation of the texture may still be different between the antique appliances due to slight differences in the distribution of the glaze during the firing process of the antique appliances. Therefore, only one antique appliance is used as a reference object, the acquisition of the sampling image is performed in advance to generate the verification feature, and the detection device and the detection method can be used for determining whether the object to be detected is the antique appliance which is subjected to the previous sampling.
The method of the invention is not limited to be used for artwork (including but not limited to painting, carving and the like), precious stones (including but not limited to diamonds, sapphires, jades and the like) or antique objects (including but not limited to pottery, porcelain and the like), and can be used as the basis for the subsequent verification of the object to be detected as long as the verification characteristics (including but not limited to color distribution, grain details, scoring flaws and the like) of the reference object are obtained in advance.
Accordingly, in one aspect the present disclosure provides a method of detecting an object by a detection device, the method comprising: transmitting a request for detecting the object to be detected; receiving the positioning point of the object to be detected; acquiring a plurality of detection images along a plurality of detection directions on the positioning point of the object to be detected; and obtaining a detection result according to the plurality of detection images, wherein the detection result is generated based on the comparison of the plurality of detection images and a corresponding positioning point of a reference object corresponding to the object to be detected.
Another aspect of the present disclosure provides a detection apparatus for detecting an object to be detected, the detection apparatus including: the image capturing unit is used for obtaining a plurality of detection images; the mobile control unit is used for enabling the image capturing unit to move when detecting the object to be detected; the processor is coupled with the image capturing unit and the mobile unit; a transfer unit coupled to the processor; a receiving unit coupled to the processor; and a storage device coupled to the processor and storing a plurality of instructions that when executed by the processor cause the processor to: transmitting a request for detecting the object to be detected through the transmitting unit; receiving the positioning point of the object to be detected through the receiving unit; the image capturing unit obtains a plurality of detection images along a plurality of detection directions on the positioning point of the object to be detected through the mobile control unit; and obtaining a detection result according to the plurality of detection images, wherein the detection result is generated based on the comparison of the plurality of detection images and a corresponding positioning point of a reference object corresponding to the object to be detected.
In some embodiments of the detection apparatus or the detection method thereof, when the detection result indicates that the reference object is different from the object to be detected, an adjustment feature is generated according to a change data of the reference object and a verification feature of the reference object, wherein: the comparison is to compare the plurality of detection images with the verification feature, and the change data of the reference object is from an online storage device; and updating the detection result according to the plurality of detection images and the adjustment characteristic.
In some embodiments of the present invention, when the updated detection result indicates that the reference object is the same as the object to be detected, a detection feature is transmitted to the online storage device to store the detection feature in the online storage device, wherein: the detection feature is generated based on the plurality of detection images, and the detection feature stored in the online storage device corresponds to the reference object.
In some embodiments of the present invention or the method thereof, the detection feature replaces the verification feature of the reference stored in the online storage device.
In some embodiments of the detection apparatus or the detection method thereof, the plurality of detection images are compared with a verification feature of the reference object to generate the detection result; the verification feature is a sampling model reconstructed according to a plurality of sampling images obtained along a plurality of sampling directions at the corresponding positioning points of the reference object, and the detection result is a similarity of the detection image and the sampling model according to the plurality of detection directions.
In some embodiments of the detection apparatus or the detection method thereof, the plurality of detection images are compared with a verification feature of the reference object to generate the detection result; the verification feature is a sampling model reconstructed according to a plurality of sampling images, the plurality of sampling images are obtained along a plurality of sampling directions on the corresponding positioning points of the reference object, the plurality of detection images reconstruct a detection model according to a plurality of detection directions, and the detection result is a similarity of the detection model and the sampling model.
In some embodiments of the detection apparatus or the detection method thereof, the plurality of detection images are compared with a verification feature of the reference object to generate the detection result; the verification feature is a plurality of sampling images acquired along a plurality of sampling directions on the corresponding positioning points of the reference object, the detection result is to compare a similarity between the plurality of detection images and the plurality of sampling images, and the plurality of detection directions are the plurality of sampling directions acquired from an online storage device.
An aspect of the present disclosure provides a method for detecting an object to be detected by a server, the method including: receiving a request for detecting the object to be detected; transmitting the positioning point of the object to be detected; receiving a plurality of detection images acquired along a plurality of detection directions on the positioning point of the object to be detected; and obtaining a detection result according to the plurality of detection images, wherein the detection result is generated based on the comparison of the plurality of detection images and a reference object corresponding to the object to be detected.
An aspect of the present disclosure provides a server for detecting an object to be detected, the server including: a processor; a transfer unit coupled to the processor; a receiving unit coupled to the processor; and a storage device coupled to the processor and storing a plurality of instructions that when executed by the processor cause the processor to: receiving a request for detecting the object to be detected through the receiving unit; transmitting the positioning point of the object to be detected through the transmission unit; receiving a plurality of detection images acquired along a plurality of detection directions on the positioning point of the object to be detected through the receiving unit; and obtaining a detection result according to the plurality of detection images, wherein the detection result is generated based on the comparison of the plurality of detection images and a reference object corresponding to the object to be detected.
In some embodiments of the server or the detection method thereof, when the detection result indicates that the reference object is different from the object to be detected, an adjustment feature is generated according to a change data of the reference object and a verification feature of the reference object, wherein: the comparison is to compare the plurality of detection images with the verification feature, and the change data of the reference object is from an online storage device; and updating the detection result according to the plurality of detection images and the adjustment characteristic.
In some embodiments of the server or the detection method thereof, when the updated detection result indicates that the reference object is the same as the object to be detected, a detection feature is transmitted to the online storage device to store the detection feature in the online storage device, wherein: the detection feature is generated based on the plurality of detection images, and the detection feature stored in the online storage device corresponds to the reference object.
In some embodiments of the server or the method of detecting the same, the detecting feature replaces the verification feature of the reference stored in the online storage device.
In some embodiments of the server or the detection method thereof, the plurality of detection images are compared with a verification feature of the reference object to generate the detection result; the verification feature is a sampling model reconstructed according to a plurality of sampling images obtained along a plurality of sampling directions at the corresponding positioning points of the reference object, and the detection result is a similarity of the detection image and the sampling model according to the plurality of detection directions.
In some embodiments of the server or the detection method thereof, the plurality of detection images are compared with a verification feature of the reference object to generate the detection result; the verification feature is a sampling model reconstructed according to a plurality of sampling images, the plurality of sampling images are obtained along a plurality of sampling directions on the corresponding positioning points of the reference object, the plurality of detection images reconstruct a detection model according to a plurality of detection directions, and the detection result is a similarity of the detection model and the sampling model.
In some embodiments of the server or the detection method thereof, the plurality of detection images are compared with a verification feature of the reference object to generate the detection result; the verification feature is a plurality of sampling images acquired along a plurality of sampling directions on the corresponding positioning points of the reference object, the detection result is to compare a similarity between the plurality of detection images and the plurality of sampling images, and the plurality of detection directions are the plurality of sampling directions acquired from an online storage device.
An aspect of the present disclosure provides a sampling method of establishing authentication data for a reference object by a sampling device, the method comprising: obtaining a location on the reference; acquiring a plurality of sampling images along a plurality of sampling directions on the positioning point of the reference object; generating a sampling feature of the positioning point of the reference object according to the plurality of sampling images and the plurality of sampling directions; and transmitting the sampled feature for storage by a network device.
An aspect of the present disclosure provides a sampling device for sampling a reference object to establish authentication data, the sampling device comprising: the image capturing unit is used for obtaining a plurality of sampling images; the movement control unit is used for enabling the image capturing unit to move when the reference object is sampled; the processor is coupled with the image capturing unit and the mobile unit; a transfer unit coupled to the processor; and a storage device coupled to the processor and storing a plurality of instructions that when executed by the processor cause the processor to: obtaining a location on the reference; the image capturing unit obtains a plurality of sampling images along a plurality of sampling directions on the positioning point of the reference object through the mobile control unit; generating a sampling feature of the positioning point of the reference object according to the plurality of sampling images and the plurality of sampling directions; and transmitting the sampling feature through the transmitting unit for storage by a network device.
In some embodiments of the sampling device or the sampling method thereof, the setpoint set for the reference object is received from a server, wherein: when the server is the network device, the server stores the anchor point, and when the server is an online storage device, the server transmits the anchor point to the online storage device.
In some embodiments of the sampling device or the sampling method thereof, the anchor point is transmitted to the network device when the anchor point is set by the sampling device.
The above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications or substitutions do not depart from the spirit of the invention
In summary, the present invention meets the requirements of the invention, and the patent application is filed by law. However, the foregoing is only one preferred embodiment of the invention, and all equivalent modifications and variations as would be apparent to one skilled in the art are intended to be included within the scope of the following claims.

Claims (10)

  1. A method of detecting an object by a detection device, the method comprising:
    transmitting a request for detecting the object to be detected;
    receiving a positioning point of the object to be detected;
    acquiring a plurality of detection images along a plurality of detection directions on the positioning point of the object to be detected; and
    obtaining a detection result according to the plurality of detection images, wherein the detection result is generated based on a comparison of the plurality of detection images and a corresponding positioning point of a reference object corresponding to the object to be detected.
  2. The method of claim 1, further comprising:
    when the detection result shows that the reference object is different from the object to be detected, generating an adjustment feature according to a change data of the reference object and a verification feature of the reference object, wherein:
    the comparison is performed by comparing the plurality of detected images with the verification feature,
    the change data of the reference object is from an online storage device; and
    and updating the detection result according to the plurality of detection images and the adjustment characteristic.
  3. The method of claim 2, further comprising:
    when the updated detection result indicates that the reference object is the same as the to-be-detected object, transmitting a detection feature to the online storage device so as to store the detection feature in the online storage device, wherein:
    The detection feature is generated based on the plurality of detection images
    The detection feature stored in the online storage device corresponds to the reference object.
  4. The method of claim 3, wherein the detection feature replaces the verification feature of the reference stored in the online storage device.
  5. The method of claim 1, wherein,
    the plurality of detection images are compared with a verification feature of the reference object to generate the detection result,
    the verification feature is a sampling model reconstructed from a plurality of sampled images,
    the plurality of sampling images are taken along a plurality of sampling directions at the corresponding positioning points of the reference object, and
    the detection result is to compare the similarity between the detection image and the sampling model according to the detection directions.
  6. The method of claim 1, wherein,
    the plurality of detection images are compared with a verification feature of the reference object to generate the detection result,
    the verification feature is a sampling model reconstructed from a plurality of sampled images,
    the plurality of sampled images are taken along a plurality of sampling directions at the corresponding anchor points of the fiducial,
    reconstructing a detection model from the plurality of detection images according to the plurality of detection directions, and
    The detection result is a similarity of the detection model and the sampling model.
  7. The method of claim 1, wherein,
    comparing the plurality of detection images with a verification feature of the reference object to generate a detection result;
    the verification feature is a plurality of sampled images taken along a plurality of sampling directions at the corresponding anchor points of the fiducial,
    the detection result is to compare the similarity between the detected images and the sampled images, and
    the plurality of detection directions are the plurality of sampling directions obtained from an online storage device.
  8. A detection apparatus for detecting an object to be detected, the detection apparatus comprising:
    the image capturing unit is used for obtaining a plurality of detection images;
    the mobile control unit is used for enabling the image capturing unit to move when detecting the object to be detected;
    the processor is coupled with the image capturing unit and the mobile unit;
    a transfer unit coupled to the processor;
    a receiving unit coupled to the processor; and
    a storage device coupled to the processor and storing a plurality of instructions that when executed by the processor cause the processor to:
    Transmitting a request for detecting the object to be detected through the transmitting unit;
    receiving the positioning point of the object to be detected through the receiving unit;
    the image capturing unit obtains a plurality of detection images along a plurality of detection directions on the positioning point of the object to be detected through the mobile control unit; and
    obtaining a detection result according to the plurality of detection images, wherein the detection result is generated based on a comparison of the plurality of detection images and a corresponding positioning point of a reference object corresponding to the object to be detected.
  9. The detection apparatus of claim 8, wherein the plurality of instructions, when executed by the processor, further cause the processor to:
    when the detection result shows that the reference object is different from the object to be detected, generating an adjustment feature according to a change data of the reference object and a verification feature of the reference object, wherein:
    the comparison is performed by comparing the plurality of detected images with the verification feature,
    the change data of the reference object is from an online storage device;
    adjusting the detection result according to the plurality of detection images and the adjustment characteristic; and
    when the updated detection result indicates that the reference object is the same as the to-be-detected object, transmitting a detection feature to the online storage device so as to store the detection feature in the online storage device, wherein:
    The detection feature is generated based on the plurality of detection images
    The detection feature stored in the online storage device corresponds to the reference object.
  10. A method for detecting an object to be detected by a server, the method comprising:
    receiving a request for detecting the object to be detected;
    transmitting the positioning point of the object to be detected;
    receiving a plurality of detection images acquired along a plurality of detection directions on the positioning point of the object to be detected; and
    obtaining a detection result according to the plurality of detection images, wherein the detection result is generated based on a comparison of the plurality of detection images and a reference object corresponding to the object to be detected.
CN202280032539.8A 2021-05-04 2022-04-29 Detection method and detection device for object to be detected Pending CN117461051A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US202163183643P 2021-05-04 2021-05-04
US63/183643 2021-05-04
PCT/CN2022/090341 WO2022233277A1 (en) 2021-05-04 2022-04-29 Method for detecting object to be detected, and detection apparatus

Publications (1)

Publication Number Publication Date
CN117461051A true CN117461051A (en) 2024-01-26

Family

ID=83931983

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280032539.8A Pending CN117461051A (en) 2021-05-04 2022-04-29 Detection method and detection device for object to be detected

Country Status (3)

Country Link
CN (1) CN117461051A (en)
TW (1) TWI827030B (en)
WO (1) WO2022233277A1 (en)

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101582162B (en) * 2008-05-14 2012-01-25 上海锦渡信息科技有限公司 Art identifying method based on texture analysis
JP2014006840A (en) * 2012-06-27 2014-01-16 Dainippon Printing Co Ltd Individual identification method, individual identification device, and program
CN103164699B (en) * 2013-04-09 2016-06-15 北京盛世融宝国际艺术品投资有限公司 Painting and calligraphy pieces fidelity identification systems
WO2015082332A1 (en) * 2013-12-02 2015-06-11 Leonhard Kurz Stiftung & Co. Kg Method for authenticating a security element, and optically variable security element
CN104636733A (en) * 2015-02-12 2015-05-20 湖北华中文化产权交易所有限公司 Image characteristic-based painting and calligraphy work authenticating method
CN108292456B (en) * 2015-11-30 2020-11-27 凸版印刷株式会社 Identification method and identification medium
CN106447361A (en) * 2016-10-28 2017-02-22 王友炎 Anti-counterfeiting authentication and recording tracing system and method for paper-medium-based artworks
CN107507090A (en) * 2017-08-23 2017-12-22 重庆艺邦动力科技有限公司 The online guarantee transaction method of the art work and the storage device and mobile terminal for realizing this method
CN112446312A (en) * 2020-11-19 2021-03-05 深圳市中视典数字科技有限公司 Three-dimensional model identification method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
TW202314228A (en) 2023-04-01
TWI827030B (en) 2023-12-21
WO2022233277A1 (en) 2022-11-10

Similar Documents

Publication Publication Date Title
CN111060023B (en) High-precision 3D information acquisition equipment and method
CN109977770B (en) Automatic tracking shooting method, device, system and storage medium
CN110570477B (en) Method, device and storage medium for calibrating relative attitude of camera and rotating shaft
US7554575B2 (en) Fast imaging system calibration
KR101634966B1 (en) Image tracking system using object recognition information based on Virtual Reality, and image tracking method thereof
CN105812746B (en) A kind of object detection method and system
CN110278382B (en) Focusing method, device, electronic equipment and storage medium
CN111091063A (en) Living body detection method, device and system
TW201118791A (en) System and method for obtaining camera parameters from a plurality of images, and computer program products thereof
US20130058537A1 (en) System and method for identifying a region of interest in a digital image
CN109670444B (en) Attitude detection model generation method, attitude detection device, attitude detection equipment and attitude detection medium
JP3859371B2 (en) Picking equipment
CN109657576A (en) Image Acquisition control method, device, storage medium and system
CN110751728B (en) Virtual reality equipment with BIM building model mixed reality function and method
TWI509466B (en) Object recognition method and object recognition apparatus using the same
US20120162387A1 (en) Imaging parameter acquisition apparatus, imaging parameter acquisition method and storage medium
CN113129383A (en) Hand-eye calibration method and device, communication equipment and storage medium
WO2016208404A1 (en) Device and method for processing information, and program
TW202244680A (en) Pose acquisition method, electronic equipment and storage medium
CN113066132A (en) 3D modeling calibration method based on multi-device acquisition
CN111340959B (en) Three-dimensional model seamless texture mapping method based on histogram matching
CN117461051A (en) Detection method and detection device for object to be detected
CN109460077B (en) Automatic tracking method, automatic tracking equipment and automatic tracking system
CN111721201A (en) Temperature detection method
US10148929B2 (en) Method of prompting proper rotation angle for image depth establishing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination