WO2022233277A1 - 一种对一待检测物进行的检测方法与检测装置 - Google Patents
一种对一待检测物进行的检测方法与检测装置 Download PDFInfo
- Publication number
- WO2022233277A1 WO2022233277A1 PCT/CN2022/090341 CN2022090341W WO2022233277A1 WO 2022233277 A1 WO2022233277 A1 WO 2022233277A1 CN 2022090341 W CN2022090341 W CN 2022090341W WO 2022233277 A1 WO2022233277 A1 WO 2022233277A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- detection
- sampling
- images
- detected
- server
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 691
- 238000000034 method Methods 0.000 title claims abstract description 82
- 238000005070 sampling Methods 0.000 claims description 542
- 238000012795 verification Methods 0.000 claims description 154
- 230000008859 change Effects 0.000 claims description 48
- 238000007689 inspection Methods 0.000 claims description 19
- 230000005540 biological transmission Effects 0.000 claims description 13
- 230000006866 deterioration Effects 0.000 description 21
- 239000000463 material Substances 0.000 description 18
- 238000003672 processing method Methods 0.000 description 18
- 238000010586 diagram Methods 0.000 description 15
- 238000005516 engineering process Methods 0.000 description 11
- 238000010187 selection method Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 8
- 230000015556 catabolic process Effects 0.000 description 7
- 238000006731 degradation reaction Methods 0.000 description 7
- 239000010437 gem Substances 0.000 description 7
- 229910001751 gemstone Inorganic materials 0.000 description 6
- 238000003384 imaging method Methods 0.000 description 6
- 238000012360 testing method Methods 0.000 description 5
- 230000007774 longterm Effects 0.000 description 4
- 230000002093 peripheral effect Effects 0.000 description 4
- 238000005211 surface analysis Methods 0.000 description 4
- 230000000295 complement effect Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 238000004643 material aging Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000010422 painting Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 238000012549 training Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 238000005562 fading Methods 0.000 description 2
- 238000005297 material degradation process Methods 0.000 description 2
- 238000001000 micrograph Methods 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 241000579895 Chlorostilbon Species 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 238000000354 decomposition reaction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 239000010432 diamond Substances 0.000 description 1
- 238000010304 firing Methods 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 229910052573 porcelain Inorganic materials 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
Definitions
- the present invention relates to the field of detection articles, in particular, to a method and detection device for detecting an object to be detected.
- the present invention is made in view of the above problems, and provides a method and a detection device for detecting an object to be detected.
- the present invention provides a method for detecting an object to be detected by a detection device.
- the method includes: transmitting a request for detecting the object to be detected; receiving the positioning point of the object to be detected; Obtaining a plurality of detection images along a plurality of detection directions on the positioning point of the object to be detected; and obtaining a detection result according to the plurality of detection images, wherein the detection result is based on the plurality of detection images and the corresponding object to be detected produced by an alignment of a corresponding locating point of a fiducial object.
- the present invention further provides a detection device for detecting an object to be detected.
- the detection device includes: an image capture unit, the image capture unit is used to obtain a plurality of detection images; a mobile control unit , the movement control unit is used to make the image capture unit move when the object to be detected is detected; a processor, the processor is coupled with the image capture unit and the moving unit; a transmission unit, the transmission a unit coupled to the processor; a receiving unit coupled to the processor; and a storage device coupled to the processor and storing a plurality of instructions for execution by the processor
- the processor transmits a request for detecting the object to be detected through the transmitting unit; receives the positioning point of the object to be detected through the receiving unit; causes the image capture unit to detect the object to be detected through the movement control unit On the positioning point, the plurality of detection images are obtained along a plurality of detection directions; and a detection result is obtained according to the plurality of detection images, wherein the detection result is based on the plurality of detection images
- the present invention also provides a method for detecting an object to be detected by a server, the method includes: receiving a request for detecting the object to be detected; sending the positioning point of the object to be detected; A plurality of detection images obtained along a plurality of detection directions on the positioning point of the object to be detected; and a detection result is obtained according to the plurality of detection images, wherein the detection result is based on the plurality of detection images and the corresponding detection images. Generated by an alignment of a reference to the test substance.
- the detection system saves the information of the reference object, and thereby limits the sampling point (ie: the corresponding positioning point) of the reference object.
- the detection device must first upload the information of the object to be detected, so that the detection system can confirm the corresponding reference object, and then can obtain the positioning point of the object to be detected that needs to capture the image.
- this method also increases the difficulty of passing the detection of counterfeit products because it is difficult to directly obtain the sampling point of the reference object.
- the information of the reference object is stored in the blockchain storage device, the security and immutability of the information can be further improved.
- FIG. 1A is a block diagram of a sampling system of the present invention for establishing identification data for fiducials and sampling.
- FIG. 1B is a block diagram of another sampling system of the present invention for establishing identification data for fiducials.
- FIG. 2 is a block diagram of a sampling device for establishing identification data for a reference object and sampling in accordance with the present invention.
- FIG. 3 is a flow chart of a sampling method for establishing identification data for a reference object according to the present invention.
- FIG. 4A is a block diagram of a detection system for detecting objects to be detected according to the present invention.
- FIG. 4B is a block diagram of another detection system for detecting objects to be detected according to the present invention.
- FIG. 5 is a block diagram of a detection device for detecting an object to be detected according to the present invention.
- FIG. 6 is a flow chart of a detection method for detecting an object to be detected according to the present invention.
- FIG. 7A shows a schematic diagram of an image capturing unit capturing a sampling image in a sampling direction on a positioning point of a reference object according to an exemplary embodiment of the present invention.
- FIG. 7B shows a schematic diagram of an image capturing unit capturing a sampling image in another sampling direction on the positioning point of the fiducial object according to an exemplary embodiment of the present invention.
- FIGS. 8A-8E are photographs of different sampled images captured on a fiducial object according to an exemplary embodiment of the present invention.
- 8F-8J are photographs of different inspection images captured on an object to be inspected according to an exemplary embodiment of the present invention.
- 9A and 9B are sampling images of different gemstones sampled in the same processing method in the same sampling direction according to an exemplary embodiment of the present invention.
- Figure 10A is a photograph of antique utensils with the same texture but not the same.
- FIG. 10B and 10C are sampled images of different antique utensils shown in FIG. 10A sampled in the same sampling direction according to an exemplary embodiment of the present invention.
- Coupled is defined as connected, whether directly or indirectly through intervening elements, and is not necessarily limited to physical connections.
- comprising means “including but not limited to,” which explicitly indicates the open inclusion or relationship of the stated combinations, groups, series, and equivalents. .
- any one or more of the disclosed encoding functions or algorithms described in this disclosure may be implemented by hardware, software, or a combination of software and hardware.
- the functions described may correspond to modules, which may be software, hardware, firmware, or any combination thereof.
- Software implementations may include computer-executable instructions stored on a computer-readable medium, such as a memory or other type of storage device.
- a microprocessor or general purpose computers with communications processing capabilities can be programmed with executable instructions and perform one or more of the disclosed functions or algorithms.
- a microprocessor or general-purpose computer may be formed from application specific integrated circuits (ASICs), programmable logic arrays, and/or using one or more digital signal processors (DSPs).
- ASICs application specific integrated circuits
- DSPs digital signal processors
- Computer readable media include but are not limited to random access memory (RAM), read only memory (ROM), erasable programmable read-only memory (EPROM), Electrically erasable programmable read-only memory (EEPROM), flash memory, compact disc read-only memory (CD ROM), magnetic cartridges, magnetic tapes, magnetic disk storage devices or capable of Any other equivalent medium storing computer readable instructions.
- RAM random access memory
- ROM read only memory
- EPROM erasable programmable read-only memory
- EEPROM Electrically erasable programmable read-only memory
- CD ROM compact disc read-only memory
- magnetic cartridges magnetic tapes
- magnetic disk storage devices capable of Any other equivalent medium storing computer readable instructions.
- the coupling between the devices of the present invention can adopt customized protocols or follow existing standards or de facto standards, including but not limited to Ethernet, IEEE 802.11 or IEEE 802.15 series, wireless USB or telecommunication standards (including but not limited to GSM (Global System for Mobile Communications, Global System for Mobile Communications), CDMA2000 (Code Division Multiple Access, Code Division Multiple Access technology), TD-SCDMA (Time Division-Synchronization Code Division Multiple Access, Time Division Synchronization Code Division Multiple Access technology), WiMAX ( World Interoperability for Microwave Access), 3GPP-LTE (Long Term Evolution, long term evolution technology) or TD-LTE (Time Division Long Term Evolution, time division long term evolution technology)).
- GSM Global System for Mobile Communications
- CDMA2000 Code Division Multiple Access, Code Division Multiple Access technology
- TD-SCDMA Time Division-Synchronization Code Division Multiple Access, Time Division Synchronization Code Division Multiple Access technology
- WiMAX World Interoperability for Microwave Access
- 3GPP-LTE Long Term Evolution,
- each apparatus of the present invention may each include any device configured to transmit and/or store data to and receive data from a computer-readable medium.
- each apparatus of the present invention may include a computer system interface that may enable data to be stored on or received from a storage device.
- each device of the present invention may include a chipset, a dedicated bus protocol, a universal serial bus (Universal Serial Bus) that support the peripheral component interconnect (Peripheral Component Interconnect, PCI) and high-speed peripheral component interconnect (Peripheral Component Interconnect Express, PCIe) bus protocols Serial Bus, USB) protocol, I2C, or any other logical and physical structure that can be used to interconnect peer devices.
- PCI peripheral component interconnect
- PCIe peripheral component interconnect Express
- FIG. 1A is a block diagram of a sampling system 101 provided by the present invention for establishing identification data for a reference object and sampling.
- the sampling system 101 includes a server 10 and a sampling device 20 .
- the server 10 may include an internal storage device 11 to store the sampled results.
- the sampling device 20 when the sampling device 20 receives a sampling request, the sampling device 20 can be coupled to the server 10 to complete the sampling method of the present invention. After the sampling system 101 completes the sampling method, the sampling device 20 can be decoupled from the server 10 .
- FIG. 1B is a block diagram of another sampling system 102 provided by the present invention for establishing identification data for fiducial objects and sampling.
- the sampling system 102 includes a server 10 , a sampling device 20 and an online storage device 30 .
- the sampling device 20 when the sampling device 20 receives a sampling request, the sampling device 20 can be coupled to the server 10, and the server 10 can be further coupled to the online storage device 30 to complete the sampling method of the present invention.
- the sampling device 20 can be decoupled from the server 10 , and the server 10 can also be decoupled from the online storage device 30 .
- the sampling device 20 when the sampling device 20 receives a sampling request, the sampling device 20 can be coupled with the server 10 and the online storage device 30 to complete the sampling method of the present invention. After the sampling system 102 completes the sampling method, the sampling device 20 can be decoupled from the server 10 and the online storage device 30 .
- the online storage device 30 may be a network data storage device or a blockchain storage device.
- the online storage device 30 is the blockchain storage device, the possibility of tampering or replacement of the sampling results can be reduced through the characteristics of the blockchain.
- the online storage device 30 can store the records of all transactions of the reference object, the time and result of the inspection, and the updated pictures.
- FIG. 2 is a block diagram of a sampling device 20 provided by the present invention for establishing identification data for a reference object and sampling.
- the sampling device 20 may be a mobile phone, a tablet computer, a desktop computer, a notebook computer, a camera, a video recorder, or other electronic devices, etc., which are not limited herein.
- the sampling device 20 includes a movement control unit 21 , an image capture unit 22 , a processor 23 , a storage 24 and a transmission unit 25 .
- the movement control unit 21 is used to enable the sampling device 20 to achieve the movement required for the sampling process. In one embodiment, the movement control unit 21 is used to make the image capture unit 22 complete the movement when sampling the reference object. In one embodiment, the movement control unit 21 can be a display screen on the sampling device 20 or an automatic movement device coupled with the image capture unit 22 .
- the display screen can be used to provide a movement instruction to the user to instruct the user to move the image capture unit 22 to the sampling position.
- the sampling position can be determined by a sampling distance and a sampling direction. For example, when a camera distance between the image capture unit 22 and a positioning point on the reference object is equal to the sampling distance, and a camera direction of the positioning point on the reference object by the image capture unit 22 is equal to In the sampling direction, the display screen shows that the image capturing unit 22 has moved to the sampling position, and can start to obtain the desired sampling image. When the camera distance is not equal to the sampling distance, the display screen may instruct the user to further move the image capturing unit 22 to zoom out or zoom in on the camera distance.
- the display screen may instruct the user to further move the image capturing unit 22 to left, right, pull up or down the camera direction.
- the sampling position may be determined by the sampling direction.
- the sampling distance can be adjusted by adjusting the focal length of the image capturing unit 22 to ensure that the surface information of the details of the reference object is obtained.
- the automatic moving device can be a robotic arm or other device that can move the image capture unit 22 , and the automatic moving device can receive the sampling position indicated by the processor 23 to The image capturing unit 22 is moved to the sampling position.
- the sampling distance can be ensured by adjusting the focal length of the image capturing unit 22 to obtain the surface information of the details of the reference object. Therefore, the automatic moving device can only adjust the sampling direction of the moving image capturing unit 22 to obtain the desired sampling image.
- the image capturing unit 22 is used for acquiring a plurality of sampled images.
- the image capture unit 22 can move to a plurality of different sampling positions, so that the image capture unit 22 can capture the reference object at different sampling positions. to obtain the plurality of sampled images.
- the image capturing unit 22 can be a charge-coupled device CCD (Charge-Coupled Device) image sensor, a complementary metal-oxide-semiconductor CMOS (Complementary Metal-Oxide-Semiconductor) image sensor or a camera.
- the image capturing unit 22 may include a high-magnification image capturing lens to obtain the surface information of the details of the fiducial object.
- the image capturing unit 22 may include a microscope image capturing lens.
- the plurality of sampled images captured by the image capturing unit 22 are all surface microscopic images of the reference object.
- the surface microscopic image is a surface texture image.
- the processor 23 and the storage 24 are coupled to each other.
- the storage 24 stores a plurality of instructions for the processor 23 to execute the sampling method of the sampling device 20 according to the plurality of instructions stored in the storage 24 .
- the storage 24 stores a sampling procedure 240 .
- the sampling program 240 further includes a positioning module 241 and a sampling module 242 .
- the positioning module 241 is used for enabling the processor 23 to assist the moving control unit 21 to move the image capturing unit 22 to the sampling position.
- the sampling module 242 is used for enabling the processor 23 to assist the image capturing unit 22 to obtain the required sampling image at the sampling position.
- Transmission unit 25 may utilize custom protocols or follow existing or de facto standards, including but not limited to Ethernet, IEEE 802.11 or IEEE 802.15 series, wireless USB or telecommunications standards (including but not limited to GSM, CDMA2000, TD-SCDMA, WiMAX , 3GPP-LTE or TD-LTE), so as to transmit the sampling feature to other devices than the sampling device 20 .
- Ethernet IEEE 802.11 or IEEE 802.15 series
- wireless USB or telecommunications standards including but not limited to GSM, CDMA2000, TD-SCDMA, WiMAX , 3GPP-LTE or TD-LTE
- FIG. 3 is a flow chart of a sampling method 300 provided by the present invention for establishing identification data for a reference object and sampling.
- the sampling method 300 shown in FIG. 3 is merely an example, as there are many ways to perform the described sampling method 300 .
- the sampling method 300 may be performed using the configurations shown in FIGS. 1A , 1B and 2 , and while the sampling method 300 is described, please incorporate reference to the various elements in FIGS. 1A , 1B and 2 .
- Each step shown in FIG. 3 can represent one or more processes, methods or subroutines to be executed, and the sequence of each step can be adjusted arbitrarily, and the essence of the sampling method 300 is not deviated from the technical solution of the sampling method 300 range.
- step S310 the sampling device 20 obtains the positioning point on the reference object.
- the sampling device 20 can set the positioning point of the reference object by itself. In one embodiment, when the sampling device 20 receives a request to establish the identification data for the reference object, the sampling device 20 can select the positioning point on the reference object by itself. In one embodiment, the image capturing unit 22 of the sampling device 20 can obtain an overall image of the reference object, and the user can select a position of the reference object from the overall image as the positioning point. In another embodiment, the image capture unit 22 of the sampling device 20 can obtain an overall image of the reference object, and randomly or through a predetermined selection method, select a position on the reference object as the positioning point.
- the sampling device 20 may receive the positioning point for the fiducial from a device other than the sampling device 20 .
- the sampling device 20 may further include a receiving unit (not shown).
- the receiving unit can receive data supplied to the sampling device 20 from devices other than the sampling device 20 .
- the receiving unit and the transmitting unit 25 can be integrated into a communication unit.
- the image capturing unit 22 of the sampling device 20 can obtain an overall image for the reference object, and transmit the image through the transmission unit 25 The whole image is transmitted to the server 10 .
- the server 10 receives the requirement for establishing the identification data for the reference object and the overall image, the server 10 can select the positioning point from the overall image based on a preset selection method, and send the positioning point back to the The receiving unit of the sampling device 20 .
- the server 10 may store the location point in the internal storage device 11 in the server 10 .
- the server 10 may transmit the location point to the online storage device 30 for storage.
- step S320 the sampling device 20 acquires a plurality of sampled images along a plurality of sampling directions at the positioning point.
- the sampling device 20 can set the sampling directions of the positioning point by itself. In the embodiment, when the sampling device 20 starts to capture the sampled image for the positioning point, the sampling device 20 can select the plurality of sampling directions for the positioning point by itself. In one embodiment, the sampling device 20 further includes a positioning unit (not shown in the figure), and the positioning unit may include a positioning device such as a gyroscope. In one embodiment, when the image capturing unit 22 captures the sampling image of the positioning point, the sampling device 20 can also record the sampling direction corresponding to the sampling image through the positioning unit. In another embodiment, the sampling device 20 may select the plurality of sampling directions in advance based on a preset orientation, and move the control unit 21 to make the image capturing unit 22 perform the plurality of samplings in the plurality of sampling directions Capture of images.
- sampling device 20 may receive the plurality of sampling directions from devices other than sampling device 20 .
- the server 10 can set the sampling directions based on the preset orientation, and return them at the positioning point When sent to the receiving unit of the sampling device 20, the set sampling directions are sent back to the receiving unit together.
- the server 10 may store the plurality of sampling directions in the internal storage device 11 of the server 10 .
- the server 10 may transmit the plurality of sampling directions to the online storage device 30 for storage.
- the server 10 may not store the plurality of sampling directions in the internal storage device 11 and also not transmit them to the online storage device 30 . In other words, the sampling system does not store the plurality of sampling directions.
- the image capturing unit 22 obtains the plurality of sampled images based on the plurality of sampling directions.
- the image capturing unit 22 can obtain a plurality of first sampling images based on a first sampling direction among the plurality of sampling directions, and can obtain a plurality of first sampling images based on a second sampling direction among the plurality of sampling directions , to obtain a plurality of second sampling images.
- step S330 the sampling device 20 transmits the sampling features created by the plurality of sampled images for storage by an online device.
- the network device may be the internal storage device 11 or the online storage device 30 in the server 10 .
- the sampling device 20 can transmit the sampling features established by the plurality of sampled images to the server 10 through the transmitting unit 25 .
- the server 10 may store the sampled features in the internal storage device 11 in the server 10 , or the server 10 may transmit the sampled features to the online storage device 30 for storage.
- the sampling device 20 also transmits the sampling features established by the plurality of sampled images to the server 10 through the transmitting unit 25, the server 10 obtains a verification feature through a preset image processing method, and sends the sampling feature to the server 10.
- the verification feature is stored in the internal storage device 11 in the server 10, or the server 10 may transmit the verification feature to the online storage device 30 for storage.
- the sampling device 20 can directly transmit the sampling features established by the plurality of sampled images to the online storage device 30 through the transmitting unit 25 .
- the sampled feature may be a sampled set of the plurality of sampled images.
- all of the plurality of sampling images, the plurality of sampling directions, and the sampling correspondence between them are the sampling features.
- the server 10 when the server 10 receives the plurality of sampled images, the plurality of sampling directions, and the sampling correspondence, the server 10 can obtain the verification feature according to a preset image processing method, and perform the verification feature. Stored in the internal storage device 11 or the online storage device 30 .
- the server 10 after the server 10 receives the plurality of sampling images, the plurality of sampling directions, and the sampling correspondence, the server 10 directly receives the plurality of sampling images, the plurality of sampling directions, and the sampling correspondence It is stored in the internal storage device 11 or the online storage device 30, and the plurality of sampling images, the plurality of sampling directions and the sampling correspondence are directly used as subsequent verification features.
- the sampling feature may be the verification feature generated based on the plurality of sampled images, the plurality of sampling directions, and the sampling correspondence.
- the processor 23 of the sampling device 20 can automatically obtain the verification feature based on the predetermined image processing method.
- the sampling device 20 may transmit the verification feature to the server 10 via the transmitting unit 25 .
- the server 10 may store the verification feature in the internal storage device 11 in the server 10 , or the server 10 may transmit the verification feature to the online storage device 30 for storage.
- the sampling device 20 can directly transmit the verification feature to the online storage device 30 through the transmitting unit 25 for storage.
- the sampling device 20 can also transmit the positioning point to the server 10 or the online storage device 30.
- Server 10 or online storage device 30 to store.
- the sampling device 20 also transmits the object information of the reference object, and the online storage device 30 or the internal storage device 11 stores the object information, so that when a to-be-detected object is to be detected later, the server 10 can The object information is compared with the information of the object to be detected to determine whether the server 10 should retrieve the positioning point and the verification feature of the reference object as identification information for detecting the object to be detected.
- the fiducial since the fiducial changes slightly over time, if a long period of time (for example, 10 years or 20 years) passes, the fiducial will perform the same two steps again. During the second sampling, the second sampling result will be different from the sampling feature or the verification feature stored in the server 10 or the online storage device 30, resulting in the reference object being identified as a fake. Therefore, the sampling device 20 can transmit a change message to the server 10 at the same time when transmitting the sampling feature to the server 10 .
- a long period of time for example, 10 years or 20 years
- the change information may be the material information or the object information of the reference object, and the server 10 can search for a deterioration information of the reference object according to the material information and the object information, so as to be used for the subsequent sampling feature When detecting an object to be detected, the possible deterioration status of the reference object can be considered together.
- the server 10 may store the degradation information in the internal storage device 11 or the online storage device 30 .
- the server 10 may directly store the change information in the internal storage device 11 or the online storage device 30 for subsequent detection of the to-be-detected object, and then search for the deterioration information according to the material information and the object information .
- the change information may be the deterioration information of the reference object.
- the sampling device 20 can search for the degradation information of the reference object according to the material information or the object information of the reference object, and transmit the degradation information to the server 10 or directly to the online storage device 30 .
- the server 10 when the server 10 receives the degradation information, the server 10 can store the degradation information in the internal storage device 11 or the online storage device 30 .
- sampling method 300 of the present invention may include at least but not limited to all the following embodiments:
- the set of the plurality of sampled images is directly stored in the internal storage device 11 or the online storage device 30 as the verification feature, and the positioning point and the plurality of sampling directions All are determined by the sampling device 20 .
- the sampling device 20 first sets the positioning point and the sampling directions by itself, obtains the sampling images based on the positioning point and the sampling directions, and calculates the sampling images of the sampling images.
- the set is used as a sampling feature, and is then sent to the server 10 or directly to the online storage device 30 for storage together with the positioning point, the sampling directions, and the sampling correspondence between the sampling images and the sampling directions.
- the server 10 when the server 10 receives the plurality of sampled images, the positioning point, the plurality of sampling directions, and the sampling correspondence, the server 10 can obtain the plurality of sampled images, the positioning point, the plurality of sampling directions, and the Each sampling direction and the sampling corresponding relationship are stored in the internal storage device 11 or the online storage device 30 .
- the data transmitted by the sampling device 20 further includes the change information
- the data stored in the internal storage device 11 or the online storage device 30 may further include the material information, the object information or the deterioration information.
- the set of the plurality of sampled images is directly stored in the internal storage device 11 or the online storage device 30 as the verification feature, and the location point or the plurality of sampling directions It is determined by the sampling device 20 , and the other is determined by the server 10 .
- the sampling directions are determined by the sampling device 20
- the positioning point is determined by the server 10 . Therefore, when the sampling device 20 receives a request to establish the identification data for the reference object, the sampling device 20 can obtain the overall image for the reference object to the server 10, and the server 10 can select the reference object from the The anchor point is selected from the overall image and provided to the sampling device 20 .
- the sampling device 20 can also select the plurality of sampling directions based on the preset orientation.
- the positioning point is determined by the sampling device 20
- the plurality of sampling directions are determined by the server 10 . Therefore, when the sampling device 20 receives a request to establish the identification data for the reference object, the sampling device 20 can obtain the overall image for the reference object, and can select the overall image based on a preset selection method. location point. At the same time, the sampling device 20 transmits the sampling request to the server 10 , so that the server 10 selects the plurality of sampling directions based on the preset orientation to provide the sampling device 20 .
- the sampling device 20 after obtaining the positioning point and the sampling directions, the sampling device 20 further obtains the sampling images, and uses the set of the sampling images as the sampling feature, together with the positioning point and the sampling direction.
- One of the plurality of sampling directions and the corresponding relationship of the sampling are transmitted to the server 10 or directly transmitted to the online storage device 30 for storage.
- the server 10 may store the plurality of sampled images, the positioning point, the plurality of sampling directions and the sampling correspondence in the internal storage device 11 or Online storage device 30 .
- the online storage device 30 when the online storage device 30 directly receives the plurality of sampled images from the sampling device 20 , the online storage device 30 can obtain the positioning points that the online storage device 30 has not received from the sampling device 20 from the server 10 . with one of the plurality of sampling directions. In the embodiment, if the positioning point is determined by the server 10, the sampling device 20 does not need to additionally transmit the positioning point. If the plurality of sampling directions are determined by the server 10, the sampling device 20 does not need to additionally transmit the plurality of sampling directions, but only needs to transmit the sampling correspondence.
- the data transmitted by the sampling device 20 further includes the change information, and the data stored in the internal storage device 11 or the online storage device 30 may further include the material information, the object information or the deterioration information.
- the set of the plurality of sampled images is directly stored in the internal storage device 11 or the online storage device 30 as the verification feature, and the positioning point and the plurality of sampling directions All are determined by the server 10 .
- the sampling device 20 when the sampling device 20 receives a request to establish the identification data for the reference object, the sampling device 20 can obtain the overall image for the reference object to the server 10, and the server 10 can obtain the overall image based on the preset For the selection method and the orientation method, the positioning point is selected from the overall image, and the plurality of sampling directions are additionally determined to be provided to the sampling device 20 .
- the sampling device 20 obtains the plurality of sampled images based on the received positioning point and the plurality of sampling directions, uses the set of the plurality of sampled images as a sampling feature, and then transmits the sampling corresponding relationship to the server 10 or directly. It is sent to the online storage device 30 for storage.
- the server 10 may store the sampling images, the positioning point, the sampling directions and the sampling correspondence in the Internal storage device 11 or online storage device 30 .
- the online storage device 30 directly receives the sampling images and the sampling correspondence from the sampling device 20 , the online storage device 30 can obtain the positioning point and the sampling directions from the server 10 .
- the data transmitted by the sampling device 20 further includes the change information
- the data stored in the internal storage device 11 or the online storage device 30 may further include the material information, the object information or the deterioration information.
- the plurality of sampled images are stored in the internal storage device 11 or the online storage device 30 after the verification feature is generated by a predetermined image processing method, and the positioning point and the A plurality of sampling directions are determined by the sampling device 20 .
- the sampling device 20 first sets the positioning point and the sampling directions by itself, and obtains the sampling images based on the positioning point and the sampling directions.
- the sampling device 20 generates the verification feature for the positioning point through a preset image processing method according to the plurality of sampling images, the plurality of sampling directions and the sampling correspondence, and the sampling device 20 stores the verification feature for the location point.
- the verification feature is used as the sampling feature, and is transmitted to the server 10 or directly to the online storage device 30 together with the positioning point for storage.
- the server 10 can store the verification feature and the positioning point in the internal storage device 11 or the online storage device 30 .
- the sampling device 20 directly transmits the positioning point, the sampling directions, the sampling images and the sampling correspondence to the server 10, and the server 10 determines the sampling images, the sampling The direction and the corresponding relationship of the sampling, the verification feature for the positioning point is generated by a preset image processing method, and stored in the internal storage device 11 or the online storage device 30 together with the positioning point.
- the data transmitted by the sampling device 20 further includes the change information
- the data stored in the internal storage device 11 or the online storage device 30 may further include the material information, the object information or the deterioration information.
- the verification feature is stored in the internal storage device 11 or the online storage device 30, and the positioning point or The plurality of sampling directions are determined by the sampling device 20 , and the other are determined by the server 10 .
- the sampling device 20 and the server 10 generate the positioning point and the positioning point according to a preset selection method and the overall image of the reference object or according to a preset orientation method. A plurality of sampling directions are obtained, and the plurality of sampling images are obtained accordingly.
- the sampling device 20 generates the verification feature for the positioning point through a preset image processing method according to the plurality of sampling images, the plurality of sampling directions and the sampling correspondence, and the sampling device 20 stores the verification feature for the location point.
- the verification feature is transmitted to the server 10 or directly to the online storage device 30 for storage as the sampled feature.
- the sampling device 20 may additionally transmit the positioning point to the server 10 or directly transmit the positioning point to the online storage device 30 for storage.
- the server 10 when the server 10 receives the verification feature, the server 10 may store the verification feature and the location point in the internal storage device 11 or the online storage device 30 .
- the sampling device 20 directly transmits the plurality of sampled images and the corresponding sampling relationship to the server 10 , and the server 10 pre- The designed image processing method generates the verification feature for the positioning point, and stores the verification feature together with the positioning point in the internal storage device 11 or the online storage device 30 .
- the sampling directions are determined by the sampling device 20
- the sampling device 20 may additionally transmit the sampling directions to the server 10 .
- the sampling device 20 may additionally transmit the positioning point to the server 10 or directly transmit the positioning point to the online storage device 30 for storage.
- the data transmitted by the sampling device 20 further includes the change information
- the data stored in the internal storage device 11 or the online storage device 30 may further include the material information, the object information or the deterioration information.
- the verification feature is stored in the internal storage device 11 or the online storage device 30, and the positioning point and The multiple sampling directions are all determined by the server 10 .
- the server 10 generates the positioning point according to a preset selection method and the overall image of the reference object received from the sampling device 20, and according to a preset orientation The plurality of sampling directions are generated by the method, and then the positioning point and the plurality of sampling directions are provided to the sampling device 20 .
- the sampling device 20 generates the verification feature for the positioning point through a preset image processing method according to the plurality of sampling images, the plurality of sampling directions and the sampling correspondence, and the sampling device 20 stores the verification feature for the location point.
- the verification feature is transmitted to the server 10 or directly to the online storage device 30 for storage as the sampled feature.
- the sampling device 20 since the positioning point is determined by the server 10, the sampling device 20 does not need to additionally transmit the positioning point.
- the server 10 may store the verification feature and the location point in the internal storage device 11 or the online storage device 30 .
- the sampling device 20 directly transmits the plurality of sampled images and the corresponding sampling relationship to the server 10 , and the server 10 pre- The designed image processing method generates the verification feature for the positioning point, and stores the verification feature together with the positioning point in the internal storage device 11 or the online storage device 30 .
- the data transmitted by the sampling device 20 further includes the change information
- the data stored in the internal storage device 11 or the online storage device 30 may further include the material information, the object information or the deterioration information.
- the preset selection method may be a random selection method for the sampling device 20 or the server 10 to arbitrarily select the positioning point from the overall image of the reference object.
- the predetermined selection method may be through a surface analysis technique for the sampling device 20 or the server 10 to select the positioning point from the overall image of the reference object.
- the surface analysis technology may be an image analysis technology such as image complexity analysis to obtain a high-complexity region in the overall image as a positioning point.
- the preset selection method includes but is not limited to the above-mentioned selection method. Any method that can be used to select a specific location of the fiducial can be used for the sampling method 300 described herein.
- the preset orientation mode may be a random orientation mode for the sampling device 20 or the server 10 to arbitrarily select the plurality of sampling directions.
- the plurality of sampling directions can be selected by a surface analysis technique according to the positioning point.
- the surface analysis technology can set more sampling directions for the part with higher degree of color change around the positioning point according to the gradient of the pixel value change, so as to obtain more complete sampling features around the positioning point.
- the preset orientation mode includes but is not limited to the above-mentioned orientation mode. Any method that can be used to select a plurality of different sampling directions can be used for the sampling method 300 described herein.
- the preset image processing method may be three-dimensional modeling processing.
- the sampling device 20 or the server 10 may use machine learning according to the sampling images, the sampling directions, and the sampling correspondence between the sampling images and the sampling directions.
- the sampling model may be a three-dimensional sampling model.
- the sampling device 20 can select a plurality of positioning points for the reference object, and obtain a plurality of sampling images with a plurality of sampling directions for each positioning point, thereby obtaining different positioning points for the reference object different sampling characteristics.
- the plurality of sampling directions for each anchor point may be completely different.
- each positioning point can use the same set of sampling combinations, and the sampling combination has multiple sampling directions, so the multiple sampling directions among the positioning points are completely the same.
- the plurality of sampling directions of each positioning point may be partially the same and partially different.
- FIG. 4A is a block diagram of a detection system 401 for detecting an object to be detected provided by the present invention.
- the detection system 401 includes the server 10 and the detection device 40 .
- the server 10 may include an internal storage device 11 to store in advance the sampling characteristics obtained by the sampling device 20 of FIG. 2 .
- the detection device 40 when the detection device 40 receives a detection request, the detection device 40 can be coupled with the server 10 to complete the detection method of the present invention. After the detection system 401 completes the detection method, the detection device 40 may stop coupling with the server 10 .
- FIG. 4B is a block diagram of another detection system 402 for detecting an object to be detected provided by the present invention.
- the detection system 402 includes the server 10 , the detection device 40 and the online storage device 30 .
- the detection device 40 when the detection device 40 receives a detection request, the detection device 40 can be coupled with the server 10, and the server 10 can be further coupled with the online storage device 30 to complete the detection method described in the present invention.
- the detection device 40 can be decoupled from the server 10 , and the server 10 can also be decoupled from the online storage device 30 .
- the detection device 40 when the detection device 40 receives a detection request, the detection device 40 can be coupled with the server 10 and the online storage device 30 to complete the detection method of the present invention. After the detection system 402 completes the detection method, the detection device 40 can be decoupled from the server 10 and the online storage device 30 .
- the online storage device 30 may be a network data storage device or a blockchain storage device.
- the online storage device 30 is the blockchain storage device, the possibility of tampering or replacement of the pre-stored sampling features can be reduced through the characteristics of the blockchain.
- the online storage device 30 can store the records of all transactions of the reference object, the time and result of the inspection, and the updated pictures.
- FIG. 5 is a block diagram of a detection device 40 for detecting an object to be detected provided by the present invention.
- the detection device 40 may be a mobile phone, a tablet computer, a desktop computer, a notebook computer, a camera, a video recorder, or other electronic devices, etc., which are not limited herein.
- the detection device 40 includes a movement control unit 41 , an image capture unit 42 , a processor 43 , a storage 44 , a transmission unit 45 and a reception unit 46 .
- the movement control unit 41 is used to enable the detection device 40 to achieve the movement required for the detection process. In one embodiment, the movement control unit 41 is used for enabling the image capture unit 42 to complete the movement when the object to be detected is detected. In one embodiment, the movement control unit 41 can be a display screen on the detection device 40 or an automatic movement device coupled with the image capture unit 42 .
- the display screen can be used to provide a movement instruction to the user to instruct the user to move the image capture unit 42 to the detection position.
- the detection position can be determined by a detection distance and a detection direction. For example, when a camera distance between the image capturing unit 42 and a positioning point on the object to be detected is equal to the detection distance, and the image capturing unit 42 takes a camera of the positioning point on the object to be detected When the direction is equal to the detection direction, the display screen displays that the image capturing unit 42 has moved to the detection position, and can start to acquire the desired detection image.
- the display screen may instruct the user to further move the image capturing unit 42 to zoom out or zoom in on the camera distance.
- the display screen may instruct the user to further move the image capturing unit 42 to left, right, pull up or down the camera direction.
- the detection position may be determined by the detection direction. In the described embodiment, the detection distance can be ensured by adjusting the focal length of the image capturing unit 42 to obtain the surface information of the details of the object to be detected.
- the automatic moving device can be a robotic arm or other device that can move the image capturing unit 42 , and the automatic moving device can receive the detection position indicated by the processor 43 to The image capturing unit 42 is moved to the detection position.
- the detection distance can be ensured by adjusting the focal length of the image capturing unit 42 to obtain the surface information of the details of the object to be detected. Therefore, the automatic moving device can only adjust the detection direction of the mobile image capturing unit 42 to obtain the required detection image.
- the image capturing unit 42 is used to obtain a plurality of detection images.
- the image capture unit 42 can be moved to a plurality of different detection positions, so that the image capture unit 42 can capture images of the object to be detected at different detection positions to obtain the plurality of detection images.
- the image capturing unit 42 may be a charge-coupled device CCD (Charge-Coupled Device) image sensor, a complementary metal-oxide-semiconductor (CMOS) image sensor or a camera.
- the image capturing unit 42 may include a high-magnification image capturing lens to obtain the surface information of the details of the object to be inspected.
- the image capture unit 42 may include a microscope image capture lens.
- the plurality of detection images captured by the image capturing unit 42 are all surface microscopic images of the object to be detected. In such embodiments, the surface microscopic image is a surface texture image.
- the processor 43 and the storage 44 are coupled to each other.
- the storage 44 stores a plurality of instructions for the processor 43 to execute the detection method of the detection device 40 according to the plurality of instructions stored in the storage 44 .
- the storage 44 stores the detection program 440 .
- the detection program 440 further includes a positioning module 441 and a detection module 442 .
- the positioning module 441 is used for enabling the processor 43 to assist the moving control unit 41 to move the image capturing unit 42 to the detection position.
- the detection module 442 is used to enable the processor 43 to assist the image capture unit 42 to obtain the required detection image at the detection position.
- Transmitting unit 45 and receiving unit 46 may utilize custom protocols or follow existing or de facto standards, including but not limited to Ethernet, IEEE 802.11 or IEEE 802.15 series, wireless USB or telecommunication standards (including but not limited to GSM, CDMA2000, TD -SCDMA, WiMAX, 3GPP-LTE or TD-LTE), so as to transmit the detection features to other devices than the detection device 40, and thereby receive the sampled features transmitted from other devices than the detection device 40.
- Ethernet including but not limited to Ethernet, IEEE 802.11 or IEEE 802.15 series, wireless USB or telecommunication standards (including but not limited to GSM, CDMA2000, TD -SCDMA, WiMAX, 3GPP-LTE or TD-LTE)
- FIG. 6 is a flowchart of a detection method 600 for detecting an object to be detected provided by the present invention.
- the detection method 600 shown in FIG. 3 is merely an example because there are many ways to perform the detection method.
- Detection method 600 may be performed using the configurations shown in FIGS. 4A , 4B, and 5, and while describing detection method 600, please incorporate reference to various elements in FIGS. 4A, 4B, and 5.
- FIG. Each step shown in FIG. 6 can represent one or more processes, methods or subroutines to be executed, and the sequence of each step can be adjusted arbitrarily, and does not make the essence of the detection method 600 deviate from the technical solution of the detection method 600 range.
- step S610 the detection device 40 transmits a request to detect the object to be detected.
- the detection device 40 before the detection device 40 captures the detection image, it needs to know which block of the object to be detected should perform the image detection before it can be stored with the online storage device 30 or the internal storage device 11 . Therefore, the detection device 40 can first transmit a request for the positioning point of the object to be detected, so as to obtain the positioning point of the object to be detected. In the embodiment, the detection device 40 may transmit the request to the server 10, and the request may include the object information of the object to be detected. Therefore, the server 10 can use the object information of the object to be detected to determine which reference object's corresponding positioning point and verification feature to retrieve as identification data for the object to be detected. In the embodiment, the corresponding positioning point of the reference object is the positioning point of the object to be detected.
- the server 10 transmits the corresponding positioning point of the reference object to the detection device 40 .
- the server 10 will first transmit a request for the positioning point of the object to be detected. Request to the online storage device 30 for the online storage device 30 to confirm which reference object's positioning point and verification feature to be retrieved as identification data for the detection of the object to be detected, and then the online storage device 30 returns the positioning point and The verification feature is sent to the server 10 , and the server 10 transmits the positioning point to the detection device 40 .
- the online storage device 30 will also transmit the change information of the reference object to the server 10 .
- step S620 the detection device 40 receives the information with the positioning point of the object to be detected.
- the server 10 transmits the positioning point of the object to be detected to the detection device 40 for use by the detection device 40 for subsequent image capture.
- the verification feature obtained by the server 10 is a plurality of sampled images of the corresponding positioning point of the fiducial object
- the information transmitted by the server 10 will also provide the respective sampling directions of the plurality of sampled images, So that the detection device 40 can obtain the detection image according to the same detection direction.
- the server 10 can also transmit the verification feature of the reference object to the detection device 40 .
- the server 10 will also transmit the change information of the reference object to the detection device 40 .
- the server 10 may not transmit the verification feature of the reference object to the detection device 40, but directly retain it in the server 10 for subsequent comparison.
- step S630 the detection device 40 acquires a plurality of detection images along a plurality of detection directions on the positioning point.
- the detection device 40 can set the detection directions of the positioning point by itself. In the embodiment, when the detection device 40 starts to capture the detection image for the positioning point, the detection device 40 can select the plurality of detection directions for the positioning point by itself. In one embodiment, the detection device 40 further includes a positioning unit (not shown in the figure), and the positioning unit may include a positioning device such as a gyroscope. In one embodiment, when the image capturing unit 42 captures the detection image of the positioning point, the detection device 40 can also record the detection direction corresponding to the detection image by the positioning unit. In another embodiment, the detection device 40 may select the detection directions in advance based on a preset orientation, and move the control unit 41 to make the image capture unit 42 perform the detection in the detection directions. Capture of images.
- the detection device 40 may receive the plurality of detection directions from devices other than the detection device 40 . In the embodiment, the detection device 40 can obtain the plurality of detection directions through the receiving unit 46 . In one embodiment, if the internal storage device 11 and the online storage device 30 do not store the plurality of sampling directions of the corresponding positioning point of the reference object, the server 10 may select the plurality of sampling directions in advance based on a preset orientation method. A detection direction is sent to the receiving unit 46 . In another embodiment, if the internal storage device 11 or the online storage device 30 stores the plurality of sampling directions of the corresponding positioning point of the reference object, the plurality of sampling directions can be sent back to the detection device 40 as the the multiple detection directions of the positioning point of the object to be detected.
- the server 10 may transmit the plurality of sampling directions to the receiving unit 46 as the plurality of detection directions.
- the online storage device 30 can directly transmit the plurality of sampling directions to the receiving unit 46 as the plurality of detection directions, or use the server 10 to store the plurality of sampling directions.
- the plurality of sampling directions are indirectly transmitted to the receiving unit 46 as the plurality of detection directions.
- the image capturing unit 42 obtains the plurality of detection images based on the plurality of detection directions.
- the image capturing unit 42 can obtain a plurality of first detection images based on a first detection direction among the plurality of detection directions, and can obtain a plurality of first detection images based on a second detection direction among the plurality of detection directions , and obtain a plurality of second detection images.
- step S640 the detection device 40 obtains detection results according to the plurality of detection images.
- the detection result is generated based on the comparison of the plurality of detection images and a reference object corresponding to the object to be detected.
- the detection device 40 can compare the plurality of detection images with the verification features of the reference object by itself to obtain the detection result.
- the detection device 40 may generate detection features according to the plurality of detection images, and transmit the detection features to the server 10 for the server 10 to compare the detection features with the verification features, and finally The server 10 transmits the detection result to the detection device 40 .
- the detection device 40 compares the plurality of detection images with the received verification feature.
- the verification feature may be a sampling model of the corresponding anchor point of the fiducial.
- the sampling model may be a three-dimensional sampling model.
- the detection device 40 may generate the detection feature based on the plurality of detection images, the plurality of detection directions, and a detection correspondence between the plurality of detection images and the plurality of detection directions.
- the processor 43 of the detection device 40 can obtain the detection feature by itself based on a preset image processing method.
- the detection feature is a detection model of the positioning point of the object to be detected.
- the detection model may be a three-dimensional detection model.
- the detection device 40 can directly compare the similarity between the three-dimensional sampling model and the three-dimensional detection model to generate the detection result.
- the detection device 40 may compare with the three-dimensional sampling model based on the plurality of detection images, the plurality of detection directions, and the detection correspondence.
- the detection device 40 can infer a plurality of verification images through the three-dimensional sampling model through the plurality of detection directions, and then compare the plurality of verification images with the plurality of verification images one by one according to the detection correspondence Detect the similarity of images to generate detection results.
- the verification feature may be the plurality of sampling images, the plurality of sampling directions and the sampling correspondence of the corresponding positioning point of the fiducial object.
- the plurality of sampling directions may be exactly the same as the plurality of detection directions, so the detection device 40 may compare the plurality of sampled images with the plurality of sampling images one by one according to the corresponding relationship between the sampling and the detection. The similarity of influences is detected to produce detection results.
- the detection device 40 when the detection result is generated by the server 10 , the detection device 40 will transmit the detection feature to the server 10 .
- the transmitted detection feature may be the three-dimensional sampling model or a combination of the plurality of detection images, the plurality of detection directions, and the detection correspondence.
- the detection device 40 may generate the three-dimensional detection model based on the plurality of detection images, the plurality of detection directions, and the detection correspondence.
- the processor 43 of the detection device 40 can obtain the three-dimensional detection model by itself based on a preset image processing method. Therefore, the detection device 40 can directly transmit the 3D detection model to the server 10 , and the server 10 compares the similarity between the 3D sampling model and the 3D detection model to provide the detection result to the detection device 40 .
- the detection device 40 may transmit the combination of the plurality of detection images, the plurality of detection directions, and the detection correspondence to the server 10 .
- the server 10 may generate the three-dimensional detection model based on the plurality of detection images, the plurality of detection directions, and the detection correspondence.
- the server 10 can obtain the three-dimensional detection model based on a preset image processing method.
- the server 10 can directly compare the similarity between the three-dimensional sampling model and the three-dimensional detection model, so as to provide the detection result to the detection device 40 .
- the server 10 may compare with the three-dimensional sampling model based on the plurality of detection images, the plurality of detection directions, and the detection correspondence.
- the server 10 can infer a plurality of verification images through the three-dimensional sampling model through the plurality of detection directions, and then compare the plurality of verification images with the plurality of detections one by one according to the detection correspondence The similarity of the images is used to provide the detection result to the detection device 40 .
- the verification feature may be the plurality of sampling images, the plurality of sampling directions, and the sampling correspondence of the corresponding positioning point of the fiducial object.
- the plurality of sampling directions may be exactly the same as the plurality of detection directions, so the server 10 may compare the plurality of sampled images and the plurality of detections one by one according to the corresponding relationship between the sampling and the detection.
- the similarity of the influence is used to provide the detection result to the detection device 40 .
- the plurality of detection directions are the plurality of sampling directions obtained from the online storage device 30 or the internal storage device 11 .
- the detection result obtained by the detection device 40 can determine that the object to be detected is the reference object. If the detection result shows that the similarity between the reference object and the object to be detected is low, the detection result obtained by the detection device 40 can determine that the object to be detected is different from the reference object. In another embodiment, if the detection result shows that the similarity between the reference object and the object to be detected is low, and the detection result determines that the object to be detected is not the same as the reference object, the detection device 40 or the server 10 can pass The change information of the reference object is used to update the detection result. In the embodiment, the detection device 40 or the server 10 can adjust the verification feature according to the change information of the reference object, and update the detection result according to the adjustment feature and the comparison of the plurality of detection images.
- the change information may be the material information, the object information or the deterioration information of the reference object. If the change information is the material information or the object information, the detection device 40 or the server 10 may search for the deterioration information of the reference object through the network or an internal database according to the material information or the object information. The detection device 40 or the server 10 can estimate the possible degree of deterioration of the reference object based on the deterioration information and the time difference between the detection time point and the sampling time point. Therefore, the detection device 40 or the server 10 can obtain the adjustment feature according to the verification feature and the deterioration degree.
- the degradation information may be material degradation information, and the material degradation information may be related to color changes (eg, fading). If the deterioration information is material aging information, the actual material aging degree can be estimated by using the material aging information and the time difference. For example, the likely degree of fading can be estimated.
- the degradation information may be material condition information, which may be related to material decomposition or breakage. If the deterioration information is the material condition information, the actual degree of material damage can be estimated by the difference between the material condition information and the time. For example: the possible crack size or number of cracks can be estimated.
- the detection result obtained by the detection device 40 can determine that the object to be detected is the reference object.
- the detection device 40 or the server 10 may store the detection features generated by the plurality of detection images in the internal storage device 11 or the online storage device 30 .
- the detection feature can directly replace the verification feature to serve as a basis for subsequent detection. If the transition of the reference object is predicted by relying on the change information for a long time, as long as the difference between the actual change state of the reference object and the predicted change state is too large, it will be difficult to correctly identify the object to be detected in the future.
- the updated detection result shows that the similarity between the reference object and the object to be detected is high, it represents the detection feature to display the actual change of the reference object as the object to be detected at the moment of detection.
- the detection device 40 or the server 10 can transmit and store the detection feature, and in this way, the actual change state of the reference object can be appropriately tracked, so as to avoid the actual change state exceeding the expectation of the predicted change state, and to keep the verification feature or the test feature at any time. Check the correctness of features on subsequent tests.
- the detection device 40 or the server 10 may match the detection feature generated by the plurality of detection images with the detection feature. Authentication features are stored together.
- the online storage device 30 or the internal storage device 11 will have the characteristic information of the reference object at two different times. Therefore, if a new object to be detected will be compared with the detection feature and the verification feature after a period of time (for example: 1 year), in addition to checking whether the new object to be detected is similar to the detection feature and the verification feature In addition, it is also possible to further detect whether the deterioration degree of the new object to be detected relative to the verification feature is greater than the deterioration degree of the detection feature.
- the detection system can still issue a warning to the new object to be detected. In this way, the detection accuracy can be further improved by the irreversibility of the degree of deterioration.
- the detection method 600 of the present invention may include at least but not limited to all the following embodiments:
- the comparison between the plurality of detection images and the verification feature is performed by the detection device 40, and the verification feature includes the corresponding positioning of the reference object corresponding to the object to be detected
- the detection device 40 transmits a request to detect the object to be detected.
- the server 10 obtains the corresponding positioning point and the verification feature of the reference object corresponding to the object to be detected from the internal storage device 11 or the online storage device 30, and stores the corresponding positioning point and the verification feature in the
- the plurality of sampling directions are sent to the detection device 40 .
- the server 10 may simultaneously transmit the plurality of sampled images and the corresponding relationship of the samples to the detection device 40 at this time.
- the detection device 40 after receiving the corresponding positioning point and the plurality of sampling directions, the detection device 40 directly sets the corresponding positioning point and the plurality of sampling directions as the positioning point and the plurality of sampling directions of the object to be detected, respectively. The direction is detected, and the plurality of detected images are obtained therefrom. In one embodiment, if the detection device 40 has acquired the plurality of sampled images before acquiring the plurality of detection images, the comparison can be directly started.
- the detection device 40 may send a request for comparing the reference object to the server 10 again, and the server 10 receives After the request, the corresponding relationship between the plurality of sampled images and the samples is provided to the detection device 40 .
- the server 10 can prevent the user from arbitrarily sending out the first request just to defraud all the verification information by sending the request twice.
- the request sent again by the detection device 40 may include the plurality of detection images for the server 10 to perform a preliminary verification. refuse to provide the plurality of sampled images. If the server 10 considers that there is a slight correlation between the plurality of detection images and the plurality of sampled images, the server 10 may transmit the plurality of sampled images for the detection device 40 to compare.
- the detection device 40 compares the plurality of sampled images and the plurality of detection images through the sampling correspondence and the detection correspondence, if the similarity between the plurality of sampled images and the plurality of detection images is similar When it is high, the detection device 40 can determine that the object to be detected is the reference object, and transmit the detection result to the server 10 . If the similarity between the plurality of sampled images and the plurality of detection images is low, the detection device 40 may determine that the object to be detected is not the reference object, and transmit the detection result to the server 10 . In another embodiment, if the similarity between the plurality of sampled images and the plurality of detected images is low, the detection device 40 may send a request for the change information of the reference object to the server 10 .
- the server 10 also performs another preliminary verification (eg, requesting the complete plurality of detection images) through the request for obtaining the change information, so as to confirm whether the change information is required by the detection device 40 .
- the detection device 40 adjusts the plurality of sampled images according to the change information to generate a plurality of adjusted images as part of the adjustment feature, and then compares the plurality of samples through the sampling correspondence and the detection correspondence. The adjustment image and the plurality of inspection images. If the similarity between the plurality of adjustment images and the plurality of detection images is low, the detection device 40 may determine that the object to be detected is not the reference object, and transmit the detection result to the server 10 .
- the detection device 40 can update the detection results that were not the same, re-identify the object to be detected as the reference object, and transmit the detection The result is given to server 10.
- the detection device 40 can upload the plurality of detection images to the server 10 for the server 10 to store the plurality of detection images in the internal storage device 11 or the online storage device 30 .
- the plurality of detection images and the plurality of sampled images can be simultaneously stored in the internal storage device 11 or the online storage device 30 corresponding to the reference object, so as to jointly serve as the verification feature of the reference object.
- the plurality of detection images can directly replace the plurality of sampled images as the verification features of the reference object.
- the server 10 may further confirm the plurality of detection images again to confirm the similarity between the plurality of detection images and the plurality of adjustment images, so as to avoid Wrong authentication characteristics stored.
- the comparison of the plurality of detection images and the verification feature is performed by the detection device 40, and the verification feature includes the correspondence of the reference object corresponding to the object to be detected Sampling model for anchor points.
- the detection device 40 transmits a request to detect the object to be detected.
- the server 10 obtains the corresponding positioning point and the verification feature of the reference object corresponding to the object to be detected from the internal storage device 11 or the online storage device 30 , and transmits the corresponding positioning point to the detection device 40 .
- the server 10 may also transmit the sampling model to the detection device 40 at this time.
- the detection device 40 after receiving the corresponding positioning point, the detection device 40 directly sets the corresponding positioning point as the positioning point of the object to be detected, and selects the plurality of detection directions by itself, thereby obtaining the plurality of detection directions. Detect images. In one embodiment, if the detection device 40 has acquired the sampling model before acquiring the plurality of detection images, the comparison can be directly started. In another embodiment, if the detection device 40 has not acquired the sampling model before acquiring the plurality of detection images, the detection device 40 can send a request for comparing the reference object to the server 10 again, and after the server 10 receives the request The sampling model is then provided to the detection device 40 .
- the server 10 can prevent the user from arbitrarily sending out the first request just to defraud all the verification information by sending the request twice.
- the request sent again by the detection device 40 may include the plurality of detection images for the server 10 to perform a preliminary verification. Refuse to provide this sampling model. If the server 10 considers that there is a slight correlation between the plurality of detection images and the sampling model, the server 10 can transmit the sampling model for the detection device 40 to compare.
- the detection device 40 may generate a detection model based on the plurality of detection images, the plurality of detection directions, and the detection correspondence, and generate detection by comparing the similarity between the detection model and the sampling model. result.
- the detection device 40 may infer a plurality of verification images based on the detection directions and the sampling model, and then compare the detection images one by one according to the detection correspondence between the detection images and the detection directions The similarity between the plurality of verification images and the plurality of detection images is used to generate a detection result.
- the detection device 40 may determine that the object to be detected is the reference object, and transmit the detection result to the server 10 . If the similarity between the plurality of verification images and the plurality of detection images is low or the similarity between the sampling model and the detection model is low, the detection device 40 may determine that the object to be detected is not the reference object, and transmit the The detection result is sent to the server 10 .
- the detection device 40 may transmit a request for the reference object A request for change information is made to the server 10 .
- the server 10 also performs another preliminary verification through the request for obtaining the change information (eg, obtaining the complete plurality of inspection images or inspection models) to confirm whether the inspection device 40 needs the change information.
- the detection device 40 adjusts the sampling model according to the change information to generate an adjustment model as the adjustment feature, and then compares the adjustment model with the plurality of detection images through the detection correspondence, or directly compares The adjustment model and the detection model.
- the detection device 40 may determine that the object to be detected is not the reference object, and transmit the The detection result is sent to the server 10 . If the similarity between the adjustment model and the plurality of detection images is high, or the similarity between the adjustment model and the detection model is high, the detection device 40 can update the detection results that were originally different, and re-identify the pending detection results. The detection object is the reference object, and the detection result is sent to the server 10 . In the embodiment, the detection device 40 can upload the plurality of detection images or the detection model to the server 10.
- the detection device 40 uploads the plurality of detection images to the server 10 , the detection device 40 must upload the plurality of detection directions and the detection corresponding relationship together for the server 10 to generate the detection model.
- the server 10 stores the detection model in the internal storage device 11 or the online storage device 30 .
- the detection model and the plurality of sampling models can be simultaneously stored in the internal storage device 11 or the online storage device 30 corresponding to the reference object, so as to jointly serve as the verification feature of the reference object.
- the detection model can directly replace the plurality of sampling models as the verification feature of the reference object.
- the server 10 when the server 10 obtains the detection model, it can further confirm the detection model again to confirm the similarity between the detection model and the sampling model, so as to avoid storing wrong verification features.
- the comparison between the plurality of detection images and the verification feature is performed by the server 10, and the verification feature includes a corresponding positioning point of the reference object corresponding to the object to be detected
- the detection device 40 transmits a request to detect the object to be detected.
- the server 10 obtains the corresponding positioning point and the verification feature of the reference object corresponding to the object to be detected from the internal storage device 11 or the online storage device 30, and stores the corresponding positioning point and the verification feature in the
- the plurality of sampling directions are sent to the detection device 40 .
- the server 10 does not need to transmit the plurality of sampled images and the corresponding relationship of the samples to the detection device 40, so there is no need to worry that the plurality of sampled images and the corresponding relationship of the samples are obtained in an improper manner.
- the detection device 40 after receiving the corresponding positioning point and the plurality of sampling directions, the detection device 40 directly sets the corresponding positioning point and the plurality of sampling directions as the positioning point and the plurality of sampling directions of the object to be detected, respectively. The direction is detected, and the plurality of detected images are obtained therefrom. In one embodiment, the detection device 40 transmits the plurality of detection images and the detection correspondence between the plurality of detection images and the plurality of detection directions to the server 10, so that the server 10 can perform the verification between the verification features. Comparison.
- the server 10 compares the plurality of sampled images and the plurality of detection images through the sampling correspondence and the detection correspondence, if the similarity between the plurality of sampled images and the plurality of detection images is high , the server 10 can determine that the object to be detected is the reference object, and return the detection result to the detection device 40 . If the similarity between the plurality of sampled images and the plurality of detection images is low, the server 10 may determine that the object to be detected is not the reference object, and transmit the detection result to the detection device 40 . In another embodiment, if the similarity between the plurality of sampled images and the plurality of detected images is low, the server 10 may obtain the change information of the reference object from the online storage device 30 or the internal storage device 11 . In the embodiment, the server 10 does not need to transmit the change information to the detection device 40, so there is no need to worry that the change information is obtained in an improper manner.
- the server 10 adjusts the plurality of sampled images according to the change information to generate a plurality of adjusted images as part of the adjustment feature, and then compares the plurality of adjustments through the sampling correspondence and the detection correspondence an image and the plurality of detection images. If the similarity between the plurality of adjustment images and the plurality of detection images is low, the server 10 may determine that the object to be detected is not the reference object, and transmit the detection result to the detection device 40 . If the similarity between the plurality of adjustment images and the plurality of detection images is high, the server 10 may update the detection results that were not identical before, re-identify the object to be detected as the reference object, and transmit the detection results to the detection device 40 .
- the server 10 may store the plurality of detection images in the internal storage device 11 or the online storage device 30 .
- the plurality of detection images and the plurality of sampled images can be simultaneously stored in the internal storage device 11 or the online storage device 30 corresponding to the reference object, so as to jointly serve as the verification feature of the reference object.
- the plurality of detection images can directly replace the plurality of sampled images as the verification features of the reference object.
- since the server 10 itself has done a complete comparison there is no need to worry about storing wrong authentication features.
- the comparison between the plurality of detection images and the verification feature is performed by the server 10, and the verification feature includes the corresponding location of the reference object corresponding to the object to be detected point sampling model.
- the detection device 40 transmits a request to detect the object to be detected.
- the server 10 obtains the corresponding positioning point and the verification feature of the reference object corresponding to the object to be detected from the internal storage device 11 or the online storage device 30 , and transmits the corresponding positioning point to the detection device 40 .
- the server 10 does not need to transmit the sampling model to the detection device 40, so there is no need to worry that the sampling model is obtained in an improper manner.
- the detection device 40 after receiving the corresponding positioning point, directly sets the corresponding positioning point as the positioning point of the object to be detected, and selects the plurality of detection directions by itself, thereby obtaining the plurality of detection directions. Detect images. In one embodiment, the detection device 40 transmits the plurality of detection images, the plurality of detection directions, and the detection correspondence between the plurality of detection images and the plurality of detection directions to the server 10 for the server 10 to perform and The alignment between the verification features. In another embodiment, the detection device 40 can automatically generate the detection model according to the plurality of detection images, the plurality of detection directions and the detection correspondence, and transmit the detection model to the server 10 for the server 10 to perform and The alignment between the verification features.
- the server 10 may generate the detection model based on the plurality of detection images, the plurality of detection directions, and the detection correspondence, and generate the detection model by comparing the similarity between the detection model and the sampling model. result. In another embodiment, the server 10 may generate a detection result by comparing the similarity between the sampling model and the received detection model. In yet another embodiment, the server 10 may infer a plurality of verification images based on the plurality of detection directions and the sampling model, and then compare them one by one according to the detection correspondence between the plurality of detection images and the plurality of detection directions The similarity between the plurality of verification images and the plurality of detection images is used to generate a detection result.
- the server 10 may determine that the object to be detected is the the reference object, and transmit the detection result to the detection device 40 . If the similarity between the plurality of verification images and the plurality of detection images is low or the similarity between the sampling model and the detection model is low, the server 10 may determine that the object to be detected is not the reference object, and transmit the detection The result is given to the detection device 40 . In another embodiment, if the similarity between the plurality of verification images and the plurality of detection images is low or the similarity between the sampling model and the detection model is low, the server 10 may obtain the data from the online storage device 30 . Or the internal storage device 11 obtains the change information of the reference object. In the embodiment, the server 10 does not need to transmit the change information to the detection device 40, so there is no need to worry that the change information is obtained in an improper manner.
- the server 10 adjusts the sampling model according to the change information to generate an adjustment model as the adjustment feature, and then compares the adjustment model with the plurality of detection images through the detection correspondence, or directly compares the adjustment model Adjust the model with the detection model. If the similarity between the adjustment model and the plurality of detection images is low or the similarity between the adjustment model and the detection model is low, the server 10 may determine that the object to be detected is not the reference object, and transmit the detection The result is given to the detection device 40 .
- the server 10 may update the original detection results that are not identical, and re-identify the to-be-detected The object is the reference object, and the detection result is sent to the detection device 40 .
- the server 10 may store the detection model in the internal storage device 11 or the online storage device 30 . If the server 10 only has the plurality of detection images and the detection corresponding relationship, the server 10 may store the detection model generated based on the plurality of detection images, the plurality of detection directions and the detection correspondence in the internal storage device 11 or in online storage 30.
- the detection model and the sampling model can be simultaneously stored in the internal storage device 11 or the online storage device 30 corresponding to the reference object, so as to jointly serve as the verification feature of the reference object.
- the detection model can directly replace the sampling model as the verification feature of the reference object.
- FIG. 7A and FIG. 7B are schematic diagrams showing that the image capturing unit 22 captures sampled images in different sampling directions on the positioning point of the fiducial object according to an exemplary embodiment of the present invention.
- the image capturing unit 22 in FIG. 7A is located just above the positioning point of the fiducial object, and the image capturing unit 22 in FIG. 7B has a deflection angle compared with the image capturing unit 22 in FIG. 7A .
- the image capturing unit 22 faces the positioning point of the fiducial object, and shoots in a first capturing range 710 to obtain the first sampled image 760 .
- the sampling device 20 or the server 10 A first sampling area 761 is extracted from the sampled image 760, and the first sampling area 761 is further divided into a plurality of first sampling blocks 7611-7616.
- the sampling device 20 or the server 10 can calculate a plurality of first sampling blocks 7611-7616 value of .
- the first sampling area 761 corresponds to an imaging area 711 in the first imaging range 710
- the plurality of first sampling blocks 7611-7616 correspond to the plurality of imaging blocks 7111-7116.
- the image capturing unit 22 faces the positioning point of the reference object at the deflection angle, and captures the image in a second capturing range 720 to obtain a second sampling image 770 , the sampling device 20 or the server 10 .
- a second sampling area 771 may be extracted from the second sampling image 770, and the second sampling area 771 may be further divided into a plurality of second sampling blocks 7711-7716, and the sampling device 20 or the server 10 may calculate a plurality of second samples Values for blocks 7711-7716.
- the sampling device 20 or the server 10 makes the first sampling region 761 and the second sampling region 771 the same as the sampling regions corresponding to the first sampling region 761 and the second sampling region 771.
- the sampling device 20 or the server 10 can find the second sampling region 771 by projecting the first sampling region 761 onto the second sampling image 770 through the deflection angle. Therefore, if the first sampling area 761 and the second sampling area 771 are projected onto the first imaging area 710 and the second imaging area 720, the same imaging area 711 can be obtained, and a plurality of second sampling blocks 7711 -7716 may also correspond to multiple acquisition blocks 7111-7116.
- the sampling device 20 or the server 10 can respectively calculate a value for each of the plurality of first sampling blocks 7611-7616 and the plurality of second sampling blocks 7711-7716, and the plurality of values are used to represent a plurality of A first sampling block 7611-7616 and a plurality of second sampling blocks 7711-7716.
- the plurality of values may be the average, mode, or median of a plurality of pixels in each block.
- the values of the plurality of first sampling blocks 7611-7616 may be a11, a12, a13, a14, a15 and a16, respectively, and the values of the plurality of second sampling blocks 7711-7716 may be a21, a22, a23, respectively , a24, a25, and a26.
- the sampling device 20 or the server 10 can obtain more values at other sampling angles, respectively. For example: a31, a32, a33, a34, a35, a36, ..., an1, an2, an3, an4, an5, and an6.
- the sampling device 20 or the server 10 can obtain a plurality of training sets of image data as follows:
- A1 [a11,a21,a31,...,an1]
- the sampling device 20 or the server 10 can generate a sampling model by 3D modeling based on the plurality of image data training sets and through machine learning technology.
- the detection device 40 or the server 10 can also capture a large number of detection sample points through the plurality of detection images in the same manner, and establish a plurality of image data training sets.
- the detection model is generated by 3D modeling with these large amount of detection sample point data.
- FIG. 8A-8E are photographs of different sampled images captured on a painting as a fiducial object according to an exemplary embodiment of the present invention.
- a plurality of sampled images are captured on the reference object by the image capture unit 22 , and the sampling device 20 or the server 10 can generate verification features by using the plurality of sampled images.
- the verification feature can be directly the plurality of sampled images.
- the verification feature can also be a machine learning technique, and the plurality of sampling samples are analyzed and trained to generate the verification feature required for verification.
- the verification feature may be a three-dimensionally modeled sample model.
- FIG. 8F-8J are photographs of different inspection images captured on a painting as an object to be inspected according to an exemplary embodiment of the present invention. Please refer to FIG. 4 and FIG. 5 together.
- a plurality of detection images are captured by the image capturing unit 42 on the object to be detected.
- the detection device 40 or the server 10 can use the plurality of detection images to perform verification on the verification feature. Comparison.
- the detection device 40 or the server 10 can directly compare FIG. 8A-FIG. 8E with FIG. 8F-FIG .
- the benchmarks are different.
- the verification feature is the sampling model, according to FIGS. 8A-8E , the sampling model should have no obvious unevenness and a relatively steep slope. Therefore, the detection device 40 or the server 10 will map the When the inspection images of 8F-8J are compared with the sampling model, it can be found that there is a significant difference in slope between the two, and it can be recognized that the object to be inspected is different from the reference object.
- FIGS. 9A and 9B are sampling images of different gemstones with the same processing method in the same sampling direction according to an exemplary embodiment of the present invention. From FIG. 9A and FIG. 9B , even with the exact same sampling direction and the exact same processing process, different sampling images may still be generated between different gemstones due to factors such as the color or clarity of the gemstone itself. Therefore, as long as a gemstone is used as a reference object, a sampling image is captured in advance to generate verification features, and then the detection device and the detection method can be used to confirm whether the to-be-detected object is a previously sampled gemstone.
- Figure 10A is a photograph of antique utensils with the same texture but not the same.
- 10B and 10C are sampled images of different antique utensils shown in FIG. 10A sampled in the same sampling direction according to an exemplary embodiment of the present invention. From FIG. 10B and FIG. 10C , even for antique utensils with the same texture, there may still be different texture changes between different antique utensils due to slight differences in glaze distribution during the firing process of the antique utensils.
- the sampling image is captured in advance to generate verification features, and then the detection device and the detection method can be used to confirm whether the object to be detected is an antique utensil that has been sampled before.
- the method described in the present invention is not limited to use in works of art (including but not limited to paintings, carvings, etc.), precious stones (including but not limited to diamonds, sapphires, emeralds, etc.) or antique items (including but not limited to pottery, porcelain, etc.) , as long as the verification features of the reference object (including but not limited to color distribution, texture details, notch defects, etc.) are obtained in advance, it can be used as the basis for subsequent verification of the object to be detected.
- works of art including but not limited to paintings, carvings, etc.
- precious stones including but not limited to diamonds, sapphires, emeralds, etc.
- antique items including but not limited to pottery, porcelain, etc.
- an aspect of the present disclosure provides a method for detecting an object to be detected by a detection device, the method comprising: transmitting a request for detecting the object to be detected; receiving the positioning point of the object to be detected; Acquiring a plurality of detection images along a plurality of detection directions on the positioning point of the object to be detected; and obtaining a detection result according to the plurality of detection images, wherein the detection result is based on the plurality of detection images and the corresponding detection images produced by an alignment of a corresponding locating point of a fiducial object of the object.
- the detection device includes: an image capture unit, the image capture unit is used to obtain a plurality of detection images; a movement control unit, The movement control unit is used to make the image capture unit move when the object to be detected is detected; a processor, the processor is coupled to the image capture unit and the moving unit; a transmission unit, the transmission unit coupled to the processor; a receiving unit coupled to the processor; and a storage device coupled to the processor and storing a plurality of instructions that when executed by the processor causing the processor to: transmit a request for detecting the object to be detected through the transmitting unit; receive the positioning point of the object to be detected through the receiving unit; enable the image capture unit to detect the object to be detected through the movement control unit On the positioning point, the plurality of detection images are obtained along a plurality of detection directions; and a detection result is obtained according to the plurality of detection images, wherein the detection result is based on the plurality of detection images and a corresponding to the object
- a change data of the reference object and a verification feature of the reference object are generated to generate a an adjustment feature, wherein: the comparison is to compare the plurality of detected images with the verification feature, the change data of the reference object is from an online storage device; and according to the plurality of detected images and the adjustment feature, updating the test result.
- a detection feature is transmitted to the online storage device, so as to store the detection feature in a The online storage device, wherein: the detection feature is generated based on the plurality of detection images, and the detection feature stored in the online storage device corresponds to the reference object.
- the detection feature replaces the verification feature of the reference object stored in the online storage device.
- the plurality of detection images are compared with a verification feature of the reference object to generate the detection result;
- the verification feature is a reconstruction based on a plurality of sampled images.
- a sampling model the plurality of sampling images are obtained along a plurality of sampling directions on the corresponding positioning point of the reference object, and the detection result is based on the plurality of detection directions to compare the detection image and the sampling model. a similarity.
- the plurality of detection images are compared with a verification feature of the reference object to generate the detection result;
- the verification feature is a reconstruction based on a plurality of sampled images.
- a sampling model the plurality of sampling images are obtained along a plurality of sampling directions on the corresponding positioning point of the reference object, the plurality of detection images reconstruct a detection model according to the plurality of detection directions, and the detection result is a comparison The similarity between the detection model and one of the sampling models.
- the plurality of detection images are compared with a verification feature of the reference object to generate the detection result; the verification feature is at the corresponding positioning point of the reference object A plurality of sampled images obtained along a plurality of sampling directions on the the plurality of sampling directions obtained.
- An aspect of the present disclosure provides a method for detecting an object to be detected by a server.
- the method includes: receiving a request for detecting the object to be detected; sending the positioning point of the object to be detected; A plurality of detection images obtained along a plurality of detection directions on the positioning point of the detection object; and a detection result is obtained according to the plurality of detection images, wherein the detection result is based on the plurality of detection images and the corresponding object to be detected generated by an alignment of a fiducial of .
- An aspect of the present disclosure provides a server for detecting an object to be detected, the server includes: a processor; a transmitting unit, the transmitting unit is coupled to the processor; a receiving unit, the receiving unit is coupled to the processor and a storage device, the storage device is coupled to the processor and stores a plurality of instructions which, when executed by the processor, cause the processor to: receive a request to detect the object to be detected through the receiving unit ; send the positioning point of the object to be detected through the transmission unit; receive a plurality of detection images obtained along a plurality of detection directions on the positioning point of the object to be detected through the receiving unit; and according to the plurality of detection images A detection result is obtained, wherein the detection result is generated based on a comparison of the plurality of detection images and a reference object corresponding to the object to be detected.
- an adjustment is generated according to a change data of the reference object and a verification feature of the reference object feature, wherein: the comparison is to compare the plurality of detected images with the verification feature, the change data of the reference object is from an online storage device; and according to the plurality of detected images and the adjustment feature, update the Test results.
- a detection feature is sent to the online storage device to store the detection feature in the An online storage device, wherein: the detection feature is generated based on the plurality of detection images, and the detection feature stored in the online storage device corresponds to the reference object.
- the detection feature replaces the verification feature of the reference object stored in the online storage device.
- the plurality of detection images are compared with a verification feature of the reference object to generate the detection result;
- the verification feature is a sample reconstructed according to the plurality of sampled images A model, the plurality of sampling images are obtained along a plurality of sampling directions on the corresponding positioning points of the fiducial object, and the detection result is based on the plurality of detection directions to compare the detection image and one of the sampling models similarity.
- the plurality of detection images are compared with a verification feature of the reference object to generate the detection result;
- the verification feature is a sample reconstructed according to the plurality of sampled images a model, the plurality of sampling images are obtained along a plurality of sampling directions on the corresponding positioning points of the reference object, the plurality of detection images reconstruct a detection model according to the plurality of detection directions, and the detection results are compared
- the detection model is similar to one of the sampling models.
- the plurality of detection images are compared with a verification feature of the reference object to generate the detection result;
- the verification feature is on the corresponding positioning point of the reference object
- a plurality of sampled images obtained along a plurality of sampling directions, the detection result is a similarity between the plurality of detection images and the plurality of sampled images, and the plurality of detection directions are obtained from an online storage device of the multiple sampling directions.
- An aspect of the present disclosure provides a sampling method for establishing identification data for a reference object by a sampling device, the method comprising: obtaining a positioning point on the reference object; along a plurality of sampling points on the positioning point of the reference object obtaining a plurality of sampled images in a direction; generating a sampled feature of the positioning point of the reference object according to the plurality of sampled images and the plurality of sampling directions; and transmitting the sampled feature for a network device to store.
- An aspect of the present disclosure provides a sampling device for sampling a fiducial object to establish identification data
- the sampling device includes: an image capturing unit configured to obtain a plurality of sampled images; a movement control unit , the movement control unit is used to make the image capture unit move when sampling the reference object; a processor, which is coupled to the image capture unit and the moving unit; a transmission unit, the transmission unit coupled to the processor; and a storage device coupled to the processor and storing a plurality of instructions that, when executed by the processor, cause the processor to: obtain a position on the fiducial point; make the image capture unit acquire a plurality of sampling images along a plurality of sampling directions on the positioning point of the reference object through the movement control unit; generate the plurality of sampling images according to the plurality of sampling images and the plurality of sampling directions sampling characteristics of the positioning point of the reference object; and transmitting the sampling characteristics through the transmission unit for storage by a network device.
- the positioning point set for the reference object is received from a server, wherein: when the server is the network device, the server stores the positioning point, and When the server is an online storage device, the server transmits the positioning point to the online storage device.
- the positioning point when the positioning point is set by the sampling device, the positioning point is transmitted to the network device.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Burglar Alarm Systems (AREA)
Abstract
Description
Claims (10)
- 一种由一检测装置对一待检测物进行的检测方法,该方法包括:传送检测该待检测物的一请求;接收该待检测物的一定位点;在该待检测物的该定位点上沿着多个检测方向取得多个检测影像;以及依据该多个检测影像取得一检测结果,其中,该检测结果是基于该多个检测影像与对应该待检测物的一基准物的一对应定位点的一比对所产生。
- 根据权利要求1所述的方法,其还包括:当该检测结果表示该基准物与该待检测物不相同时,依据该基准物的一变化数据以及该基准物的一验证特征产生一调整特征,其中:该比对是将该多个检测影像与该验证特征进行比较,该基准物的该变化数据是来自于一在线储存装置;以及依据该多个检测影像以及该调整特征,更新该检测结果。
- 根据权利要求2所述的方法,其还包括:当该更新的检测结果表示该基准物与该待检测物相同时,传送一检测特征给该在线储存装置,以将该检测特征储存于该在线储存装置,其中:该检测特征是基于该多个检测影像产生,以及储存于该在线储存装置的该检测特征是对应于该基准物。
- 根据权利要求3所述的方法,其中该检测特征取代在该在线储存装置所储存的该基准物的该验证特征。
- 根据权利要求1所述的方法,其中,该多个检测影像与该基准物的一验证特征进行比对来产生该检测结果,该验证特征是依据多个取样影像所重建的一取样模型,该多个取样影像是在该基准物的该对应定位点上沿着多个取样方向 所取得,以及该检测结果是依据该多个检测方向去比对该检测影像与该取样模型之一相似度。
- 根据权利要求1所述的方法,其中,该多个检测影像与该基准物的一验证特征进行比对来产生该检测结果,该验证特征是依据多个取样影像所重建的一取样模型,该多个取样影像是在该基准物的该对应定位点上沿着多个取样方向所取得,该多个检测影像依据该多个检测方向重建一检测模型,以及该检测结果是比对该检测模型与该取样模型之一相似度。
- 根据权利要求1所述的方法,其中,该多个检测影像与该基准物的一验证特征进行比对来产生该检测结果;该验证特征是在该基准物的该对应定位点上沿着多个取样方向所取得的多个取样影像,该检测结果是比对该多个检测影像与该多个取样影像之一相似度,以及该多个检测方向是从一在线储存装置所取得的该多个取样方向。
- 一种用于对一待检测物进行检测的检测装置,该检测装置包括:影像撷取单元,该影像撷取单元用以取得多个检测影像;移动控制单元,该移动控制单元用以使该影像撷取单元达到在对该待检测物进行检测时的移动;处理器,该处理器与该影像撷取单元以及该移动单元耦接;传送单元,该传送单元与该处理器耦接;接收单元,该接收单元与该处理器耦接;以及储存装置,该储存装置耦接到该处理器并且储存多个指令,该多个指 令在由该处理器执行时使该处理器:通过该传送单元传送检测该待检测物的一请求;通过该接收单元接收该待检测物的定位点;通过该移动控制单元使该影像撷取单元在该待检测物的该定位点上,沿着多个检测方向取得该多个检测影像;以及依据该多个检测影像取得一检测结果,其中,该检测结果是基于该多个检测影像与对应该待检测物的一基准物的一对应定位点的一比对所产生。
- 根据权利要求8所述的检测装置,其中,该多个指令在由该处理器执行时进一步使该处理器:当该检测结果表示该基准物与该待检测物不相同时,依据该基准物的一变化数据以及该基准物的一验证特征产生一调整特征,其中:该比对是将该多个检测影像与该验证特征进行比较,该基准物的该变化数据是来自于一在线储存装置;依据该多个检测影像以及该调整特征,调整该检测结果;以及当该更新的检测结果表示该基准物与该待检测物相同时,传送一检测特征给该在线储存装置,以将该检测特征储存于该在线储存装置,其中:该检测特征是基于该多个检测影像产生,以及储存于该在线储存装置的该检测特征是对应于该基准物。
- 一种由一服务器对一待检测物进行的检测方法,该方法包括:接收检测该待检测物的一请求;发送该待检测物的定位点;接收在该待检测物的该定位点上沿着多个检测方向取得的多个检测影像;以及依据该多个检测影像取得一检测结果,其中,该检测结果是基于该多个检测影像与对应该待检测物的一基准物的一比对所产生。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280032539.8A CN117461051A (zh) | 2021-05-04 | 2022-04-29 | 一种对一待检测物进行的检测方法与检测装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202163183643P | 2021-05-04 | 2021-05-04 | |
US63/183,643 | 2021-05-04 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022233277A1 true WO2022233277A1 (zh) | 2022-11-10 |
Family
ID=83931983
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2022/090341 WO2022233277A1 (zh) | 2021-05-04 | 2022-04-29 | 一种对一待检测物进行的检测方法与检测装置 |
Country Status (3)
Country | Link |
---|---|
CN (1) | CN117461051A (zh) |
TW (1) | TWI827030B (zh) |
WO (1) | WO2022233277A1 (zh) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101582162A (zh) * | 2008-05-14 | 2009-11-18 | 上海锦渡信息科技有限公司 | 基于纹理分析的艺术品鉴别方法 |
JP2014006840A (ja) * | 2012-06-27 | 2014-01-16 | Dainippon Printing Co Ltd | 個体識別方法、個体識別装置、プログラム |
CN104636733A (zh) * | 2015-02-12 | 2015-05-20 | 湖北华中文化产权交易所有限公司 | 一种基于图像特征的书画作品鉴定方法 |
CN106447361A (zh) * | 2016-10-28 | 2017-02-22 | 王友炎 | 一种纸张介质艺术品防伪鉴定和备案追溯系统及方法 |
CN107507090A (zh) * | 2017-08-23 | 2017-12-22 | 重庆艺邦动力科技有限公司 | 艺术品在线担保交易方法及实现该方法的存储设备和移动终端 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103164699B (zh) * | 2013-04-09 | 2016-06-15 | 北京盛世融宝国际艺术品投资有限公司 | 书画作品保真鉴定系统 |
US10019626B2 (en) * | 2013-12-02 | 2018-07-10 | Leonhard Kurz Stiftung & Co. Kg | Method for authenticating a security element, and optically variable security element |
CN108292456B (zh) * | 2015-11-30 | 2020-11-27 | 凸版印刷株式会社 | 识别方法以及识别介质 |
CN112446312A (zh) * | 2020-11-19 | 2021-03-05 | 深圳市中视典数字科技有限公司 | 三维模型识别方法、装置、电子设备及存储介质 |
-
2022
- 2022-04-29 CN CN202280032539.8A patent/CN117461051A/zh active Pending
- 2022-04-29 TW TW111116530A patent/TWI827030B/zh active
- 2022-04-29 WO PCT/CN2022/090341 patent/WO2022233277A1/zh active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101582162A (zh) * | 2008-05-14 | 2009-11-18 | 上海锦渡信息科技有限公司 | 基于纹理分析的艺术品鉴别方法 |
JP2014006840A (ja) * | 2012-06-27 | 2014-01-16 | Dainippon Printing Co Ltd | 個体識別方法、個体識別装置、プログラム |
CN104636733A (zh) * | 2015-02-12 | 2015-05-20 | 湖北华中文化产权交易所有限公司 | 一种基于图像特征的书画作品鉴定方法 |
CN106447361A (zh) * | 2016-10-28 | 2017-02-22 | 王友炎 | 一种纸张介质艺术品防伪鉴定和备案追溯系统及方法 |
CN107507090A (zh) * | 2017-08-23 | 2017-12-22 | 重庆艺邦动力科技有限公司 | 艺术品在线担保交易方法及实现该方法的存储设备和移动终端 |
Also Published As
Publication number | Publication date |
---|---|
CN117461051A (zh) | 2024-01-26 |
TW202314228A (zh) | 2023-04-01 |
TW202413928A (zh) | 2024-04-01 |
TWI827030B (zh) | 2023-12-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10495756B2 (en) | Multi-camera laser scanner | |
CN109977770B (zh) | 一种自动跟踪拍摄方法、装置、系统及存储介质 | |
EP3248374B1 (en) | Method and apparatus for multiple technology depth map acquisition and fusion | |
CN108830906B (zh) | 一种基于虚拟双目视觉原理的摄像机参数自动标定方法 | |
US8666145B2 (en) | System and method for identifying a region of interest in a digital image | |
US9906783B2 (en) | Automated measurement of mobile device application performance | |
CN105100620B (zh) | 拍摄方法及装置 | |
WO2018121269A1 (zh) | 膜片的检测系统、检测方法和装置 | |
JP2018528388A (ja) | レンズの1つ以上の光学パラメータを決定するための機器、システム、および方法 | |
TW201118791A (en) | System and method for obtaining camera parameters from a plurality of images, and computer program products thereof | |
CN106570899B (zh) | 一种目标物体检测方法及装置 | |
JP2014102766A5 (zh) | ||
WO2016165379A1 (zh) | 一种投影方法、装置、设备及计算机存储介质 | |
CN101859371A (zh) | 摄像装置及其物体识别方法 | |
WO2019105315A1 (zh) | 视场角测试方法和系统 | |
CN113129383A (zh) | 手眼标定方法、装置、通信设备及存储介质 | |
CN112257713A (zh) | 图像处理方法、装置、电子设备和计算机可读存储介质 | |
TW201544995A (zh) | 物件辨識方法與裝置 | |
CN112104851A (zh) | 画面校正的检测方法、装置和检测系统 | |
WO2022233277A1 (zh) | 一种对一待检测物进行的检测方法与检测装置 | |
JP2000123186A5 (ja) | 被写体認識装置及び被写体認識方法 | |
TWI852856B (zh) | 一種對一待檢測物進行檢測的方法與檢測裝置 | |
CN111344554A (zh) | 外观缺陷检测方法及装置 | |
CN109489560A (zh) | 一种线性尺寸测量方法及装置、智能终端 | |
US20150116486A1 (en) | Terminal device, image measuring system and method of inspection of workpiece |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22798618 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280032539.8 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 11202308368X Country of ref document: SG |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22798618 Country of ref document: EP Kind code of ref document: A1 |