CN115099834A - Article source code tracing system - Google Patents
Article source code tracing system Download PDFInfo
- Publication number
- CN115099834A CN115099834A CN202210822026.9A CN202210822026A CN115099834A CN 115099834 A CN115099834 A CN 115099834A CN 202210822026 A CN202210822026 A CN 202210822026A CN 115099834 A CN115099834 A CN 115099834A
- Authority
- CN
- China
- Prior art keywords
- video
- module
- stage
- product
- tracing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 claims abstract description 112
- 238000003860 storage Methods 0.000 claims abstract description 20
- 238000001514 detection method Methods 0.000 claims abstract description 17
- 238000003032 molecular docking Methods 0.000 claims abstract description 16
- 210000001503 joint Anatomy 0.000 claims abstract description 12
- 238000012545 processing Methods 0.000 claims description 12
- 238000004891 communication Methods 0.000 claims description 2
- 238000011017 operating method Methods 0.000 claims 1
- 238000004519 manufacturing process Methods 0.000 description 7
- 238000012549 training Methods 0.000 description 5
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000000819 phase cycle Methods 0.000 description 2
- 230000006399 behavior Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000003306 harvesting Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000004886 process control Methods 0.000 description 1
- 238000012552 review Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/018—Certifying business or products
- G06Q30/0185—Product, service or business identity fraud
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/46—Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/49—Segmenting video sequences, i.e. computational techniques such as parsing or cutting the sequence, low-level clustering or determining units such as shots or scenes
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Multimedia (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Entrepreneurship & Innovation (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Accounting & Taxation (AREA)
- Development Economics (AREA)
- Economics (AREA)
- Finance (AREA)
- Marketing (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses an article source code tracing system, which belongs to the technical field of food safety and comprises an acquisition module, a storage module, a butt joint module, a coding module and a server; the acquisition module is used for acquiring the live-action video of the product, acquiring product type information, matching a corresponding acquisition method according to the acquired product type information, acquiring the live-action video according to the matched acquisition method, and inserting a corresponding stage label into the acquired video according to the acquisition method to obtain a source tracing video; the storage module is used for storing product process data; the docking module is used for performing data docking, acquiring a docking data interface of the third-party detection mechanism, marking the docking data interface as a detection interface, and docking the storage module with the detection interface; the encoding module is used for generating a source tracing code corresponding to the product process data; the invention realizes the dynamic identification card of the article and records the whole chain process of product generation; and the evidence tracing of the public credibility is realized by adopting a related real-time video recording means.
Description
Technical Field
The invention belongs to the technical field of food safety, and particularly relates to an article traceability code system.
Background
At present, "one thing and one code" and "one family and one code" are commonly used in society and are widely popularized; however, the content is realized by manual input or system manual operation, which only can make people know the product, but cannot ensure the authenticity and validity of the content; meanwhile, the process condition cannot be known by code scanning or the real condition of a certain time period cannot be supported; therefore, the invention provides an article source tracing code system.
Disclosure of Invention
In order to solve the problems existing in the scheme, the invention provides an article traceability code system.
The purpose of the invention can be realized by the following technical scheme:
an article source tracing code system comprises an acquisition module, a storage module, a butt joint module, a coding module and a server;
the acquisition module is used for acquiring live-action videos of products, acquiring product type information, matching corresponding acquisition methods according to the acquired product type information, acquiring the live-action videos according to the matched acquisition methods, and inserting corresponding stage labels into the acquired videos according to the acquisition methods to obtain source tracing videos;
the storage module is used for storing product process data;
the docking module is used for performing data docking, acquiring a docking data interface of a third-party detection mechanism, marking the docking data interface as a detection interface, and docking the storage module with the detection interface;
the encoding module is used for generating a source tracing code corresponding to the product process data.
Further, the method for matching the corresponding acquisition method according to the acquired product category information comprises the following steps:
acquiring the type of the target product, setting each process stage of the type of the target product, setting stage labels and stage node characteristics of corresponding stages, compiling a corresponding acquisition method according to the type of the target product, and establishing an acquisition method library; identifying product type information, inputting the product type information into an acquisition method library for matching, and acquiring a corresponding acquisition method.
Further, the method for inserting the corresponding stage label into the collected video according to the collection method comprises the following steps:
and setting a feature recognition model, recognizing the collected video according to the stage node features in the collection method in real time through the feature recognition model, matching the corresponding stage labels when the corresponding stage node features are recognized, and inserting the matched stage labels into corresponding positions in the collected video.
Further, the system also comprises a video processing module and an association module; the video processing module is used for generating a shortcut representative video according to the source tracing video; the association module is used for associating the shortcut representative video with the source tracing video.
Further, the working method of the video processing module comprises the following steps:
identifying a phase tag in a tracing video, dividing the tracing video into a plurality of video segments according to the identified phase tag, setting a time proportion of the corresponding video segments, processing the video segments to obtain corresponding phase representative videos, and integrating all the phase representative videos into a quick representative video according to an original phase sequence.
Further, the method for setting the time scale of the corresponding video segment comprises the following steps:
marking a video segment as i, wherein i is 1, 2, … …, n is a positive integer; identifying the process stage of the video segment corresponding to the target product, matching the corresponding weight value according to the identified process stage, marking as QZi, establishing a content model, analyzing the video segment through the content model to obtain the corresponding content value, marking as NRi, and obtaining the corresponding content value according to a formulaCalculating the time proportion of the corresponding video segment; wherein b3 and b4 are both proportional coefficients with the value range of 0<b3≤1,0<b4≤1。
Further, the method of matching the corresponding weight values according to the identified process stages is:
the method comprises the steps of obtaining a consumer attention proportion corresponding to each process stage of a target product based on big data, marking the proportion as BL, obtaining importance of each process stage to the target product, setting a corresponding importance value as ZY, and calculating a weight value according to a formula QZ (b 1 multiplied by BL multiplied by b2 multiplied by ZY), wherein b1 and b2 are proportional coefficients, and the value range is 0< b1 is less than or equal to 1, and 0< b2 is less than or equal to 1.
Further, the working method of the association module comprises the following steps:
the method comprises the steps of identifying and corresponding the shortcut representative video and the tracing video, setting a positioning unit according to the corresponding relation between the shortcut representative video and the tracing video, wherein the positioning unit is used for quickly positioning to the corresponding position in the tracing video according to the mark of a consumer in the shortcut representative video, and integrating the shortcut representative video and the tracing video into product process data.
Compared with the prior art, the invention has the beneficial effects that: the invention realizes the dynamic identification card of the article, records the whole chain process of product generation, and adopts the relevant real-time video recording means to realize the evidence traceability of the public trust; the product identity card is real and effective and has public trust; the method not only can snap the operation scene at regular time for the consumer to check, but also can support the user-defined time period to call and read the real shape of the production and manufacturing process; the whole chain process is transparent and public.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the embodiments or the prior art descriptions will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic block diagram of the present invention.
Detailed Description
The technical solutions of the present invention will be described below clearly and completely in conjunction with the embodiments, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, an article source code tracing system, at present, "one article for one code" and "one user for one code" are commonly used in society and are widely popularized. However, the content is realized by manual input or system manual operation, which only can make people know the product, but cannot ensure the authenticity and validity of the content. At the same time, the code scanning cannot know the process condition or cannot support the real condition of a certain time period.
The invention realizes the dynamic identification card of the article, records the whole chain process of product generation, and adopts the relevant real-time video recording means to realize the evidence tracing of the public credibility. The product identity card is real and effective and has public trust. The method not only can snap the operation scene at regular time for the consumer to check, but also can support the user-defined time period to call and read the real form of the production and manufacturing process. The whole chain process is transparent and public.
All traceback records are true, valid, and are not spurious. The real-time camera shooting of the main process control point position is adopted for recording and tracing, and certain public confidence is achieved. In addition, the interface data detected by the third party is automatically communicated, so that manual operation is avoided, the method is more real and authoritative, and the public trust is enhanced. The consequences caused by some illegal and unethical behaviors are avoided. Meanwhile, the method supports the user-defined time to call and read the on-site video for checking in each manufacturing (or planting) link, and the real-time environment and the operation normalization of product manufacturing are known.
The system specifically comprises an acquisition module, a video processing module, a correlation module, a storage module, a docking module, a coding module and a server; the acquisition module, the storage module, the butt joint module and the coding module are all in communication connection with the server;
the acquisition module is used for acquiring a live-action video of a product, and the specific method comprises the following steps:
the method comprises the steps of obtaining product type information, matching a corresponding acquisition method according to the obtained product type information, carrying out live-action video acquisition according to the matched acquisition method, and inserting a corresponding stage label into an acquired video according to the acquisition method to obtain a source tracing video.
The method for matching the corresponding acquisition method according to the acquired product category information comprises the following steps:
acquiring the type of the target product, wherein the type of the target product is the type of food to be subjected to tracing acquisition, setting each process stage of the type of the target product, and setting stage labels and stage node characteristics of corresponding stages, such as a planting stage, a growth stage, a harvesting stage, a manufacturing stage, an inspection stage and the like, wherein the stage node characteristics are set manually according to video expression characteristics of the corresponding target product at nodes of the corresponding stages, compiling a corresponding acquisition method according to the type of the target product, and establishing an acquisition method library;
identifying product type information, inputting the product type information into an acquisition method library for matching, and acquiring a corresponding acquisition method.
The corresponding acquisition method is compiled according to the type of the target product, the acquisition method is compiled in a manual mode, one target product forms an acquisition scheme, and when a new target product appears subsequently, the corresponding acquisition scheme is added into the acquisition method library for real-time updating.
The method for inserting the corresponding stage label into the collected video according to the collection method comprises the following steps:
and setting a feature recognition model, recognizing the collected video according to the stage node features in the collection method in real time through the feature recognition model, matching the corresponding stage labels when the corresponding stage node features are recognized, and inserting the matched stage labels into corresponding positions in the collected video.
By inserting the corresponding stage label into the captured video, the consumer can conveniently and quickly locate the corresponding stage for watching, the corresponding stage reference or example is provided for the consumer, and the situation that the consumer directly faces a long video and does not know that the stage which the consumer wants to watch corresponds to the time period is avoided.
The feature recognition model is established based on a CNN network or a DNN network, and is trained by setting a corresponding training set.
The video processing module is used for generating a shortcut representative video according to the source tracing video, and the specific method comprises the following steps:
identifying a phase tag in a tracing video, dividing the tracing video into a plurality of video segments according to the identified phase tag, setting a time proportion of the corresponding video segment corresponding to one phase, processing the video segments to obtain corresponding phase representative videos, and integrating all the phase representative videos into a quick representative video according to an original phase sequence.
The stage representative video is obtained by extracting the main representative video in the corresponding video segment, cutting and combining the main representative video based on the set time proportion of the video segment, removing long repeated parts in the original video segment, realizing simplification of the video, facilitating quick understanding of consumers, watching the source tracing video of the corresponding part when needed, specifically carrying out intelligent processing by establishing a corresponding neural network model to obtain the corresponding stage representative video, and specifically establishing and training the video to be common knowledge in the field.
The method for setting the time scale of the corresponding video segment comprises the following steps:
marking a video segment as i, wherein i is 1, 2, … …, n is a positive integer; identifying the process stage of the video segment corresponding to the target product, matching the corresponding weight value according to the identified process stage, marking as QZi, establishing a content model, analyzing the video segment through the content model to obtain the corresponding content value, marking as NRi, and obtaining the corresponding content value according to a formulaCalculating the time proportion of the corresponding video segment; wherein b3 and b4 are both proportional coefficients with the value range of 0<b3≤1,0<b4≤1。
The content model is established based on a CNN network or a DNN network, and is trained by setting a corresponding training set for analyzing the repeatability of corresponding video content and further setting a corresponding content value, namely the training set is the content value corresponding to the repeatability setting of the corresponding video content, so as to form the training set.
The method for matching the corresponding weight value according to the identified process stage comprises the following steps:
the method comprises the steps of obtaining a customer attention proportion corresponding to each process stage of a target product based on big data, obtaining the proportion corresponding to the process stage through modes of review, questionnaire survey, posting and the like, marking the proportion as BL through the prior art, obtaining the importance of each process stage to the target product, setting a corresponding importance value as ZY, and calculating a weight value according to a formula QZ (b 1 × BL × b2 × ZY), wherein b1 and b2 are proportional coefficients, the value range is 0< b1 is less than or equal to 1, and 0< b2 is less than or equal to 1.
The importance of each process stage to the target product is that the safety and the authenticity of the target product corresponding to the process stage are more important, the corresponding important value is manually set according to the corresponding importance, a stage important value matching table of each target product is established, and the corresponding important value is obtained after matching.
The association module is used for associating the shortcut representation video with the source tracing video, and the specific method comprises the following steps:
the method comprises the steps of identifying and corresponding the shortcut representative video and the tracing video, setting a positioning unit according to the corresponding relation between the shortcut representative video and the tracing video, wherein the positioning unit is used for quickly positioning to the corresponding position in the tracing video according to the mark of a consumer in the shortcut representative video, and integrating the shortcut representative video and the tracing video into product process data.
And identifying and corresponding the shortcut representative video and the source tracing video, namely identifying the position of the picture in the shortcut representative video in the source tracing video, and identifying and corresponding.
Illustratively, when a consumer watches the shortcut representative video, the consumer wants to know a certain video segment in detail, marks the shortcut representative video through the positioning unit, and directly positions the corresponding video segment in the source-tracing video according to the mark positioning unit for playing.
The storage module is used for storing product process data, the storage module can be set to be cloud-end storage according to needs, or a block chain-based technology can be adopted for storage so as to guarantee the integrity of the product process data, and the product process data is prevented from being tampered.
The butt joint module is used for carrying out data butt joint, acquiring a butt joint data interface of the third-party detection mechanism, marking the butt joint data interface as a detection interface, and butting the storage module with the detection interface to realize the detection of the third-party detection mechanism on the product process data in the storage module.
The encoding module is used for generating a tracing code, namely a two-dimensional code, corresponding to the product process data, a consumer can obtain the corresponding product process data by scanning the tracing code, and the specific tracing code generation process is common knowledge in the field, so that detailed description is omitted.
The above formulas are all calculated by removing dimensions and taking numerical values thereof, the formula is a formula which is obtained by acquiring a large amount of data and performing software simulation to obtain the closest real situation, and the preset parameters and the preset threshold value in the formula are set by the technical personnel in the field according to the actual situation or obtained by simulating a large amount of data.
The working principle of the invention is as follows: acquiring live-action videos of products through an acquisition module to acquire product type information, matching corresponding acquisition methods according to the acquired product type information, acquiring live-action videos according to the matched acquisition methods, and inserting corresponding stage labels into the acquired videos according to the acquisition methods to acquire source tracing videos; storing the acquired source tracing video through a storage module; the data docking module is used for performing data docking, a docking data interface of the third-party detection mechanism is obtained and marked as a detection interface, and the storage module is docked with the detection interface; and generating a tracing code corresponding to the tracing video by using the coding module.
Although the present invention has been described in detail with reference to the preferred embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the spirit and scope of the present invention.
Claims (8)
1. An article source tracing code system is characterized by comprising an acquisition module, a storage module, a docking module, a coding module and a server, wherein the acquisition module, the storage module, the docking module and the coding module are all in communication connection with the server;
the acquisition module is used for acquiring the live-action video of the product, acquiring product type information, matching a corresponding acquisition method according to the acquired product type information, acquiring the live-action video according to the matched acquisition method, and inserting a corresponding stage label into the acquired video according to the acquisition method to obtain a source tracing video;
the storage module is used for storing product process data; carrying out data butt joint through a butt joint module, acquiring a butt joint data interface of a third-party detection mechanism, marking the butt joint data interface as a detection interface, and carrying out butt joint on a storage module and the detection interface;
the encoding module is used for generating a source tracing code corresponding to the product process data.
2. The article source tracing code system according to claim 1, wherein the method for matching the corresponding acquisition method according to the obtained product category information comprises:
acquiring the type of the target product, setting each process stage of the type of the target product, setting stage labels and stage node characteristics of corresponding stages, compiling a corresponding acquisition method according to the type of the target product, and establishing an acquisition method library; identifying product type information, inputting the product type information into an acquisition method library for matching, and acquiring a corresponding acquisition method.
3. The article source tracing code system according to claim 2, wherein the method for inserting the corresponding stage tag into the captured video according to the capturing method is:
and setting a feature recognition model, recognizing the collected video according to the stage node features in the collection method in real time through the feature recognition model, matching the corresponding stage labels when the corresponding stage node features are recognized, and inserting the matched stage labels into corresponding positions in the collected video.
4. The article source tracing code system according to claim 1, further comprising a video processing module and an association module; the video processing module is used for generating a shortcut representative video according to the source tracing video; the association module is used for associating the shortcut representative video with the source tracing video.
5. The article source tracing code system according to claim 4, wherein the video processing module operating method comprises:
identifying a stage label in a tracing video, dividing the tracing video into a plurality of video segments according to the identified stage label, setting the time proportion of the corresponding video segments, processing the video segments to obtain corresponding stage representative videos, and integrating all the stage representative videos into a quick representative video according to the original stage sequence.
6. The item tracing code system according to claim 5, wherein said method for setting time scale of corresponding video segment comprises:
marking a video segment as i, wherein i is 1, 2, … …, n is a positive integer; identifying the process stage of the video segment corresponding to the target product, matching the corresponding weight value according to the identified process stage, marking as QZi, establishing a content model, analyzing the video segment through the content model to obtain the corresponding content value, marking as NRi, and obtaining the corresponding content value according to a formulaCalculating the time proportion of the corresponding video segment; wherein b3 and b4 are both proportionality coefficients with the value range of 0<b3≤1,0<b4≤1。
7. The article source tracing code system according to claim 6, wherein the method of matching the corresponding weight value according to the identified process stage is:
the method comprises the steps of obtaining a consumer attention proportion corresponding to each process stage of a target product based on big data, marking the proportion as BL, obtaining importance of each process stage to the target product, setting a corresponding importance value as ZY, and calculating a weight value according to a formula QZ (b 1 multiplied by BL multiplied by b2 multiplied by ZY), wherein b1 and b2 are proportional coefficients, and the value range is 0< b1 is less than or equal to 1, and 0< b2 is less than or equal to 1.
8. The article source tracing code system according to claim 4, wherein the working method of the association module comprises:
the method comprises the steps of identifying and corresponding the shortcut representative video and the tracing video, setting a positioning unit according to the corresponding relation between the shortcut representative video and the tracing video, wherein the positioning unit is used for quickly positioning to the corresponding position in the tracing video according to the mark of a consumer in the shortcut representative video, and integrating the shortcut representative video and the tracing video into product process data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210822026.9A CN115099834A (en) | 2022-07-12 | 2022-07-12 | Article source code tracing system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210822026.9A CN115099834A (en) | 2022-07-12 | 2022-07-12 | Article source code tracing system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115099834A true CN115099834A (en) | 2022-09-23 |
Family
ID=83297280
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210822026.9A Pending CN115099834A (en) | 2022-07-12 | 2022-07-12 | Article source code tracing system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115099834A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116664152A (en) * | 2023-06-15 | 2023-08-29 | 浙江智飨科技有限公司 | Supply chain full-flow node tracing system and method |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111368138A (en) * | 2020-02-10 | 2020-07-03 | 北京达佳互联信息技术有限公司 | Method and device for sorting video category labels, electronic equipment and storage medium |
CN114548671A (en) * | 2022-01-14 | 2022-05-27 | 豆豆猫(海南)软件科技有限公司 | Agricultural product traceability system and agricultural product traceability method based on block chain |
-
2022
- 2022-07-12 CN CN202210822026.9A patent/CN115099834A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111368138A (en) * | 2020-02-10 | 2020-07-03 | 北京达佳互联信息技术有限公司 | Method and device for sorting video category labels, electronic equipment and storage medium |
CN114548671A (en) * | 2022-01-14 | 2022-05-27 | 豆豆猫(海南)软件科技有限公司 | Agricultural product traceability system and agricultural product traceability method based on block chain |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116664152A (en) * | 2023-06-15 | 2023-08-29 | 浙江智飨科技有限公司 | Supply chain full-flow node tracing system and method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11095594B2 (en) | Location resolution of social media posts | |
CN111401314B (en) | Dressing information detection method, device, equipment and storage medium | |
US10410043B2 (en) | System and method for part identification using 3D imaging | |
US20220103505A1 (en) | Social media influence of geographic locations | |
CN108734184B (en) | Method and device for analyzing sensitive image | |
JP5857073B2 (en) | System and method for relevance of image texting and text imaging | |
CN109492604A (en) | Faceform's characteristic statistics analysis system | |
US20130343618A1 (en) | Searching for Events by Attendants | |
CN111241338B (en) | Depth feature fusion video copy detection method based on attention mechanism | |
CN105303449B (en) | The recognition methods and system of social network user based on camera fingerprint characteristic | |
CN112446715B (en) | Product tracing method based on industrial internet cloud platform | |
CN115099834A (en) | Article source code tracing system | |
CN112149690A (en) | Tracing method and tracing system based on biological image feature recognition | |
CN104050599A (en) | Edible agricultural product quality safety traceability system and method based on video playback | |
CN117668372B (en) | Virtual exhibition system on digital wisdom exhibition line | |
CN117392289A (en) | Method and system for automatically generating case field video based on AI (advanced technology attachment) voice | |
CN110309737A (en) | A kind of information processing method applied to cigarette sales counter, apparatus and system | |
WO2022246923A1 (en) | Method for screening potential customer | |
CN117216308B (en) | Searching method, system, equipment and medium based on large model | |
CN117216403B (en) | Web-based personalized service recommendation method | |
CN108921185A (en) | A kind of shelf sales promotion information recognition methods based on image recognition, device and system | |
CN113743382B (en) | Shelf display detection method, device and system | |
CN114170548B (en) | Deep learning-based oilfield on-site micro-target detection method and system | |
Thomas et al. | Computer vision supported pedestrian tracking: A demonstration on trail bridges in rural Rwanda | |
CN114676117A (en) | Post data storage method and device and post robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20220923 |
|
RJ01 | Rejection of invention patent application after publication |