CN113269245B - Comparison method and system - Google Patents

Comparison method and system Download PDF

Info

Publication number
CN113269245B
CN113269245B CN202110565254.8A CN202110565254A CN113269245B CN 113269245 B CN113269245 B CN 113269245B CN 202110565254 A CN202110565254 A CN 202110565254A CN 113269245 B CN113269245 B CN 113269245B
Authority
CN
China
Prior art keywords
data
comparison
standard
current
compared
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110565254.8A
Other languages
Chinese (zh)
Other versions
CN113269245A (en
Inventor
吕英丽
徐小君
张红领
顾勇
张晔
李亚杰
杨晓晴
张静静
方彬
张晓峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hebei University of Architecture
Original Assignee
Hebei University of Architecture
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hebei University of Architecture filed Critical Hebei University of Architecture
Priority to CN202110565254.8A priority Critical patent/CN113269245B/en
Publication of CN113269245A publication Critical patent/CN113269245A/en
Application granted granted Critical
Publication of CN113269245B publication Critical patent/CN113269245B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Probability & Statistics with Applications (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a comparison method and a system thereof, wherein the comparison system comprises: the system comprises at least one business user terminal, at least one client terminal and a comparison server; a merchant terminal: the system comprises a comparison server, a comparison module and a comparison module, wherein the comparison server is used for receiving and executing an initial data acquisition instruction and feeding back the acquired initial data of an object to be compared to the comparison server; a client: the comparison server is used for receiving and executing a current data acquisition instruction and feeding back the acquired current data of the object to be compared to the comparison server; the comparison server is used for executing the following steps: acquiring initial data of objects to be compared; analyzing the initial data, and determining a comparison model and standard comparison data; acquiring current data and standard comparison data of an object to be compared, and optimizing the current data by using the standard comparison data to obtain the data to be compared; and processing the standard comparison data and the data to be compared by using the comparison model to obtain a comparison result. The method and the device are convenient for determining whether the received article and the purchased article are correct versions or not, and improve the comparison accuracy.

Description

Comparison method and system
Technical Field
The present application relates to the field of computer technologies, and in particular, to a comparison method and system.
Background
With the rapid development of electronic commerce, more and more users purchase required commodities in an online shopping mode, but the users cannot directly contact the commodities to perform intuitive judgment before purchasing the commodities purchased from the online, so that whether the received commodities are consistent with the commodities displayed by merchants in an electronic mode or the correct varieties of the required commodities cannot be confirmed. Particularly, in the case of living goods (such as plants and animals) having a growing period, the growing conditions of the living goods are influenced by the environmental factors such as regions and climate, thereby causing differences in appearance, and it is difficult for the user who first plants or cultures the plants or animals of the kind to determine the varieties of the living goods in different regions, different seasons, and different growing stages.
Disclosure of Invention
The present application provides a comparison method and a system thereof, which are convenient for determining whether the received article and the selected article are correct versions, and effectively improve the comparison accuracy.
To achieve the above object, the present application provides an alignment system, comprising: the system comprises at least one business user terminal, at least one client terminal and a comparison server; wherein, the merchant terminal: the system comprises a comparison server, a comparison module and a comparison module, wherein the comparison server is used for receiving and executing an initial data acquisition instruction and feeding back the acquired initial data of an object to be compared to the comparison server; a client: the comparison server is used for receiving and executing a current data acquisition instruction and feeding back the acquired current data of the object to be compared to the comparison server; the comparison server is used for executing the following steps: acquiring initial data of objects to be compared; analyzing the initial data, and determining a comparison model and standard comparison data, wherein the comparison model at least comprises: a living body comparison model and a non-living body comparison model; acquiring current data and standard comparison data of an object to be compared, and optimizing the current data by using the standard comparison data to obtain the data to be compared; and processing the standard comparison data and the data to be compared by using the comparison model to obtain a comparison result.
As above, the comparison server at least includes: a data acquisition unit and a data processing unit; a data acquisition unit: the system comprises a data processing unit, a client side and a data processing unit, wherein the data processing unit is used for receiving a comparison request of the client side, issuing an initial data acquisition instruction to a merchant side, receiving initial data of an object to be compared fed back by the merchant side and sending the initial data to the data processing unit; the system comprises a data processing unit, a data acquisition unit and a comparison unit, wherein the data processing unit is used for issuing a current data acquisition instruction to a client, receiving current data of an object to be compared fed back by the client and sending the current data to the data processing unit; a data processing unit: and processing the initial data and the current data to generate a comparison result.
As above, wherein the data acquisition unit at least includes a first acquisition subunit, a second acquisition subunit, and a third acquisition subunit; wherein, the first acquisition subunit: the system is used for issuing an initial data acquisition instruction to a merchant terminal to acquire initial data of an object to be compared, and the initial data at least comprises: name, initial image, article attribute; a second acquisition subunit: the system is used for issuing a current data acquisition instruction to a client to acquire current data of an object to be compared, and the current data at least comprises: current image and current basic information; a third acquisition subunit: and the standard comparison data acquisition unit is used for receiving and executing the standard comparison data acquisition command issued by the analysis subunit and sending the acquired standard comparison data to the optimization subunit.
As above, wherein the data processing unit comprises: analyzing the subunits, comparing the subunits and optimizing the subunits; wherein the analysis subunit: the system is used for analyzing the initial data, determining a comparison model and issuing a model calling instruction; comparing the subunit: the comparison module is used for receiving the model calling instruction and activating the required comparison model according to the model calling instruction; receiving standard comparison data; after activating the needed comparison model, issuing a data acquisition instruction to a data acquisition unit, and receiving data to be compared; processing the data to be compared by using the comparison model and the standard comparison data to generate a comparison result; optimizing the subunit: the data acquisition unit is used for receiving current data fed back after the data acquisition unit executes the data acquisition instruction; receiving standard comparison data; and optimizing the current data by using the standard comparison data to obtain comparison data, and uploading the data to be compared to the comparison subunit.
The application also provides a comparison method, which comprises the following steps: acquiring initial data of objects to be compared; analyzing the initial data, and determining a comparison model and standard comparison data, wherein the comparison model at least comprises: a living body comparison model and a non-living body comparison model; acquiring current data and standard comparison data of an object to be compared, and optimizing the current data by using the standard comparison data to obtain the data to be compared; and processing the standard comparison data and the data to be compared by using the comparison model to obtain a comparison result.
As above, wherein the initial data is analyzed, the substeps of determining the standard alignment data are as follows: reading the name and the property of the initial data; and determining standard comparison data to be acquired according to the name and the characteristic attribute in the article attribute, and issuing an instruction for acquiring the standard comparison data.
As above, the substep of obtaining the data to be compared by optimizing the current data using the standard comparison data is as follows: receiving current data and standard comparison data of an object to be compared; analyzing the current basic information of the current data by using the standard comparison data to generate an adjustment analysis result; and optimizing the current image according to the adjustment analysis result to obtain the data to be compared.
As above, when the comparison model is the living body comparison model, the standard comparison data and the data to be compared are processed by using the comparison model, and the sub-step of obtaining the comparison result is as follows: screening all sub-standard data in the standard comparison data by using the current basic information of the current data to determine the optimal standard comparison data; performing feature extraction on the optimal standard comparison data to obtain standard comparison features; carrying out feature extraction on the data to be compared to obtain features to be compared; and comparing the characteristics to be compared with the standard comparison characteristics to obtain a comparison result.
As above, analyzing all sub-standard data in the standard comparison data according to the current location information, the current time information and the current climate information in the current basic information to obtain an external factor similarity value of each sub-standard data; and taking the sub-standard data corresponding to the external factor similarity value which satisfies the condition that the external factor similarity value is greater than or equal to the preset external factor similarity threshold value as the optimal standard comparison data.
As above, the formula for calculating the similarity value of the external factors is as follows:
Figure BDA0003080464350000031
wherein the content of the first and second substances,
Figure BDA0003080464350000032
is a extrinsic similarity value; mu.s 0 An importance parameter which is the distance between the areas; mu.s 1 The importance parameter of the soil property of the region is obtained; mu.s 2 An importance parameter for a growth time; mu.s 3 Is an important parameter of the growth stage of the living body; jd bz Longitude of the in vivo growth site in the sub-standard data; jd dq Longitude of the live growth site in the current data; wd bz The latitude of the living body growth site in the sub-standard data; wd dq The latitude of the living body growth place in the current data; gc of gc bz The elevation of the living body growth site in the sub-standard data; gc of gc dq The elevation of the living body growth site in the current data; Δ dqy is the allowable spacing value for the distance between the regions; n is a radical of bz The regional soil property comprehensive value is sub-standard data; n is a radical of dq The regional soil property comprehensive value of the current data is obtained; Δ ntz is an allowable difference value of soil texture; t is t bz The growth time of the living body in the sub-standard data; t is t dq The growth time of the living body in the current data; Δ t is an allowable error value for a long time; sjd bz The number of the growth stage of the living body in the sub-standard data is shown; sjd dq The serial number of the growth stage of the living body in the current data; Δ sjd is a growth phase allowable span value.
The method and the device are convenient for determining whether the received article and the purchased article are correct versions or not, and effectively improve the comparison accuracy.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present application, and other drawings can be obtained by those skilled in the art according to the drawings.
FIG. 1 is a schematic diagram of an embodiment of an alignment system;
FIG. 2 is a flowchart of an embodiment of an alignment method.
Detailed Description
The technical solutions in the embodiments of the present invention are clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, the present application provides an alignment system, comprising: at least one merchant terminal 110, at least one client terminal 120 and a comparison server 130.
Wherein, the merchant end 110: and the comparison server is used for receiving and executing an initial data acquisition instruction and feeding back the acquired initial data of the object to be compared to the comparison server.
The client 120: and the comparison server is used for receiving and executing the current data acquisition instruction and feeding back the acquired current data of the object to be compared to the comparison server.
The comparison server 130 is configured to perform the following steps:
acquiring initial data of objects to be compared;
analyzing the initial data, and determining a comparison model and standard comparison data, wherein the comparison model at least comprises: a living body comparison model and a non-living body comparison model;
acquiring current data and standard comparison data of an object to be compared, and optimizing the current data by using the standard comparison data to obtain the data to be compared;
and processing the standard comparison data and the data to be compared by using the comparison model to obtain a comparison result.
Further, the comparison server 130 at least includes: a data acquisition unit and a data processing unit; a data acquisition unit: the system comprises a data processing unit, a client side and a data processing unit, wherein the data processing unit is used for receiving a comparison request of the client side, issuing an initial data acquisition instruction to a merchant side, receiving initial data of an object to be compared fed back by the merchant side and sending the initial data to the data processing unit; the system comprises a data processing unit, a data acquisition unit and a comparison unit, wherein the data processing unit is used for issuing a current data acquisition instruction to a client, receiving current data of an object to be compared fed back by the client and sending the current data to the data processing unit; a data processing unit: and processing the initial data and the current data to generate a comparison result.
Further, the data acquisition unit at least comprises a first acquisition subunit, a second acquisition subunit and a third acquisition subunit; wherein, the first acquisition subunit: the system is used for issuing an initial data acquisition instruction to a merchant end to acquire initial data of an object to be compared, and the initial data at least comprises: name, initial image, article attribute; a second acquisition subunit: the system is used for issuing a current data acquisition instruction to a client to acquire current data of an object to be compared, and the current data at least comprises: current image and current basic information; a third acquisition subunit: and the standard comparison data acquisition unit is used for receiving and executing the standard comparison data acquisition command issued by the analysis subunit and sending the obtained standard comparison data to the optimization subunit.
Further, the data processing unit includes: analyzing the subunits, comparing the subunits and optimizing the subunits; wherein the analysis subunit: the system is used for analyzing the initial data, determining a comparison model and issuing a model calling instruction; comparing the subunit: the comparison module is used for receiving the model calling instruction and activating the required comparison model according to the model calling instruction; receiving standard comparison data; after activating the needed comparison model, issuing a data acquisition instruction to a data acquisition unit, and receiving data to be compared; processing the data to be compared by using the comparison model and the standard comparison data to generate a comparison result; optimizing the subunit: the data acquisition unit is used for receiving current data fed back after the data acquisition unit executes the data acquisition instruction; receiving standard comparison data; and optimizing the current data by using the standard comparison data to obtain comparison data, and uploading the data to be compared to the comparison subunit.
As shown in fig. 2, the present application provides an alignment method, comprising the following steps:
s210: and acquiring initial data of the objects to be compared.
Further, the substep of obtaining the initial data of the objects to be compared is as follows:
s2101: and receiving a comparison request, and issuing an initial data acquisition instruction.
Specifically, after receiving a comparison request sent by the client, the data acquisition unit issues an initial data acquisition instruction to the merchant through the first acquisition subunit.
S2102: and receiving initial data of the object to be compared, which is fed back after the initial data acquisition instruction is executed.
Specifically, after receiving the initial data obtaining instruction, the merchant executes the initial data obtaining instruction to obtain initial data of the object to be compared, and feeds the initial data back to the data processing unit, and after receiving the initial data, the data processing unit executes S220.
The object to be compared is an article purchased by the user through a non-physical store such as an online store and an online store.
Wherein the initial data comprises at least: name of the article, initial image, attribute of the article and initial basic information.
The name of the item represents the name of the item being purchased. The initial image represents an image taken by the merchant when the merchant has issued the item being purchased. The article attributes include: a base attribute and a feature attribute; wherein, the basic attribute comprises: living and non-living; the characteristic attributes include: live-feature attributes and non-live-feature attributes. Specifically, the non-living characteristic attribute is used to indicate the material and size of the purchased item. The vital sign attribute is used to indicate the item type of the purchased item.
Wherein the initial basic information includes: initial device information, initial location information, initial time information, and initial climate information. Specifically, the initial device information includes: the model of the device that captured the initial image, and the detailed information of the initial image (e.g., resolution, width, height, bit depth, etc.). The initial time information includes: the time at which the initial image was taken, and the production time of non-living goods or the growth stage of living goods at the time of shipment.
For example: if the user purchases a silk garment in a non-physical store mode, the names in the initial data of the objects to be compared are as follows: a garment; the article attributes are: non-living and silk products; if the user purchases a package of shredded squid through a non-physical store, the names in the initial data of the objects to be compared are as follows: a food product; the article attributes are: non-living and shredded squid snacks; if the user purchases a black bark rose in a non-physical store mode, the brand name in the initial data of the object to be compared is as follows: a plant; the article attributes are: live and black bark rose; if the user purchases a small dog of labrador variety by means of a non-brick store, the names in the initial data of the objects to be compared are as follows: an animal; the article attributes are: live and labrador puppy.
S220: analyzing the initial data, and determining a comparison model and standard comparison data, wherein the comparison model at least comprises: live alignment model and non-live alignment model.
Further, the initial data is analyzed, and the sub-steps of determining the alignment model are as follows:
s2201: the item attribute of the initial data is read.
Specifically, the data processing unit reads the article attribute of the initial data by the analysis subunit, and executes S2202.
S2202: and determining a comparison model to be called according to the basic attribute in the article attribute.
Specifically, if the basic attribute in the article attributes of the initial data is a living body, determining that the comparison model is a living body comparison model; and if the basic attribute in the article attributes of the initial data is a non-living body, determining that the comparison model is a non-living body comparison model.
S2203: and issuing a model calling instruction, wherein the model calling instruction comprises a comparison model needing to be called.
Specifically, after the analysis subunit determines the comparison model to be called, it issues a model calling instruction to the comparison subunit.
Further, the substeps of analyzing the initial data and determining the standard alignment data are as follows:
s2201': the name and item attributes of the initial data are read.
Specifically, the data processing unit reads the item name and the item attribute of the initial data by the analysis subunit, and executes S2202'.
S2202': and determining standard comparison data to be acquired according to the name and the characteristic attribute in the article attribute, and issuing an instruction for acquiring the standard comparison data.
Specifically, the analysis subunit acquires standard comparison data to be acquired according to the name and the characteristic attribute, and issues a standard comparison data acquisition instruction to the data acquisition unit.
S230: and acquiring current data and standard comparison data of the object to be compared, and optimizing the current data by using the standard comparison data to obtain the data to be compared.
Further, the current data is optimized by using the standard comparison data, and the substeps of obtaining the data to be compared are as follows:
s2301: and receiving the current data and the standard comparison data of the object to be compared.
Specifically, after the comparison subunit receives the model calling instruction, the required comparison model is activated according to the model calling instruction; and after the data acquisition unit receives the data acquisition instruction, the data acquisition unit issues a current data acquisition instruction to the client through the second acquisition subunit to acquire current data of the object to be compared, and uploads the current data to the optimization subunit. The third acquisition subunit receives and executes the instruction for acquiring the standard comparison data, and sends the acquired standard comparison data to the optimization subunit.
Wherein the current data at least comprises: a current image and current base information. The current basic information includes: current device information, current location information, current time information, and current climate information. Specifically, the current device information includes: the model of the device that captured the current image and the detailed information of the current image (e.g., resolution, width, height, bit depth, etc.). The current time information includes: the time when the current image was taken, and the production time of non-living goods or the growth stage of living goods at the time of shipment.
The standard comparison data is image data of an article uploaded by a merchant terminal or image data of the same variety acquired through a network. Specifically, if the object to be compared is a living body, the standard comparison data at least includes image data of the object to be compared in different areas, different time periods and different growth stages, and standard equipment information and standard climate information. If the object to be compared is a non-living body, the standard comparison data at least comprises the size, color and material of the object to be compared, standard equipment information and standard climate information. Since different devices are used for shooting the image, the set shooting parameters and the weather during shooting affect the color of the image and the like, and certain errors are caused, all the image data in the standard comparison data are the image data obtained by optimizing the original image data according to the preset standard device information and the standard weather information.
Wherein the standard device information includes: the model of the device that obtains the standard alignment data and the detailed information (e.g., resolution, width, height, bit depth, etc.) of the image data.
S2302: and analyzing the current basic information of the current data by using the standard comparison data to generate an adjustment analysis result.
Specifically, the sub-step of analyzing the current basic information of the current data and generating the adjustment analysis result is as follows:
s23021: and pre-judging the current equipment information in the current basic information by using the standard equipment information in the standard comparison data to generate a first pre-judgment result.
Specifically, the current device information is pre-determined by using the standard device information in the standard comparison data, and if one or more of the model of the device in the current device information and the detailed information of the image data are different from the standard device information, errors may exist in the obtained color, size and the like of the current image, and a first pre-determination result is generated as Ycy; if the model of the device in the current device information and the detailed information of the image data are both the same as those in the standard device information, there is no error in the color, size, and the like of the acquired current image, and the generated first pre-determination result is Wcy.
S23022: and pre-judging the current climate information in the current basic information by using the standard climate information in the standard comparison data to generate a second pre-judgment result.
Specifically, the current climate information in the current basic information is pre-judged by using the standard climate information in the standard comparison data, if the difference value between the current climate information and the standard climate information is greater than or equal to the preset climate difference threshold value, errors may exist in the color, the size and the like of the obtained current image, and a second pre-judgment result is generated as Ycy; if the difference value between the current climate information and the standard climate information is smaller than the preset climate difference threshold value, there is no error in the color, size, etc. of the obtained current image, and the second pre-determination result is Wcy.
S23023: and generating an adjustment analysis result according to the first pre-judgment result and the second pre-judgment result.
Specifically, if at least one Ycy exists in the first pre-determined result and the second pre-determined result, the generated adjustment analysis result is: xtz, and the adjustment analysis result also needs to include the adjustment reason. If the first predetermined result and the second predetermined result are both Wcy, the generated adjustment analysis result is: btz, and there is no adjustment reason in the adjustment analysis result.
Wherein the adjustment reason comprises a first factor and a second factor. When only the first pre-determination result is Ycy, the adjustment factor is the first factor. When only the second pre-determination result is Ycy, the adjustment factor is the second factor. When the first pre-judgment result and the second pre-judgment result are both Ycy, the adjustment factor is the first factor and the second factor.
S2303: and optimizing the current image according to the adjustment analysis result to obtain the data to be compared.
Further, the current image is optimized according to the adjustment analysis result, and the sub-steps of obtaining the data to be compared are as follows:
s23031: reading the adjustment analysis result, and if the adjustment analysis result is Xtz, executing S23032; if the adjustment result is Wcy, S23033 is executed.
Specifically, reading an adjustment analysis result, if the adjustment analysis result is Xtz, which indicates that an error exists in the current image and the current image needs to be adjusted, executing S23032; if the adjustment result is Wcy, which indicates that there is no possibility of error in the current image and no adjustment is required, S23033 is performed.
S23032: and performing optimization processing on the current image according to the adjustment reason to obtain an optimized image, and executing S23033.
Further, the current image is optimized according to the adjustment reason, and the sub-steps of obtaining the optimized image are as follows:
s230321: reading the adjustment reason, judging whether the adjustment reason comprises a first factor, and if so, judging whether the adjustment reason comprises the following factors: then S230322 is performed; if not, then S230334 is performed.
S230322: and optimizing the current image according to the first factor to obtain a first processed image.
Further, the sub-steps of performing optimization processing on the current image according to the first factor to obtain a first processed image are as follows:
t1: a first adjustment parameter is obtained based on a first factor.
Specifically, parameters (such as resolution, width, height, bit depth and the like) different from those in the standard device information in the current device information are analyzed, and a difference value of the corresponding parameters is obtained as a first adjustment parameter, where an expression of the first adjustment parameter is:
T cs1 =cbz i -cdq i
wherein, T cs1 Is a first adjustment parameter; cbz i The value of the ith parameter in the standard equipment information; cdq i Is the value of the ith parameter in the current device information.
T2: and adjusting the current image according to the first adjustment parameter to obtain a first processed image.
Specifically, after the first adjustment parameter is obtained, the current image is adjusted by using the first adjustment parameter to obtain a first processed image, and S230323 is executed.
S230323: reading the adjustment reason, judging whether the adjustment reason comprises a second factor, and if so, judging whether the adjustment reason comprises the following factors: then S230324 is performed; if not, S230335 is performed.
S230334: the current image or the first processed image is subjected to optimization processing according to the second factor to obtain a second processed image, and S230335 is performed.
Further, the sub-step of performing optimization processing on the current image or the first processed image according to a second factor to obtain a second processed image is as follows:
t1': and obtaining a second adjusting parameter according to the second factor.
Specifically, the influence of the current climate on the light when the current image is shot is analyzed, and a second adjustment parameter is obtained.
T2': and adjusting the current image or the first processed image according to a second factor to obtain a second processed image.
Specifically, the current image or the first processed image is adjusted according to the second factor to obtain a second processed image, and after the second processed image is obtained, S230335 is executed.
S230335: and taking the first processed image or the second processed image as an optimized image.
Specifically, if the adjustment reason does not include the second factor, the first processed image is used as the optimized image; and if the adjustment reason comprises a second factor, taking the second processed image as an optimized image.
S23033: and determining the data to be compared.
Specifically, if the adjustment result is Wcy, the current image is directly used as the data to be compared; if the adjustment result is Xtz, the optimized image is used as the data to be compared. After the data to be compared is determined, S240 is executed.
S240: and processing the standard comparison data and the data to be compared by using the comparison model to obtain a comparison result.
Further, as an embodiment, when the comparison model is the living body comparison model, the comparison model is used to process the standard comparison data and the data to be compared, and the sub-step of obtaining the comparison result is as follows:
s2401: and screening all sub-standard data in the standard comparison data by using the current basic information of the current data to determine the optimal standard comparison data.
Further, analyzing all sub-standard data in the standard comparison data according to current location information, current time information and current climate information in the current basic information to obtain an external factor similarity value of each sub-standard data; and taking the sub-standard data corresponding to the external factor similarity value which is the largest and meets the condition of being greater than or equal to the preset external factor similarity threshold value among all the external factor similarity values as the optimal standard comparison data. Wherein, the sub-standard data is: image data of the objects to be compared in different areas, different time periods and different growth stages.
Further, the external factor similarity value is calculated as follows:
Figure BDA0003080464350000121
wherein the content of the first and second substances,
Figure BDA0003080464350000131
is a extrinsic similarity value; mu.s 0 An importance parameter which is the distance between the areas; mu.s 1 The importance parameter of the soil property of the region is obtained; mu.s 2 An importance parameter for a growth time; mu.s 3 Is an important parameter of the growth stage of the living body; jd bz Longitude of the in vivo growth site in the sub-standard data; jd dq Longitude of the living body growth position in the current data; wd bz The latitude of the living body growth site in the sub-standard data; wd dq The latitude of the living body growth place in the current data; gc of gc bz The elevation of the living body growth site in the sub-standard data is taken as the elevation of the living body growth site; gc of gc dq The elevation of the living body growth site in the current data; Δ dqy is the allowable spacing value for the distance between the regions; n is a radical of hydrogen bz The regional soil property comprehensive value is sub-standard data; n is a radical of dq The regional soil property comprehensive value of the current data is obtained; Δ ntz is an allowable difference value of soil texture; t is t bz As growth time of living body in sub-standard data;t dq The growth time of the living body in the current data is taken; Δ t is an allowable error value for a long time; sjd bz The number of the growth stage of the living body in the sub-standard data is shown; sjd dq The serial number of the growth stage of the living body in the current data; Δ sjd is a growth phase allowable span value.
Specifically, the growth time is expressed by specific days, months, and years, and indicates the growth time of the living body at the time of imaging, and for example, when the imaging is 3 months and 28 days, the growth time at the time of imaging the living body is 3 months and 28 days. The growth stage of the living body indicates the growth state of the living body, for example: if the living body is a plant, the growing stage comprises: seed stage, germination stage, flowering stage, fruiting stage, etc. The living body is an animal: neonatal, juvenile, adult and elderly stages, etc. The number of the growth stage is gradually increased from the juvenile stage to the old stage, for example, the number of the seed stage is 1, and the number of the germination stage is 2. The growth phase allowable span value is equal to 1, which means that the same growth phase and the adjacent growth phase are allowed to be compared.
Further, the expression of the regional soil property comprehensive value is as follows:
Figure BDA0003080464350000132
wherein, N is the regional soil property comprehensive value, when the regional soil property comprehensive value of the current data is obtained, N is equal to N dq When the regional soil property comprehensive value of the sub-standard data is obtained, N is equal to N bz B, carrying out the following steps of; τ is the total number of sample points; y represents a nutrient element in the soil;
Figure BDA0003080464350000133
the concentration of the nutrient element y dissolved in the solution in the alpha sampling point is shown; v is the volume of the solution; g α The weight of the soil at the alpha th sampling point.
S2402: and performing feature extraction on the optimal standard comparison data to obtain standard comparison features.
Specifically, feature extraction is performed on the optimal standard comparison data to obtain standard comparison features. The standard comparison features are used for expressing color, texture, shape and spatial relationship in the image, and can be color features, texture features, shape features, spatial relationship features and the like.
S2403: and performing feature extraction on the data to be compared to obtain the features to be compared.
Specifically, feature extraction is performed on the data to be compared to obtain features to be compared. The features to be compared are used for expressing color, texture, shape and spatial relationship in the image, and can be color features, texture features, shape features, spatial relationship features and the like.
S2404: and comparing the characteristics to be compared with the standard comparison characteristics to obtain a comparison result.
Specifically, the features to be compared are compared with the standard comparison features, and if the comparison result is greater than or equal to the preset feature similarity threshold, the generated comparison result is: checking the plate; if the comparison result is smaller than the preset characteristic similarity threshold, the generated comparison result is as follows: and (5) no register is performed.
S2405: and if the comparison result is not the version comparison, confirming the optimal standard comparison data corresponding to the initial data, and comparing the initial data with the optimal standard comparison data to obtain a new comparison result.
Specifically, if the comparison result obtained by comparing the standard comparison data with the data to be compared is not the version comparison, the optimal standard comparison data corresponding to the initial data is determined, and the initial data is compared with the optimal standard comparison data to obtain a new comparison result. If the new comparison result is not the version, the article variety sent by the business end is wrong, if the new comparison result is the version, the article variety is correct, and the current difference is caused by the influence of the acquired growth environment factors.
Further, as an embodiment, when the comparison model is a non-living body comparison model, the comparison model is used to process the standard comparison data and the data to be compared, and the sub-step of obtaining the comparison result is as follows:
s2401': and integrally analyzing the data to be compared by using the initial data to obtain an integral analysis result.
Specifically, overall analysis is performed on data to be compared by using non-living body characteristic attributes in the initial data, wherein the overall analysis includes: and if one or more of the sub-analyses of the style, the color, the size and the like has a deviation larger than a preset overall deviation value, generating an overall analysis result as follows: if there is an error, S2403' is performed. If the sub-analysis of the style, the color, the size and the like does not have deviation larger than the preset integral deviation value, the generated integral analysis result is as follows: without error, S2402' is performed.
S2402': and carrying out local analysis on the data to be compared by utilizing the standard comparison data to obtain a local analysis result.
Specifically, the sub-steps of performing local analysis on the data to be compared by using the standard comparison data to obtain a local analysis result are as follows:
s24021': and acquiring a standard comparison local area of the standard comparison data, and extracting the characteristics of the standard comparison local area as standard comparison characteristics.
Specifically, a region of the standard comparison data, which has the same material as the data to be compared, is used as a standard comparison local region, feature extraction is performed on the standard comparison local region, the obtained feature is used as a standard comparison feature, the standard comparison feature is used for expressing the image characteristics of the material, and the standard comparison feature may be a texture feature, but is not limited to the texture feature.
S24022': and acquiring a local area to be compared of the data to be compared, and extracting the characteristics of the local area to be compared as the characteristics to be compared.
Specifically, an area in the data to be compared, which needs to be subjected to material comparison, is used as a local area to be compared, feature extraction is performed on the local area to be compared, the obtained features are used as features to be compared, the features to be compared are used for expressing the image characteristics of the material, and the standard comparison features can be texture features, but are not limited to texture features.
S24023': and comparing the standard comparison characteristic with the characteristic to be compared to generate a local analysis result.
Specifically, the standard comparison feature and the feature to be compared are compared, and if the comparison result is greater than the preset local deviation value, the generated local analysis result is as follows: an error exists; if the comparison result is less than or equal to the preset local deviation value, the generated local analysis result is as follows: and (4) no error exists.
S2403': and generating a comparison result according to the overall analysis result or the local analysis result.
Specifically, if one or more errors exist in the overall analysis result and the local analysis result, the generated comparison result is: the printing is not checked. If the overall analysis result and the local analysis result are both error-free, the generated comparison result is as follows: and (6) registering.
The method and the device have the technical effects that whether the received articles and the purchased articles are correct versions or not is convenient to determine, and the comparison accuracy is effectively improved.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, the scope of protection of this application is intended to be construed as including the preferred embodiments and all variations and modifications that fall within the scope of the application. It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (7)

1. An alignment system, comprising: the system comprises at least one business user terminal, at least one client terminal and a comparison server;
wherein the merchant end: the system comprises a comparison server, a comparison server and a comparison module, wherein the comparison server is used for receiving and executing an initial data acquisition instruction and feeding back the acquired initial data of an object to be compared to the comparison server;
the client side comprises the following steps: the comparison server is used for receiving and executing a current data acquisition instruction and feeding back the acquired current data of the object to be compared to the comparison server;
the comparison server is used for executing the following steps:
acquiring initial data of objects to be compared; wherein the initial data at least comprises: the method comprises the following steps of (1) obtaining a product name, an initial image, an article attribute and initial basic information;
analyzing the initial data, and determining a comparison model and standard comparison data, wherein the comparison model at least comprises: a living body comparison model and a non-living body comparison model;
acquiring current data and standard comparison data of an object to be compared, and optimizing the current data by using the standard comparison data to obtain the data to be compared;
processing the standard comparison data and the data to be compared by using a comparison model to obtain a comparison result;
when the comparison model is a living body comparison model, the comparison model is used for processing the standard comparison data and the data to be compared, and the substep of obtaining the comparison result is as follows:
screening all sub-standard data in the standard comparison data by using the current basic information of the current data to determine the optimal standard comparison data;
performing feature extraction on the optimal standard comparison data to obtain standard comparison features;
carrying out feature extraction on the data to be compared to obtain features to be compared;
comparing the features to be compared with the standard comparison features to obtain comparison results;
analyzing all sub-standard data in the standard comparison data according to current location information, current time information and current climate information in the current basic information to obtain an external factor similarity value of each sub-standard data; and the sub-standard data corresponding to the external factor similarity value which satisfies the condition that the external factor similarity value is greater than or equal to the preset external factor similarity threshold value is taken as the optimal standard comparison data;
wherein, the calculation formula of the external factor similarity value is as follows:
Figure FDA0003758865540000021
wherein the content of the first and second substances,
Figure FDA0003758865540000022
is a extrinsic similarity value; mu.s 0 An importance parameter being the distance between the belonging areas; mu.s 1 The importance parameter of the soil property of the region is obtained; mu.s 2 An importance parameter for a growth time; mu.s 3 Is an important parameter of the growth stage of the living body; jd bz Longitude of the in vivo growth site in the sub-standard data; jd dq Longitude of the live growth site in the current data; wd bz The latitude of the living body growth site in the sub-standard data; wd dq The latitude of the living body growth place in the current data; gc of gc bz The elevation of the living body growth site in the sub-standard data; gc of gc dq The elevation of the living body growth site in the current data; Δ dqy is the allowable spacing value for the distance between the regions; n is a radical of bz The regional soil property comprehensive value is sub-standard data; n is a radical of dq The regional soil property comprehensive value of the current data is obtained; Δ ntz is an allowable difference value of soil texture; t is t bz The growth time of the living body in the sub-standard data; t is t dq The growth time of the living body in the current data; Δ t is an allowable error value for a long time; sjd bz The number of the growth stage of the living body in the sub-standard data is the serial number of the growth stage of the living body in the sub-standard data; sjd dq The serial number of the growth stage of the living body in the current data; Δ sjd is a growth phase allowable span value.
2. The alignment system of claim 1, wherein the alignment server comprises at least: a data acquisition unit and a data processing unit;
the data acquisition unit: the system comprises a data processing unit, a client side and a data processing unit, wherein the data processing unit is used for receiving a comparison request of the client side, issuing an initial data acquisition instruction to a merchant side, receiving initial data of an object to be compared fed back by the merchant side and sending the initial data to the data processing unit; the system comprises a data processing unit, a data acquisition unit and a comparison unit, wherein the data processing unit is used for issuing a current data acquisition instruction to a client, receiving current data of an object to be compared fed back by the client and sending the current data to the data processing unit;
the data processing unit: and processing the initial data and the current data to generate a comparison result.
3. The comparison system according to claim 2, wherein the data acquisition unit comprises at least a first acquisition subunit, a second acquisition subunit, and a third acquisition subunit;
wherein the first acquisition subunit: the system comprises a data acquisition module, a comparison module and a comparison module, wherein the data acquisition module is used for issuing an initial data acquisition instruction to a merchant terminal to acquire initial data of an object to be compared;
the second acquisition subunit: the system is used for issuing a current data acquisition instruction to a client to acquire current data of an object to be compared, wherein the current data at least comprises: current image and current basic information;
the third acquisition subunit: and the standard comparison data acquisition unit is used for receiving and executing the standard comparison data acquisition command issued by the analysis subunit and sending the acquired standard comparison data to the optimization subunit.
4. The alignment system of claim 2, wherein the data processing unit comprises: analyzing the subunits, comparing the subunits and optimizing the subunits;
wherein the analysis subunit: the system is used for analyzing the initial data, determining a comparison model and issuing a model calling instruction;
the ratio pair subunit: the comparison module is used for receiving the model calling instruction and activating the required comparison model according to the model calling instruction; receiving standard comparison data; after activating the needed comparison model, issuing a data acquisition instruction to a data acquisition unit, and receiving data to be compared; processing the data to be compared by using the comparison model and the standard comparison data to generate a comparison result;
the optimization subunit: the data acquisition unit is used for receiving current data fed back after the data acquisition unit executes the data acquisition instruction; receiving standard comparison data; and optimizing the current data by using the standard comparison data to obtain comparison data, and uploading the data to be compared to the comparison subunit.
5. An alignment method, comprising the steps of:
acquiring initial data of objects to be compared; wherein the initial data at least comprises: the method comprises the following steps of (1) obtaining a product name, an initial image, an article attribute and initial basic information;
analyzing the initial data, and determining a comparison model and standard comparison data, wherein the comparison model at least comprises: a living body comparison model and a non-living body comparison model;
acquiring current data and standard comparison data of an object to be compared, and optimizing the current data by using the standard comparison data to obtain the data to be compared;
processing the standard comparison data and the data to be compared by using a comparison model to obtain a comparison result;
when the comparison model is a living body comparison model, the comparison model is used for processing the standard comparison data and the data to be compared, and the sub-step of obtaining the comparison result is as follows:
screening all sub-standard data in the standard comparison data by using the current basic information of the current data to determine the optimal standard comparison data;
performing feature extraction on the optimal standard comparison data to obtain standard comparison features;
carrying out feature extraction on the data to be compared to obtain features to be compared;
comparing the features to be compared with the standard comparison features to obtain comparison results;
analyzing all sub-standard data in the standard comparison data according to current location information, current time information and current climate information in the current basic information to obtain an external factor similarity value of each sub-standard data; taking sub-standard data corresponding to the external factor similarity value which is the largest of all external factor similarity values and meets the external factor similarity threshold condition larger than or equal to the preset external factor similarity threshold condition as optimal standard comparison data;
wherein, the calculation formula of the external factor similarity value is as follows:
Figure FDA0003758865540000041
wherein the content of the first and second substances,
Figure FDA0003758865540000042
is a extrinsic similarity value; mu.s 0 An importance parameter which is the distance between the areas; mu.s 1 The importance parameter of the soil property of the region is obtained; mu.s 2 An importance parameter for a growth time; mu.s 3 Is an important parameter of the growth stage of the living body; jd bz Longitude of the in vivo growth site in the sub-standard data; jd dq Longitude of the live growth site in the current data; wd bz The latitude of the living body growth site in the sub-standard data; wd dq The latitude of the living body growth place in the current data; gc of bz The elevation of the living body growth site in the sub-standard data; gc of gc dq The elevation of the living body growth site in the current data; Δ dqy is the allowable spacing value for the distance between the regions; n is a radical of bz The regional soil property comprehensive value is sub-standard data; n is a radical of dq The regional soil property comprehensive value of the current data is obtained; Δ ntz is an allowable difference value of soil texture; t is t bz The growth time of the living body in the sub-standard data; t is t dq The growth time of the living body in the current data is taken; Δ t is an allowable error value for a long time; sjd bz The number of the growth stage of the living body in the sub-standard data is shown; sjd dq The serial number of the growth stage of the living body in the current data; Δ sjd is a growth phase allowable span value.
6. The alignment method of claim 5, wherein the initial data is analyzed and the standard alignment data is determined by the following sub-steps:
reading the name and the property of the initial data;
and determining standard comparison data to be acquired according to the name and the characteristic attribute in the article attribute, and issuing an instruction for acquiring the standard comparison data.
7. The comparison method according to claim 5, wherein the sub-step of obtaining the data to be compared by optimizing the current data using the standard comparison data is as follows:
receiving current data and standard comparison data of an object to be compared;
analyzing the current basic information of the current data by using the standard comparison data to generate an adjustment analysis result;
and optimizing the current image according to the adjustment analysis result to obtain the data to be compared.
CN202110565254.8A 2021-05-24 2021-05-24 Comparison method and system Active CN113269245B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110565254.8A CN113269245B (en) 2021-05-24 2021-05-24 Comparison method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110565254.8A CN113269245B (en) 2021-05-24 2021-05-24 Comparison method and system

Publications (2)

Publication Number Publication Date
CN113269245A CN113269245A (en) 2021-08-17
CN113269245B true CN113269245B (en) 2022-09-02

Family

ID=77232416

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110565254.8A Active CN113269245B (en) 2021-05-24 2021-05-24 Comparison method and system

Country Status (1)

Country Link
CN (1) CN113269245B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106570703A (en) * 2015-10-09 2017-04-19 阿里巴巴集团控股有限公司 Commodity object identification information processing method and device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180006691A (en) * 2016-07-11 2018-01-19 주식회사 스토어카메라 Method and program for taking the image to upload on online-mall
CN109902737A (en) * 2019-02-25 2019-06-18 厦门商集网络科技有限责任公司 A kind of bill classification method and terminal
CN110738640B (en) * 2019-09-29 2022-11-18 万翼科技有限公司 Spatial data comparison method and related product
CN112418009B (en) * 2020-11-06 2024-03-22 中保车服科技服务股份有限公司 Image quality detection method, terminal equipment and storage medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106570703A (en) * 2015-10-09 2017-04-19 阿里巴巴集团控股有限公司 Commodity object identification information processing method and device

Also Published As

Publication number Publication date
CN113269245A (en) 2021-08-17

Similar Documents

Publication Publication Date Title
US11816120B2 (en) Extracting seasonal, level, and spike components from a time series of metrics data
CN107153971B (en) Method and device for identifying equipment cheating in APP popularization
US7437308B2 (en) Methods for estimating the seasonality of groups of similar items of commerce data sets based on historical sales date values and associated error information
CN105512949A (en) Culturing farm informationized management system based on social platform and method thereof
Anguraj et al. Crop recommendation on analyzing soil using machine learning
Liseune et al. Leveraging latent representations for milk yield prediction and interpolation using deep learning
Santa-Catarina et al. Image-based phenotyping of morpho-agronomic traits in papaya fruits (Carica papayaL. THB var.)
CN113269245B (en) Comparison method and system
CN110688513A (en) Crop survey method and device based on video and computer equipment
CN109190663A (en) A method of the identification cigarette case product rule based on depth learning technology
Xu et al. An automatic wheat ear counting model based on the minimum area intersection ratio algorithm and transfer learning
CN112966486A (en) Intelligent engineering quantity list generation method and device, terminal and storage medium
CN116168275A (en) Lightweight dual-attention mechanism identification method based on feature grouping and channel replacement
TWI804090B (en) Learning system, learning method and program product
Oury et al. Earbox, an open tool for high-throughput measurement of the spatial organization of maize ears and inference of novel traits
Diepeveen et al. Identifying key crop performance traits using data mining
Strunk et al. Stand validation of lidar forest inventory modeling for a managed southern pine forest
CN113688690A (en) Large-scale fruit counting method and system
CN114066132A (en) Quality evaluation method and evaluation device for agricultural products
Kumari et al. CertiMart: Use Computer Vision to Digitize and Automate Supermarket with Fruit Quality Measuring and Maintaining
KR20200041533A (en) Real-time breeding value and genetic parameter evaluation system
Miranda et al. Assessing automatic data processing algorithms for RGB-D cameras to predict fruit size and weight in apples
CN115033763B (en) Big data based storage method and system thereof
Dymond et al. Use of VEGETATION satellite imagery to map pasture quality for input to a methane budget of New Zealand
Sukati et al. Supply response of milk producers to economic and non-economic factors in Swaziland

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant