CN112541381A - Robot-based commodity purchasing method and robot - Google Patents

Robot-based commodity purchasing method and robot Download PDF

Info

Publication number
CN112541381A
CN112541381A CN202010278703.6A CN202010278703A CN112541381A CN 112541381 A CN112541381 A CN 112541381A CN 202010278703 A CN202010278703 A CN 202010278703A CN 112541381 A CN112541381 A CN 112541381A
Authority
CN
China
Prior art keywords
commodity
information
robot
target
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010278703.6A
Other languages
Chinese (zh)
Inventor
顾震江
刘大志
孙其民
罗沛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Uditech Co Ltd
Original Assignee
Uditech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Uditech Co Ltd filed Critical Uditech Co Ltd
Priority to CN202010278703.6A priority Critical patent/CN112541381A/en
Publication of CN112541381A publication Critical patent/CN112541381A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N3/00Investigating strength properties of solid materials by application of mechanical stress
    • G01N3/40Investigating hardness or rebound hardness
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/02Food
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06395Quality analysis or management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/018Certifying business or products

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Resources & Organizations (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Development Economics (AREA)
  • Mechanical Engineering (AREA)
  • Robotics (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Marketing (AREA)
  • Food Science & Technology (AREA)
  • Educational Administration (AREA)
  • Data Mining & Analysis (AREA)
  • General Business, Economics & Management (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Game Theory and Decision Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Accounting & Taxation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Finance (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • Artificial Intelligence (AREA)
  • Medicinal Chemistry (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)

Abstract

The application is suitable for the technical field of robots, and provides a commodity purchasing method based on a robot, which comprises the following steps: if the purchasing instruction is detected, determining a target commodity pointed by the purchasing instruction; acquiring commodity information of a target commodity; determining a quality detection result of the target commodity based on a preset detection rule and commodity information; and screening the commodities with the quality detection results meeting the preset quality standard from the target commodities, and purchasing the screened commodities. According to the method, the robot detects the purchasing instruction, the commodity information of the target commodity pointed by the purchasing instruction is obtained, the robot performs quality detection on the commodity based on the commodity information, the quality detection result is determined, the purchasing quality detection result is the qualified target commodity, when a user purchases the commodity through the robot, the quality of the commodity can be judged, the high-quality target commodity is purchased, and the shopping experience of the user is improved.

Description

Robot-based commodity purchasing method and robot
Technical Field
The application belongs to the technical field of robots, and particularly relates to a commodity purchasing method based on a robot and the robot.
Background
With the development of the technology, the robot can perform auxiliary work in many fields. The existing robot technology can assist a user in shopping, and the user can remotely control the robot to move and acquire images of commodities through the robot, so that the robot can be remotely controlled to purchase the commodities. However, in the conventional method for purchasing a commodity by a robot, only the type of the commodity is selected, and the quality of the commodity cannot be determined. Therefore, the user may buy the product with poor quality, which may affect the shopping experience of the user.
Disclosure of Invention
The embodiment of the application provides a robot-based commodity purchasing method and a robot, and the problem that when a user purchases commodities through the robot, the quality of the commodities cannot be judged, the user possibly purchases commodities with poor quality, and the shopping experience of the user is poor can be solved.
In a first aspect, an embodiment of the present application provides a commodity purchasing method based on a robot, which is applied to the robot, and the method includes:
if a purchasing instruction is detected, determining a target commodity pointed by the purchasing instruction;
acquiring commodity information of the target commodity;
determining a quality detection result of the target commodity based on a preset detection rule and the commodity information;
and screening the commodities of which the quality detection results meet the preset quality standard from the target commodities, and purchasing the screened commodities.
Further, the target commodity is a fresh commodity, and the commodity information comprises a commodity image of the fresh commodity;
the determining the quality detection result of the target commodity based on the preset detection rule and the commodity information comprises:
inputting the commodity image into a pre-trained freshness identification model for processing, and identifying the freshness information of the fresh commodity;
and determining the quality detection result of the fresh commodity based on the freshness information.
Further, the training process of the freshness identification model comprises the following steps:
acquiring two-dimensional image samples, three-dimensional characteristic information and freshness labels of a plurality of preset commodities to obtain a sample data set consisting of the two-dimensional image samples, the three-dimensional characteristic information and the freshness labels of the preset commodities;
training a preset initial model based on the sample data set until a preset convergence condition is met, and finishing the training to obtain the trained freshness identification model.
Further, the target commodity is a fresh commodity, and the commodity information comprises smell information and surface texture information of the fresh commodity;
the determining the quality detection result of the target commodity based on the preset detection rule and the commodity information comprises:
determining rot degree information of the fresh commodity based on preset smell of the fresh commodity and the smell information;
determining a water content of the fresh good based on the surface texture information;
determining the quality detection result of the fresh commodity based on the rot degree information and the water content.
Further, the determining the moisture content of the fresh good based on the surface texture information includes:
and acquiring wrinkle texture information in the surface texture information, and determining the water content of the fresh commodity based on the ratio between the wrinkle texture information and the surface texture information.
Further, the commodity information includes a hardness value of the target commodity;
the determining the quality detection result of the target commodity based on the preset detection rule and the commodity information comprises:
and when the hardness value is greater than or equal to a preset hardness threshold value, judging that the quality detection result of the target commodity is qualified.
Further, the target commodity is a fruit and vegetable commodity, and the commodity information comprises a sample mass spectrogram of the fruit and vegetable commodity;
the determining the quality detection result of the target commodity based on the preset detection rule and the commodity information comprises:
acquiring difference information between a preset pesticide standard mass spectrogram and the sample mass spectrogram;
determining pesticide residue of the fresh commodity based on the difference information;
and determining the quality detection result of the target commodity based on the pesticide residue.
Further, after the determining the quality detection result of the target product based on the preset detection rule and the product information, the method further includes:
and sending the quality detection result to a user terminal.
In a second aspect, an embodiment of the present application provides a robot, including:
the first processing unit is used for determining a target commodity pointed by a purchasing instruction when the purchasing instruction is detected;
a first acquisition unit configured to acquire commodity information of the target commodity;
the first determining unit is used for determining the quality detection result of the target commodity based on a preset detection rule and the commodity information;
and the second processing unit is used for screening out the commodities of which the quality detection results meet the preset quality standard from the target commodities and purchasing the screened commodities.
Further, the target commodity is a fresh commodity, and the commodity information comprises a commodity image of the fresh commodity;
the first determining unit is specifically configured to:
inputting the commodity image into a pre-trained freshness identification model for processing, and identifying the freshness information of the fresh commodity;
and determining the quality detection result of the fresh commodity based on the freshness information.
In one aspect, the robot further includes:
the second acquisition unit is used for acquiring two-dimensional image samples, three-dimensional characteristic information and freshness labels of a plurality of preset commodities to obtain a sample data set consisting of the two-dimensional image samples, the three-dimensional characteristic information and the freshness labels of the preset commodities;
and the training unit is used for training a preset initial model based on the sample data set until a preset convergence condition is met, finishing the training and obtaining the trained freshness identification model.
Further, the target commodity is a fresh commodity, and the commodity information comprises smell information and surface texture information of the fresh commodity;
the first determination unit includes:
the second determining unit is used for determining the rotting degree information of the fresh commodity based on the preset smell of the fresh commodity and the smell information;
a third determining unit for determining the water content of the fresh commodity based on the surface texture information;
a fourth determination unit for determining the quality detection result of the fresh commodity based on the rotten degree information and the water content.
Further, the third determining unit is specifically configured to:
and acquiring wrinkle texture information in the surface texture information, and determining the water content of the fresh commodity based on the ratio between the wrinkle texture information and the surface texture information.
Further, the commodity information includes a hardness value of the target commodity;
the first determining unit is specifically configured to:
and when the hardness value is greater than or equal to a preset hardness threshold value, judging that the quality detection result of the target commodity is qualified.
Further, the target commodity is a fruit and vegetable commodity, and the commodity information comprises a sample mass spectrogram of the fruit and vegetable commodity;
the first determining unit is specifically configured to:
acquiring difference information between a preset pesticide standard mass spectrogram and the sample mass spectrogram;
determining pesticide residue of the fresh commodity based on the difference information;
and determining the quality detection result of the target commodity based on the pesticide residue.
In a third aspect, the present application provides a robot, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program to implement the robot-based commodity purchasing method as described in the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the robot-based commodity purchasing method according to the first aspect.
In the embodiment of the application, if the purchasing instruction is detected, the target commodity pointed by the purchasing instruction is determined; acquiring commodity information of a target commodity; determining a quality detection result of the target commodity based on a preset detection rule and commodity information; and screening the commodities with the quality detection results meeting the preset quality standard from the target commodities, and purchasing the screened commodities. According to the method, the robot detects the purchasing instruction, the commodity information of the target commodity pointed by the purchasing instruction is obtained, the robot performs quality detection on the commodity based on the commodity information, the quality detection result is determined, the purchasing quality detection result is the qualified target commodity, when a user purchases the commodity through the robot, the quality of the commodity can be judged, the high-quality target commodity is purchased, and the shopping experience of the user is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1 is a schematic flow chart diagram of a robot-based merchandise procurement method according to a first embodiment of the application;
fig. 2 is a schematic flowchart of a detailed step S103 in a robot-based merchandise purchasing method according to a first embodiment of the present application;
FIG. 3 is a schematic flow chart of a training process of a freshness identification model in S1031 in a robot-based merchandise procurement method provided by a first embodiment of the present application;
FIG. 4 is a schematic flowchart of a detailed step S103 of a robot-based merchandise purchasing method according to a first embodiment of the present application;
FIG. 5 is a schematic flowchart of a detailed step S103 of a method for purchasing goods based on a robot according to a first embodiment of the present application;
FIG. 6 is a schematic view of a robot provided in a second embodiment of the present application;
fig. 7 is a schematic view of a robot according to a third embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to" determining "or" in response to detecting ". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
Referring to fig. 1, fig. 1 is a schematic flow chart of a robot-based merchandise purchasing method according to a first embodiment of the present application. In this embodiment, an execution subject of the robot-based commodity purchasing method is a robot having a purchasing function. The robot-based merchandise procurement method as shown in fig. 1 may include:
s101: and if the purchasing instruction is detected, determining the target commodity pointed by the purchasing instruction.
In this embodiment, the robot has a purchasing function. The robot is in communication connection with a user terminal, and the user terminal can send a purchase instruction to the robot. And the robot receives a purchasing instruction sent by the user terminal and is triggered to execute the commodity purchasing method in the embodiment.
The robot detects a purchase instruction, wherein the purchase instruction is an instruction for triggering the robot to perform purchase and may include information representing the user's needs, such as the name of the item to be purchased and the location of the item to be purchased. The purchase instruction can be sent by the user terminal, the purchase instruction is generated by inputting the name of a commodity to be purchased on the user terminal by a user and clicking the button for determining purchase, for example, the user wants to purchase an apple, inputs the apple on the user terminal, clicks the button for determining purchase to trigger the generation of the purchase instruction, and then the user terminal sends the adoption instruction to the robot; the purchase instruction may also be sent via a server, without limitation.
The purchase instruction includes the commodity identification information of the target commodity to be purchased. The target commodity is the commodity pointed by the purchasing instruction, namely the commodity to be purchased. The robot obtains the target commodity pointed by the purchasing instruction based on the commodity identification information. In addition, the purchasing instruction may also include location information of the target product to be purchased in a shopping mall or a supermarket, and the like, which is not limited herein.
S102: and acquiring commodity information of the target commodity.
The robot acquires commodity information of a target commodity. The commodity information is information related to the target commodity, and the commodity information is used for judging the quality of the target commodity, that is, the commodity information of the target commodity is information related to the target commodity and capable of judging the quality of the target commodity.
For example, the merchandise information may include an image of the target merchandise. The robot can acquire the image of the target commodity through the image acquisition device of the robot, and the image can be a single-frame image or video information. The robot can judge whether the target commodity package is intact or not, whether damage exists or not and the like through the image of the target commodity, and therefore the quality of the target commodity is judged.
The information of the goods acquired by the robot may be different for different kinds of target goods. For example, when the target commodity is a vegetable, the commodity information may include an image of the target commodity for determining whether the vegetable is fresh, thereby determining the quality of the target commodity; when the target commodity is a fresh meat commodity, the commodity information may include odor information of the target commodity, and is used to determine whether the fresh meat commodity is fresh, thereby determining the quality of the target commodity.
The commodity information may also include odor information, humidity information, hardness information, temperature information, etc. of the target commodity, which is not limited herein.
S103: and determining the quality detection result of the target commodity based on the preset detection rule and the commodity information.
The robot is pre-stored with a preset detection rule, and the preset detection rule is used for determining the quality detection result of the target commodity according to the commodity information. The robot determines a quality detection result of the target commodity based on preset detection rules and commodity information, the quality detection result can identify the quality of the target commodity, the quality detection result can be a quality score, and the quality detection result can also be a quality grade. For example, a quality test result may be "pass" or "fail".
Further, when the target product is a fresh product, the product information includes a product image of the fresh product, and in order to accurately obtain a quality detection result of the fresh product, so that the user can purchase a good-quality product through the robot, S103 may include S1031 to S1032, as shown in fig. 2, S1031 to S1032 are specifically as follows:
s1031: and inputting the commodity image into a pre-trained freshness identification model for processing, and identifying the freshness information of the fresh commodity.
In this embodiment, the target commodity is a fresh commodity, and the commodity information includes a commodity image of the fresh commodity. The robot prestores a pre-trained freshness identification model, the input of the freshness identification model is sample image information in a training sample and a freshness information label corresponding to the sample image information, and the output of the freshness identification model is freshness information corresponding to the sample image information. The freshness information may identify the freshness of the fresh good, either as a score or a grade.
It can be understood that the freshness identification model can be trained by the robot in advance, and files corresponding to the freshness identification model can also be transplanted to the robot after being trained by other equipment in advance. Specifically, when the deep learning network is trained, the model parameters of the deep learning network are frozen, and the freshness identification model file corresponding to the frozen deep learning network is transplanted to the robot.
The robot acquires the commodity image, inputs the commodity image of the fresh commodity into a pre-trained freshness identification model for processing such as feature extraction, feature analysis and feature identification, and obtains the freshness information of the fresh commodity.
Further, the training process of the freshness identification model may include S10311 to S10312 as shown in fig. 3, where S10311 to S10312 are specifically as follows:
s10311: the method comprises the steps of obtaining two-dimensional image samples, three-dimensional characteristic information and freshness labels of a plurality of preset commodities, and obtaining a sample data set formed by the two-dimensional image samples, the three-dimensional characteristic information and the freshness labels of the preset commodities.
In this implementation, the robot obtains two-dimensional image samples, three-dimensional characteristic information and freshness degree label of a plurality of preset commodities, wherein, the two-dimensional image samples are the two-dimensional views of preset commodities, can be RGB color images, and the two-dimensional image samples include the characteristics of the two-dimensional aspect of preset commodities. When the robot is trained, an initial image of a preset commodity can be obtained, and the initial image is input into a preset two-dimensional image screening model to be processed, so that a two-dimensional image sample is obtained.
The robot acquires three-dimensional characteristic information and a freshness label of a preset commodity, acquires an initial color image of the preset commodity, performs segmentation processing on the initial color image to obtain a target image area corresponding to the preset commodity, performs gray processing on the target image area to acquire a gray map of the target image area, and acquires the three-dimensional characteristic information of the preset commodity based on the gray map of the target image area.
The robot generates a sample data set based on the two-dimensional image samples, the three-dimensional characteristic information and the freshness labels of the preset commodities, wherein the sample data set is composed of the two-dimensional image samples, the three-dimensional characteristic information and the freshness labels of the preset commodities.
S10312: and training the preset initial model based on the sample data set until a preset convergence condition is met, and finishing the training to obtain a trained freshness identification model.
And training the preset initial model based on the sample data set until the preset convergence condition is met by the robot, and finishing the training to obtain a trained freshness identification model. In the training process, the input of an initial model is a two-dimensional image sample and three-dimensional characteristic information in a sample data set, the label of the initial model is a freshness label, the output of the initial model is freshness information, the robot can input the two-dimensional image sample and the three-dimensional characteristic information into the initial model for processing, the corresponding freshness information is recognized, the obtained freshness information is compared with the corresponding freshness label to obtain a comparison result, the initial model is adjusted through the comparison result until a preset convergence condition is met, the training is completed, and a trained freshness identification model is obtained.
S1032: and determining the quality detection result of the fresh commodity based on the freshness information.
The robot determines the quality detection result of the fresh commodity based on the freshness information, specifically, a quality detection preset condition can be preset in the robot, and when the freshness information meets the quality detection preset condition, the quality detection result of the fresh commodity is qualified; and when the freshness information does not meet the preset quality detection condition, the quality detection result of the fresh commodity is unqualified.
Further, when the target product is a fresh product, the product information includes odor information and surface texture information of the fresh product, and in order to accurately obtain a quality detection result of the fresh product, so that the user may purchase a good-quality product by using the robot, S103 may include S1033 to S1035, as shown in fig. 4, S1033 to S1035 are specifically as follows:
s1033: and determining the rot degree information of the fresh commodity based on the preset smell and odor information of the fresh commodity.
In this embodiment, the target product is a fresh product, and the product information includes odor information and surface texture information of the fresh product, where the odor information of the fresh product can be acquired by an odor sensor mounted on the robot, and the surface texture information can be extracted from an image of the fresh product. The robot stores preset smell in advance, acquires smell information of the fresh commodity, compares the preset smell with the acquired smell information of the target commodity to acquire a difference value between the preset smell and the acquired smell information, and can determine the rot degree information of the fresh commodity according to the difference value and the corresponding preset rot degree value.
S1034: the moisture content of the fresh good is determined based on the surface texture information.
The robot can acquire the image information of target commodity, and surface texture information can be drawed from the image of giving birth to bright commodity, also can acquire through the sensor on the manipulator of robot, through the analysis to surface texture information, can confirm the water content of giving birth to bright commodity, and the most audio-visual analytic mode, for the denseness that surface texture distributes, then the water content of giving birth to bright commodity just is few, and surface texture distributes sparsely more, then the water content of giving birth to bright commodity just is higher.
Further, in order to more accurately obtain the moisture content of the fresh goods, so that the user can purchase good-quality goods through the robot, S1034 may include: and acquiring wrinkle texture information in the surface texture information, and determining the water content of the fresh commodity based on the ratio between the wrinkle texture information and the surface texture information.
The robot obtains surface texture information, and obtains fold line information from the surface texture information, wherein the fold line information is lines formed by fresh commodities due to water shortage. The robot can set judgment conditions, and when the surface texture information meets the judgment conditions, the robot is determined to be the wrinkle texture information. The robot calculates the ratio between fold line information and the surface texture information, and when the ratio is bigger, it explains that fold line information is more, and this gives birth to the water shortage more of bright commodity, based on preset ratio and the corresponding relation between the preset water content, can confirm the water content of giving birth to bright commodity.
S1035: and determining the quality detection result of the fresh commodity based on the rot degree information and the water content.
The robot determines the quality detection result of the fresh commodity based on the rot degree information and the water content, specifically, a quality detection preset condition can be preset in the robot, the quality is comprehensively judged based on the rot degree information and the water content, and when the rot degree information and the water content meet the quality detection preset condition, the quality detection result of the fresh commodity is qualified; and when the rot degree information and the water content do not meet the preset quality detection conditions, the quality detection result of the fresh commodity is unqualified.
Further, when the commodity information includes a hardness value of the target commodity, in order to accurately obtain a quality detection result of the fresh commodity, so that the user can purchase a good-quality commodity through the robot, S103 may include: and when the hardness value is greater than or equal to the preset hardness threshold value, judging that the quality detection result of the target commodity is qualified. In this embodiment, the commodity information includes a hardness value of the target commodity, the hardness value may be obtained by a sensor on a manipulator of the robot, and when the hardness value is greater than or equal to a preset hardness threshold, it is determined that the quality detection result of the target commodity is qualified. For example, when the target commodity is tomato or cucumber, if the hardness value is greater than the preset hardness threshold value, the quality detection result of the target commodity is determined to be qualified.
Further, when the target commodity is a fruit and vegetable commodity, the commodity information includes a sample mass spectrogram of the fruit and vegetable commodity, and in order to accurately obtain a quality detection result of the fresh commodity, so that the user can purchase a good-quality commodity through the robot, S103 may include S1036 to S1038, as shown in fig. 5, S1036 to S1038 are specifically as follows:
s1036: and acquiring difference information between a preset pesticide standard mass spectrogram and a sample mass spectrogram.
In this embodiment, the target commodity is a fruit and vegetable commodity, and the commodity information includes a sample mass spectrogram of the fruit and vegetable commodity, wherein the sample mass spectrogram of the fruit and vegetable commodity can be stored in the remote server, and the robot sends an acquisition request to the remote server, so as to acquire the sample mass spectrogram of the fruit and vegetable commodity; the sample mass spectrogram of fruit and vegetable commodities can be acquired by the robot on site during purchasing, namely after the robot determines a target commodity, a blade sample of the target commodity is put into a detection box arranged on the robot, a cutting and extruding device, a detachable filter screen device and a detection device are arranged in the detection box, residues of the blade or the sample are blocked by a filter screen after the blade or the sample passes through the cutting and extruding device, juice falls into the detection device, and volatile gas of the juice is blown into a mass spectrometer in the detection device, so that the mass spectrogram is acquired. The robot prestores a preset pesticide standard mass spectrogram, the pesticide standard mass spectrogram marks the standard that pesticide residues do not harm a human body, and the robot acquires difference information between the preset pesticide standard mass spectrogram and a sample mass spectrogram.
S1037: and determining the pesticide residue of the fresh commodity based on the difference information.
The corresponding relation between the preset difference information and the preset pesticide residue is stored in the robot, and the robot determines the pesticide residue of the fresh commodity according to the corresponding relation between the preset difference information and the preset pesticide residue.
S1038: and determining the quality detection result of the target commodity based on the pesticide residue.
The robot determines the quality detection result of the target commodity based on the pesticide residue, specifically, a preset pesticide residue threshold value can be preset in the robot, and when the pesticide residue is less than or equal to the preset pesticide residue threshold value, the quality detection result of the target commodity is qualified; and when the pesticide residue is greater than the preset pesticide residue threshold value, the quality detection result of the target commodity is unqualified.
Further, the robot can also send the quality detection result to the user terminal, so that the user can know the quality of the commodity in real time and the quality detection result, and the user can evaluate the quality.
S104: and screening the commodities with the quality detection results meeting the preset quality standard from the target commodities, and purchasing the screened commodities.
The robot stores preset quality standards in advance, the preset quality standards are used for screening commodities purchased by the robot from target commodities, the robot judges whether quality detection results meet the preset quality standards or not, the commodities of which the quality detection results meet the preset quality standards are obtained from the target commodities and serve as the commodities to be purchased, and the screened commodities are purchased.
For example, when the quality detection result is the quality score of the target commodity, the preset quality standard may be a quality score threshold, and when the quality score of the target commodity is higher than the preset quality standard, the target commodity is purchased.
In addition, the robot can also be provided with a vegetable basket, and an electronic scale is arranged in the vegetable basket and can weigh purchased commodities so as to correct the weight of the commodities; meanwhile, a uncooked food area, a cooked food area and a cold storage area are arranged to classify and place various foods. In order to make the visual angle of the robot adapt to the goods storage racks with different heights and simulate the visual angle of the robot for bending over and standing, the robot is also provided with a lifting system, so that the change of the visual angle height is realized under the control of a user. The user side and the robot side are both provided with holographic projection equipment, the image tracking system in the holographic projection equipment is used for acquiring the projection source images of the market environment and the user, the holographic projection technology is used for carrying out holographic projection on the acquired projection source images, so that the user can acquire the holographic projection images of the market environment, and the user and a dealer can establish more vivid exchange interaction.
In the embodiment of the application, if the purchasing instruction is detected, the target commodity pointed by the purchasing instruction is determined; acquiring commodity information of a target commodity; determining a quality detection result of the target commodity based on a preset detection rule and commodity information; and screening the commodities with the quality detection results meeting the preset quality standard from the target commodities, and purchasing the screened commodities. According to the method, the robot detects the purchasing instruction, the commodity information of the target commodity pointed by the purchasing instruction is obtained, the robot performs quality detection on the commodity based on the commodity information, the quality detection result is determined, the purchasing quality detection result is the qualified target commodity, when a user purchases the commodity through the robot, the quality of the commodity can be judged, the high-quality target commodity is purchased, and the shopping experience of the user is improved.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Referring to fig. 6, fig. 6 is a schematic view of a robot according to a second embodiment of the present application. The units included are used to perform the steps in the embodiments corresponding to fig. 1-5. Please refer to the related description of the embodiments in fig. 1 to 5. For convenience of explanation, only the portions related to the present embodiment are shown. Referring to fig. 6, the robot 6 includes:
the first processing unit 610 is used for determining a target commodity pointed by a purchase instruction when the purchase instruction is detected;
a first obtaining unit 620, configured to obtain commodity information of the target commodity;
a first determining unit 630, configured to determine a quality detection result of the target product based on a preset detection rule and the product information;
the second processing unit 640 is configured to screen out, from the target commodities, commodities whose quality detection results meet a preset quality standard, and purchase the screened commodities.
Further, the target commodity is a fresh commodity, and the commodity information comprises a commodity image of the fresh commodity;
the first determining unit 630 is specifically configured to:
inputting the commodity image into a pre-trained freshness identification model for processing, and identifying the freshness information of the fresh commodity;
and determining the quality detection result of the fresh commodity based on the freshness information.
In a further aspect, the robot 6 further includes:
the second acquisition unit is used for acquiring two-dimensional image samples, three-dimensional characteristic information and freshness labels of a plurality of preset commodities to obtain a sample data set consisting of the two-dimensional image samples, the three-dimensional characteristic information and the freshness labels of the preset commodities;
and the training unit is used for training a preset initial model based on the sample data set until a preset convergence condition is met, finishing the training and obtaining the trained freshness identification model.
Further, the target commodity is a fresh commodity, and the commodity information comprises smell information and surface texture information of the fresh commodity;
the first determining unit 630 includes:
the second determining unit is used for determining the rotting degree information of the fresh commodity based on the preset smell of the fresh commodity and the smell information;
a third determining unit for determining the water content of the fresh commodity based on the surface texture information;
a fourth determination unit for determining the quality detection result of the fresh commodity based on the rotten degree information and the water content.
Further, the third determining unit is specifically configured to:
and acquiring wrinkle texture information in the surface texture information, and determining the water content of the fresh commodity based on the ratio between the wrinkle texture information and the surface texture information.
Further, the commodity information includes a hardness value of the target commodity;
the first determining unit 630 is specifically configured to:
and when the hardness value is greater than or equal to a preset hardness threshold value, judging that the quality detection result of the target commodity is qualified.
Further, the target commodity is a fruit and vegetable commodity, and the commodity information comprises a sample mass spectrogram of the fruit and vegetable commodity;
the first determining unit 630 is specifically configured to:
acquiring difference information between a preset pesticide standard mass spectrogram and the sample mass spectrogram;
determining pesticide residue of the fresh commodity based on the difference information;
and determining the quality detection result of the target commodity based on the pesticide residue.
Fig. 7 is a schematic view of a robot according to a third embodiment of the present application. As shown in fig. 7, the robot 7 of this embodiment includes: a processor 70, a memory 71 and a computer program 72, such as a robot-based goods procurement program, stored in the memory 71 and executable on the processor 70. The processor 70, when executing the computer program 72, implements the steps of the various robot-based article procurement method embodiments described above, such as the steps 101-104 shown in fig. 1. Alternatively, the processor 70, when executing the computer program 72, implements the functions of the modules/units in the above-mentioned device embodiments, for example, the functions of the modules 610 to 640 shown in fig. 6.
Illustratively, the computer program 72 may be partitioned into one or more modules/units that are stored in the memory 71 and executed by the processor 70 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 72 in the temperature regulated device 7. For example, the computer program 72 may be divided into a first processing unit, a first obtaining unit, a first determining unit, and a second processing unit, and the specific functions of each unit are as follows:
the first processing unit is used for determining a target commodity pointed by a purchasing instruction when the purchasing instruction is detected;
a first acquisition unit configured to acquire commodity information of the target commodity;
the first determining unit is used for determining the quality detection result of the target commodity based on a preset detection rule and the commodity information;
and the second processing unit is used for screening out the commodities of which the quality detection results meet the preset quality standard from the target commodities and purchasing the screened commodities.
The robot may include, but is not limited to, a processor 70, a memory 71. Those skilled in the art will appreciate that fig. 7 is merely an example of a robot 7 and does not constitute a limitation of robot 7 and may include more or fewer components than shown, or some components in combination, or different components, e.g., the robot may also include input output devices, network access devices, buses, etc.
The Processor 70 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field-Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 71 may be an internal storage unit of the robot 7, such as a hard disk or a memory of the robot 7. The memory 71 may also be an external storage device of the robot 7, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, provided on the robot 7. Further, the robot 7 may also include both an internal storage unit and an external storage device of the robot 7. The memory 71 is used for storing the computer program and other programs and data required by the robot. The memory 71 may also be used to temporarily store data that has been output or is to be output.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/units, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and specific reference may be made to the part of the embodiment of the method, which is not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
An embodiment of the present application further provides a network device, where the network device includes: at least one processor, a memory, and a computer program stored in the memory and executable on the at least one processor, the processor implementing the steps of any of the various method embodiments described above when executing the computer program.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps in the above-mentioned method embodiments.
The embodiments of the present application provide a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A commodity purchasing method based on a robot is applied to the robot, and is characterized by comprising the following steps:
if a purchasing instruction is detected, determining a target commodity pointed by the purchasing instruction;
acquiring commodity information of the target commodity;
determining a quality detection result of the target commodity based on a preset detection rule and the commodity information;
and screening the commodities of which the quality detection results meet the preset quality standard from the target commodities, and purchasing the screened commodities.
2. The robot-based goods procurement method of claim 1 characterized in that the target goods are fresh goods, the goods information comprising goods images of the fresh goods;
the determining the quality detection result of the target commodity based on the preset detection rule and the commodity information comprises:
inputting the commodity image into a pre-trained freshness identification model for processing, and identifying the freshness information of the fresh commodity;
and determining the quality detection result of the fresh commodity based on the freshness information.
3. The robot-based merchandise procurement method of claim 2 characterized by, the training process for the freshness identification model, comprising:
acquiring two-dimensional image samples, three-dimensional characteristic information and freshness labels of a plurality of preset commodities to obtain a sample data set consisting of the two-dimensional image samples, the three-dimensional characteristic information and the freshness labels of the preset commodities;
training a preset initial model based on the sample data set until a preset convergence condition is met, and finishing the training to obtain the trained freshness identification model.
4. The robot-based goods procurement method of claim 1 characterized by, the target goods are fresh goods, the goods information comprises odor information and surface texture information of the fresh goods;
the determining the quality detection result of the target commodity based on the preset detection rule and the commodity information comprises:
determining rot degree information of the fresh commodity based on preset smell of the fresh commodity and the smell information;
determining a water content of the fresh good based on the surface texture information;
determining the quality detection result of the fresh commodity based on the rot degree information and the water content.
5. The robot-based merchandise procurement method of claim 4 wherein determining the moisture content of the fresh merchandise based on the surface texture information comprises:
and acquiring wrinkle texture information in the surface texture information, and determining the water content of the fresh commodity based on the ratio between the wrinkle texture information and the surface texture information.
6. The robot-based merchandise procurement method of claim 1 characterized by, the merchandise information comprises a hardness value of the target merchandise;
the determining the quality detection result of the target commodity based on the preset detection rule and the commodity information comprises:
and when the hardness value is greater than or equal to a preset hardness threshold value, judging that the quality detection result of the target commodity is qualified.
7. The robot-based commodity purchasing method according to claim 1, wherein the target commodity is a fruit and vegetable commodity, and the commodity information includes a sample mass spectrogram of the fruit and vegetable commodity;
the determining the quality detection result of the target commodity based on the preset detection rule and the commodity information comprises:
acquiring difference information between a preset pesticide standard mass spectrogram and the sample mass spectrogram;
determining pesticide residue of the fresh commodity based on the difference information;
and determining the quality detection result of the target commodity based on the pesticide residue.
8. A robot, comprising:
the first processing unit is used for determining a target commodity pointed by a purchasing instruction when the purchasing instruction is detected;
a first acquisition unit configured to acquire commodity information of the target commodity;
the first determining unit is used for determining the quality detection result of the target commodity based on a preset detection rule and the commodity information;
and the second processing unit is used for screening out the commodities of which the quality detection results meet the preset quality standard from the target commodities and purchasing the screened commodities.
9. A robot comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
CN202010278703.6A 2020-04-10 2020-04-10 Robot-based commodity purchasing method and robot Pending CN112541381A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010278703.6A CN112541381A (en) 2020-04-10 2020-04-10 Robot-based commodity purchasing method and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010278703.6A CN112541381A (en) 2020-04-10 2020-04-10 Robot-based commodity purchasing method and robot

Publications (1)

Publication Number Publication Date
CN112541381A true CN112541381A (en) 2021-03-23

Family

ID=75013431

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010278703.6A Pending CN112541381A (en) 2020-04-10 2020-04-10 Robot-based commodity purchasing method and robot

Country Status (1)

Country Link
CN (1) CN112541381A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113109240A (en) * 2021-04-08 2021-07-13 国家粮食和物资储备局标准质量中心 Method and system for determining imperfect grains of grains implemented by computer
CN117250322A (en) * 2023-09-12 2023-12-19 新疆绿丹食品有限责任公司 Red date food safety intelligent monitoring method and system based on big data

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007109140A (en) * 2005-10-17 2007-04-26 Aruze Corp Merchandise purchase system
US20160300455A1 (en) * 2013-10-03 2016-10-13 Digitized Concepts, Llc Apparatus, System, and Method for Self-Service Shopping
CN107451859A (en) * 2017-07-26 2017-12-08 上海与德通讯技术有限公司 A kind of robot purchase method and device
CN109215010A (en) * 2017-06-29 2019-01-15 沈阳新松机器人自动化股份有限公司 A kind of method and robot face identification system of picture quality judgement
CN110020604A (en) * 2019-03-11 2019-07-16 潍坊学院 A kind of quality of vegetable detection method and system
CN110136129A (en) * 2019-05-22 2019-08-16 广东工业大学 A kind of commercial quality detection method, device and storage medium
CN110161194A (en) * 2019-05-29 2019-08-23 中北大学 It is a kind of based on odiferous information BP fuzzy neuron identification the recognition methods of fruit freshness, apparatus and system
CN110287824A (en) * 2019-06-10 2019-09-27 秒针信息技术有限公司 Identify the method and device of food
CN110942035A (en) * 2019-11-28 2020-03-31 浙江由由科技有限公司 Method, system, device and storage medium for acquiring commodity information

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007109140A (en) * 2005-10-17 2007-04-26 Aruze Corp Merchandise purchase system
US20160300455A1 (en) * 2013-10-03 2016-10-13 Digitized Concepts, Llc Apparatus, System, and Method for Self-Service Shopping
CN109215010A (en) * 2017-06-29 2019-01-15 沈阳新松机器人自动化股份有限公司 A kind of method and robot face identification system of picture quality judgement
CN107451859A (en) * 2017-07-26 2017-12-08 上海与德通讯技术有限公司 A kind of robot purchase method and device
CN110020604A (en) * 2019-03-11 2019-07-16 潍坊学院 A kind of quality of vegetable detection method and system
CN110136129A (en) * 2019-05-22 2019-08-16 广东工业大学 A kind of commercial quality detection method, device and storage medium
CN110161194A (en) * 2019-05-29 2019-08-23 中北大学 It is a kind of based on odiferous information BP fuzzy neuron identification the recognition methods of fruit freshness, apparatus and system
CN110287824A (en) * 2019-06-10 2019-09-27 秒针信息技术有限公司 Identify the method and device of food
CN110942035A (en) * 2019-11-28 2020-03-31 浙江由由科技有限公司 Method, system, device and storage medium for acquiring commodity information

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113109240A (en) * 2021-04-08 2021-07-13 国家粮食和物资储备局标准质量中心 Method and system for determining imperfect grains of grains implemented by computer
CN113109240B (en) * 2021-04-08 2022-09-09 国家粮食和物资储备局标准质量中心 Method and system for determining imperfect grains of grains implemented by computer
CN117250322A (en) * 2023-09-12 2023-12-19 新疆绿丹食品有限责任公司 Red date food safety intelligent monitoring method and system based on big data
CN117250322B (en) * 2023-09-12 2024-04-12 新疆绿丹食品有限责任公司 Red date food safety intelligent monitoring method and system based on big data

Similar Documents

Publication Publication Date Title
Ivorra et al. Assessment of grape cluster yield components based on 3D descriptors using stereo vision
US20190220692A1 (en) Method and apparatus for checkout based on image identification technique of convolutional neural network
Cubero et al. A new method for pedicel/peduncle detection and size assessment of grapevine berries and other fruits by image analysis
CN108921645B (en) Commodity purchase judgment method and device and user terminal
CN109741144B (en) Commodity verification method and device, host and equipment
WO2018217280A1 (en) Automated inspection system
MXPA02001474A (en) Item recognition method and apparatus.
CN111582359B (en) Image identification method and device, electronic equipment and medium
CN112541381A (en) Robot-based commodity purchasing method and robot
CN108960132B (en) Method and device for purchasing commodities in open type vending machine
Sabzi et al. Non-destructive estimation of physicochemical properties and detection of ripeness level of apples using machine vision
CN114666670B (en) Data monitoring method, device, equipment and computer readable medium
Calixto et al. A computer vision model development for size and weight estimation of yellow melon in the Brazilian northeast
CN112381589A (en) Intelligent commodity evaluation management system of commodity transaction platform based on cloud computing
WO2020208540A1 (en) A system and method for grading agricultural commodity
Rafiq et al. Application of computer vision system in food processing
Khekare et al. Internet of things based best fruit segregation and taxonomy system for smart agriculture
JP2022528022A (en) Analysis method and system of products on supermarket product shelves
WO2021048813A1 (en) Scale and method for the automatic recognition of a product
CN110096946A (en) A kind of self-service system merged based on pressure sensitivity and vision
CN113709576B (en) Online live broadcast method and system for E-commerce based on Internet
CN109858448A (en) Item identification method and equipment under a kind of public safety
KR101771810B1 (en) Apparatus and method for differentiating species of plant
CN113553902A (en) Intelligent fruit and vegetable accurate identification method and system, computer equipment and application
CN110118735A (en) A kind of high light spectrum image-forming detection method and device detecting bergamot pear male and female

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination