CN116416522A - Plant species determination method, plant species determination device and computer readable storage medium - Google Patents

Plant species determination method, plant species determination device and computer readable storage medium Download PDF

Info

Publication number
CN116416522A
CN116416522A CN202211656319.0A CN202211656319A CN116416522A CN 116416522 A CN116416522 A CN 116416522A CN 202211656319 A CN202211656319 A CN 202211656319A CN 116416522 A CN116416522 A CN 116416522A
Authority
CN
China
Prior art keywords
species
image
plant
user
distinguishing feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211656319.0A
Other languages
Chinese (zh)
Inventor
徐青松
罗欢
陈明权
何涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Ruisheng Software Co Ltd
Original Assignee
Hangzhou Ruisheng Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Ruisheng Software Co Ltd filed Critical Hangzhou Ruisheng Software Co Ltd
Priority to CN202211656319.0A priority Critical patent/CN116416522A/en
Publication of CN116416522A publication Critical patent/CN116416522A/en
Priority to PCT/CN2023/138022 priority patent/WO2024131589A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The present disclosure provides a plant species determination method comprising: obtaining one or more first images of plants from a user; determining, based on the first image, one or more first species associated with the plant based on an image recognition model using an image recognition technique; searching one or more second images matched with the first image from a database by utilizing an image matching technology based on the first image, and determining one or more second species associated with the plant according to plant species corresponding to the second image; presenting the first image, the second image, the first species, and the second species to a particular person so that the particular person determines a third species associated with the plant; and displaying the third species to the user. The present disclosure also relates to a plant species determination device and a computer readable storage medium.

Description

Plant species determination method, plant species determination device and computer readable storage medium
Technical Field
The disclosure relates to the field of computer technology, in particular to a plant species determination method, a plant species determination device and a computer readable storage medium.
Background
With the development of society, people have higher requirements on living environments, and plants are increasingly appearing in daily life of people. In daily life, people often encounter plants which are not known per se, and under the driving of curiosity and learning desire, people often want to know the variety of the plants. In this case, a user often installs an application program having a plant identification function on a terminal such as a mobile phone, and knows the plant type through the application program.
Disclosure of Invention
It is an object of one or more embodiments of the present disclosure to provide a plant species determination method, apparatus and computer readable storage medium.
According to a first aspect of embodiments of the present disclosure, there is provided a plant species determination method comprising: obtaining one or more first images of plants from a user; determining, based on the first image, one or more first species associated with the plant based on an image recognition model using an image recognition technique; searching one or more second images matched with the first image from a database by utilizing an image matching technology based on the first image, and determining one or more second species associated with the plant according to plant species corresponding to the second image; presenting the first image, the second image, the first species, and the second species to a particular person so that the particular person determines a third species associated with the plant; and displaying the third species to the user.
In some embodiments, the method further comprises: acquiring a third image of the plant from the user prior to acquiring the first image from the user; determining a fourth species associated with the plant based on the third image using computer vision techniques, the computer vision techniques including the image recognition technique and/or the image matching technique; displaying the fourth species to the user and a first interface for redefining the species of the plant; and acquiring the first image from a user in response to the user operating the first interface.
In some embodiments, the method further comprises: acquiring a third image of the plant from the user prior to acquiring the first image from the user; determining a fourth species associated with the plant and a confidence level corresponding to the fourth species based on the third image using computer vision techniques, the computer vision techniques including the image recognition technique and/or the image matching technique; and acquiring the first image from the user in response to the confidence coefficient corresponding to the fourth species being smaller than a preset threshold.
In some embodiments, the method further comprises: after determining the first species, determining a first distinguishing feature of the plant in the first image from a plant corresponding to the first species, the first distinguishing feature comprising a difference at one or more of a leaf, flower, fruit, stem, root; and/or, after determining the second species, determining a second distinguishing feature of the plant in the first image from a plant corresponding to the second species, the second distinguishing feature comprising a distinction at one or more of a leaf, flower, fruit, stem, root; and displaying the first distinguishing feature and/or the second distinguishing feature to a particular person so that the particular person determines the third species.
In some embodiments, the method further comprises: marking at least one of the first discriminating characteristic and the second discriminating characteristic in the first image after determining the at least one of the first discriminating characteristic and the second discriminating characteristic; wherein presenting the first distinguishing feature and/or the second distinguishing feature to a particular person comprises presenting the first image to a particular person labeled with at least one of the first distinguishing feature and the second distinguishing feature.
In some embodiments, the method further comprises: after at least one of the first distinguishing feature and the second distinguishing feature is determined, information related to the at least one of the first distinguishing feature and the second distinguishing feature is acquired from a user in a man-machine interaction mode; wherein presenting the first distinguishing feature and/or the second distinguishing feature to a particular person comprises: presenting information related to at least one of the first distinguishing feature and the second distinguishing feature to a particular person.
In some embodiments, the method further comprises: determining a confidence level corresponding to the first species by using the image recognition technology and/or determining a confidence level corresponding to the second species by using the image matching technology; and displaying the confidence corresponding to the first species and/or the confidence corresponding to the second species to a specific person so that the specific person can determine the third species.
In some embodiments, the first species includes a plurality, and the display order of the plurality of first species is based on a confidence level corresponding to each of the first species; and/or the second species comprises a plurality; the display sequence of the plurality of second species is based on the confidence corresponding to each second species.
In some embodiments, the method further comprises: obtaining a description relating to the plant from the user; and presenting a description relating to the plant to a specific person so that the specific person determines the third species.
In some embodiments, the method further comprises: obtaining geographic location information from the user; and displaying the geographic location information to a particular person so that the particular person determines the third species.
In some embodiments, the method further comprises: while presenting the third species to the user, also presenting a second interface to the user for enabling interaction between the user and the particular person; receiving supplemental information related to the plant from a user via the second interface; and presenting the supplemental information to the particular person so that the particular person redetermines the third species.
In some embodiments, the method further comprises: presenting a third interface to the particular person for enabling interaction between the user and the particular person at least during the determination of the third species by the particular person; receiving first interaction information from the specific personnel through the third interface, and displaying the first interaction information to the user; and receiving second interaction information responding to the first interaction information from the user, and displaying the second interaction information to the specific personnel through the third interface.
In some embodiments, the method further comprises: requesting the particular person to confirm whether the third species is a species of the plant in response to the third species being inconsistent with each of one or more of the first species; in response to the third species being identified as a species of the plant, taking as a sample a correspondence of the first image with the third species to train an image recognition model on which the image recognition technique is based, and/or in response to the third species not being included in a database on which the image recognition technique is based, requesting the particular person to confirm whether the third species is a species of the plant; in response to the third species being identified as a species of the plant, the first image labeled with a correspondence with the third species is added to a database on which the image recognition technique is based.
In some embodiments, the method further comprises: the distinguishing features of the plants of the third species from the plants in the first image are displayed to the particular person so that the particular person confirms whether the third species is a species of the plants.
In some embodiments, the method further comprises: before a first image of the plant is acquired from the user, a shooting instruction is presented to the user.
In some embodiments, the method further comprises: before determining the first species, evaluating whether the first image is satisfactory using an image evaluation model; and responsive to the first image not being satisfactory, re-acquiring the first image from the user.
In some embodiments, the specific person is specified by a system providing the method among a plurality of specific persons, the system making the specification based on one or more of an accuracy of determining plant species for each specific person among the plurality of specific persons, a degree of idleness of the specific person, a level of the specific person, a difficulty in determining the plant species.
In some embodiments, the method further comprises: determining a confidence level corresponding to the first species by using the image recognition technology; and/or determining a confidence level corresponding to the second species by using the image matching technology; the difficulty in determining the plant species is determined according to the confidence corresponding to the first species and/or the confidence corresponding to the second species.
According to a second aspect of embodiments of the present disclosure, there is provided a plant species determination device comprising: an acquisition module configured to acquire one or more first images of a plant from a user;
a first determination module configured to determine, based on the first image, one or more first species associated with the plant based on an image recognition model using an image recognition technique; a second determination module configured to search a database for one or more second images matching the first image using an image matching technique based on the first image, and determine one or more second species associated with the plant from plant species corresponding to the second image;
a first display module configured to display the first image, the second image, the first species, and the second species to a particular person so that the particular person determines a third species associated with the plant; and a second display module configured to display the third species to the user.
According to a third aspect of embodiments of the present disclosure, there is provided a plant species determination device comprising: a memory; and a processor coupled to the memory, the processor configured to perform the method of any of the embodiments described above based on instructions stored in the memory.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer readable storage medium comprising computer program instructions, wherein the computer program instructions, when executed by a processor, implement the method according to any one of the embodiments described above.
According to a fifth aspect of embodiments of the present disclosure, there is provided a computer program product comprising a computer program, wherein the computer program, when executed by a processor, implements the method according to any one of the embodiments described above.
The technical scheme of the present disclosure is described in further detail below through the accompanying drawings and examples.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and other drawings may be obtained according to these drawings without inventive effort to a person of ordinary skill in the art.
Fig. 1 is a schematic flow diagram of a plant species determination method according to an embodiment of the present disclosure.
Fig. 2 is a schematic diagram of a presentation page for a particular person of a plant species determination method according to an embodiment of the present disclosure.
Fig. 3 is a schematic structural view of a plant species determination device according to an embodiment of the present disclosure.
Fig. 4 is a schematic structural view of a plant species determination device according to an embodiment of the present disclosure.
Detailed Description
The following description of the technical solutions in the embodiments of the present disclosure will be made clearly and completely with reference to the accompanying drawings in the embodiments of the present disclosure, and it is apparent that the described embodiments are only some embodiments of the present disclosure, not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art based on the embodiments in this disclosure without inventive faculty, are intended to fall within the scope of this disclosure.
The relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless it is specifically stated otherwise.
Meanwhile, it should be understood that the sizes of the respective parts shown in the drawings are not drawn in actual scale for convenience of description.
Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any specific values should be construed as merely illustrative, and not a limitation. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
The inventors have noted that the accuracy of current plant species identification is not high enough, and even sometimes the user can clearly know that the identified species result is erroneous, which can seriously affect the user experience.
Fig. 1 is a schematic flow diagram of a plant species determination method according to an embodiment of the present disclosure. The method may be performed by an application installed on an electronic device, such as a computer, a cell phone, etc., to determine the species of the plant from the plant image input by the user. Fig. 2 is a schematic diagram of a presentation page for a particular person of a plant species determination method according to an embodiment of the present disclosure. The plant species determination method according to the embodiment of the present disclosure is described below with reference to fig. 1 and 2. As shown in fig. 1, the plant species determination method may include steps S110 to S150 as described below.
In step S110, one or more first images of a plant are acquired from a user.
For example, the first image may be obtained from a user's cell phone album. For another example, the first image captured by the user on-site may be acquired by a camera of the user. How to acquire the first image of the plant will be described in detail later with reference to some embodiments.
In step S120, one or more first species associated with the plant are determined based on the image recognition model using image recognition techniques based on the first image.
For example, the image recognition model may be trained using pre-labeled training samples, whereby the first species may be determined from the trained image recognition model. The image recognition model may be a neural network model, such as a convolutional neural network model or a residual network model.
In step S130, based on the first image, one or more second images matching the first image are looked up from a database using an image matching technique, and one or more second species associated with the plant are determined from plant species corresponding to the second images.
For example, a database may be pre-established that includes a sample image corresponding to each of a plurality of plant species. The plurality of plant species may be all the species that have been enrolled in the database, for example, may be a collection of all plant species corresponding to all sample images collected. The sample image may be a representative image of a plant of a corresponding species. It should be appreciated that each plant species may correspond to one or more sample images. The image matching technique may then be used, for example, to find one or more sample images from the database that meet the requirements (e.g., may be greater than a predetermined threshold) for matching the first image using a shortest edit distance matching technique for the feature vectors of the images, and to determine the one or more sample images as the second image. It should be appreciated that when the plurality of second images are found, any two of the plurality of second images may correspond to the same plant species or may correspond to different plant species. When the plurality of second images are found, the plurality of second images may be ranked from high to low in terms of the degree of matching with the first image. And then determining one or more second species associated with the plant according to the corresponding relation between the sample image and the plant species stored in the database in advance. When there are multiple second species, the multiple second species may be ranked from high to low according to the matching degree of their corresponding second images and first images.
In step S140, the first image, the second image, the first species and the second species are presented to the particular person so that the particular person determines a third species associated with the plant. The specific person is, for example, a plant expert or the like who has a great experience in plant recognition/diagnosis. In some embodiments, a particular person is specified among a plurality of particular persons by the system providing the method. As will be further described below in connection with some embodiments. It is understood that the third species is a species of plant that is self-determined by a particular person on the basis of the first image, the second image, the first species and the second species. In general, the third species may be the same as one of the one or more first species and the one or more second species, but may also be different from both the first species and the second species.
As shown in fig. 2, in some embodiments, the first image may be presented in region 1 and region 2. For example, in the case where the user provides only one first image, the first image may be presented only in the area 1. In case the user provides a plurality of first images, one of the images may be presented in area 1 and the other image in area 2. In some embodiments, the first image and the second image may be presented in region 1 and region 2, respectively. It should be appreciated that although region 1 shows only one image region and region 2 shows multiple image regions, either of regions 1 and 2 may include one or more image regions. In some embodiments, a first image, a second image, and a third image (to be described below) may be displayed in the region 1 and the region 2, for example, a third image may be displayed in the region 1, and the first image and the second image may be displayed in the region 2.
The first species and the second species may be displayed in region 5. In some implementations, in the event that a particular person performs a particular operation (e.g., a click operation) on a first species or a second species in the area 5, the particular person may be presented with an image of the first species or the second species (e.g., may include a representative image of a plant, as well as other images), a description of the plant, and/or a link for jumping to a plant detail page. After the specific person determines the third species, an entry corresponding to the third species may be selected in the area 5, and then the confirmation button shown in the area 3 is operated to submit the determination result. In some embodiments, in the event that a particular person operates the confirmation button shown in area 3, a box may be presented to the particular person so that the particular person enters some comments related to the plant or species determination. After the submission is completed, the name of the third species may be shown, for example, in the location shown in area 6, and comments related to the plant or species determination may be shown, for example, in the location shown in area 7. The name of the specific person and the time at which the specific person determines the third species can also be shown in the area 6.
In some embodiments, the area 6 may be set so that a particular person jumps to the plant detail page corresponding to a third species shown in the area 6 when clicking on the name of the third species.
As some implementations, in addition to displaying the first image, the second image, the first species, and the second species to a specific person, other things such as a first distinguishing feature of a plant in the first image and a plant corresponding to the first species, a confidence level of the first species, a second distinguishing feature of a plant in the first image and a plant corresponding to the second species, and/or a confidence level of the second species may be displayed to the specific person, which will be further described in connection with some embodiments.
In step S150, a third species determined by a specific person is presented to the user. In some embodiments, in addition to presenting the third species to the user, a plant detail page including plant characteristics, plant standard images, plant maintenance methods, etc., may also be presented to the user.
In the above embodiment, the first species and the second species are determined by using different methods based on the image provided by the user, and the first image, the second image, the first species and the second species are displayed to the specific person, so that the specific person can determine the third species of the plant by combining these information, which is beneficial to the specific person to judge the species of the plant more accurately, thereby being beneficial to improving the use experience of the user.
The following describes how one or more first images of a plant are obtained.
In some embodiments, the shooting guidance may be presented to the user prior to acquiring the first image of the plant from the user. The shooting guidance may be presented in the form of text information, such as "please shoot at least three-angle images". The shooting instruction is displayed to instruct a user to shoot, so that the image which is convenient to identify by utilizing the image identification and/or image matching technology is beneficial to acquiring, and the accuracy of plant species determination is improved.
In some embodiments, before determining the first species from the first image, the image evaluation model may be utilized to evaluate whether the first image meets requirements, such as brightness, sharpness, contrast, etc., of the image for image recognition and/or image matching, and size of the identified subject in the image, etc., and to retrieve the first image from the user in response to the first image not meeting the requirements, which is advantageous for improving accuracy of plant species determination and for saving user time. For example, after evaluating the first image as not meeting the requirements for image recognition and/or image matching using the image evaluation model, a photographing prompt may be presented to the user to re-provide the meeting first image. Examples of reasons for photographing are that the first image does not meet the requirements for image recognition and/or image matching, and the requirements for image recognition and/or image matching.
In some embodiments, a third image of the plant is acquired from the user before the first image is acquired from the user. A fourth species associated with the plant is determined based on the third image using computer vision techniques, where the computer vision techniques include image recognition techniques and/or image matching techniques. The fourth species and the first interface for redefining the species of the plant are presented to the user. A first image is acquired from a user in response to a user operation of the first interface.
It will be appreciated that the user operating the first interface for re-speciation of the plant is typically because the user does not trust/satisfy the determination of the fourth species, which is the species, the user would want to re-specie. In this case, the first image of the plant is obtained from the user, and then the plant species determining method shown in fig. 1 is performed, so that a more accurate and authoritative result is provided for the user, and user experience is improved.
As some implementations, the third image and the first image may be the same, i.e., the user-provided images in both the front and back identifications are the same. Although the third image is identical to the first image, only computer vision techniques are used as the fourth species is determined from the third image; and when the third species is determined from the first image, not only the computer vision technique is used, but also a specific person is further caused to determine the third species from the result of the computer vision technique. Thus, the confidence of the third species is higher than that of the fourth species.
As other implementations, the third image and the first image may be different, e.g., the user may be required to provide more images when the first image is acquired. When the contents such as the first image and the second image are provided for a specific person, the third image can also be provided for the specific person to refer to. For another example, when the first image is acquired, shooting guidance is displayed to the user, so that the user can provide a higher-quality image, and the accuracy of determining the third species by a specific person can be further improved.
In some embodiments, a third image of the plant is acquired from the user prior to acquiring the first image from the user; determining a fourth species associated with the plant and a confidence level corresponding to the fourth species based on the third image using computer vision techniques, the computer vision techniques including image recognition techniques and/or image matching techniques; and acquiring the first image from the user in response to the confidence level corresponding to the fourth species being less than the preset threshold.
It will be appreciated that a confidence level for the fourth species that is less than the predetermined threshold value indicates that the confidence level for the fourth species is lower. In this case, the method for determining the species by the specific person is automatically performed, that is, the plant species determining method shown in fig. 1 is automatically performed, which is favorable for avoiding the occurrence of the situation that the user mistakenly recognizes the plant species due to receiving the species determining result with low confidence, and is favorable for improving the user experience.
The following describes, in connection with some embodiments, what may be presented to a particular person in addition to the first image, the second image, the first species, and the second species.
In some embodiments, after determining the first species, a first distinguishing feature of the plant in the first image and the plant corresponding to the first species may be determined, and/or after determining the second species, a second distinguishing feature of the plant in the first image and the plant corresponding to the second species may be determined, and the first distinguishing feature and/or the second distinguishing feature may be displayed to a particular person so that the particular person determines the third species. Here, the first distinguishing feature may include a distinction at one or more of a leaf, a flower, a fruit, a stem, a root, and the second distinguishing feature may include a distinction at one or more of a leaf, a flower, a fruit, a stem, a root. For example, where the first species is determined to be a peach tree and the second species is determined to be a plum tree, the first distinguishing feature of the peach tree from the plants in the first image and the second distinguishing feature of the plum tree from the plants in the second image may be displayed to a particular person.
In the above embodiments, by displaying the first distinguishing feature of the plant corresponding to the first species and/or the second distinguishing feature of the plant corresponding to the second species in the first image, it is advantageous for a specific person to determine the plant species more accurately, and in particular, for a specific person to determine the species of the confusing plant.
As some implementations, after determining at least one of the first discriminating characteristic and the second discriminating characteristic, at least one of the first discriminating characteristic and the second discriminating characteristic is annotated in the first image. Accordingly, presenting the first distinguishing feature and/or the second distinguishing feature to the particular person includes presenting a first image to the particular person that is labeled with at least one of the first distinguishing feature and the second distinguishing feature. For example, in case it is determined that the first distinguishing feature is a leaf shape difference, a fruit color difference, the leaves and fruits of the plant may be marked in the first image, for example, the areas of the leaves and fruits are framed in the first image, and a specific description of the leaf shape difference, the fruit color difference may also be given. This allows the first and/or second distinguishing feature to be more directly and visually presented to the particular person so that the particular person more accurately determines the third species.
As some implementations, after determining at least one of the first distinguishing feature and the second distinguishing feature, information related to the at least one of the first distinguishing feature and the second distinguishing feature may be obtained from a user through a human-computer interaction manner. Thus, presenting the first distinguishing feature and/or the second distinguishing feature to a particular person comprises: information relating to at least one of the first distinguishing feature and the second distinguishing feature is presented to a particular person. For example, in the case where the first distinguishing feature is determined to be a leaf portion, the user may be asked by means of man-machine interaction about the features of the leaf size, the color of the back surface of the leaf, and the like of the plant. Basic features of leaf color, shape, etc. of the target plant can be identified from the first image from the user using computer vision techniques. However, it may be difficult to identify the leaf size of the target plant, for example, in the absence of a reference; for example, in the case where the user provides only a frontal image of the plant leaf, it is difficult to obtain features of the back of the leaf (such as features of color, nap, venation, etc.) from the image; or for example, in the case where the user provides an image that includes only a portion (particularly a small portion) of the foliage of the target plant, it is difficult to determine from the image the average distribution of the speckle on that plant foliage (e.g., whether the area of the speckle on the foliage is more than 30% of the total area of the foliage for the whole plant). These features may be obtained from the user by way of human-machine interaction, such as outputting questions to the user and receiving answers from the user. The answer of the user can be presented to a specific person as related information of the distinguishing feature. As shown in fig. 2, information relating to at least one of the first distinguishing feature and the second distinguishing feature may be presented to a particular person in the area 10.
In the above embodiment, the information related to at least one of the first distinguishing feature and the second distinguishing feature is acquired from the user in a man-machine interaction manner, which is favorable for acquiring more abundant plant information, and further is favorable for improving the accuracy of determining the third species by the specific personnel.
In some embodiments, the confidence level corresponding to each of the one or more first species is determined using image recognition techniques and/or the confidence level corresponding to each of the one or more second species is determined using image matching techniques; the confidence level for each first species and/or the confidence level for each second species is presented to the particular person so that the particular person determines the third species. By displaying the confidence level of the first species and/or the second species, it is advantageous to better assist a particular person in making the judgment. For example, the confidence levels of the two first species are 40% and 30%, respectively, and the confidence levels of the two second species are 65% and 25%, respectively, then a particular person may prioritize whether the second species with the confidence level of 65% is the species of the target plant when determining the third species.
In some embodiments, the first species includes a plurality, and the display order of the plurality of first species is determined based on the confidence of each first species. In some embodiments, the second species includes a plurality, and the display order of the plurality of second species is determined based on the confidence level corresponding to each second species.
In some embodiments, the first species and the second species may be displayed in the same region, as shown in fig. 2, and the first species and the second species may be displayed together in region 5. In this case, the order of presentation of the first species and the second species in the region 5 may also be determined based on the respective confidence levels.
In some embodiments, the description relating to the plant may also be obtained from the user and presented to a particular person for the particular person to determine a third species. As some implementations, the description related to the plant is, for example, a question of the user, and is, for example, information about the plant that the user wants to provide, such as the place where the plant grows, the characteristics of maintaining the plant (e.g., how much water is applied, how much fertilizer is applied, etc.), and so on. As shown in fig. 2, a description relating to a plant may be presented to a particular person in the area 10.
In some embodiments, the geographic location information may be obtained from the user and presented to the particular person so that the particular person determines the third species. Because geographic location information is critical to the determination of plant species, obtaining geographic location information facilitates assisting a particular person in better determining a third species. As some implementations, the geographic location information may be automatically obtained from the user terminal, for example, via an operating system of the user terminal reading or directly reading positioning information in a GPS/beidou locator in the user terminal, or a base station identification code of a cellular communication module. It should be appreciated that the user terminal may be a terminal where the user provides the first image. As other implementations, the user may be queried for geographic location information by way of human-machine interaction.
In some embodiments, when a third species is presented to the user, a second interface is also presented to the user for enabling interaction between the user and the particular person; receiving supplemental information related to the plant from the user via the second interface; and presenting the supplemental information to the particular person so that the particular person redetermines the third species. As shown in fig. 2, supplemental information may be presented to a particular person in area 7 and/or area 10.
It will be appreciated that after the user has learned about the third species, continuing to input supplemental information related to the plant via the second interface may be because the user is not satisfied/trusted with the third species determined by the particular person, the user desiring to have the particular person re-determine the third species by further providing the supplemental information; it is also possible that some information that is considered by the user to be critical for determining plant species has not been provided. Through setting up the second interface, be favorable to helping specific personnel to know plant condition and user's demand better to carry out the species better and confirm, be favorable to promoting user experience.
In some embodiments, a third interface may be presented to the particular person for enabling interaction between the user and the particular person at least during the determination of the third species by the particular person; receiving first interaction information from a specific person through a third interface, and displaying the first interaction information to a user; and receiving second interaction information responding to the first interaction information from the user, and displaying the second interaction information to the specific personnel through a third interface. As shown in fig. 2, the first interaction information and the second interaction information may be displayed in the area 7 or the area 10. For example, a particular person may send a first interactive message "ask about how much the plant is at about height" through a third interface, and after receiving this message, the user may reply to "about 1 meter high". By displaying the third interface to the specific person, interaction between the user and the specific person can be realized, and the specific person can obtain information required by the specific person during species determination in a targeted manner, so that species determination can be performed better. As some implementations, the first interaction information and the second interaction information shown in the area 7 or the area 10 may be arranged in time sequence, for example, the latest message is located at the lowest end of the area 7 or the area 10, or may be arranged in logic sequence of questions and answers.
In some embodiments, a fourth interface for searching for plant information may also be presented to a particular person, as shown by area 8 in fig. 2. In response to the operation of the fourth interface by the specific person, the content such as the image and the text description of the plant searched by the specific person can be presented to the specific person. For example, a specific person may find that the peach tree and the plum tree are likely to be the third species in determining the third species, and at this time, the specific person may conveniently query the information of each of the peach tree and the plum tree through the fourth interface, thereby determining the species more accurately.
In some embodiments, in response to the third species being inconsistent with each of the one or more first species, requesting the particular person to confirm whether the third species is a species of plant; in response to the third species being identified as a species of plant, taking a correspondence of the first image of the plant with the third species as a sample to train an image recognition model on which the image recognition technique is based. It will be appreciated that the third species is inconsistent with each of the one or more first species and generally represents that the first species determined based on the image recognition model using image recognition techniques is not sufficiently accurate. In this case, the first image of the plant needs to be further trained as a new training sample on the image recognition model on which the image recognition technology is based, so that the image recognition model can output a correct result next time.
In some embodiments, in response to the third species not being included in the database on which the image recognition technique is based, requesting the particular person to confirm whether the third species is a species of plant; in response to the third species being identified as a species of plant, the first image labeled with a correspondence with the third species is added to a database on which the image recognition technique is based. The method is beneficial to outputting a correct result when searching the matched second image based on the image matching technology next time, thereby effectively improving the accuracy of determining the species by image matching.
In some embodiments, in response to the third species not being identical to each of the one or more first species or the third species not including in the species database, a distinguishing characteristic of the plant of the third species from the plant in the first image is displayed to the particular person so that the particular person confirms whether the third species is a species of the plant. Since the third species is not identical to each of the first species or both the third species is not included in the species database, subsequent modifications to the image recognition model, database, will be involved, and thus great impact on species recognition should be taken care of. It is advantageous to have a specific person confirm whether the third species is a plant species, i.e. to require the specific person to carefully examine and finally confirm, before modifying the image recognition model and the database.
In some embodiments, the specific person is specified by a system providing the method among a plurality of specific persons, and the system may specify one or more of accuracy in determining the plant species, a degree of idleness of the specific person, a specific person class, difficulty in determining the plant species, based on each specific person. This facilitates a more rational determination of the choice of species identification, thereby improving user satisfaction and also improving the efficiency of operation of the system for species identification.
As some implementations, as shown in fig. 2, the time of the plant species determination request submitted by the user, the status of the plant species determination request submitted by the user, and the name or number of a particular person among a plurality of particular persons specified by the system providing the method may be presented to the particular person in area 9. The time at which the user submitted the plant species determination request is, for example, the time at which the user provided the first image. The status of the plant species determination request submitted by the user includes, for example, a status of a new request, a status that species determination is performed by other specific personnel already present but the determination is failed, etc.
As some implementations, the confidence level of the first species correspondence may be determined using image recognition techniques; and/or determining the confidence corresponding to the second species by using an image matching technology. Thus, the difficulty in determining the plant species is determined based on the confidence level corresponding to the first species and/or the confidence level corresponding to the second species. For example, higher confidence species are identified as easier tasks and lower model confidence species are identified as more difficult tasks. Assuming that two first species are determined by using an image recognition technology, the respective confidence levels are 80% and 60% respectively, and one second species is determined by using an image matching technology, the difficulty in determining the plant species can be determined as low difficulty. For another example, a plant species may be determined to be difficult by determining a first species with 60% confidence using image recognition techniques, a second species with 50% confidence using image matching techniques.
In some embodiments, a particular person may need to determine the species of multiple plants submitted by multiple users. In this case, after the particular person has determined the third species of one plant, the "next" button, shown as area 4 in fig. 2, may be clicked on to begin the species determination of the next plant.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different manner from other embodiments, so that the same or similar parts between the embodiments are mutually referred to. For the device embodiments, since they basically correspond to the method embodiments, the description is relatively simple, and the relevant points are referred to in the description of the method embodiments.
Fig. 3 is a schematic structural view of a plant species determination device according to some embodiments of the present disclosure.
As shown in fig. 3, the plant species determining device 300 includes an acquisition module 310, a first determination module 320, a second determination module 330, a first display module 340, and a second display module 350.
The acquisition module 310 is configured to acquire one or more first images of the plant from the user. The first determination module 320 is configured to determine one or more first species associated with the plant using image recognition techniques based on the first image. The second determination module 330 is configured to, based on the first image, look up one or more second images matching the first image from a database using an image matching technique, and determine one or more second species associated with the plant from plant species corresponding to the second image. The first display module 340 is configured to display the first image, the second image, the first species, and the second species to a particular person so that the particular person determines a third species associated with the plant. The second presentation module 350 is configured to present the third species to the user.
Fig. 4 is a schematic structural view of a plant species determination device according to still further embodiments of the present disclosure.
As shown in fig. 4, the plant species determining device 400 comprises a memory 410 and a processor 420 coupled to the memory 410, the processor 420 being configured to perform the method of any of the preceding embodiments based on instructions stored in the memory 410.
Memory 410 may include, for example, system memory, fixed nonvolatile storage media, and the like. The system memory may store, for example, an operating system, application programs, boot Loader (Boot Loader), and other programs.
Plant species determination device 400 may also include an input-output interface 430, a network interface 440, a storage interface 450, and the like. These interfaces 430, 440, 450, and between the memory 410 and the processor 420 may be connected, for example, by a bus 460. The input/output interface 430 provides a connection interface for input/output devices such as a display, mouse, keyboard, touch screen, etc. Network interface 440 provides a connection interface for various networking devices. Storage interface 450 provides a connection interface for external storage devices such as SD cards, U-discs, and the like.
The disclosed embodiments also provide a computer readable storage medium comprising computer program instructions which, when executed by a processor, implement the method of any of the above embodiments.
The disclosed embodiments also provide a computer program product comprising a computer program, wherein the computer program, when executed by a processor, implements the method of any of the above.
Thus, various embodiments of the present disclosure have been described in detail. In order to avoid obscuring the concepts of the present disclosure, some details known in the art are not described. How to implement the solutions disclosed herein will be fully apparent to those skilled in the art from the above description.
It will be appreciated by those skilled in the art that embodiments of the present disclosure may be provided as a method, system, or computer program product. Accordingly, the present disclosure may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present disclosure may take the form of a computer program product embodied on one or more computer-usable non-transitory storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The present disclosure is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that functions specified in one or more of the flowcharts and/or one or more of the blocks in the block diagrams may be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Additionally, embodiments of the present disclosure may also include the following examples:
1. a plant species determination method comprising:
obtaining one or more first images of plants from a user;
determining, based on the first image, one or more first species associated with the plant based on an image recognition model using an image recognition technique;
Searching one or more second images matched with the first image from a database by utilizing an image matching technology based on the first image, and determining one or more second species associated with the plant according to plant species corresponding to the second image;
presenting the first image, the second image, the first species, and the second species to a particular person so that the particular person determines a third species associated with the plant; and
and displaying the third species to the user.
2. The method according to 1, further comprising:
acquiring a third image of the plant from the user prior to acquiring the first image from the user;
determining a fourth species associated with the plant based on the third image using computer vision techniques, the computer vision techniques including the image recognition technique and/or the image matching technique;
displaying the fourth species to the user and a first interface for redefining the species of the plant; and
and responding to the operation of the user on the first interface, and acquiring the first image from the user.
3. The method according to 1, further comprising:
Acquiring a third image of the plant from the user prior to acquiring the first image from the user;
determining a fourth species associated with the plant and a confidence level corresponding to the fourth species based on the third image using computer vision techniques, the computer vision techniques including the image recognition technique and/or the image matching technique; and
and acquiring the first image from the user in response to the confidence coefficient corresponding to the fourth species being smaller than a preset threshold.
4. The method according to 1, further comprising:
after determining the first species, determining a first distinguishing feature of the plant in the first image from a plant corresponding to the first species, the first distinguishing feature comprising a difference at one or more of a leaf, flower, fruit, stem, root; and/or, after determining the second species, determining a second distinguishing feature of the plant in the first image from a plant corresponding to the second species, the second distinguishing feature comprising a distinction at one or more of a leaf, flower, fruit, stem, root; and
the first distinguishing feature and/or the second distinguishing feature is presented to a particular person in order for the particular person to determine the third species.
5. The method according to claim 4, further comprising:
marking at least one of the first discriminating characteristic and the second discriminating characteristic in the first image after determining the at least one of the first discriminating characteristic and the second discriminating characteristic;
wherein presenting the first distinguishing feature and/or the second distinguishing feature to a particular person comprises presenting the first image to a particular person labeled with at least one of the first distinguishing feature and the second distinguishing feature.
6. The method according to claim 4, further comprising:
after at least one of the first distinguishing feature and the second distinguishing feature is determined, information related to the at least one of the first distinguishing feature and the second distinguishing feature is acquired from a user in a man-machine interaction mode;
wherein presenting the first distinguishing feature and/or the second distinguishing feature to a particular person comprises: presenting information related to at least one of the first distinguishing feature and the second distinguishing feature to a particular person.
7. The method according to 1, further comprising:
determining a confidence level corresponding to each of the one or more first species by using the image recognition technology, and/or determining a confidence level corresponding to each of the one or more second species by using the image matching technology;
And displaying the confidence corresponding to each first species and/or the confidence corresponding to each second species to a specific person so that the specific person can determine the third species.
8. The method according to claim 7, wherein,
the first species comprises a plurality of first species, and the display sequence of the plurality of first species is based on the confidence corresponding to each first species; and/or
The second species comprises a plurality of; the display sequence of the plurality of second species is based on the confidence corresponding to each second species.
9. The method according to 1, further comprising:
obtaining a description relating to the plant from the user; and
a description relating to the plant is presented to a particular person so that the particular person determines the third species.
10. The method according to 1, further comprising:
obtaining geographic location information from the user; and
the geographic location information is presented to a particular person so that the particular person determines the third species.
11. The method according to 1, further comprising:
while presenting the third species to the user, also presenting a second interface to the user for enabling interaction between the user and the particular person;
Receiving supplemental information related to the plant from a user via the second interface; and
the supplemental information is presented to the particular person so that the particular person redefines the third species.
12. The method according to 1, further comprising:
presenting a third interface to the particular person for enabling interaction between the user and the particular person at least during the determination of the third species by the particular person;
receiving first interaction information from the specific personnel through the third interface, and displaying the first interaction information to the user;
and receiving second interaction information responding to the first interaction information from the user, and displaying the second interaction information to the specific personnel through the third interface.
13. The method according to 1, further comprising:
requesting the particular person to confirm whether the third species is a species of the plant in response to the third species being inconsistent with each of one or more of the first species;
in response to the third species being identified as a species of the plant, taking as a sample a correspondence of the first image with the third species to train an image recognition model on which the image recognition technique is based, and/or
Requesting the particular person to confirm whether the third species is a species of the plant in response to the third species not being included in a database on which the image recognition technique is based;
in response to the third species being identified as a species of the plant, the first image labeled with a correspondence with the third species is added to a database on which the image recognition technique is based.
14. The method of claim 13, further comprising:
the distinguishing features of the plants of the third species from the plants in the first image are displayed to the particular person so that the particular person confirms whether the third species is a species of the plants.
15. The method according to 1, further comprising:
before a first image of the plant is acquired from the user, a shooting instruction is presented to the user.
16. The method according to 1, further comprising:
before determining the first species, evaluating whether the first image is satisfactory using an image evaluation model; and
and re-acquiring the first image from the user in response to the first image being unsatisfactory.
17. The method of claim 1, wherein the specific person is specified among a plurality of specific persons by a system providing the method, the system performing the specifying based on one or more of accuracy of determining plant species for each specific person of the plurality of specific persons, degree of idleness of the specific person, class of the specific person, difficulty in determining the plant species.
18. The method of claim 17, further comprising:
determining a confidence level corresponding to the first species by using the image recognition technology; and/or determining a confidence level corresponding to the second species by using the image matching technology;
the difficulty in determining the plant species is determined according to the confidence corresponding to the first species and/or the confidence corresponding to the second species.
19. A plant species determination device comprising:
an acquisition module configured to acquire one or more first images of a plant from a user;
a first determination module configured to determine, based on the first image, one or more first species associated with the plant based on an image recognition model using an image recognition technique;
a second determination module configured to search a database for one or more second images matching the first image using an image matching technique based on the first image, and determine one or more second species associated with the plant from plant species corresponding to the second image;
a first display module configured to display the first image, the second image, the first species, and the second species to a particular person so that the particular person determines a third species associated with the plant; and
And a second display module configured to display the third species to the user.
20. A plant species determination device comprising:
a memory; and
a processor coupled to the memory and configured to perform the method of any one of claims 1-18 based on instructions stored in the memory.
21. A computer readable storage medium comprising computer program instructions, wherein the computer program instructions, when executed by a processor, implement the method of any one of claims 1-18.
22. A computer program product comprising a computer program, wherein the computer program when executed by a processor implements the method of any one of claims 1-18.
Although some specific embodiments of the present disclosure have been described in detail by way of example, it should be understood by those skilled in the art that the above examples are for illustration only and are not intended to limit the scope of the present disclosure. It will be understood by those skilled in the art that the foregoing embodiments may be modified and equivalents substituted for elements thereof without departing from the scope and spirit of the disclosure. The scope of the present disclosure is defined by the appended claims.

Claims (10)

1. A plant species determination method comprising:
obtaining one or more first images of plants from a user;
determining, based on the first image, one or more first species associated with the plant based on an image recognition model using an image recognition technique;
searching one or more second images matched with the first image from a database by utilizing an image matching technology based on the first image, and determining one or more second species associated with the plant according to plant species corresponding to the second image;
presenting the first image, the second image, the first species, and the second species to a particular person so that the particular person determines a third species associated with the plant; and
and displaying the third species to the user.
2. The method of claim 1, further comprising:
acquiring a third image of the plant from the user prior to acquiring the first image from the user;
determining a fourth species associated with the plant based on the third image using computer vision techniques, the computer vision techniques including the image recognition technique and/or the image matching technique;
Displaying the fourth species to the user and a first interface for redefining the species of the plant; and
and responding to the operation of the user on the first interface, and acquiring the first image from the user.
3. The method of claim 1, further comprising:
acquiring a third image of the plant from the user prior to acquiring the first image from the user;
determining a fourth species associated with the plant and a confidence level corresponding to the fourth species based on the third image using computer vision techniques, the computer vision techniques including the image recognition technique and/or the image matching technique; and
and acquiring the first image from the user in response to the confidence coefficient corresponding to the fourth species being smaller than a preset threshold.
4. The method of claim 1, further comprising:
after determining the first species, determining a first distinguishing feature of the plant in the first image from a plant corresponding to the first species, the first distinguishing feature comprising a difference at one or more of a leaf, flower, fruit, stem, root; and/or, after determining the second species, determining a second distinguishing feature of the plant in the first image from a plant corresponding to the second species, the second distinguishing feature comprising a distinction at one or more of a leaf, flower, fruit, stem, root; and
The first distinguishing feature and/or the second distinguishing feature is presented to a particular person in order for the particular person to determine the third species.
5. The method of claim 4, further comprising:
marking at least one of the first discriminating characteristic and the second discriminating characteristic in the first image after determining the at least one of the first discriminating characteristic and the second discriminating characteristic;
wherein presenting the first distinguishing feature and/or the second distinguishing feature to a particular person comprises presenting the first image to a particular person labeled with at least one of the first distinguishing feature and the second distinguishing feature.
6. The method of claim 4, further comprising:
after at least one of the first distinguishing feature and the second distinguishing feature is determined, information related to the at least one of the first distinguishing feature and the second distinguishing feature is acquired from a user in a man-machine interaction mode;
wherein presenting the first distinguishing feature and/or the second distinguishing feature to a particular person comprises: presenting information related to at least one of the first distinguishing feature and the second distinguishing feature to a particular person.
7. The method of claim 1, further comprising:
determining a confidence level corresponding to each of the one or more first species by using the image recognition technology, and/or determining a confidence level corresponding to each of the one or more second species by using the image matching technology;
and displaying the confidence corresponding to each first species and/or the confidence corresponding to each second species to a specific person so that the specific person can determine the third species.
8. The method of claim 7, wherein,
the first species comprises a plurality of first species, and the display sequence of the plurality of first species is based on the confidence corresponding to each first species; and/or
The second species comprises a plurality of; the display sequence of the plurality of second species is based on the confidence corresponding to each second species.
9. The method of claim 1, further comprising:
obtaining a description relating to the plant from the user; and
a description relating to the plant is presented to a particular person so that the particular person determines the third species.
10. The method of claim 1, further comprising:
Obtaining geographic location information from the user; and
the geographic location information is presented to a particular person so that the particular person determines the third species.
CN202211656319.0A 2022-12-22 2022-12-22 Plant species determination method, plant species determination device and computer readable storage medium Pending CN116416522A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211656319.0A CN116416522A (en) 2022-12-22 2022-12-22 Plant species determination method, plant species determination device and computer readable storage medium
PCT/CN2023/138022 WO2024131589A1 (en) 2022-12-22 2023-12-12 Plant species determining method and device and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211656319.0A CN116416522A (en) 2022-12-22 2022-12-22 Plant species determination method, plant species determination device and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN116416522A true CN116416522A (en) 2023-07-11

Family

ID=87058721

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211656319.0A Pending CN116416522A (en) 2022-12-22 2022-12-22 Plant species determination method, plant species determination device and computer readable storage medium

Country Status (2)

Country Link
CN (1) CN116416522A (en)
WO (1) WO2024131589A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024131589A1 (en) * 2022-12-22 2024-06-27 杭州睿胜软件有限公司 Plant species determining method and device and computer readable storage medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018119684A1 (en) * 2016-12-27 2018-07-05 深圳前海达闼云端智能科技有限公司 Image recognition system and image recognition method
CN108460389B (en) * 2017-02-20 2021-12-03 阿里巴巴集团控股有限公司 Type prediction method and device for identifying object in image and electronic equipment
CN110059715A (en) * 2019-03-12 2019-07-26 平安科技(深圳)有限公司 Floristic recognition methods and device, storage medium, computer equipment
CN112036499A (en) * 2020-09-04 2020-12-04 西南民族大学 Traditional Chinese medicine identification method based on convolutional neural network
CN114550165B (en) * 2022-02-21 2024-06-18 杭州睿胜软件有限公司 Method, system and readable storage medium for identifying toxic approximate species
CN115471681A (en) * 2022-08-12 2022-12-13 北京结慧科技有限公司 Image recognition method, device and storage medium
CN116416522A (en) * 2022-12-22 2023-07-11 杭州睿胜软件有限公司 Plant species determination method, plant species determination device and computer readable storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024131589A1 (en) * 2022-12-22 2024-06-27 杭州睿胜软件有限公司 Plant species determining method and device and computer readable storage medium

Also Published As

Publication number Publication date
WO2024131589A1 (en) 2024-06-27

Similar Documents

Publication Publication Date Title
JP2018106662A (en) Information processor, information processing method, and program
CN106203490A (en) Based on attribute study and the image ONLINE RECOGNITION of interaction feedback, search method under a kind of Android platform
CN107153844A (en) The accessory system being improved to flowers identifying system and the method being improved
CN105808637A (en) Personalized recommendation method and device
TW202009681A (en) Sample labeling method and device, and damage category identification method and device
WO2024131589A1 (en) Plant species determining method and device and computer readable storage medium
CN110490237B (en) Data processing method and device, storage medium and electronic equipment
CN109933650B (en) Method and system for understanding picture title in operation
CN111369294B (en) Software cost estimation method and device
CN112287227A (en) Online learning recommendation method and online learning system
JP2019152793A (en) Device, method, and program for processing information
CN111078870A (en) Evaluation data processing method, evaluation data processing device, evaluation data processing medium, and computer device
CN110209916B (en) Method and device for recommending point of interest images
CN116028702A (en) Learning resource recommendation method and system and electronic equipment
CN111767424B (en) Image processing method, image processing device, electronic equipment and computer storage medium
CN113822907A (en) Image processing method and device
CN113707304A (en) Triage data processing method, device, equipment and storage medium
CN111767923B (en) Image data detection method, device and computer readable storage medium
CN116166717B (en) Artificial intelligence information extraction method applied to resume
CN112766130A (en) Classroom teaching quality monitoring method, system, terminal and storage medium
CN111198960A (en) Method and device for determining user portrait data, electronic equipment and storage medium
CN114742522B (en) Method, system, device and storage medium for automatically comparing survey design drawings
JP2021015549A (en) Information processing method and information processing device
CN111143643B (en) Element identification method, element identification device, readable storage medium and electronic equipment
CN114048148A (en) Crowdsourcing test report recommendation method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication