CN110458082B - Urban management case classification and identification method - Google Patents

Urban management case classification and identification method Download PDF

Info

Publication number
CN110458082B
CN110458082B CN201910718588.7A CN201910718588A CN110458082B CN 110458082 B CN110458082 B CN 110458082B CN 201910718588 A CN201910718588 A CN 201910718588A CN 110458082 B CN110458082 B CN 110458082B
Authority
CN
China
Prior art keywords
classification
image
scene
urban management
case
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910718588.7A
Other languages
Chinese (zh)
Other versions
CN110458082A (en
Inventor
王国梁
毛云青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CCI China Co Ltd
Original Assignee
CCI China Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CCI China Co Ltd filed Critical CCI China Co Ltd
Priority to CN201910718588.7A priority Critical patent/CN110458082B/en
Publication of CN110458082A publication Critical patent/CN110458082A/en
Application granted granted Critical
Publication of CN110458082B publication Critical patent/CN110458082B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/19Recognition using electronic means
    • G06V30/192Recognition using electronic means using simultaneous comparisons or correlations of the image signals with a plurality of references
    • G06V30/194References adjustable by an adaptive method, e.g. learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Tourism & Hospitality (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Economics (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • General Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Educational Administration (AREA)
  • Marketing (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Strategic Management (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Development Economics (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • Databases & Information Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a classification and recognition method for urban management cases, which comprises the following steps of obtaining a large number of case image samples, carrying out classification and recognition on case images through manual work, identifying various object objects and positions related to the contents of urban management cases in the images, and inputting various object objects, positions and images identified by manual work into a deep learning model for learning and training. The method does not directly utilize a deep learning model to judge the classification of the urban management case, but utilizes the deep learning model to obtain the classification and the position of various objects appearing in the image of the urban management case, then judges the classification of the specific urban management case according to a scene classification judgment algorithm, and reduces the recognition misjudgment rate.

Description

Urban management case classification and identification method
Technical Field
The invention relates to the technical field of urban management case classification and identification, in particular to a method for classifying and identifying urban management cases.
Background
Nowadays, with the rapid development of urbanization, people continuously improve expected values for city management, so that various problems occur in city management, and the intellectualization of city management follows the demand, which is not only a demand for the development of modern society, but also a demand for the innovation of the traditional city management method. The smart city management system is a tool and means for intelligent city management, and is a rudimentary form along with rapid development of technologies such as computer technology, artificial intelligence, big data analysis and the like, and the smart city management system can improve the problem solving efficiency of city management, utilize information resources to the maximum extent and reduce labor cost by utilizing advanced technologies such as image algorithm, data mining, scheduling algorithm and the like.
The urban management image recognition technology based on deep learning is popular at present, and the technology needs an algorithm to learn effective object features such as shapes, colors and textures in an image according to different urban management case event categories so as to classify the images.
According to the public data, Chinese patent CN201811292068 is retrieved, and an intelligent city violation identification method is an image category identification scheme based on deep learning.
However, there are several problems with this identification scheme: images in the urban management case are shot by mobile phones or collected by cameras in outdoor places such as roads, communities, public venues and the like, background information is complex, image quality is low, the collected quality is uneven, and the case type identification result by using the existing deep learning method is not ideal; another problem is that this identification method does not understand the classification method of urban management cases: deep learning or neural network models adopt a certain recognition algorithm to learn features from a large number of samples and establish the relationship between the features and the class of the urban management case, and the essence of the recognition algorithms is object recognition, namely, the recognition algorithms can only be used for judging objects, but not class definitions in the urban management case, and the classes are generally defined as a scene, namely, a combination condition of the appearance of a plurality of objects.
Disclosure of Invention
The invention aims to solve the defects in the prior art and provides a method for classifying and identifying urban management cases.
In order to achieve the purpose, the invention adopts the following technical scheme: a classification and identification method for urban management cases comprises the following steps:
101: acquiring a large number of case image samples, manually classifying and identifying case images, and identifying various object objects and positions related to the contents of urban management cases in the images;
102: inputting various object objects, positions and images identified by manual work into an image recognition model for learning training to obtain an improved image recognition model;
103: acquiring a large amount of scene classification data and object list data corresponding to scenes;
104: inputting the scene classification data and the object classification list into a scene recognition model for learning to obtain an improved scene recognition model;
105: acquiring image data through a business system of a city management department;
106: providing the image data as input to an image recognition model, and reasoning by the image recognition model to obtain the prediction results of various objects in the image;
107: the classification and the position of various objects in the result image are used as input to be provided for a scene recognition model, and the classification of a specific urban management case is obtained through algorithm judgment;
108: and the urban management business system enters a corresponding case processing flow according to case classification.
As a further description of the above technical solution:
the object classification in the step 101 includes: people, cars, tricycles, roll-up doors, stools, tables, leafy vegetables, fruits, trash cans, plastic bags, boxes, books, backgrounds, and the like, but are not limited thereto;
the image samples in the step 101 are various images generated in the process of processing the urban management case, including but not limited to various illegal parking, litter, store-out operation, and the like.
As a further description of the above technical solution:
the target recognition algorithm for learning training in the step 102 is a model for obtaining urban management object recognition classification through continuous learning, and the learning process includes repeating the training process through continuous parameter adjustment, wherein the target recognition algorithm includes object recognition algorithms such as SSD and YOLO.
As a further description of the above technical solution:
the scene classification in the step 103 includes various illegal parking, littering garbage, store outlet operation, mobile vendor, littering materials and the like but is not limited thereto;
the object list data in step 103 may be provided in the form of text data or database records, a scene classification corresponds to a plurality of object classifications, and a plurality of classification data may be provided in the form of a list or a plurality of table records.
As a further description of the above technical solution:
in the step 104, scene classification, identification and learning are continuously learned to obtain a model of urban management case scene classification, wherein the learning process comprises the step of repeatedly training by continuously adjusting parameters; the scene classification algorithm comprises a visual word packet model and deep learning.
As a further description of the above technical solution:
the receiving method of the image in the step 105 may be based on data stream, local file, network transmission and other modes, and the image in the step 105 is from various camera devices of urban management departments.
As a further description of the above technical solution:
and 106, the image recognition model receives an image file, the judgment probability of each object is obtained through the calculation of the image recognition model, the positions of the objects in the image and the positions of the objects are obtained after the objects are screened by a preset threshold value, and a plurality of objects on one image can form a list or a database record.
As a further description of the above technical solution:
and 107, the scene recognition model receives the object position information to judge the scene according to a preset judgment condition and outputs a classification result of the scene, wherein the scene comprises: parking violations, litter, store outing, mobile dealers, scrabbles, and the like, but are not limited thereto.
As a further description of the above technical solution:
in the step 108, the urban management service system receives the image classification and the position information parameters as input, introduces a specific service module, and receives means including local function call, remote rpc call and the like;
the case classification in the step 108 comprises: illegal parking, littering garbage, out-of-store operation, mobile stall dealer, littering materials and other urban management case categories.
As a further description of the above technical solution:
the image recognition model is used for distinguishing various objects in the image and obtaining the classification and the corresponding positions of the various objects appearing in the image, wherein the recognition means of the objects in the image comprise target recognition deep learning algorithms such as SSD, YOLO and the like;
the scene recognition model is used for judging the combination of the objects to be appeared and obtaining the scene classification of the urban management case to which the image possibly belongs, wherein the scene recognition means comprises a visual word bag model, deep learning and other means.
Advantageous effects
The invention provides a classification and identification method for urban management cases. The method has the following beneficial effects:
(1): the urban management case classification and identification method does not directly utilize a deep learning model to judge the urban management case classification, but utilizes the deep learning model to obtain the classification and the position of various objects appearing in an urban management case image, and then judges the classification of a specific urban management case according to a scene classification judgment algorithm to reduce the identification misjudgment rate.
(2): the method for classifying and identifying the urban management cases improves the automation degree of case classification in urban management by identifying and judging the image content, and also improves the accuracy degree of case classification in urban management by matching and judging the scenes in the images.
Drawings
Fig. 1 is a flow chart of a classification and identification method for city management cases according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments.
Referring to fig. 1, a method for classifying and identifying urban management cases includes the following steps:
101: acquiring a large number of case image samples, manually classifying and identifying case images, and identifying various object objects and positions related to the contents of urban management cases in the images;
102: inputting various object objects, positions and images identified by manual work into an image recognition model for learning training to obtain an improved image recognition model;
103: acquiring a large amount of scene classification data and object list data corresponding to scenes;
104: inputting the scene classification data and the object classification list into a scene recognition model for learning to obtain an improved scene recognition model;
105: acquiring image data through a business system of a city management department;
106: providing the image data as input to an image recognition model, and reasoning by the image recognition model to obtain the prediction results of various objects in the image;
107: the classification and the position of various objects in the result image are used as input to be provided for a scene recognition model, and the classification of a specific urban management case is obtained through algorithm judgment;
108: and the urban management business system enters a corresponding case processing flow according to case classification.
The object classification in step 101 includes: people, cars, tricycles, roll-up doors, stools, tables, leafy vegetables, fruits, trash cans, plastic bags, boxes, books, backgrounds, and the like, but are not limited thereto;
the image samples in the step 101 are various images generated in the process of processing the urban management case, including but not limited to various illegal parking, littering garbage, out-of-store operation and the like.
The marking process can accept an input image file as output, the manually marked content can be stored in a marking record in the form of a database, a text file and the like, and a mapping relation is established between the input image file and the marking record.
The labeling results of the plurality of image files form a labeling record list, and the labeling record list is saved as output in the form of a data record list, wherein the data record list can be a database, a text file and the like.
The target recognition algorithm for learning training in the step 102 is a model for obtaining urban management object recognition classification through continuous learning, and the learning process comprises the step of repeating the training process through continuous parameter adjustment, wherein the target recognition algorithm comprises object recognition algorithms such as SSD and YOLO.
The image recognition model can receive the input image and the data record list of the corresponding annotation record of the image, and an improved model is formed by continuously adjusting internal parameters through an optimization algorithm through the difference between the model calculation result and the annotation result
The scene classification in step 103 includes, but is not limited to, various parking violations, litter, store outing, mobile vendor, and material heaps.
The object list data may be provided in the form of text data, database records, a scene classification corresponding to a plurality of object classifications, and a plurality of classification data may be provided in the form of a list or a plurality of table records.
And 104, continuously learning to obtain a model for classifying the scenes of the urban management case, wherein the learning process comprises continuously adjusting parameters and repeating the training process, and the scene classification algorithm comprises a visual word packet model and deep learning.
The scene classification recognition model can receive scene classification and a data record list of various objects corresponding to the scene, and an improved model is formed by continuously adjusting internal parameters through an optimization algorithm after calculating the difference between an inference result and an actual result for multiple times
The receiving method of the image in the step 105 can be based on data stream, local file, network transmission and other modes, and the image in the step 105 is from various camera devices of urban management departments.
And 106, receiving an image file by the image recognition model, calculating the image recognition model to obtain the judgment probability of various objects, screening the judgment probability by a preset threshold value to obtain the objects and the positions of the objects in the image, wherein a plurality of objects on one image can form a list or a database record.
And 107, the scene recognition model receives the object position information to realize scene judgment according to a preset judgment condition and outputs a scene classification result, wherein the scene comprises: parking violations, litter, store outing, mobile dealers, scrabbles, and the like, but are not limited thereto.
108, the urban management service system receives the image classification and the position information parameters as input, introduces a specific service module, and the receiving means comprises local function call, remote rpc call and the like;
the case classification in the step 108 comprises the following steps: illegal parking, littering garbage, out-of-store operation, mobile stall dealer, littering materials and other urban management case categories.
The image recognition model is used for distinguishing various objects in the image and obtaining the classification and the corresponding positions of the various objects appearing in the image, wherein the recognition means of the objects in the image comprises target recognition depth learning algorithms such as SSD, YOLO and the like;
the scene recognition model is used for judging the combination of the objects to be appeared and obtaining the scene classification of the urban management case to which the image possibly belongs, wherein the scene recognition means comprises a visual word packet model, deep learning and other means.
The urban management case scene classification comprises the following steps: class of urban management cases such as illegal parking, random waste, out-of-store operation, mobile stall dealer and random material piling
In the description herein, references to the description of "one embodiment," "an example," "a specific example" or the like are intended to mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art should be considered to be within the technical scope of the present invention, and the technical solutions and the inventive concepts thereof according to the present invention should be equivalent or changed within the scope of the present invention.

Claims (8)

1. A classification and identification method for urban management cases is characterized by comprising the following steps:
101: acquiring a large number of case image samples, manually classifying and identifying case images, and identifying various object objects and positions related to the contents of urban management cases in the images;
102: inputting various object objects, positions and images identified by manual work into an image recognition model for learning training to obtain an improved image recognition model;
103: acquiring a large amount of scene classification data and object list data corresponding to scenes;
104: inputting the scene classification data and the object classification list into a scene recognition model for learning to obtain an improved scene recognition model;
105: acquiring image data through a business system of a city management department;
106: providing the image data as input to an image recognition model, and reasoning by the image recognition model to obtain the prediction results of various objects in the image;
107: the classification and the position of various objects in the result image are used as input to be provided for a scene recognition model, and the classification of a specific urban management case is obtained through algorithm judgment;
108: the urban management business system enters a corresponding case processing flow according to case classification,
the urban management service system receives image classification and position information parameters as input, introduces a specific service module, and receives means including local function call, remote rpc call and the like;
the case classification includes: illegal parking, littering garbage, out-of-store operation, mobile stall dealer, littering materials and other urban management case categories;
the image recognition model is used for distinguishing various objects in the image and obtaining the classification and the corresponding positions of the various objects appearing in the image, wherein the recognition means of the objects in the image comprise target recognition deep learning algorithms such as SSD, YOLO and the like;
the scene recognition model is used for judging the combination of the objects to be appeared and obtaining the scene classification of the urban management case to which the image possibly belongs, wherein the scene recognition means comprises a visual word bag model, deep learning and other means.
2. The method according to claim 1, wherein the object classification in step 101 comprises: people, cars, tricycles, roll-up doors, stools, tables, leafy vegetables, fruits, trash cans, plastic bags, boxes, books, backgrounds, and the like, but are not limited thereto;
the image samples in the step 101 are various images generated in the process of processing the urban management case, including but not limited to various illegal parking, litter, store-out operation, and the like.
3. The method of claim 1, wherein the target recognition algorithm for learning training in step 102 is a model for obtaining the recognition and classification of the urban management object through continuous learning, and the learning process includes repeating the training process through continuous parameter adjustment, wherein the target recognition algorithm includes object recognition algorithms such as SSD and YOLO.
4. The method as claimed in claim 1, wherein the scene classification in step 103 includes but is not limited to illegal parking, litter, store-out operation, mobile vendor, heap material, etc.;
the object list data in step 103 may be provided in the form of text data or database records, a scene classification corresponds to a plurality of object classifications, and a plurality of classification data may be provided in the form of a list or a plurality of table records.
5. The method according to claim 1, wherein the scene classification recognition learning in step 104 is a model for obtaining the scene classification of the urban management case by continuously learning, and the learning process comprises repeating the training process by continuously adjusting parameters; the scene classification algorithm comprises a visual word packet model and deep learning.
6. The method as claimed in claim 1, wherein the receiving method of the image in step 105 is based on data stream, local file, network transmission, etc., and the image in step 105 is from various cameras in the department of city management.
7. The method as claimed in claim 1, wherein in step 106, the image recognition model receives an image file, the image recognition model is calculated to obtain the judgment probability of each object, the object and the position of the object in the image are obtained after screening by a predetermined threshold, and a plurality of objects on a picture can form a list or database record.
8. The method according to claim 1, wherein the scene recognition model in step 107 receives object position information to realize scene judgment according to a predetermined judgment condition, and outputs a classification result of the scene, wherein the scene comprises: parking violations, litter, store outing, mobile dealers, scrabbles, and the like, but are not limited thereto.
CN201910718588.7A 2019-08-05 2019-08-05 Urban management case classification and identification method Active CN110458082B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910718588.7A CN110458082B (en) 2019-08-05 2019-08-05 Urban management case classification and identification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910718588.7A CN110458082B (en) 2019-08-05 2019-08-05 Urban management case classification and identification method

Publications (2)

Publication Number Publication Date
CN110458082A CN110458082A (en) 2019-11-15
CN110458082B true CN110458082B (en) 2022-05-03

Family

ID=68484889

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910718588.7A Active CN110458082B (en) 2019-08-05 2019-08-05 Urban management case classification and identification method

Country Status (1)

Country Link
CN (1) CN110458082B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111126252B (en) * 2019-12-20 2023-08-18 浙江大华技术股份有限公司 Swing behavior detection method and related device
CN111539400A (en) * 2020-07-13 2020-08-14 追创科技(苏州)有限公司 Control method and device of self-moving equipment, storage medium and self-moving equipment
WO2022012471A1 (en) * 2020-07-13 2022-01-20 追觅创新科技(苏州)有限公司 Control method for self-moving device, apparatus, storage medium, and self-moving device
CN112214628A (en) * 2020-10-28 2021-01-12 城云科技(中国)有限公司 Information distribution method, system equipment and storage medium for city management
CN112329605B (en) * 2020-11-03 2022-05-17 中再云图技术有限公司 City appearance random pasting and random drawing behavior identification method, storage device and server
CN112651871A (en) * 2020-12-17 2021-04-13 北京无线电计量测试研究所 City management law enforcement system and method based on augmented reality technology
CN112733909A (en) * 2020-12-31 2021-04-30 北京软通智慧城市科技有限公司 Duplicate removal identification method, device, medium and electronic equipment for urban cases
CN113221804B (en) * 2021-05-25 2023-03-24 城云科技(中国)有限公司 Disordered material detection method and device based on monitoring video and application

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106250871A (en) * 2016-08-16 2016-12-21 桂林电子科技大学 City management case classification method and device
CN108133236A (en) * 2017-12-25 2018-06-08 城云科技(中国)有限公司 A kind of method, integrating device and the system of the classification of city management digital image recognition
CN108921083A (en) * 2018-06-28 2018-11-30 浙江工业大学 Illegal flowing street pedlar recognition methods based on deep learning target detection
CN109002744A (en) * 2017-06-06 2018-12-14 中兴通讯股份有限公司 Image-recognizing method, device and video monitoring equipment
CN109063612A (en) * 2018-07-19 2018-12-21 中智城信息技术有限公司 City intelligent red line management method and machine readable storage medium
CN109190608A (en) * 2018-10-30 2019-01-11 长威信息科技发展股份有限公司 A kind of city intelligent identification Method violating the regulations
CN109471922A (en) * 2018-09-29 2019-03-15 平安科技(深圳)有限公司 Case type recognition methods, device, equipment and medium based on deep learning model
CN109993047A (en) * 2017-12-28 2019-07-09 杭州海康威视系统技术有限公司 City huddles violation recognition methods, device and the electronic equipment of material
CN110020755A (en) * 2019-04-12 2019-07-16 城云科技(中国)有限公司 City management system based on man-machine coordination

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106250871A (en) * 2016-08-16 2016-12-21 桂林电子科技大学 City management case classification method and device
CN109002744A (en) * 2017-06-06 2018-12-14 中兴通讯股份有限公司 Image-recognizing method, device and video monitoring equipment
CN108133236A (en) * 2017-12-25 2018-06-08 城云科技(中国)有限公司 A kind of method, integrating device and the system of the classification of city management digital image recognition
CN109993047A (en) * 2017-12-28 2019-07-09 杭州海康威视系统技术有限公司 City huddles violation recognition methods, device and the electronic equipment of material
CN108921083A (en) * 2018-06-28 2018-11-30 浙江工业大学 Illegal flowing street pedlar recognition methods based on deep learning target detection
CN109063612A (en) * 2018-07-19 2018-12-21 中智城信息技术有限公司 City intelligent red line management method and machine readable storage medium
CN109471922A (en) * 2018-09-29 2019-03-15 平安科技(深圳)有限公司 Case type recognition methods, device, equipment and medium based on deep learning model
CN109190608A (en) * 2018-10-30 2019-01-11 长威信息科技发展股份有限公司 A kind of city intelligent identification Method violating the regulations
CN110020755A (en) * 2019-04-12 2019-07-16 城云科技(中国)有限公司 City management system based on man-machine coordination

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
城市管理案件的图像分类算法研究;杨浩;《万方数据知识服务平台》;20180321;正文第1-50页 *

Also Published As

Publication number Publication date
CN110458082A (en) 2019-11-15

Similar Documents

Publication Publication Date Title
CN110458082B (en) Urban management case classification and identification method
CN111444848A (en) Specific scene model upgrading method and system based on federal learning
CN110294236A (en) A kind of garbage classification monitoring apparatus, method and server system
CN101923653B (en) Multilevel content description-based image classification method
CN108509954A (en) A kind of more car plate dynamic identifying methods of real-time traffic scene
CN110210635A (en) A kind of intelligent classification recovery system that can identify waste
CN106599925A (en) Plant leaf identification system and method based on deep learning
CN107480643B (en) Intelligent garbage classification processing robot
CN108171136A (en) A kind of multitask bayonet vehicle is to scheme to search the system and method for figure
CN110861851A (en) Community garbage classification system and method based on Internet of things
CN111186656A (en) Target garbage classification method and intelligent garbage can
US11335086B2 (en) Methods and electronic devices for automated waste management
CN206546593U (en) House refuse intelligent classification reclaims cloud identifying system
CN106331636A (en) Intelligent video monitoring system and method of oil pipelines based on behavioral event triggering
CN104268528A (en) Method and device for detecting crowd gathered region
Gyawali et al. Comparative analysis of multiple deep CNN models for waste classification
CN116189099B (en) Method for detecting and stacking exposed garbage based on improved yolov8
CN112488162A (en) Garbage classification method based on active learning
CN106960176A (en) A kind of pedestrian's gender identification method based on transfinite learning machine and color characteristic fusion
CN111723772B (en) Perishable garbage identification method and device based on image identification and computer equipment
CN112298844A (en) Garbage classification monitoring method and device
CN113220878A (en) Knowledge graph-based OCR recognition result classification method
CN111582219A (en) Intelligent pet management system
CN113807347A (en) Kitchen waste impurity identification method based on target detection technology
CN111217062A (en) Garbage can garbage identification method based on edge calculation and deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant