CN111832750A - User registration method of garbage classification platform and related products - Google Patents

User registration method of garbage classification platform and related products Download PDF

Info

Publication number
CN111832750A
CN111832750A CN201910290177.2A CN201910290177A CN111832750A CN 111832750 A CN111832750 A CN 111832750A CN 201910290177 A CN201910290177 A CN 201910290177A CN 111832750 A CN111832750 A CN 111832750A
Authority
CN
China
Prior art keywords
target
angle
preset
parameter
garbage
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910290177.2A
Other languages
Chinese (zh)
Other versions
CN111832750B (en
Inventor
吕胜军
莫乙玟
蒋晓明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Jiajia Classification Technology Co ltd
Original Assignee
Shenzhen Jiajia Classification Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Jiajia Classification Technology Co ltd filed Critical Shenzhen Jiajia Classification Technology Co ltd
Priority to CN201910290177.2A priority Critical patent/CN111832750B/en
Publication of CN111832750A publication Critical patent/CN111832750A/en
Application granted granted Critical
Publication of CN111832750B publication Critical patent/CN111832750B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/30Administration of product recycling or disposal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K17/00Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations
    • G06K17/0022Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations arrangements or provisious for transferring data to distant stations, e.g. from a sensing device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02WCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO WASTEWATER TREATMENT OR WASTE MANAGEMENT
    • Y02W90/00Enabling technologies or technologies with a potential or indirect contribution to greenhouse gas [GHG] emissions mitigation

Landscapes

  • Business, Economics & Management (AREA)
  • Engineering & Computer Science (AREA)
  • Tourism & Hospitality (AREA)
  • Human Resources & Organizations (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Primary Health Care (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Educational Administration (AREA)
  • Development Economics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Sustainable Development (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The embodiment of the application discloses a user registration method of a garbage classification platform and a related product, wherein the method comprises the following steps: scanning a target two-dimensional code, and entering a registration page of a garbage classification platform; acquiring target identity information of a target user on the garbage classification platform; according to the target identity information, setting identification information for tracing the garbage bags for the target user, wherein the identification information is used for uniquely identifying the target user, and the garbage bags are at least one of the following types: dry trash bags and wet trash bags. By adopting the embodiment of the application, the garbage can be traced.

Description

User registration method of garbage classification platform and related products
Technical Field
The application relates to the technical field of electronics, in particular to a user registration method of a garbage classification platform and a related product.
Background
With the widespread use of electronic devices (such as mobile phones, tablet computers, and the like), the electronic devices have more and more applications and more powerful functions, and the electronic devices are developed towards diversification and personalization, and become indispensable electronic products in the life of users.
In life, most people still have a great chance to treat garbage, for example, garbage is thrown everywhere, and the problem of how to repeat garbage needs to be solved urgently.
Disclosure of Invention
The embodiment of the application provides a user registration method of a garbage classification platform and a related product, and can realize the tracing of garbage.
In a first aspect, an embodiment of the present application provides a user registration method for a garbage classification platform, including:
scanning a target two-dimensional code, and entering a registration page of a garbage classification platform;
acquiring target identity information of a target user on the garbage classification platform;
according to the target identity information, setting identification information for tracing the garbage bags for the target user, wherein the identification information is used for uniquely identifying the target user, and the garbage bags are at least one of the following types: dry trash bags and wet trash bags.
In a second aspect, an embodiment of the present application provides a user registration apparatus for a garbage classification platform, where the apparatus includes:
the scanning unit is used for scanning the target two-dimensional code and entering a registration page of the garbage classification platform;
the acquisition unit is used for acquiring target identity information of a target user on the garbage classification platform;
the generating unit is used for formulating identification information for tracing the garbage bags for the target user according to the target identity information, the identification information is used for uniquely identifying the target user, and the garbage bags are at least one of the following types: dry trash bags and wet trash bags.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for executing the steps in the first aspect of the embodiment of the present application.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program enables a computer to perform some or all of the steps described in the first aspect of the embodiment of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product, where the computer program product includes a non-transitory computer-readable storage medium storing a computer program, where the computer program is operable to cause a computer to perform some or all of the steps as described in the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
It can be seen that, in the user registration method for the garbage classification platform and the related product described in the embodiment of the present application, the target two-dimensional code is scanned, the garbage classification platform enters the registration page of the garbage classification platform, the target identity information of the target user is obtained on the garbage classification platform, the identification information for tracing back the garbage bag is formulated for the target user according to the target identity information, the identification information is used for uniquely identifying the target user, and the garbage bag is at least one of the following: dry disposal bag and wet disposal bag, so, can be for the disposal bag generates specific identification information, realize traceing back the disposal bag, finally can trace back rubbish.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1A is a schematic flowchart of a user registration method of a garbage classification platform disclosed in an embodiment of the present application;
FIG. 1B is a schematic illustration of an interface presentation of a garbage classification platform disclosed in an embodiment of the present application;
fig. 2 is a schematic flowchart of another user registration method of a garbage classification platform disclosed in an embodiment of the present application;
fig. 3 is a schematic structural diagram of another electronic device disclosed in the embodiments of the present application;
fig. 4A is a schematic structural diagram of a user registration apparatus of a garbage classification platform disclosed in an embodiment of the present application;
fig. 4B is a schematic structural diagram of another user registration apparatus of a garbage classification platform disclosed in an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The electronic device according to the embodiment of the present application may include various handheld devices (e.g., smart phones), vehicle-mounted devices, smart trash cans, biochemical degradation devices, Virtual Reality (VR)/Augmented Reality (AR) devices, wearable devices, computing devices, or other processing devices connected to wireless modems, and various forms of User Equipment (UE), Mobile Stations (MS), terminal devices (terminal device), research and development/test platforms, servers, and so on. For convenience of description, the above-mentioned devices are collectively referred to as electronic devices.
The following describes embodiments of the present application in detail.
Referring to fig. 1A, fig. 1A is a schematic flowchart of a user registration method of a garbage classification platform disclosed in an embodiment of the present application, where the user registration method of the garbage classification platform includes the following steps 101-103.
101. And scanning the target two-dimension code, and entering a registration page of the garbage classification platform.
Wherein, the two-dimensional code can be the two-dimensional code that rubbish classification platform corresponds. The two-dimensional code can be a website or a page link, and in specific implementation, after the target two-dimensional code is scanned, a registration page of the garbage classification platform can be accessed, and a user is prompted to complete registration operation.
102. And acquiring target identity information of a target user on the garbage classification platform.
In this embodiment, the identity information may be at least one of the following: name, gender, age, identification number, mobile phone number, graduation number, academic number, student number, driver's license number, home address, native place, work experience, bank card number, social account number, family relationship, etc., without limitation. In specific implementation, a user may obtain target identity information of a target user on a garbage allocation platform, for example, the target identity information may be manually input, or target identity information registered in advance in a pre-bound third-party APP may also be directly read.
Optionally, the registration page includes a face input control; in the step 102, obtaining the target identity information of the target user on the garbage classification platform may include the following steps:
21. when the touch operation aiming at the face input control is detected, starting a camera, and shooting according to a first shooting parameter to obtain a preview image;
22. carrying out face detection on the preview image;
23. when a face is detected, determining a relative angle and a relative distance between the face and the camera;
24. adjusting the first shooting parameter according to the relative angle and the relative distance to obtain a second shooting parameter;
25. shooting according to the second shooting parameters to obtain a target face image;
26. searching the target face image in a preset database to obtain a preset face template successfully matched with the target face image;
27. and taking the identity information corresponding to the preset face template as the target identity information of the target user.
Wherein, in this application embodiment, electronic equipment's camera can be single camera, two cameras or many cameras, and the camera can be infrared camera or visible light camera, and of course, the camera still can be wide-angle camera, and perhaps the camera can be rotatable camera. The first photographing parameter may be at least one of: the focal length, the exposure duration, the aperture size, the screen fill-in light parameter, the camera angle parameter, ISO, and the like, which are not limited herein, wherein the screen fill-in light parameter may be at least one of the following: the camera angle parameter may be at least one of the following parameters, without limitation, screen brightness, screen wallpaper, screen color temperature, screen light emitting area, and the like: the camera rotation angle, the camera focusing angle, the camera shooting range angle (e.g., wide-angle mode or non-wide-angle mode), the camera rotation direction, the camera rotation angle, the camera rotation speed, and the like, which are not limited herein. The preset database can store a plurality of face templates.
The registration page includes a face input control, as shown in fig. 1B, the garbage classification platform may include a face input control, in a specific implementation, when a touch operation for the face input control is detected, a camera may be started, and a first shooting parameter is used for shooting to obtain a preview image, a face detection is performed on the preview image, when a face is detected, a relative angle and a relative distance between the face and the camera are determined, specifically, the method may be implemented by two cameras or a distance sensor, and the first shooting parameter may be adjusted according to the relative angle and the relative distance to obtain a second shooting parameter, and a shooting is performed according to the second shooting parameter to obtain a target face image, so that the face front image may be shot as much as possible, the target face image is searched in a preset database to obtain a preset face template successfully matched with the target face image, the face search can be accurately realized, and then the identity information corresponding to the preset face template can be used as the target identity information of the target user, so that the identity information of the registered user can be quickly acquired without manual input of the user.
Optionally, the first shooting parameter at least includes: a screen fill light parameter and a camera angle parameter; in the step 24, adjusting the first shooting parameter according to the relative angle and the relative distance to obtain a second shooting parameter may include the following steps:
241. when the relative angle is not in a preset angle range, determining a difference value between the relative angle and a preset angle, and determining a target camera angle parameter corresponding to the camera according to the difference value, wherein the target camera angle parameter comprises a rotation angle, after the camera rotates by the rotation angle, an angle between the camera and the face is in the preset angle range, and the preset angle is any angle in the preset angle range;
242. determining a target screen light supplement parameter corresponding to the relative distance according to a mapping relation between a preset distance and the screen light supplement parameter;
243. and taking the angle parameter of the target camera and the light supplement parameter of the target screen as the second shooting parameter.
The preset angle range and the preset angle can be set by a user or defaulted by a system, and the preset angle can be understood as a reference angle which refers to a relative angle between a shooting object and a camera. The first shooting parameter at least includes: in the specific implementation, when the relative angle is not within the preset angle range, it is indicated that the photographing angle is not good, and adjustment is required, so that a difference between the relative angle and the preset angle can be determined, and a target camera angle parameter corresponding to the camera can be determined according to the difference, and specifically, the target camera angle parameter corresponding to the difference can be determined according to a mapping relationship between a preset numerical value and the camera angle parameter. In addition, the target camera angle parameter includes a rotation angle, after the camera rotates by the rotation angle, the angle between the camera and the face is in a preset angle range, the preset angle is any angle in the preset angle range, a mapping relation between a preset distance and a screen light supplement parameter can be stored in the electronic device in advance, further, a target screen light supplement parameter corresponding to the relative distance can be determined according to the mapping relation between the preset distance and the screen light supplement parameter, and the target camera angle parameter and the target screen light supplement parameter are used as a second shooting parameter, so that the best light supplement effect can be achieved, the shooting angle is optimal, and clear face images can be obtained.
Optionally, in the step 26, searching the target face image in a preset database to obtain a preset face template successfully matched with the target face image, the method may include the following steps:
261. acquiring target environment parameters corresponding to the target face image;
262. determining a target matching threshold corresponding to the target environment parameter according to a mapping relation between a preset environment parameter and the matching threshold;
263. extracting the contour of the target face image to obtain a first contour;
264. extracting feature points of the target face image to obtain a first feature point set;
265. acquiring the brightness of a target environment;
266. determining a target weight distribution factor corresponding to the target ambient brightness according to a preset mapping relation between the ambient brightness and the weight distribution factor, wherein the target weight distribution factor comprises a target contour weight factor and a target characteristic point weight factor;
267. determining a contour matching threshold value according to the target contour weight factor and the target matching threshold value;
268. determining a feature point matching threshold according to the target feature point weight factor and the target matching threshold;
269. acquiring a second contour and the second feature point set corresponding to the preset face template, wherein the preset face template is any one face template in the preset database;
2610. matching the first contour with the second contour to obtain a first matching value;
2611. matching the first characteristic point set and the second characteristic point set to obtain a second matching value;
2612. when the first matching value is larger than the contour matching threshold and the second matching value is larger than the feature point matching threshold, determining a target matching value according to the first matching value, the second matching value and the target weight distribution factor;
2613. and when the target matching value is larger than the target matching threshold value, confirming that the target face image is successfully matched with the preset face template.
Wherein, the environmental parameter may be at least one of the following: the method includes, but is not limited to, ambient brightness, ambient color temperature, geographical location, weather, humidity, temperature, magnetic field disturbance parameters, and the like, and the feature point extraction may be at least one of the following: harris corner detection, Scale Invariant Feature Transform (SIFT), SURF feature extraction algorithm, etc., which may be at least one of the following: hough transform, fourier transform, pyramid transform, etc., and are not limited herein. The mapping relationship between the preset environment parameter and the matching threshold value can be stored in the electronic device in advance. Of course, the electronic device may also pre-store a mapping relationship between preset ambient brightness and a weight distribution factor, where the weight distribution factor includes a profile weight factor and a feature point weight factor, and the profile weight factor + the feature point weight factor is 1, which is specifically as follows:
ambient brightness Weight assignment factor
K1 (A1,B1)
K2 (A2,B2)
Kn (An,Bn)
Where K1, K2, …, and Kn represent ambient brightness, a1, a2, …, and An represent contour weight factors, and B1, B2, …, and Bn represent feature point weight factors.
In the concrete implementation, the target environment parameters corresponding to the target face image can be obtained, the target matching threshold corresponding to the target environment parameters is determined according to the mapping relation between the preset environment parameters and the matching threshold, so that the matching value is suitable for the environment and is more beneficial to improving the matching accuracy, the target face image is subjected to contour extraction to obtain a first contour, the target face image is subjected to feature point extraction to obtain a first feature point set, the target environment brightness is obtained, the target weight distribution factor corresponding to the target environment brightness is determined according to the mapping relation between the preset environment brightness and the weight distribution factor, the target weight distribution factor comprises a target contour weight factor and a target feature point weight factor, the contour matching threshold is determined according to the target contour weight factor and the target matching threshold, namely the contour matching threshold is the contour weight factor and the target matching threshold, determining a feature point matching threshold according to the target feature point weight factor and the target matching threshold, wherein the feature point matching threshold is the target matching threshold and the target feature point weight factor, acquiring a second contour and a second feature point set corresponding to a preset face template, the preset face template is any face template in a preset database, matching the first contour with the second contour to obtain a first matching value, matching the first feature point set with the second feature point set to obtain a second matching value, when the first matching value is greater than the contour matching threshold and the second matching value is greater than the feature point matching threshold, determining a target matching value according to the first matching value, the second matching value and the target weight distribution factor, when the target matching value is greater than the target matching threshold, the target matching value is the first matching value and the target contour weight factor plus the second matching value and the target feature point weight factor, and confirming that the target face image is successfully matched with the preset face template, wherein the matching of the target face image and the preset face template is failed when the first matching value is smaller than or equal to the contour matching threshold, or the second matching value is smaller than or equal to the feature point matching threshold, or the target matching value is smaller than or equal to the target matching threshold, so that the face can be accurately identified.
103. According to the target identity information, setting identification information for tracing the garbage bags for the target user, wherein the identification information is used for uniquely identifying the target user, and the garbage bags are at least one of the following types: dry trash bags and wet trash bags.
In specific implementation, the registration operation can be completed according to the target identity information, and the identification information for tracing the garbage bags can be formulated for the target user according to the target identity information. The identification information may be at least one of: two-dimensional codes, character strings, watermarks, LOGO, patterns or other invisible marks, and the like, which are not limited herein. The identification information can be branded on the garbage bag. In the embodiment of the present application, the garbage bags may include a dry garbage bag for containing dry garbage, such as paper, packaging material, etc., and a wet garbage bag for containing wet garbage, such as pericarp, vegetable leaves, leftovers, etc., which are not limited herein. The dry garbage bag and the wet garbage bag can be different in color and material.
Optionally, between the above step 102 and step 103, the following steps may be further included:
a1, sending the target identity information to a third party auditing agency;
a2, after the target identity information is approved by the third-party auditing mechanism, the step of formulating identification information for tracing the garbage bags for the target user according to the target identity information is executed.
Wherein, the third party audit organization can be at least one of the following: community administration, police, third party applications, and the like, without limitation. In specific implementation, the target identity information may be sent to a third-party auditing mechanism, and step 103 is executed after the third-party auditing mechanism passes auditing of the target identity information, so that authenticity of the user identity can be ensured, and management efficiency is improved.
Based on the embodiment of the application, in the concrete implementation, the user scans the code for registration, registers identity information and enters a garbage classification platform, the platform records the identity of the user, the garbage bag can be customized for the user, the garbage bag comprises a dry garbage bag and a wet garbage bag, and marks such as two-dimensional codes are attached to the garbage bag, the purpose is tracing, namely, the garbage delivery record of the user is tracked, the score of the user for garbage delivery is determined according to the garbage delivery record, the environmental protection fund is determined according to the score, and other related affairs are carried out according to the garbage delivery record.
Optionally, after the step 103, the following steps may be further included:
b1, sending a garbage delivery request to a server, wherein the garbage delivery request carries target position information;
b2, receiving load state information of a plurality of garbage cans corresponding to the target position information sent by the server;
b3, acquiring the delivered garbage amount of the target user;
b4, determining to select a target garbage can from the plurality of garbage cans according to the garbage amount;
and B5, generating a navigation route between the target position information and the target garbage can.
Wherein the amount of the garbage can be at least one of the following: waste volume, waste weight, waste type, etc., without limitation. In a specific implementation, a spam delivery request may be sent to the server, where the spam delivery request carries the target location information, and further, load status information of a plurality of trash cans near the target location information sent by the server may be received, where the load status information may be understood as a trash holding degree, for example, the load status information may be that the trash cans are full, or that the trash cans are loaded in 50% of space, and so on. Furthermore, the garbage amount delivered by the target user can be acquired, the garbage can capable of containing the garbage amount in the garbage cans can be used as the target garbage can, a navigation route between the target position information and the target garbage can be generated, and the user can conveniently and quickly find the garbage can.
Further optionally, in the step B4, determining to select the target trash can from the plurality of trash cans according to the trash amount includes:
b41, determining the cleaning time of each garbage can in the plurality of garbage cans to obtain a plurality of cleaning time points;
b42, determining a navigation route between the target position information and each garbage bin in the plurality of garbage bins, and determining a route corresponding to each navigation route to obtain a plurality of routes;
b43, determining the arrival time point of the target user to each garbage can according to the multiple routes to obtain multiple arrival time points;
b44, estimating target load state information of the target user reaching each of the plurality of garbage cans according to the plurality of cleaning time points and the plurality of arrival time points to obtain a plurality of target load state information;
and B45, selecting a garbage can corresponding to the target load state information which can contain the garbage amount and has the closest arrival time point from the target load state information as a target garbage can.
Wherein, in the concrete implementation, the cleaning time of each garbage can in the plurality of garbage cans can be determined to obtain a plurality of cleaning time points, in general, the garbage can be cleaned at certain time intervals in consideration of the full state of the garbage can, the time point corresponding to the cleaning can be understood as the cleaning time point, further, a navigation route between the target position information and each garbage can in the plurality of garbage cans can be determined, a route corresponding to each navigation route can be determined to obtain a plurality of routes, since the average estimation of the user walking can be obtained by the user APP, the arrival time point of the target user reaching each garbage can be determined according to the plurality of routes to obtain a plurality of arrival times, the target load state information of the target user reaching each garbage can in the plurality of garbage cans can be estimated according to the plurality of cleaning time points and the plurality of arrival time points to obtain a plurality of target load state information, can estimate very easily that the user arrives the rubbish that each garbage bin corresponds and clear up or not clear up yet, know the load condition that the user corresponds when falling rubbish promptly to a certain garbage bin, and then, can select from a plurality of target load state information and can hold the rubbish volume and the nearest garbage bin that a target load state information corresponds of arrival time point that corresponds as the target garbage bin, so, can let the user swiftly and realize rubbish delivery fast.
Optionally, after the step 103, the following steps may be further included:
c1, photographing the garbage to be delivered to obtain a target garbage image;
c2, carrying out image segmentation on the target garbage image to obtain a garbage area image;
c3, analyzing the garbage area image to obtain target garbage characteristic parameters;
c4, determining target control parameters corresponding to the target characteristic garbage parameters according to the mapping relation between preset garbage characteristic parameters and the control parameters of the garbage degradation equipment;
c5, sending the target control parameters to a server, and guiding the garbage degradation equipment to process the garbage to be delivered according to the target control parameters by the server.
The mapping relationship between preset garbage characteristic parameters and control parameters of the garbage degradation equipment can be prestored, and the garbage characteristic parameters can be one of the following parameters: the type of garbage, the volume of the garbage and the like are not limited herein, the type of garbage can be dry garbage or wet garbage, and the control parameter of the garbage degradation equipment can be at least one of the following: the type of the degradation agent, the capacity of the degradation agent, the degradation temperature, the degradation power, the degradation temperature, the degradation stirring speed, the degradation mode, and the like, which are not limited herein. The method comprises the following steps of shooting garbage to be delivered to obtain a target garbage image, carrying out image segmentation on the target garbage image to obtain a garbage area image, analyzing the garbage area image, wherein the analysis mode can be at least one of the following modes: the method comprises the steps of extracting characteristic points, identifying substances and the like to finally obtain target garbage characteristic parameters, determining target control parameters corresponding to the target characteristic garbage parameters according to a mapping relation between preset garbage characteristic parameters and control parameters of garbage degradation equipment, sending the target control parameters to a server, and guiding the garbage degradation equipment to process the garbage to be delivered according to the target control parameters by the server, so that the garbage can be efficiently processed.
It can be seen that, in the user registration method for the garbage classification platform described in the embodiment of the present application, the target two-dimensional code is scanned, a registration page of the garbage classification platform is entered, target identity information of a target user is obtained on the garbage classification platform, identification information for tracing back a garbage bag is formulated for the target user according to the target identity information, the identification information is used for uniquely identifying the target user, and the garbage bag is at least one of the following: dry disposal bag and wet disposal bag, so, can be for the disposal bag generates specific identification information, realize traceing back the disposal bag, finally can trace back rubbish.
Consistent with the above, fig. 2 is a schematic flowchart of a user registration method of a garbage classification platform disclosed in an embodiment of the present application. The user registration method of the garbage classification platform comprises the following steps 201 and 204.
201. And scanning the target two-dimension code, and entering a registration page of the garbage classification platform.
202. And acquiring target identity information of a target user on the garbage classification platform.
203. And sending the target identity information to a third-party auditing agency.
204. After the target identity information is approved by the third-party auditing mechanism, formulating identification information for tracing the garbage bags for the target user according to the target identity information, wherein the identification information is used for uniquely identifying the target user, and the garbage bags are at least one of the following types: dry trash bags and wet trash bags.
The detailed description of the steps 201 to 204 may refer to the corresponding description of the user registration method of the garbage classification platform described in fig. 1A, and is not repeated herein.
It can be seen that, in the user registration method for the garbage classification platform described in the embodiment of the present application, the target two-dimensional code is scanned, a registration page of the garbage classification platform is entered, target identity information of a target user is obtained on the garbage classification platform, after a third-party audit organization passes the target identity information audit, identification information for tracing back a garbage bag is formulated for the target user according to the target identity information, the identification information is used for uniquely identifying the target user, and the garbage bag is at least one of the following: dry disposal bag and wet disposal bag, so, not only can generate specific identification information for the disposal bag, because user's identity is certified by official, more can realize going on accurately tracing back to the disposal bag moreover, can finally trace back rubbish.
Referring to fig. 3, fig. 3 is a schematic structural diagram of another electronic device disclosed in the embodiment of the present application, and as shown in the drawing, the electronic device includes a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for performing the following steps:
scanning a target two-dimensional code, and entering a registration page of a garbage classification platform;
acquiring target identity information of a target user on the garbage classification platform;
according to the target identity information, setting identification information for tracing the garbage bags for the target user, wherein the identification information is used for uniquely identifying the target user, and the garbage bags are at least one of the following types: dry trash bags and wet trash bags.
It can be seen that, in the electronic device of the garbage classification platform described in this embodiment of the application, the target two-dimensional code is scanned, the electronic device enters a registration page of the garbage classification platform, the target identity information of the target user is acquired on the garbage classification platform, the identification information for tracing back the garbage bag is formulated for the target user according to the target identity information, the identification information is used for uniquely identifying the target user, and the garbage bag is at least one of the following: dry disposal bag and wet disposal bag, so, can be for the disposal bag generates specific identification information, realize traceing back the disposal bag, finally can trace back rubbish.
In one possible example, the registration page includes a face input control;
in the aspect of acquiring the target identity information of the target user by the garbage classification platform, the program includes instructions for executing the following steps:
when touch operation aiming at the face input control is detected, starting a camera, and shooting according to a first shooting parameter to obtain a preview image;
carrying out face detection on the preview image;
when a face is detected, determining a relative angle and a relative distance between the face and the camera;
adjusting the first shooting parameter according to the relative angle and the relative distance to obtain a second shooting parameter;
shooting according to the second shooting parameters to obtain a target face image;
searching the target face image in a preset database to obtain a preset face template successfully matched with the target face image;
and taking the identity information corresponding to the preset face template as the target identity information of the target user.
In one possible example, the first photographing parameters include at least: a screen fill light parameter and a camera angle parameter;
in the aspect that the first shooting parameter is adjusted according to the relative angle and the relative distance to obtain a second shooting parameter, the program includes instructions for executing the following steps:
when the relative angle is not in a preset angle range, determining a difference value between the relative angle and a preset angle, and determining a target camera angle parameter corresponding to the camera according to the difference value, wherein the target camera angle parameter comprises a rotation angle, after the camera rotates by the rotation angle, an angle between the camera and the face is in the preset angle range, and the preset angle is any angle in the preset angle range;
determining a target screen light supplement parameter corresponding to the relative distance according to a mapping relation between a preset distance and the screen light supplement parameter;
and taking the angle parameter of the target camera and the light supplement parameter of the target screen as the second shooting parameter.
In one possible example, in the aspect that the target face image is searched in a preset database to obtain a preset face template successfully matched with the target face image, the program includes instructions for executing the following steps:
acquiring target environment parameters corresponding to the target face image;
determining a target matching threshold corresponding to the target environment parameter according to a mapping relation between a preset environment parameter and the matching threshold;
extracting the contour of the target face image to obtain a first contour;
extracting feature points of the target face image to obtain a first feature point set;
acquiring the brightness of a target environment;
determining a target weight distribution factor corresponding to the target ambient brightness according to a preset mapping relation between the ambient brightness and the weight distribution factor, wherein the target weight distribution factor comprises a target contour weight factor and a target characteristic point weight factor;
determining a contour matching threshold value according to the target contour weight factor and the target matching threshold value;
determining a feature point matching threshold according to the target feature point weight factor and the target matching threshold;
acquiring a second contour and the second feature point set corresponding to the preset face template, wherein the preset face template is any one face template in the preset database;
matching the first contour with the second contour to obtain a first matching value;
matching the first characteristic point set and the second characteristic point set to obtain a second matching value;
when the first matching value is larger than the contour matching threshold and the second matching value is larger than the feature point matching threshold, determining a target matching value according to the first matching value, the second matching value and the target weight distribution factor;
and when the target matching value is larger than the target matching threshold value, confirming that the target face image is successfully matched with the preset face template.
In one possible example, the program further includes instructions for performing the steps of:
sending the target identity information to a third-party auditing agency;
and after the third-party auditing mechanism passes the target identity information auditing, executing the step of formulating identification information for tracing the garbage bags for the target user according to the target identity information.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It is understood that the electronic device comprises corresponding hardware structures and/or software modules for performing the respective functions in order to realize the above-mentioned functions. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative elements and algorithm steps described in connection with the embodiments provided herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional units according to the method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Referring to fig. 4A, fig. 4A is a schematic structural diagram of a user registration apparatus of a garbage classification platform disclosed in an embodiment of the present application, where the user registration apparatus 400 of the garbage classification platform includes: a scanning unit 401, an acquisition unit 402, and a generation unit 403, wherein,
the scanning unit 401 is configured to scan a target two-dimensional code and enter a registration page of the garbage classification platform;
an obtaining unit 402, configured to obtain target identity information of a target user on the garbage classification platform;
a generating unit 403, configured to formulate, for the target user, identification information for tracing back a trash bag according to the target identity information, where the identification information is used to uniquely identify the target user, and the trash bag is at least one of the following: dry trash bags and wet trash bags.
It can be seen that, in the user registration apparatus of the garbage classification platform described in this embodiment of the application, the target two-dimensional code is scanned, the user enters the garbage classification platform registration page, the target identity information of the target user is acquired on the garbage classification platform, the identification information for tracing back the garbage bag is formulated for the target user according to the target identity information, the identification information is used for uniquely identifying the target user, and the garbage bag is at least one of the following: dry disposal bag and wet disposal bag, so, can be for the disposal bag generates specific identification information, realize traceing back the disposal bag, finally can trace back rubbish.
In one possible example, the registration page includes a face input control;
in the aspect of acquiring, by the garbage classification platform, target identity information of a target user, the acquiring unit 402 is specifically configured to:
when touch operation aiming at the face input control is detected, starting a camera, and shooting according to a first shooting parameter to obtain a preview image;
carrying out face detection on the preview image;
when a face is detected, determining a relative angle and a relative distance between the face and the camera;
adjusting the first shooting parameter according to the relative angle and the relative distance to obtain a second shooting parameter;
shooting according to the second shooting parameters to obtain a target face image;
searching the target face image in a preset database to obtain a preset face template successfully matched with the target face image;
and taking the identity information corresponding to the preset face template as the target identity information of the target user.
In one possible example, the first photographing parameters include at least: a screen fill light parameter and a camera angle parameter;
in the aspect that the first shooting parameter is adjusted according to the relative angle and the relative distance to obtain a second shooting parameter, the obtaining unit 402 is specifically configured to:
when the relative angle is not in a preset angle range, determining a difference value between the relative angle and a preset angle, and determining a target camera angle parameter corresponding to the camera according to the difference value, wherein the target camera angle parameter comprises a rotation angle, after the camera rotates by the rotation angle, an angle between the camera and the face is in the preset angle range, and the preset angle is any angle in the preset angle range;
determining a target screen light supplement parameter corresponding to the relative distance according to a mapping relation between a preset distance and the screen light supplement parameter;
and taking the angle parameter of the target camera and the light supplement parameter of the target screen as the second shooting parameter.
In a possible example, in the aspect that the target face image is searched in a preset database to obtain a preset face template successfully matched with the target face image, the obtaining unit 402 is specifically configured to:
acquiring target environment parameters corresponding to the target face image;
determining a target matching threshold corresponding to the target environment parameter according to a mapping relation between a preset environment parameter and the matching threshold;
extracting the contour of the target face image to obtain a first contour;
extracting feature points of the target face image to obtain a first feature point set;
acquiring the brightness of a target environment;
determining a target weight distribution factor corresponding to the target ambient brightness according to a preset mapping relation between the ambient brightness and the weight distribution factor, wherein the target weight distribution factor comprises a target contour weight factor and a target characteristic point weight factor;
determining a contour matching threshold value according to the target contour weight factor and the target matching threshold value;
determining a feature point matching threshold according to the target feature point weight factor and the target matching threshold;
acquiring a second contour and the second feature point set corresponding to the preset face template, wherein the preset face template is any one face template in the preset database;
matching the first contour with the second contour to obtain a first matching value;
matching the first characteristic point set and the second characteristic point set to obtain a second matching value;
when the first matching value is larger than the contour matching threshold and the second matching value is larger than the feature point matching threshold, determining a target matching value according to the first matching value, the second matching value and the target weight distribution factor;
and when the target matching value is larger than the target matching threshold value, confirming that the target face image is successfully matched with the preset face template.
In one possible example, as shown in fig. 4B, fig. 4B is a further modified structure of the user registration apparatus of the garbage classification platform described in fig. 4A, which, compared with fig. 4A, may further include: the communication unit 404 is specifically as follows:
a communication unit 404, configured to send the target identity information to a third-party auditing agency;
after the third-party audit organization passes the target identity information audit, the generating unit 403 executes the step of formulating identification information for tracing back the trash bag for the target user according to the target identity information.
It should be noted that the electronic device described in the embodiments of the present application is presented in the form of a functional unit. The term "unit" as used herein is to be understood in its broadest possible sense, and objects used to implement the functions described by the respective "unit" may be, for example, an integrated circuit ASIC, a single circuit, a processor (shared, dedicated, or chipset) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
Among them, the scanning unit 401, the acquisition unit 402, the generation unit 403, and the communication unit 404 may be a control circuit or a processor.
Embodiments of the present application further provide a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, and the computer program enables a computer to perform part or all of the steps of the user registration method of any one of the garbage classification platforms as described in the above method embodiments.
Embodiments of the present application also provide a computer program product, which includes a non-transitory computer-readable storage medium storing a computer program, and the computer program is operable to cause a computer to perform part or all of the steps of any one of the user registration methods of the garbage classification platform as set forth in the above method embodiments.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may be implemented in the form of a software program module.
The integrated units, if implemented in the form of software program modules and sold or used as stand-alone products, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned memory comprises: various media capable of storing program codes, such as a usb disk, a read-only memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and the like.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash disk, ROM, RAM, magnetic or optical disk, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A user registration method of a garbage classification platform is characterized by comprising the following steps:
scanning a target two-dimensional code, and entering a registration page of a garbage classification platform;
acquiring target identity information of a target user on the garbage classification platform;
according to the target identity information, setting identification information for tracing the garbage bags for the target user, wherein the identification information is used for uniquely identifying the target user, and the garbage bags are at least one of the following types: dry trash bags and wet trash bags.
2. The method of claim 1, wherein the registration page includes a face input control;
the obtaining of the target identity information of the target user at the garbage classification platform includes:
when touch operation aiming at the face input control is detected, starting a camera, and shooting according to a first shooting parameter to obtain a preview image;
carrying out face detection on the preview image;
when a face is detected, determining a relative angle and a relative distance between the face and the camera;
adjusting the first shooting parameter according to the relative angle and the relative distance to obtain a second shooting parameter;
shooting according to the second shooting parameters to obtain a target face image;
searching the target face image in a preset database to obtain a preset face template successfully matched with the target face image;
and taking the identity information corresponding to the preset face template as the target identity information of the target user.
3. The method according to claim 2, characterized in that the first shooting parameters comprise at least: a screen fill light parameter and a camera angle parameter;
adjusting the first shooting parameter according to the relative angle and the relative distance to obtain a second shooting parameter, including:
when the relative angle is not in a preset angle range, determining a difference value between the relative angle and a preset angle, and determining a target camera angle parameter corresponding to the camera according to the difference value, wherein the target camera angle parameter comprises a rotation angle, after the camera rotates by the rotation angle, an angle between the camera and the face is in the preset angle range, and the preset angle is any angle in the preset angle range;
determining a target screen light supplement parameter corresponding to the relative distance according to a mapping relation between a preset distance and the screen light supplement parameter;
and taking the angle parameter of the target camera and the light supplement parameter of the target screen as the second shooting parameter.
4. The method according to claim 2, wherein the searching the target face image in a preset database to obtain a preset face template successfully matched with the target face image comprises:
acquiring target environment parameters corresponding to the target face image;
determining a target matching threshold corresponding to the target environment parameter according to a mapping relation between a preset environment parameter and the matching threshold;
extracting the contour of the target face image to obtain a first contour;
extracting feature points of the target face image to obtain a first feature point set;
acquiring the brightness of a target environment;
determining a target weight distribution factor corresponding to the target ambient brightness according to a preset mapping relation between the ambient brightness and the weight distribution factor, wherein the target weight distribution factor comprises a target contour weight factor and a target characteristic point weight factor;
determining a contour matching threshold value according to the target contour weight factor and the target matching threshold value;
determining a feature point matching threshold according to the target feature point weight factor and the target matching threshold;
acquiring a second contour and the second feature point set corresponding to the preset face template, wherein the preset face template is any one face template in the preset database;
matching the first contour with the second contour to obtain a first matching value;
matching the first characteristic point set and the second characteristic point set to obtain a second matching value;
when the first matching value is larger than the contour matching threshold and the second matching value is larger than the feature point matching threshold, determining a target matching value according to the first matching value, the second matching value and the target weight distribution factor;
and when the target matching value is larger than the target matching threshold value, confirming that the target face image is successfully matched with the preset face template.
5. The method according to any one of claims 1-4, further comprising:
sending the target identity information to a third-party auditing agency;
and after the third-party auditing mechanism passes the target identity information auditing, executing the step of formulating identification information for tracing the garbage bags for the target user according to the target identity information.
6. A user registration apparatus of a garbage classification platform, the apparatus comprising:
the scanning unit is used for scanning the target two-dimensional code and entering a registration page of the garbage classification platform;
the acquisition unit is used for acquiring target identity information of a target user on the garbage classification platform;
the generating unit is used for formulating identification information for tracing the garbage bags for the target user according to the target identity information, the identification information is used for uniquely identifying the target user, and the garbage bags are at least one of the following types: dry trash bags and wet trash bags.
7. The apparatus of claim 6, wherein the registration page comprises a face input control;
in the aspect of acquiring the target identity information of the target user by the garbage classification platform, the acquiring unit is specifically configured to:
when touch operation aiming at the face input control is detected, starting a camera, and shooting according to a first shooting parameter to obtain a preview image;
carrying out face detection on the preview image;
when a face is detected, determining a relative angle and a relative distance between the face and the camera;
adjusting the first shooting parameter according to the relative angle and the relative distance to obtain a second shooting parameter;
shooting according to the second shooting parameters to obtain a target face image;
searching the target face image in a preset database to obtain a preset face template successfully matched with the target face image;
and taking the identity information corresponding to the preset face template as the target identity information of the target user.
8. The apparatus of claim 7, wherein the first photographing parameters at least include: a screen fill light parameter and a camera angle parameter;
in the aspect that the first shooting parameter is adjusted according to the relative angle and the relative distance to obtain a second shooting parameter, the obtaining unit is specifically configured to:
when the relative angle is not in a preset angle range, determining a difference value between the relative angle and a preset angle, and determining a target camera angle parameter corresponding to the camera according to the difference value, wherein the target camera angle parameter comprises a rotation angle, after the camera rotates by the rotation angle, an angle between the camera and the face is in the preset angle range, and the preset angle is any angle in the preset angle range;
determining a target screen light supplement parameter corresponding to the relative distance according to a mapping relation between a preset distance and the screen light supplement parameter;
and taking the angle parameter of the target camera and the light supplement parameter of the target screen as the second shooting parameter.
9. An electronic device comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-5.
10. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1-5.
CN201910290177.2A 2019-04-11 2019-04-11 User registration method of garbage classification platform and related products Active CN111832750B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910290177.2A CN111832750B (en) 2019-04-11 2019-04-11 User registration method of garbage classification platform and related products

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910290177.2A CN111832750B (en) 2019-04-11 2019-04-11 User registration method of garbage classification platform and related products

Publications (2)

Publication Number Publication Date
CN111832750A true CN111832750A (en) 2020-10-27
CN111832750B CN111832750B (en) 2021-10-22

Family

ID=72915054

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910290177.2A Active CN111832750B (en) 2019-04-11 2019-04-11 User registration method of garbage classification platform and related products

Country Status (1)

Country Link
CN (1) CN111832750B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102466945A (en) * 2010-11-19 2012-05-23 北京海鑫智圣技术有限公司 LED supplementary lighting and image clipping evaluation system in standard image acquisition device
CN104967776A (en) * 2015-06-11 2015-10-07 广东欧珀移动通信有限公司 Photographing setting method and user terminal
CN105173482A (en) * 2015-09-25 2015-12-23 蒋曙 Garbage classification method, device and system
CN108074206A (en) * 2017-12-29 2018-05-25 佛山市幻云科技有限公司 Campus monitoring method, device, server and system based on RFID
CN109241908A (en) * 2018-09-04 2019-01-18 深圳市宇墨科技有限公司 Face identification method and relevant apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102466945A (en) * 2010-11-19 2012-05-23 北京海鑫智圣技术有限公司 LED supplementary lighting and image clipping evaluation system in standard image acquisition device
CN104967776A (en) * 2015-06-11 2015-10-07 广东欧珀移动通信有限公司 Photographing setting method and user terminal
CN105173482A (en) * 2015-09-25 2015-12-23 蒋曙 Garbage classification method, device and system
CN108074206A (en) * 2017-12-29 2018-05-25 佛山市幻云科技有限公司 Campus monitoring method, device, server and system based on RFID
CN109241908A (en) * 2018-09-04 2019-01-18 深圳市宇墨科技有限公司 Face identification method and relevant apparatus

Also Published As

Publication number Publication date
CN111832750B (en) 2021-10-22

Similar Documents

Publication Publication Date Title
CN111814517B (en) Garbage delivery detection method and related product
CN111476227B (en) Target field identification method and device based on OCR and storage medium
CN112100461B (en) Questionnaire data processing method, device, server and medium based on data analysis
CN108520196B (en) Luxury discrimination method, electronic device, and storage medium
CN106886774A (en) The method and apparatus for recognizing ID card information
CN106303599B (en) Information processing method, system and server
CN105787133B (en) Advertisement information filtering method and device
CN107871011A (en) Image processing method, device, mobile terminal and computer-readable recording medium
CN104123485A (en) Real-name system information binding method and device
CN111178147B (en) Screen crushing and grading method, device, equipment and computer readable storage medium
CN112183296B (en) Simulated bill image generation and bill image recognition method and device
US9355338B2 (en) Image recognition device, image recognition method, and recording medium
CN108334797B (en) File scanning method, device and computer readable storage medium
CN109547748B (en) Object foot point determining method and device and storage medium
CN114170435A (en) Method and device for screening appearance images for recovery detection
CN112465517A (en) Anti-counterfeiting verification method and device and computer readable storage medium
CN105590113A (en) Information-acquiring method based on law enforcement recorder
KR20140066686A (en) Method for determining if business card about to be added is present in contact list
CN110490022A (en) A kind of bar code method and device in identification picture
Kim et al. m CLOVER: mobile content-based leaf image retrieval system
KR100985949B1 (en) System and method for providing product information service by mobile network system
CN109213397B (en) Data processing method and device and user side
CN112278636B (en) Garbage classification recycling method, device, system and storage medium
CN111832750B (en) User registration method of garbage classification platform and related products
WO2017069741A1 (en) Digitized document classification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant