CN110809234A - Figure category identification method and terminal equipment - Google Patents
Figure category identification method and terminal equipment Download PDFInfo
- Publication number
- CN110809234A CN110809234A CN201911085643.XA CN201911085643A CN110809234A CN 110809234 A CN110809234 A CN 110809234A CN 201911085643 A CN201911085643 A CN 201911085643A CN 110809234 A CN110809234 A CN 110809234A
- Authority
- CN
- China
- Prior art keywords
- person
- target
- category
- wifi
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/025—Services making use of location information using location based information parameters
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/52—Network services specially adapted for the location of the user terminal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/50—Network services
- H04L67/55—Push-based network services
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W48/00—Access restriction; Network selection; Access point selection
- H04W48/08—Access restriction or access information delivery, e.g. discovery data delivery
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W48/00—Access restriction; Network selection; Access point selection
- H04W48/16—Discovering, processing access restriction or access information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W8/00—Network data management
- H04W8/22—Processing or transfer of terminal data, e.g. status or physical capabilities
- H04W8/24—Transfer of terminal data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Abstract
The invention discloses a figure category identification method and terminal equipment, wherein the method comprises the following steps: capturing WIFI characteristic information reported by a terminal of a target person when the terminal is connected with WIFI; and determining the target character category to which the target character belongs according to the WIFI characteristic information and the target neural network model. According to the technical scheme, the target character category to which the target character belongs is determined according to the multidimensional WIFI characteristic information and the supervised target neural network model, so that the obtained classification result is better in reproducibility and more accurate.
Description
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a person category identification method and a terminal device.
Background
With the development of the internet, more and more people adopt the network to carry out communication or business negotiation and the like, so that the character information resources on the internet are greatly enriched. By utilizing the personal information resources, people can be classified according to the categories, so that push service and the like can be provided for the people according to the categories of the people.
In the related art, the human categories are mainly distinguished by adopting a label clustering mode, but because the label clustering belongs to an unsupervised algorithm, the reproducibility of the obtained category classification result is poor, and the categories corresponding to the classification results of the same person are possibly different.
Disclosure of Invention
In view of the above problems, the present invention provides a person category identification method and a terminal device, which classify persons by using a supervised algorithm model, so that the obtained classification result has better reproducibility.
According to a first aspect of the embodiments of the present invention, there is provided a person category identification method, including:
capturing WIFI characteristic information reported by a terminal of a target person when the terminal is connected with WIFI;
and determining the target character category to which the target character belongs according to the WIFI characteristic information and a target neural network model.
In one embodiment, preferably, the WIFI feature information includes at least one of the following information: the system comprises character tag information, position service-based information, WIFI list information and terminal attribute information.
In one embodiment, preferably, the personal tag information includes: person gender, place visited, person occupation, and person recent activity;
in one embodiment, preferably, the location based service information includes: the frequency, radius of activity, commute distance and work location of people visiting the city;
in one embodiment, preferably, the WIFI list information includes: the method comprises the following steps of (1) transmitting a WIFI signal, a transmitting position, a form and social attributes of the WIFI signal;
in one embodiment, preferably, the terminal attribute information includes: hardware attribute information of the terminal.
In one embodiment, preferably, the method further comprises:
and adding the target person into the person set corresponding to the target person category.
In one embodiment, preferably, the method further comprises:
receiving an input query command for the target character category;
and displaying the character set corresponding to the target character category according to the query command.
In one embodiment, preferably, the method further comprises:
and training by utilizing the sample figure class and a preset neural network model to obtain the target neural network model.
According to a second aspect of the embodiments of the present invention, there is provided a terminal device, including:
a touch-sensitive display;
one or more processors;
a memory;
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the method as described in the first aspect or any of the embodiments of the first aspect.
In the embodiment of the invention, the target character category to which the target character belongs is determined according to the character tag information, the multidimensional WIFI characteristic information such as the information of the position service, the WIFI list information, the terminal attribute information and the like and the supervised target neural network model, so that the obtained classification result has better reproducibility and is more accurate.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart of a person category identification method according to an embodiment of the present invention.
Fig. 2 is a flowchart of a person category identification method according to another embodiment of the present invention.
Fig. 3 is a flowchart of a person category identification method according to another embodiment of the present invention.
Fig. 4 is a flowchart of a person category identification method according to another embodiment of the present invention.
Fig. 5 is a block diagram illustrating a partial structure of a mobile phone related to a terminal provided in an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention.
In some of the flows described in the present specification and claims and in the above figures, a number of operations are included that occur in a particular order, but it should be clearly understood that these operations may be performed out of order or in parallel as they occur herein, with the order of the operations being indicated as 101, 102, etc. merely to distinguish between the various operations, and the order of the operations by themselves does not represent any order of performance. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is a flowchart of a person category identification method according to an embodiment of the present invention.
As shown in fig. 1, the person category identification method includes:
and S101, capturing WIFI characteristic information reported by a terminal of a target person when the terminal is connected with WIFI.
The WIFI characteristic information comprises at least one of the following information: the system comprises character tag information, position service-based information, WIFI list information and terminal attribute information.
The character tag information can be determined according to data such as behavior habits of the user in the process that the user uses the terminal daily. Specifically, the person tag information includes: gender of the person, places visited, occupation of the person and activities within the near term of the person, etc.
The location-based service information may be determined according to data such as an activity location of the user in using the terminal online. Specifically, the location based service information includes: the number of times a person visits a city, the radius of activity, commuting distance, work location, etc.
The WIFI list information may be a transmission mode, a transmission position, a form and social attributes of the WIFI signal, and the like.
The terminal attribute information may be hardware attribute information of the terminal, such as a brand of the terminal, a hardware configuration, and the like.
And S102, determining the target character category to which the target character belongs according to the WIFI characteristic information and the target neural network model.
In the embodiment of the invention, the target character category to which the target character belongs is determined according to the character tag information, the multidimensional WIFI characteristic information such as the information of the position service, the WIFI list information, the terminal attribute information and the like and the supervised target neural network model, so that the obtained classification result has better reproducibility and is more accurate.
Fig. 2 is a flowchart of a person category identification method according to another embodiment of the present invention.
As shown in fig. 2, in an embodiment, preferably, after step S102, the method further includes step S201:
step S201, adding the target person to the person set corresponding to the target person category.
In this embodiment, each person category corresponds to one person set, so that a target person category to which a target person belongs is determined according to the person tag information, multi-dimensional WIFI characteristic information such as position service information, WIFI list information, terminal attribute information and the like, and a supervised target neural network model, and then the target person is added to the person set corresponding to the target person category, so that a user can conveniently find a person to be found according to the person category in a later period.
Fig. 3 is a flowchart of a person category identification method according to another embodiment of the present invention.
As shown in fig. 3, in one embodiment, preferably, after step S201, the method further includes steps S301 to S302:
step S301, receiving an input query command for the target person category;
step S302, according to the query command, displaying the character set corresponding to the target character category.
In this embodiment, the user may query a group of people that the user wants to find according to the people category, such as querying young women between 20 and 30 years old, so that the user may input a query command to find a target young woman in the people group, so as to push information for the target young woman. Therefore, the user can conveniently and accurately find the target crowd, and the use experience of the user is improved.
Fig. 4 is a flowchart of a person category identification method according to another embodiment of the present invention.
As shown in fig. 4, in an embodiment, preferably before step S102, the method further includes:
step S401, training is carried out by utilizing the sample character category and a preset neural network model, and a target neural network model is obtained.
In the embodiment, a user can acquire or mark sample character categories in advance, and then train according to the sample data of the known character categories and a preset neural network model, so that a target neural network model capable of carrying out character classification is obtained, and the obtained target neural network model belongs to a supervised model, so that the reproducibility of character classification is better.
According to a second aspect of the embodiments of the present invention, there is provided a terminal device, including:
a touch-sensitive display;
one or more processors;
a memory;
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the method as in the first aspect or any embodiment of the first aspect.
As shown in fig. 5, for convenience of description, only the parts related to the embodiment of the present invention are shown, and details of the specific technology are not disclosed, please refer to the method part of the embodiment of the present invention. The terminal may be any terminal device including a mobile phone, a tablet computer, a PDA (Personal Digital Assistant), a POS (Point of sales), a vehicle-mounted computer, etc., taking the terminal as the mobile phone as an example:
fig. 5 is a block diagram illustrating a partial structure of a mobile phone related to a terminal provided in an embodiment of the present invention. Referring to fig. 5, the handset includes: radio Frequency (RF) circuitry 510, memory 520, input unit 530, display unit 540, sensor 550, audio circuitry 560, wireless-fidelity (Wi-Fi) module 570, processor 580, and power supply 590. Those skilled in the art will appreciate that the handset configuration shown in fig. 5 is not intended to be limiting and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile phone in detail with reference to fig. 5:
RF circuit 510 may be used for receiving and transmitting signals during information transmission and reception or during a call, and in particular, for processing downlink information of a base station after receiving the downlink information to processor 580; in addition, the data for designing uplink is transmitted to the base station. In general, RF circuit 510 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a Low Noise Amplifier (LNA), a duplexer, and the like. In addition, RF circuit 510 may also communicate with networks and other devices via wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to global system for Mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), email, Short Messaging Service (SMS), and the like.
The memory 520 may be used to store software programs and modules, and the processor 580 executes various functional applications and data processing of the mobile phone by operating the software programs and modules stored in the memory 520. The memory 520 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 520 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The input unit 530 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the cellular phone. Specifically, the input unit 530 may include a touch panel 531 and other input devices 532. The touch panel 531, also called a touch screen, can collect touch operations of a user on or near the touch panel 531 (for example, operations of the user on or near the touch panel 531 by using any suitable object or accessory such as a finger or a stylus pen), and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 531 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, and sends the touch point coordinates to the processor 580, and can receive and execute commands sent by the processor 580. In addition, the touch panel 531 may be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The input unit 530 may include other input devices 532 in addition to the touch panel 531. In particular, other input devices 532 may include, but are not limited to, one or more of a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
The display unit 540 may be used to display information input by the user or information provided to the user and various menus of the mobile phone. The display unit 540 may include a display panel 541, and optionally, the display panel 541 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like. Further, the touch panel 531 may cover the display panel 541, and when the touch panel 531 detects a touch operation on or near the touch panel 531, the touch panel is transmitted to the processor 580 to determine the type of the touch event, and then the processor 580 provides a corresponding visual output on the display panel 541 according to the type of the touch event. Although the touch panel 531 and the display panel 541 are shown as two separate components in fig. 5 to implement the input and output functions of the mobile phone, in some embodiments, the touch panel 531 and the display panel 541 may be integrated to implement the input and output functions of the mobile phone.
The handset may also include at least one sensor 550, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 541 according to the brightness of ambient light, and the proximity sensor may turn off the display panel 541 and/or the backlight when the mobile phone is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally, three axes), can detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of a mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer and tapping), and the like; as for other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
Audio circuitry 560, speaker 561, and microphone 562 may provide an audio interface between a user and a cell phone. The audio circuit 560 may transmit the electrical signal converted from the received audio data to the speaker 561, and convert the electrical signal into a sound signal by the speaker 561 for output; on the other hand, the microphone 562 converts the collected sound signals into electrical signals, which are received by the audio circuit 560 and converted into audio data, which are then processed by the audio data output processor 580, and then passed through the RF circuit 510 to be sent to, for example, another cellular phone, or output to the memory 520 for further processing.
WiFi belongs to short distance wireless transmission technology, and the mobile phone can help the user to send and receive e-mail, browse web pages, access streaming media, etc. through the WiFi module 570, which provides wireless broadband internet access for the user. Although fig. 5 shows the WiFi module 570, it is understood that it does not belong to the essential constitution of the handset, and can be omitted entirely as needed within the scope not changing the essence of the invention.
The processor 580 is a control center of the mobile phone, connects various parts of the entire mobile phone by using various interfaces and lines, and performs various functions of the mobile phone and processes data by operating or executing software programs and/or modules stored in the memory 520 and calling data stored in the memory 520, thereby performing overall monitoring of the mobile phone. Alternatively, processor 580 may include one or more processing units; preferably, the processor 580 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 580.
The handset also includes a power supply 590 (e.g., a battery) for powering the various components, which may preferably be logically coupled to the processor 580 via a power management system, such that the power management system may be used to manage charging, discharging, and power consumption.
Although not shown, the mobile phone may further include a camera, a bluetooth module, etc., which are not described herein.
In the embodiment of the present invention, the processor 580 included in the terminal further has the following functions:
receiving an input query request;
according to the query request, target WIFI list data corresponding to each target mobile terminal are searched in a WIFI data table;
determining relationship information between different target mobile terminals according to the target WIFI list data;
and outputting the relation information.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable storage medium, and the storage medium may include: a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic or optical disk, or the like.
It will be understood by those skilled in the art that all or part of the steps in the method for implementing the above embodiments may be implemented by hardware that is instructed to implement by a program, and the program may be stored in a computer-readable storage medium, where the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
While the terminal device provided by the present invention has been described in detail, for those skilled in the art, the idea of the embodiment of the present invention may be changed in the specific implementation and application scope, and in summary, the content of the present description should not be construed as limiting the present invention.
Claims (10)
1. A person category identification method is characterized by comprising the following steps:
capturing WIFI characteristic information reported by a terminal of a target person when the terminal is connected with WIFI;
and determining the target character category to which the target character belongs according to the WIFI characteristic information and a target neural network model.
2. The person category identification method according to claim 1, wherein the WIFI characteristic information preferably includes at least one of the following information: the system comprises character tag information, position service-based information, WIFI list information and terminal attribute information.
3. The person category identification method according to claim 2, characterized in that, preferably,
the personal tag information includes: gender of the person, locations visited, occupation of the person, and activities within the person's near term.
4. The person category identification method according to claim 2, characterized in that, preferably,
the location based service information includes: the number of times a person visits a visited city, the radius of activity, the commute distance and the work place.
5. The personal category identifying method of claim 2, wherein,
the WIFI list information includes: the WIFI signal transmitting mode, the transmitting position, the WIFI signal form and the social attribute.
6. The personal category identifying method of claim 2, wherein,
the terminal attribute information includes: hardware attribute information of the terminal.
7. The person category identification method according to claim 1, characterized in that the method further comprises:
and adding the target person into the person set corresponding to the target person category.
8. The person category identification method according to claim 7, characterized in that the method further comprises:
receiving an input query command for the target character category;
and displaying the character set corresponding to the target character category according to the query command.
9. The person category identification method according to any one of claims 1 to 8, characterized in that the method further comprises:
and training by utilizing the sample figure class and a preset neural network model to obtain the target neural network model.
10. A terminal device, comprising:
a touch-sensitive display;
one or more processors;
a memory;
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the method of any of claims 1-9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911085643.XA CN110809234A (en) | 2019-11-08 | 2019-11-08 | Figure category identification method and terminal equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911085643.XA CN110809234A (en) | 2019-11-08 | 2019-11-08 | Figure category identification method and terminal equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110809234A true CN110809234A (en) | 2020-02-18 |
Family
ID=69501608
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911085643.XA Pending CN110809234A (en) | 2019-11-08 | 2019-11-08 | Figure category identification method and terminal equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110809234A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111368701A (en) * | 2020-02-28 | 2020-07-03 | 上海商汤智能科技有限公司 | Character management and control method and device, electronic equipment and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102625084A (en) * | 2012-03-07 | 2012-08-01 | 深圳创维-Rgb电子有限公司 | Intelligent television monitoring system |
CN106910092A (en) * | 2017-02-28 | 2017-06-30 | 成都瑞小博科技有限公司 | A kind of active marketing method and system based on business WIFI industry attributes |
CN107451607A (en) * | 2017-07-13 | 2017-12-08 | 山东中磁视讯股份有限公司 | A kind of personal identification method of the typical character based on deep learning |
US20190138919A1 (en) * | 2017-11-06 | 2019-05-09 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Methods and systems for preloading applications and generating prediction models |
CN110086874A (en) * | 2019-04-30 | 2019-08-02 | 清华大学 | A kind of Expressway Service user classification method, system, equipment and medium |
-
2019
- 2019-11-08 CN CN201911085643.XA patent/CN110809234A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102625084A (en) * | 2012-03-07 | 2012-08-01 | 深圳创维-Rgb电子有限公司 | Intelligent television monitoring system |
CN106910092A (en) * | 2017-02-28 | 2017-06-30 | 成都瑞小博科技有限公司 | A kind of active marketing method and system based on business WIFI industry attributes |
CN107451607A (en) * | 2017-07-13 | 2017-12-08 | 山东中磁视讯股份有限公司 | A kind of personal identification method of the typical character based on deep learning |
US20190138919A1 (en) * | 2017-11-06 | 2019-05-09 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Methods and systems for preloading applications and generating prediction models |
CN110086874A (en) * | 2019-04-30 | 2019-08-02 | 清华大学 | A kind of Expressway Service user classification method, system, equipment and medium |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111368701A (en) * | 2020-02-28 | 2020-07-03 | 上海商汤智能科技有限公司 | Character management and control method and device, electronic equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108156508B (en) | Barrage information processing method and device, mobile terminal, server and system | |
CN106303070B (en) | notification message prompting method and device and mobile terminal | |
CN104852885B (en) | Method, device and system for verifying verification code | |
US20170109756A1 (en) | User Unsubscription Prediction Method and Apparatus | |
US20190200203A1 (en) | Data Sharing Method and Terminal | |
CN107633051A (en) | Desktop searching method, mobile terminal and computer-readable recording medium | |
CN110597793A (en) | Data management method and device, electronic equipment and computer readable storage medium | |
CN108984066B (en) | Application icon display method and mobile terminal | |
JP6915074B2 (en) | Message notification method and terminal | |
CN107765954B (en) | Application icon updating method, mobile terminal and server | |
CN110602766B (en) | Personal hotspot identification method and method for determining association relationship between terminals | |
CN106934003B (en) | File processing method and mobile terminal | |
CN106293407B (en) | Picture display method and terminal equipment | |
CN109521916B (en) | File processing method based on flexible screen, mobile terminal and storage medium | |
CN108304709B (en) | Face unlocking method and related product | |
CN110809234A (en) | Figure category identification method and terminal equipment | |
CN115904179A (en) | Multi-screen desktop split-screen display method, device, equipment and storage medium | |
CN111027406B (en) | Picture identification method and device, storage medium and electronic equipment | |
CN109976610B (en) | Application program identifier classification method and terminal equipment | |
CN109799994B (en) | Terminal component generation method and device | |
CN112101215A (en) | Face input method, terminal equipment and computer readable storage medium | |
CN106657278B (en) | Data transmission method and device and computer equipment | |
CN109242479A (en) | A kind of method and apparatus that red packet is got based on integrating system | |
CN111887174B (en) | Method, device and system for collecting favorite position information of pet | |
CN112199049B (en) | Fingerprint storage method, fingerprint storage device and terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200218 |