CN110609933A - Image processing method and device, electronic equipment and storage medium - Google Patents
Image processing method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN110609933A CN110609933A CN201910842126.6A CN201910842126A CN110609933A CN 110609933 A CN110609933 A CN 110609933A CN 201910842126 A CN201910842126 A CN 201910842126A CN 110609933 A CN110609933 A CN 110609933A
- Authority
- CN
- China
- Prior art keywords
- check
- image
- data
- user
- display interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/90—Details of database functions independent of the retrieved data types
- G06F16/904—Browsing; Visualisation therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C1/00—Registering, indicating or recording the time of events or elapsed time, e.g. time-recorders for work people
- G07C1/10—Registering, indicating or recording the time of events or elapsed time, e.g. time-recorders for work people together with the recording, indicating or registering of other data, e.g. of signs of identity
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The present disclosure relates to an image processing method and apparatus, an electronic device, and a storage medium, wherein the method includes: responding to the check-in operation of a user, obtaining check-in data, and displaying the check-in data on a first display interface of a first terminal; obtaining sign-in processing results respectively corresponding to the sign-in data according to the sign-in data; and outputting the check-in processing result to a second display interface of a second terminal. By adopting the method and the device, the check-in can be realized at the first terminal, and diversified check-in processing results can be displayed at the second terminal.
Description
Technical Field
The present disclosure relates to the field of computer vision processing technologies, and in particular, to an image processing method and apparatus, an electronic device, and a storage medium.
Background
In image processing scenes such as face recognition, target detection, security monitoring and the like, related data statistics can be carried out after the face is recognized. For example, in daily work, a user sign-in operation needs to be executed in an access control system for attendance checking or an office area, so as to perform statistical processing on a background server, and further, a special display terminal can be adopted to output a statistical processing result. The statistical processing is carried out on the background server, the processing is not intuitive enough, the statistical processing result cannot be obtained at a glance, and the hardware cost is increased by adopting a special display terminal. However, there is no effective solution to this in the related art.
Disclosure of Invention
The present disclosure proposes a technical solution of image processing.
According to an aspect of the present disclosure, there is provided an image processing method, the method including:
responding to the check-in operation of a user, obtaining check-in data, and displaying the check-in data on a first display interface of a first terminal;
obtaining sign-in processing results respectively corresponding to the sign-in data according to the sign-in data;
and outputting the check-in processing result to a second display interface of a second terminal.
In a possible implementation manner, the obtaining the check-in data in response to the check-in operation of the user includes:
carrying out face recognition on the user to obtain a recognition result;
and obtaining the check-in data according to the identification result.
In a possible implementation manner, the performing face recognition on the user to obtain a recognition result includes:
acquiring a face image of the user after the sign-in operation of the user is triggered to obtain an acquisition result;
and comparing the acquisition result with a face image in a face recognition library to obtain the recognition result.
In a possible implementation manner, the outputting the multiple check-in processing results to the second display interface of the second terminal includes:
performing image rendering on the check-in processing result to obtain a check-in image;
and outputting the check-in image, and controlling the second display interface to display the check-in image.
In a possible implementation manner, the check-in image is displayed on the second display interface in an image display form including at least one of suspension, lamination or embedding.
In a possible implementation manner, the check-in image at least includes face data.
In a possible implementation manner, before the controlling displays the check-in image on the second display interface, the controlling further includes:
and responding to the condition that the number of the check-in images does not reach a preset threshold value, and displaying the check-in images.
In a possible implementation manner, before the controlling displays the check-in image on the second display interface, the controlling further includes:
deleting a predetermined number of the first check-in images in response to a situation that the number of the check-in images reaches a preset threshold;
displaying the second check-in image on a second display interface, wherein the check-in image comprises a first check-in image and a second check-in image, and the first check-in image is generated earlier than the second check-in image.
In a possible implementation manner, the controlling displays the check-in image on the second display interface, including:
starting timing after triggering the sign-in image to display;
and in response to the condition that the timing duration reaches the preset time, closing the displayed check-in image.
According to an aspect of the present disclosure, there is provided a check-in apparatus, the apparatus including:
the response unit is used for responding to the check-in operation of the user, obtaining check-in data and displaying the check-in data on a first display interface of the first terminal;
the check-in unit is used for obtaining check-in processing results respectively corresponding to the check-in data according to the check-in data;
and the control output unit is used for outputting the check-in processing result to a second display interface of a second terminal.
In a possible implementation manner, the response unit is configured to:
carrying out face recognition on the user to obtain a recognition result;
and obtaining the check-in data according to the identification result.
In a possible implementation manner, the response unit is configured to:
acquiring a face image of the user after the sign-in operation of the user is triggered to obtain an acquisition result;
and comparing the acquisition result with a face image in a face recognition library to obtain the recognition result.
In a possible implementation manner, the control output unit is configured to:
performing image rendering on the check-in processing result to obtain a check-in image;
and outputting the check-in image, and controlling the second display interface to display the check-in image.
In a possible implementation manner, the check-in image is displayed on the second display interface in an image display form including at least one of suspension, lamination or embedding.
In a possible implementation manner, the check-in image at least includes face data.
In a possible implementation manner, the apparatus further includes a first processing unit, configured to:
and responding to the condition that the number of the check-in images does not reach a preset threshold value, and displaying the check-in images.
In a possible implementation manner, the apparatus further includes a first processing unit, configured to:
deleting a predetermined number of the first check-in images in response to a situation that the number of the check-in images reaches a preset threshold;
displaying the second check-in image on a second display interface, wherein the check-in image comprises a first check-in image and a second check-in image, and the first check-in image is generated earlier than the second check-in image.
In a possible implementation manner, the apparatus further includes a first processing unit, configured to:
starting timing after triggering the sign-in image to display;
and in response to the condition that the timing duration reaches the preset time, closing the displayed check-in image.
According to an aspect of the present disclosure, there is provided an electronic device including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to: the above-described image processing method is performed.
According to an aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the above-described image processing method.
In the embodiment of the disclosure, sign-in data is obtained in response to a user sign-in operation, and the sign-in data is displayed on a first display interface of a first terminal; obtaining a plurality of check-in processing results according to the check-in data; and outputting the multiple check-in processing results to a second display interface of a second terminal, and correspondingly displaying the multiple check-in processing results on multiple display frames of the second display interface respectively. By adopting the method and the device, the check-in conditions of the participants can be counted, the check-in can be realized at the first terminal, the corresponding check-in data can be displayed, and diversified display of the check-in statistics can be presented on the conference interface of the second terminal.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Other features and aspects of the present disclosure will become apparent from the following detailed description of exemplary embodiments, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure.
Fig. 1 shows a flowchart of an image processing method according to an embodiment of the present disclosure.
Fig. 2 shows a schematic diagram of a first display interface according to an embodiment of the disclosure.
FIG. 3 shows a schematic diagram of a second display interface according to an embodiment of the disclosure.
FIG. 4 illustrates a flow diagram of a check-in application instance in accordance with an embodiment of the disclosure.
Fig. 5 illustrates a block diagram of an image processing apparatus according to an embodiment of the present disclosure.
Fig. 6 illustrates a block diagram of an electronic device in accordance with an embodiment of the disclosure.
Detailed Description
Various exemplary embodiments, features and aspects of the present disclosure will be described in detail below with reference to the accompanying drawings. In the drawings, like reference numbers can indicate functionally identical or similar elements. While the various aspects of the embodiments are presented in drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The word "exemplary" is used exclusively herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
Furthermore, in the following detailed description, numerous specific details are set forth in order to provide a better understanding of the present disclosure. It will be understood by those skilled in the art that the present disclosure may be practiced without some of these specific details. In some instances, methods, means, elements and circuits that are well known to those skilled in the art have not been described in detail so as not to obscure the present disclosure.
In image processing scenes such as face recognition, target detection, security monitoring and the like, related data statistics can be carried out after the face is recognized, and the image processing scene is not limited to a check-in scene. For example, a first terminal (such as a face recognition machine) can be arranged in an entrance guard system of an office area or a work attendance check-in or check-in. In addition, in a check-in scene of a meeting or a business welcome scene, the conditions of the attendees/presence people (such as VIP people) need to be counted or displayed.
Taking a business welcome scene as an example, the number of the terminal devices is two, each terminal is respectively provided with a screen, for example, after the first terminal executes user sign-in operation, if face recognition is successful, sign-in is realized, sign-in data is obtained, and the sign-in data is displayed on a display of the first terminal, so that on-site service personnel and sign-in personnel can conveniently check the sign-in data. And meanwhile, after image rendering is carried out on the first terminal according to the check-in data, a check-in processing result is obtained, the check-in processing result is output to a second terminal (such as a TV of a conference room), and the check-in processing result is displayed on a display/screen of the second terminal, so that on-site personnel can conveniently watch the check-in processing result. In this example, the display of the first terminal and/or the display/screen of the second terminal may be a touch display screen, or may be other display devices, which is not limited herein. If a person signs, obtaining sign-in data which can correspond to a sign-in processing result; if a plurality of persons sign in together at the same time, a plurality of sign-in data are obtained, and each sign-in data respectively corresponds to a respective sign-in processing result.
In the above-mentioned check-in process, the two displays respectively display the check-in data and the check-in processing result at the same time, that is, display the check-in data on the screen of the first terminal, and display the check-in processing result (the result obtained after counting the check-in data) on the screen of the second terminal. By adopting the mode, different contents can be displayed on different screens, the signing statistical requirement of the welcome is improved, the calculation resources are saved, the signing result is displayed in a diversified manner, the memory and the resource occupation of the conference system are not influenced in the calculation process, and the conference system can operate normally.
Fig. 1 is a flowchart illustrating an image processing method according to an embodiment of the present disclosure, and the method is applied to a check-in apparatus, for example, the apparatus may be deployed on a terminal device side or other processing device side, and is used for performing check-in and check-in statistics according to face recognition in a scene such as an access control system and a conference system for checking in and out of office. Wherein, the terminal side includes first terminal and the second terminal of two screen difference shows, and first terminal can be face identification machine, and the second terminal can be for running conference system's display device etc.. In some possible implementations, the image processing method may be implemented by a processor calling computer readable instructions stored in a memory. As shown in fig. 1, the process includes:
and S101, responding to the check-in operation of the user, obtaining check-in data, and displaying the check-in data on a first display interface of the first terminal.
In an example, after the user sign-in operation is triggered, a face image of the user is acquired to obtain an acquisition result, and the acquisition result and a face recognition library (which may be a face recognition machine at the first terminal or a server at the background) are used to perform face recognition on the user to obtain a recognition result. And obtaining the check-in data according to the identification result. The check-in data may include: the user successfully signs in; or, the user has failed to check in, etc.
In the process of face recognition, the features to be recognized can be obtained according to the acquisition result, and the features stored in the face recognition library can be extracted. And then, comparing the feature to be recognized with the feature stored in the face recognition library to obtain the recognition result. And confirming whether the user is a stored verification user according to the identification result, if the user is the verification user, the user sign-in operation is an operation initiated by the user, and the identity identification verification of the user is successful.
For the face recognition library, taking the scene of signing in a meeting as an example, the time of the meeting is different because the people participating in different meetings are different. Therefore, the related data of the meeting guests, such as the face images, the meeting time and the like, can be pre-input into the face recognition library before the meeting is opened, so that comparison and statistics and display of the number of the participants can be conveniently carried out according to the face recognition library under the condition that the meeting guests sign in; for a business welcome scene, guests need to be checked in, the guests need to be counted and displayed, the guests are divided into common people and VIP people, for the common people, the face recognition library does not need to be input in advance, for the VIP people needing important attention, relevant data of the VIP people, such as face images and the like, can be input in advance into the face recognition library, so that whether the VIP people are checked in or not, whether all the VIP people are checked in or not can be tracked.
And S102, obtaining check-in processing results respectively corresponding to the check-in data according to the check-in data.
In one example, the check-in processing result may be a check-in statistic result statistically obtained according to check-in data, such as the total number of people checking in, a face image of the user, user data of the user including name/department/title, and the like.
And step S103, outputting the sign-in processing result to a second display interface of a second terminal.
In an example, the check-in processing result may be subjected to image rendering to obtain a check-in image. And outputting the check-in image, and controlling the check-in image to be displayed on the second display interface (the check-in image is displayed on the second display interface in an image display form including at least one of suspension, lamination or embedding). The check-in image at least comprises face data and user data.
By adopting the method and the device, the sign-in data is obtained in response to the sign-in operation of the user, and the sign-in data is displayed on a first display interface of a first terminal; obtaining sign-in processing results respectively corresponding to the sign-in data according to the sign-in data; and outputting the check-in processing result to a second display interface of a second terminal. By means of the double-screen different display mode, the conference check-in scene is taken as an example, the check-in conditions of the participants can be counted through the check-in processing result, the check-in can be achieved at the first terminal, corresponding check-in data can be displayed, and diversified display check-in images can be displayed on the conference interface of the second terminal.
In an example, the check-in processing results may be divided into two types, that is, the two types of check-in processing results may be subjected to image rendering to obtain two types of check-in images, and the two types of check-in images are output and displayed on a second display interface of the second terminal. For example, the first type of check-in image may be a greeting card, and the second type of check-in image may be each face image displayed on a photo wall. The greeting card can be displayed on the photo wall in a stacked or floating manner. As shown in fig. 3, the photo wall may be formed by a plurality of user photos 252 or the like filled in with the grid 25, and the greeting card 24 includes at least a user photo 241 and user data 242 or the like.
In an example, according to the check-in data, user data (such as a name, a department of the user, a position, and the like) corresponding to the user may be queried, the user data is used as a first type check-in processing result, for example, the first type check-in processing result is rendered in a first display form (such as a greeting card), and the first type check-in processing result is filled in the greeting card and then output to the second terminal. And inquiring face data (such as a face avatar) corresponding to the user according to the check-in data, and taking the face data as a second type check-in processing result. For example, the second type check-in processing result is rendered in a second display form (such as a photo wall), and the second type check-in processing result is filled in a corresponding photo of the photo wall and then output to the second terminal.
The first type check-in processing result and the second type check-in processing result may have correlation, and correspondingly, the two types of check-in images obtained by rendering the two types of check-in processing results may also have correlation. For example, the first type check-in image can be displayed on the second display interface in a greeting card mode; the second type check-in image can be displayed on the second display interface in the form of a photo wall, and one example of the relevance is as follows: with the change of the user in the greeting card, the corresponding head portrait of the user on the photo wall also changes continuously in real time, when the user signs in, the corresponding head portrait on the photo wall is in a highlight display state, and people who do not sign in are in a display state different from the highlight display state, such as a gray display state.
By adopting the method, the possible realization mode of the double-screen different display is that the check-in data is displayed on the first display interface of the first terminal (such as a face recognition machine or check-in equipment with a face recognition function); and displaying a check-in processing result (such as a check-in statistical result obtained by counting check-in data) on a second display interface of a second terminal (such as a display device provided with a conference system, such as a TV). The check-in processing result may be one or more. And the check-in processing result can obtain a check-in image after the image rendering, and diversified image display is respectively carried out on a second display interface of the second terminal. For example, the check-in image is displayed on the second display interface in an image display form including at least one of suspension, lamination or embedding.
The lamination means: a display form that is superimposed one upon the other; both layers may be of the same type, such as windows.
The suspension means that: as in the case of pop-up windows, the two layers may not be of the same type.
In a possible implementation manner, before the controlling displays the check-in image on the second display interface, the controlling further includes: and responding to the condition that the number of the check-in images does not reach a preset threshold value, and displaying the check-in images.
In a possible implementation manner, before the controlling displays the check-in image on the second display interface, the controlling further includes: in response to the number of check-in images reaching a preset threshold, deleting a predetermined number of the first check-in images, for example, deleting one first check-in image or multiple first check-in images, where the deleted number may be subscribed according to the number of second check-in images to be currently displayed. Displaying the second check-in image on a second display interface, wherein the check-in image comprises a first check-in image and a second check-in image, and the first check-in image is generated earlier than the second check-in image. For example, if 3 check-in images (image a, image B, and image C, respectively) are already displayed, and a new check-in image D is added, any one of image a, image B, and image C is deleted and updated with image D, or image a is deleted and updated with image D in chronological order of check-in and display, and a new check-in image is continuously added and image update is continued.
In a possible implementation manner, the controlling displays the check-in image on the second display interface, including: starting timing after triggering the sign-in image to display; and in response to the condition that the timing duration reaches the preset time, closing the displayed check-in image.
In one example, the controlling the check-in image to be displayed on the second display interface includes: and under the condition that a first check-in image (such as a first greeting card) is displayed on the second display interface, adding an mth first check-in image according to the update of the mth check-in data, and adding the mth first check-in image to the second display interface, wherein m is a positive integer greater than or equal to 2. If m is larger than n, stopping adding processing of the nth first check-in image, and updating data content of n-1 first check-in images displayed previously according to the updated nth check-in data; the n is the upper threshold limit of the display quantity of the first check-in images, and the n is a positive integer larger than m.
For example, after the first user signs in, the first greeting card is displayed, at this time, the second user signs in, the display interface is updated, the data related to the second user is displayed, the second greeting card is continuously added, and the like. If data is updated and the screen is full, the earliest greeting card is cleared.
In one example, the controlling the plurality of images to be displayed on the second display interface includes: and under the condition that the first check-in image is displayed on the second display interface, canceling the timing processing of the timer after the check-in data is updated. In this case, the count of the timer is not triggered, and the count is cancelled.
In one example, after stopping the adding process of the nth first check-in image, the method further includes: restarting the timing processing of the timer; and removing all the displayed n-1 first check-in images when the timing processing result reaches the preset time. After the timing is finished, all the greeting cards are removed
In one example, the controlling the plurality of images to be displayed on the second display interface includes: obtaining the sign-in state of the user according to the update of the sign-in data; the display of a second check-in image (e.g., a current user photograph displayed in a photo wall) is updated based on the check-in status of the user.
In one example, the updating the display of the corresponding photo in the second image according to the check-in status of the user includes: if the check-in status of the user is checked-in and is displayed on the second check-in image set (such as a photo wall), highlighting a photo corresponding to the user in the second check-in image set; and adding photos corresponding to the user in the second image set and refreshing the second image set under the condition that the check-in state of the user is checked-in and is not displayed in the second image set.
Fig. 2 is a schematic diagram of a first display interface according to an embodiment of the present disclosure, and as shown in fig. 2, at an entrance 11 of a conference office area, a user participating in a conference performs face recognition through a face recognition machine 12 provided in an access control system at the entrance 11 to obtain a recognition result. And obtaining check-in data (one or more check-in data) according to the identification result. The check-in data is displayed at a first display interface of the first terminal of the face recognition engine 12. The check-in data displayed in the first display interface may be "XX, check-in successful", or "XX, check-in failed", as shown in data 131.
Fig. 3 is a schematic diagram of a second display interface according to an embodiment of the present disclosure, as shown in fig. 3, in a conference site formed by a multi-person discussion group, a group conference discussion is performed by accessing a display device 23 (e.g., TV) of a conference system. The multi-person discussion group is shown as group 21-group 22. The access control system accessed by the face recognition machine 12 shown in fig. 2 performs face acquisition, face recognition, and guest greeting processing (including processing logic of how to obtain check-in data and image rendering), and then controls the output result to be displayed on the display device 23. And the display of multi-level contents (the greeting card and the photo wall, wherein the photo wall is positioned below the display level of the greeting card) on the same interface is realized in the obtained second display interface.
It should be noted that in the dual-screen different display mode of the present disclosure, the check-in data is displayed in the face recognition machine 12, the check-in processing results presented in the form of greeting cards and photo walls are displayed in the display device 23 (e.g., TV), and the processing results of different contents are respectively displayed corresponding to different specific forms (greeting cards or photo wall forms). After the user signs in, the face recognition machine 12 firstly controls the display device 23 to send the output result of the face recognition machine 12 to the second terminal in the second display interface, and pops up the greeting card 24 in the second display interface of the second terminal. At least a user avatar 241 is included in the greeting card 24 and user data options such as name, department, and position are formed from at least one user data 242. Then, the face recognition machine 12 controls the display device 23 to send the output result of the face recognition machine 12 to the second terminal in the second display interface, and displays the corresponding user photo in the photo wall of the second display interface of the second terminal. The photo wall is a background wall formed by a plurality of grid areas 25 in the second display interface. The grid area is used to fill in user photos (e.g., user avatars), such as user photos in a greeting card. The user's photos in the grid area are non-highlighted as photos 252 and are also highlighted as photos 253, with the highlighted photos being different from the background color of the photo wall or from other non-highlighted photos. A text box 251 is further included in the second display interface, and the number of the attendance statistics, for example, the meeting participant "100 people checked in", etc., is displayed through the text box 251.
In one example, after the greeting card is triggered to display, the timing process of the timer is started. And closing the displayed greeting card when the timing processing result reaches the preset time. For example, after the first user signs in, the first greeting card is displayed, and the timer is started and the timing process is executed. If no second user signs in, the timing processing result reaches the preset time (such as 3 seconds-20 seconds), and the first greeting card is closed.
In an example, when the second display interface displays the greeting card, the update of the sign-in data is obtained, the greeting card is newly added, and the newly added greeting card is added to the second display interface. And under the condition that the number of the newly added greeting cards reaches a preset threshold value, stopping adding processing of the newly added greeting cards, and updating the content of the previously displayed greeting cards according to the sign-in data update. For example, after the first user signs in, the first greeting card is already displayed currently, at this time, the second user signs in, the data is updated, the second greeting card is continuously added until a preset threshold (for example, the threshold is 3), then, after the third greeting card is reached, if data is updated, the oldest greeting card and/or the previous greeting card is removed.
In an example, in a case that a greeting card is already displayed on the second display interface, after obtaining the update of the check-in data, the method further includes: canceling the timing processing of the timer. For example, after the first user signs in, the first greeting card is displayed, and the timer is started and the timing process is executed. And if the second user signs in, canceling the timing processing of the timer. And continuously monitoring whether other users check in, if no other users check in, timing the processing result to reach a preset time (such as 3 seconds-1 minute), and closing the displayed one or more greeting cards.
In an example, after the adding process of the newly added greeting card is stopped, the method further includes: and restarting the timing processing of the timer, and removing all the displayed greeting cards when the timing processing result reaches the preset time. For example, after the first user signs in, the first greeting card is displayed, and the timer is started and the timing process is executed. And if the second user signs in, canceling the timing processing of the timer. And continuously monitoring whether other users check in, if other users check in, restarting timing processing, and if the timing processing result reaches a preset time (such as 3 seconds-1 minute), closing the displayed one or more greeting cards. And when all the users finish the check-in, stopping timing processing and removing all the greeting cards.
In one example, in the case that the check-in status of the user is checked-in and displayed on the photo wall, a photo corresponding to the user, such as an original ground color or white or black of the photo wall, is highlighted on the photo wall, and the photo corresponding to the user is yellow and highlighted. In short, the specific form of highlighting is not limited, and it is within the scope of the present disclosure to distinguish the display possibility of the background color of the photo wall. In this case, the photo wall does not need to be refreshed, and the current ordering of the photos corresponding to the user on the photo wall does not need to be rearranged or disturbed.
In one example, when the check-in status of the user is checked-in and not displayed on the photo wall, a photo corresponding to the user is added to the photo wall, and the photo wall is refreshed. For example, photos corresponding to other users in the photo wall may be deleted, and photos corresponding to the current user are filled in the positions of the photos corresponding to the other users. In another example, a photo corresponding to the user may also be added to the unfilled photo location in the photo wall. In this case, the photo wall needs to be refreshed, and the current ordering of the photos corresponding to the user on the photo wall needs to be rearranged or disturbed.
Application example:
hardware facilities for implementing the present disclosure, such as a face recognition machine and a display device, are deployed within the aisles of a designated office area. And starting a face recognition machine in a face recognition-conference check-in system, carrying out face recognition on the checked-in personnel through the face recognition machine, and after the face recognition machine successfully matches the checked-in personnel, pushing check-in data to the welcome processing logic of the first terminal face recognition machine (the logic can be realized through an internal welcome module). And processing the check-in data through the welcome processing logic of the face recognition machine, sending a processing result and a control instruction to a display device of the second terminal, and indicating the display device to display the welcome card through the control instruction. Refreshing the number of the persons signed in, starting a timer after showing the greeting card, and closing the greeting card after the timing processing of the timer reaches the preset time. If the new push of the attendance personnel is received within the duration of the timing processing of the timer, the timer is closed, and the greeting card is continuously displayed and the attendance number is refreshed. And when all the participants finish the check-in, closing the greeting card, refreshing the photo of the photo wall, and marking the total number of the checked-in personnel.
Taking a check-in scene as an example, a module for implementing check-in processing in the first terminal is exemplified by a welcome module, fig. 4 shows a flowchart of a check-in application example according to an embodiment of the present disclosure, and as shown in fig. 4, the following contents are included:
the method comprises the steps that a guest greeting module of a first terminal receives sign-in data, sign-in processing results corresponding to the sign-in data are obtained according to the sign-in data, image rendering is carried out on the sign-in processing results, a sign-in image containing a guest greeting card is obtained, and for example, the sign-in processing results obtained based on the sign-in data are filled in an interface of the guest greeting card. At this time, the check-in image may be a greeting card.
The face recognition method comprises the steps that a face of a user can be collected in response to the sign-in operation of the user through a collection module and a face recognition module of a first terminal, a collection result is obtained, the collection result is compared with a face image of a face recognition library, and a recognition result is obtained. And obtaining the check-in data according to the identification result.
And secondly, setting a timer (the timer can be one), and if the timer is not manually cancelled in the preset time, closing the greeting card.
And thirdly, if one greeting card is displayed on the interface after the greeting module of the first terminal receives the data update, canceling the timer, adding another greeting card to the interface, and repeating the processing steps of filling and displaying the sign-in processing result obtained based on the sign-in data on the interface of the greeting card.
And fourthly, if the number of the greeting cards on the interface reaches the set maximum value, removing the earliest greeting card, and repeating the processing steps of filling and displaying the sign-in processing result obtained based on the sign-in data on the interface of the greeting card.
And fifthly, after the timing is finished, clearing all the greeting cards.
And sixthly, adding the user of the last greeting card into the background photo wall. If the user is in the background photo wall, only the personal check-in state is refreshed, and the photo of the whole photo wall is not refreshed; if the current user is not in the background photo wall, one user is randomly deleted from the photo wall, and the current user is added into the background wall, so that the sequence of the photo wall is disturbed. And refreshing the photo wall.
The user photo displayed in the photo wall is another expression form of the check-in image, and after the greeting card displays the user photo, the user photo in the photo wall can be correspondingly displayed in real time.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
The above-mentioned method embodiments can be combined with each other to form a combined embodiment without departing from the principle logic, which is limited by the space and will not be repeated in this disclosure.
In addition, the present disclosure also provides an image processing apparatus, an electronic device, a computer-readable storage medium, and a program, which may all be used to implement the image processing method provided by the present disclosure, and the corresponding technical solutions and descriptions and corresponding descriptions in the method section are not repeated.
Fig. 5 illustrates a block diagram of an image processing apparatus according to an embodiment of the present disclosure, which, as illustrated in fig. 5, includes: the response unit 41 is configured to obtain check-in data in response to a check-in operation of a user, and display the check-in data on a first display interface of the first terminal; the sign-in unit 42 is configured to obtain sign-in processing results respectively corresponding to the sign-in data according to the sign-in data; and a control output unit 43, configured to output the check-in processing result to a second display interface of the second terminal.
In a possible implementation manner, the response unit is configured to: carrying out face recognition on the user to obtain a recognition result; and obtaining the check-in data according to the identification result.
In a possible implementation manner, the response unit is configured to: acquiring a face image of the user after the sign-in operation of the user is triggered to obtain an acquisition result; and comparing the acquisition result with a face image in a face recognition library to obtain the recognition result.
In a possible implementation manner, the control output unit is configured to: performing image rendering on the check-in processing result to obtain a check-in image; and outputting the check-in image, and controlling the second display interface to display the check-in image.
In a possible implementation manner, the check-in image is displayed on the second display interface in an image display form including at least one of suspension, lamination or embedding.
In a possible implementation manner, the check-in image at least includes face data.
In a possible implementation manner, the apparatus further includes a first processing unit, configured to: and responding to the condition that the number of the check-in images does not reach a preset threshold value, and displaying the check-in images.
In a possible implementation manner, the apparatus further includes a first processing unit, configured to: deleting a predetermined number of the first check-in images in response to a situation that the number of the check-in images reaches a preset threshold; displaying the second check-in image on a second display interface, wherein the check-in image comprises a first check-in image and a second check-in image, and the first check-in image is generated earlier than the second check-in image.
In a possible implementation manner, the apparatus further includes a first processing unit, configured to: starting timing after triggering the sign-in image to display; and in response to the condition that the timing duration reaches the preset time, closing the displayed check-in image.
In some embodiments, functions of or modules included in the apparatus provided in the embodiments of the present disclosure may be used to execute the method described in the above method embodiments, and specific implementation thereof may refer to the description of the above method embodiments, and for brevity, will not be described again here.
Embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the above-mentioned method. The computer readable storage medium may be a non-volatile computer readable storage medium.
An embodiment of the present disclosure further provides an electronic device, including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured as the above method.
The electronic device may be provided as a terminal, server, or other form of device.
Fig. 6 is a block diagram illustrating an electronic device 800 in accordance with an example embodiment. For example, the electronic device 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, or the like terminal.
Referring to fig. 6, electronic device 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the electronic device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the electronic device 800. Examples of such data include instructions for any application or method operating on the electronic device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 806 provides power to the various components of the electronic device 800. The power components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the electronic device 800.
The multimedia component 808 includes a screen that provides an output interface between the electronic device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the electronic device 800 is in an operation mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the electronic device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the electronic device 800. For example, the sensor assembly 814 may detect an open/closed state of the electronic device 800, the relative positioning of components, such as a display and keypad of the electronic device 800, the sensor assembly 814 may also detect a change in the position of the electronic device 800 or a component of the electronic device 800, the presence or absence of user contact with the electronic device 800, orientation or acceleration/deceleration of the electronic device 800, and a change in the temperature of the electronic device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate wired or wireless communication between the electronic device 800 and other devices. The electronic device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the electronic device 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium, such as the memory 804, is also provided that includes computer program instructions executable by the processor 820 of the electronic device 800 to perform the above-described methods.
The present disclosure may be systems, methods, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for causing a processor to implement various aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present disclosure may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry that can execute the computer-readable program instructions implements aspects of the present disclosure by utilizing the state information of the computer-readable program instructions to personalize the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA).
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Different embodiments of the present application may be combined with each other without departing from the logic, and the descriptions of the different embodiments are focused on, and for the parts focused on the descriptions of the different embodiments, reference may be made to the descriptions of the other embodiments.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.
Claims (10)
1. An image processing method, characterized in that the method comprises:
responding to the check-in operation of a user, obtaining check-in data, and displaying the check-in data on a first display interface of a first terminal;
obtaining sign-in processing results respectively corresponding to the sign-in data according to the sign-in data;
and outputting the check-in processing result to a second display interface of a second terminal.
2. The method of claim 1, wherein obtaining check-in data in response to a check-in operation by a user comprises:
carrying out face recognition on the user to obtain a recognition result;
and obtaining the check-in data according to the identification result.
3. The method according to claim 2, wherein the performing face recognition on the user to obtain a recognition result comprises:
acquiring a face image of the user after the sign-in operation of the user is triggered to obtain an acquisition result;
and comparing the acquisition result with a face image in a face recognition library to obtain the recognition result.
4. The method according to any one of claims 1-3, wherein outputting the plurality of check-in processing results to a second display interface of a second terminal comprises:
performing image rendering on the check-in processing result to obtain a check-in image;
and outputting the check-in image, and controlling the second display interface to display the check-in image.
5. The method of claim 4, wherein the check-in image is displayed on the second display interface in an image display format comprising at least one of floating, cascading, or embedding.
6. The method of claim 4, wherein the check-in image includes at least face data.
7. The method of any of claims 4-6, wherein prior to displaying the check-in image on the second display interface, the controlling further comprises:
and responding to the condition that the number of the check-in images does not reach a preset threshold value, and displaying the check-in images.
8. An image processing apparatus, characterized in that the apparatus comprises:
the response unit is used for responding to the check-in operation of the user, obtaining check-in data and displaying the check-in data on a first display interface of the first terminal;
the check-in unit is used for obtaining check-in processing results respectively corresponding to the check-in data according to the check-in data;
and the control output unit is used for outputting the check-in processing result to a second display interface of a second terminal.
9. An electronic device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to: performing the method of any one of claim 1 to claim 7.
10. A computer readable storage medium having computer program instructions stored thereon, which when executed by a processor implement the method of any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910842126.6A CN110609933A (en) | 2019-09-06 | 2019-09-06 | Image processing method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910842126.6A CN110609933A (en) | 2019-09-06 | 2019-09-06 | Image processing method and device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110609933A true CN110609933A (en) | 2019-12-24 |
Family
ID=68892236
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910842126.6A Pending CN110609933A (en) | 2019-09-06 | 2019-09-06 | Image processing method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110609933A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111563972A (en) * | 2020-06-01 | 2020-08-21 | 上海商汤智能科技有限公司 | Sign-in method, sign-in device, computer equipment and storage medium |
CN111627125A (en) * | 2020-06-02 | 2020-09-04 | 上海商汤智能科技有限公司 | Sign-in method, device, computer equipment and storage medium |
CN111640215A (en) * | 2020-06-04 | 2020-09-08 | 上海商汤智能科技有限公司 | Sign-in method, device, computer equipment and storage medium |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106127876A (en) * | 2016-06-23 | 2016-11-16 | 深圳市商汤科技有限公司 | A kind of multi-screen is registered system |
WO2016191952A1 (en) * | 2015-05-29 | 2016-12-08 | 华为技术有限公司 | Method and device for displaying photo |
CN107315555A (en) * | 2016-04-27 | 2017-11-03 | 腾讯科技(北京)有限公司 | Register method for information display and device |
CN108090981A (en) * | 2017-11-17 | 2018-05-29 | 克立司帝控制系统(上海)有限公司 | Meeting signature system and method based on face recognition technology |
CN108492393A (en) * | 2018-03-16 | 2018-09-04 | 百度在线网络技术(北京)有限公司 | Method and apparatus for registering |
CN109559399A (en) * | 2018-12-18 | 2019-04-02 | 深圳市致善科技有限公司 | User registers method, system and storage medium and terminal device |
CN109903411A (en) * | 2019-01-26 | 2019-06-18 | 北方民族大学 | A kind of meeting signature device and application method based on recognition of face |
-
2019
- 2019-09-06 CN CN201910842126.6A patent/CN110609933A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016191952A1 (en) * | 2015-05-29 | 2016-12-08 | 华为技术有限公司 | Method and device for displaying photo |
CN107315555A (en) * | 2016-04-27 | 2017-11-03 | 腾讯科技(北京)有限公司 | Register method for information display and device |
CN106127876A (en) * | 2016-06-23 | 2016-11-16 | 深圳市商汤科技有限公司 | A kind of multi-screen is registered system |
CN108090981A (en) * | 2017-11-17 | 2018-05-29 | 克立司帝控制系统(上海)有限公司 | Meeting signature system and method based on face recognition technology |
CN108492393A (en) * | 2018-03-16 | 2018-09-04 | 百度在线网络技术(北京)有限公司 | Method and apparatus for registering |
CN109559399A (en) * | 2018-12-18 | 2019-04-02 | 深圳市致善科技有限公司 | User registers method, system and storage medium and terminal device |
CN109903411A (en) * | 2019-01-26 | 2019-06-18 | 北方民族大学 | A kind of meeting signature device and application method based on recognition of face |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111563972A (en) * | 2020-06-01 | 2020-08-21 | 上海商汤智能科技有限公司 | Sign-in method, sign-in device, computer equipment and storage medium |
CN111627125A (en) * | 2020-06-02 | 2020-09-04 | 上海商汤智能科技有限公司 | Sign-in method, device, computer equipment and storage medium |
CN111640215A (en) * | 2020-06-04 | 2020-09-08 | 上海商汤智能科技有限公司 | Sign-in method, device, computer equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10123196B2 (en) | Method and device for alarm triggering | |
CN105843615B (en) | Notification message processing method and device | |
CN107948708B (en) | Bullet screen display method and device | |
US10509540B2 (en) | Method and device for displaying a message | |
CN110675539B (en) | Identity verification method and device, electronic equipment and storage medium | |
CN110942036B (en) | Person identification method and device, electronic equipment and storage medium | |
CN106325674B (en) | Message prompting method and device | |
CN106919629B (en) | Method and device for realizing information screening in group chat | |
CN110992562A (en) | Access control method and device, electronic equipment and storage medium | |
CN110555930B (en) | Door lock control method and device, electronic equipment and storage medium | |
EP3754959A1 (en) | Quick access to an application in the lock screen | |
CN110609933A (en) | Image processing method and device, electronic equipment and storage medium | |
CN108924644B (en) | Video clip extraction method and device | |
CN110569777A (en) | Image processing method and device, electronic equipment and storage medium | |
CN107147815B (en) | Call processing method and device based on taxi taking | |
CN106331328B (en) | Information prompting method and device | |
CN113240702A (en) | Image processing method and device, electronic equipment and storage medium | |
CN105323152A (en) | Message processing method, device and equipment | |
CN114519441A (en) | Conference room management method, conference room management apparatus, electronic device, storage medium, and program product | |
CN111625671A (en) | Data processing method and device, electronic equipment and storage medium | |
CN107908522B (en) | Information display method and device and computer readable storage medium | |
CN107132908B (en) | Image updating method and device | |
CN106919302B (en) | Operation control method and device of mobile terminal | |
CN109151544B (en) | Multimedia playing and displaying method and device | |
US20170147593A1 (en) | Contact managing method and apparatus, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20191224 |