CN111223220B - Image correlation method and device and server - Google Patents

Image correlation method and device and server Download PDF

Info

Publication number
CN111223220B
CN111223220B CN201811420110.8A CN201811420110A CN111223220B CN 111223220 B CN111223220 B CN 111223220B CN 201811420110 A CN201811420110 A CN 201811420110A CN 111223220 B CN111223220 B CN 111223220B
Authority
CN
China
Prior art keywords
user
image
acquiring
authentication
user account
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811420110.8A
Other languages
Chinese (zh)
Other versions
CN111223220A (en
Inventor
王琨
李新伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201811420110.8A priority Critical patent/CN111223220B/en
Publication of CN111223220A publication Critical patent/CN111223220A/en
Application granted granted Critical
Publication of CN111223220B publication Critical patent/CN111223220B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07BTICKET-ISSUING APPARATUS; FARE-REGISTERING APPARATUS; FRANKING APPARATUS
    • G07B11/00Apparatus for validating or cancelling issued tickets

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses an image correlation method, an image correlation device and a server, wherein the method comprises the following steps: acquiring a field image acquired by a camera device in a venue; identifying character features in the live image to obtain a user account corresponding to the character features; and associating the live image with the corresponding user account.

Description

Image correlation method and device and server
Technical Field
The present invention relates to the field of face recognition technologies, and in particular, to an image association method, an image association apparatus, and a server.
Background
At present, in the competition performance activities of the industries such as culture, entertainment, sports and the like, audiences who watch the competition performance often have higher enthusiasm and consumption capacity, and have high interest and purchase willingness on official and inscription-worthy customized and personalized camera shooting.
However, in the above-mentioned game play activities, it is difficult for the spectators to achieve a good effect even if they want to take pictures by themselves because of the forbidden bands, the crowded seats, the limited space, and so on. However, when the event venue for the match performance is used as a public place, there is no photographing and photographing method that meets the needs of the audience to photograph the audience, for example, even if a camera of a monitoring system is used, there is a problem that the identity and the accurate position of the audience are confirmed. Therefore, it is necessary to invent a method for distributing an image by recognizing the identity of a person such as a viewer.
Disclosure of Invention
An object of an embodiment of the present invention is to provide an image association method, so as to achieve the purpose of identifying the identity of a person and distributing an image.
According to a first aspect of the present invention, there is provided an image association method implemented by a server, comprising:
acquiring a field image acquired by a camera device in a venue;
identifying character features in the live image to obtain a user account corresponding to the character features;
and associating the live image with the corresponding user account.
Optionally, the step of identifying the character features in the live image to obtain the user account corresponding to the character features includes:
identifying character features in the live image;
searching a user image matched with the character characteristics in a user database according to the identified character characteristics;
and acquiring a user account bound with the user image as a user account corresponding to the character characteristics.
Optionally, the step of searching the user database for the user image matching the character feature includes:
acquiring a shooting area corresponding to the camera device;
screening user images corresponding to the shooting area from the user database under the condition that the shooting area is a seat area of a venue to form an image candidate set;
and searching the image candidate set for the user image matched with the character characteristic.
Optionally, before acquiring the live image collected by the camera device in the venue, the method further includes:
acquiring a user image of an entering user, and storing the user image of the entering user into the user database;
and establishing a binding relationship between the user image of the entering user and the corresponding user account.
Optionally, before the establishing the binding relationship between the user image of the entering user and the corresponding user account, the method further includes:
according to the user image of the entry user, creating a user account corresponding to the user image of the entry user; alternatively, the first and second electrodes may be,
obtaining entrance voucher information of an entrance user;
and creating a user account corresponding to the entrance user according to the entrance voucher information.
Optionally, after associating the live image with the corresponding user account, the method further includes:
responding to request information for acquiring the on-site image sent by a user terminal, and performing user identity authentication;
under the condition that the authentication is passed, acquiring a user account corresponding to the request information;
and acquiring the on-site image added in the user account corresponding to the request information, and sending the on-site image to the user terminal for downloading.
Optionally, the step of performing user identity authentication includes:
obtaining entrance certificate information carried in the request information;
according to the entrance voucher information, searching a user image bound with the entrance voucher information in a user database as a target image;
providing a plurality of user images including the target image to the user terminal for selection; and the number of the first and second groups,
and performing user identity authentication according to the selection result sent by the user terminal.
Optionally, the step of performing user identity authentication includes:
configuring the user terminal to start a camera of the user terminal to acquire a user image as an authentication image;
acquiring the authentication image provided by the user terminal;
searching a user database for a user image matched with the authentication image; and the number of the first and second groups,
and under the condition that the user image matched with the authentication image is found, the authentication is passed.
According to a second aspect of the present invention, there is provided an image correlation apparatus implemented by a server, including:
the on-site image acquisition module is used for acquiring an on-site image acquired by a camera device in a venue;
the account locking module is used for identifying character features in the live image to obtain a user account corresponding to the character features; and the number of the first and second groups,
and the image adding module is used for associating the on-site image with the corresponding user account.
According to a third aspect of the present invention, there is also provided a server, which includes the image correlation apparatus according to the second aspect of the present invention; alternatively, it comprises a memory for storing instructions for controlling the processor to operate to perform the image correlation method according to the first aspect of the invention and a processor.
The method has the advantages that the method is based on the identification of the character features in the live images, the user account corresponding to the character features is obtained, the live images are correlated with the corresponding user account, and the character features correspond to the user accounts, so that the method can correlate the live images with the corresponding user accounts to achieve the purposes of identifying the user identities and distributing the images.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a schematic diagram illustrating an exemplary embodiment of an image distribution system;
fig. 2 is a schematic diagram of a hardware structure of a ticket checking device according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a hardware configuration of an image pickup apparatus according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating a hardware configuration of a server according to an embodiment of the present invention;
fig. 5 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present invention;
FIG. 6 is a schematic flow chart of an image correlation method according to an embodiment of the present invention;
fig. 7 is a schematic flow chart of an interaction process among a ticket checking device, a camera device, a server and a user terminal according to an embodiment of the present invention;
FIG. 8 is a first schematic structural diagram of an image correlation apparatus according to an embodiment of the present invention;
FIG. 9 is a second schematic structural diagram of an image correlation apparatus according to an embodiment of the present invention;
FIG. 10 is a functional block diagram of a server according to an embodiment of the present invention.
Detailed Description
Various exemplary embodiments of the present invention will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as exemplary only and not as limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be discussed further in subsequent figures.
< architecture of the entire image distribution System >
Fig. 1 is a schematic diagram illustrating an overall architecture of an image distribution system according to an embodiment of the present invention.
As shown in fig. 1, the image distribution system of the present embodiment includes a ticket gate 1000, at least one camera 2000 (see fig. 3), i.e., a camera 2000A, a camera 2000B, a camera 2000C, a camera 2000D, a server 3000, and a terminal device 4000, which can establish communication with each other through a wireless network 5000.
It should be understood that although fig. 1 shows only one ticket gate 1000, four image capturing devices 2000, one server 3000, and one terminal device 4000, it is not meant to limit the corresponding number, and other numbers of ticket gate 1000, image capturing devices 2000, servers 3000, and terminal devices 4000 may be included in the image distribution system.
In this embodiment, the ticket checking device 1000 may be a device having a ticket checking function and a face recognition function, and the ticket checking device 1000 may be a face recognition machine or a face recognition gate, for example.
Fig. 2 is a schematic diagram of a hardware configuration of the ticket checking apparatus 1000 according to the present embodiment.
As shown in fig. 2, the ticket gate apparatus 1000 may include at least: a processor 1010, a face information acquisition device 1020 and a communication device 1030.
The processor 1010 is configured to detect entrance credential information of an entrance user, and send the detected entrance credential information of the entrance user to the communication device 1030.
The face information collecting device 1020 is configured to collect a user image of the entering user, and send the collected user image of the entering user to the communication device 1030 through the processor 1010.
The communication means 1030 is configured to transmit the entrance credential information of the entrance user and the user image of the entrance user to the server 3000 via the network 5000.
In the present embodiment, the image pickup device 2000 may be a video camera.
Fig. 3 is a schematic diagram of the hardware configuration of the image pickup apparatus 2000 according to the present embodiment.
As shown in fig. 3, the image pickup apparatus 2000 includes at least a front end module 2010, a rear end module 2020, a peripheral circuit module 2030, and a transfer operation module 2040. The front end module 2010 is configured to capture an external image and send the external image to the back end module 2020. The back end module 2020 is configured to perform image processing on the external image acquired by the front end module 2010. The peripheral circuit module 2030 is mainly responsible for communication, memory read/write, video output, and interface operation. The transmission operation module 2040 is mainly responsible for operation control of peripheral components.
In one embodiment, the front-end module 2010 may include at least a lens group 2011, a filter 2012, an image sensor 2013, a Timing Generator (TG) 2014, a Sampling Controller (CDS) 2015, an Automatic Gain Control (AGC) 2015, and an Analog-to-digital (a/D) converter 2017.
The image sensor 2013 at least comprises an on-chip micro-lens, an enhanced color fill filter, an internal lens and other structures (not shown in the figure).
The image sensor 2013 may be a Complementary Metal Oxide Semiconductor (CMOS), and although the CMOS has the advantages of high speed, low power consumption, simple process, low cost, and the like, the CMOS has the disadvantages of high noise, poor imaging quality, and the like.
The image sensor 2013 may also adopt a Charge Coupled Device (CCD), and although the CCD has the disadvantages of large power consumption, complex manufacturing process, high cost, and the like, the CCD has the advantages of high resolution, good imaging quality, low noise, high signal-to-noise ratio, and the like, so the CCD is more suitable for the image sensor 2013.
The lens group 2011 at least includes a lens 2011A, a lens 2011B and a lens 2011C, an aperture is provided inside the lens, the image capturing device 2000 employs a long focal length lens and a small aperture coefficient, and the aperture coefficient is a ratio of a focal length of the lens to an effective aperture of the lens.
The angle of view FOV of the long focus lens is 10 ° to 30 °, and the lens focal length is 8mm to 30mm, so that the image pickup device 2000 can clearly photograph an area of 10 meters to 50 meters.
The aperture can control the luminous flux passing through the lens, the aperture coefficient of the image pickup device 2000 is small, and the aperture is large, so that the luminous flux reaching the image sensor 2013 is large, and the image pickup device 2000 is favorable for shooting dark scenes.
In one embodiment, the back-end module 2020 is an integrated circuit Chip (System-on-a-Chip, SoC) with a Digital Signal Processing (DSP) Chip 2021 and a control Chip 2022 as cores, the DSP Chip 2021 is responsible for Digital image Processing, such as but not limited to image color Processing, image resolution conversion, jpeg (joint Photographic Experts Group) encoding/decoding, and MPEG4(Moving Pictures Experts Group 4) encoding/decoding, and the control Chip 2022 is responsible for controlling the operation of the entire image capturing device 2000.
In one embodiment, the peripheral circuit module 2030 is mainly responsible for communication, memory read/write, video output, and interface operations, and after the image pickup apparatus 2000 is connected to the server 3000, information in the server 3000 can be read through the peripheral circuit module.
In one embodiment, the transmission operating module 2040 is controlled by a Micro Controller Unit (MCU), and the transmission operating module 2040 is mainly responsible for operating and controlling peripheral components such as, but not limited to, a lens group motor driving circuit 2041, a zoom motor 2042, a focus motor 2043, a flash lamp 2044, an infrared fill lamp 2045, and a base steering engine 2046.
The image capturing apparatus 2000 may adopt a Back-light Compensation (BLC) mode, i.e., a backlight Compensation and a backlight correction, to adjust the overall brightness of the image, so as to improve the imaging effect of the main area.
The image capturing device 2000 may also adopt a Wide Dynamic Range (WDR) mode during shooting, where the WDR mode is suitable for capturing a scene with too large contrast, and at this time, the image capturing device 2000 can capture a foreground object and a background object clearly. Generally, there are two methods for implementing the WDR mode: one is to adopt a double-speed CCD to carry out exposure of different time on different scenes, and the other is to adopt a DSP to independently control the exposure time of each pixel point.
For special scenes with low illumination of the seating area, the camera 2000 may also be in night mode. When the camera device 2000 is in the night mode, the infrared light supplement lamp 2045 of the camera device 2000 starts to work, and since the infrared light supplement lamp 2045 is provided with a lens (not shown in the figure), the camera device 2000 can focus according to the distance between the seats of the audience and the camera device 2000, so that the infrared light supplement lamp 2045 is concentrated in the shooting area as much as possible, and a better shooting effect is achieved. In addition, because the wavelength of the infrared light exceeds the visible range of human eyes, the user can not sense the infrared light, and the viewing and playing experience of audiences can not be influenced.
To the concert, generally have various stage lighting effect equipment, when the light effect highlight gets into the shooting scope, camera device 2000 can open the highlight inhibition function automatically, reduces automatic light ring to highlight sensitivity for the picture darkens fast, reaches highlight inhibition effect.
In this embodiment, the server 3000 may be a unitary server or a distributed server across multiple computers or computer data centers. The server may be of various types, such as, but not limited to, a web server, a news server, a mail server, a message server, an advertisement server, a file server, an application server, an interaction server, a database server, or a proxy server. In some embodiments, the server may include hardware, software, or embedded logic components or a combination of two or more such components for performing the appropriate functions supported or implemented by the server.
Fig. 4 is a schematic diagram of the hardware configuration of the server 3000 according to the present embodiment.
As shown in fig. 4, the server 3000 may include one or more memories 3400 and one or more processors 3300 and a communications device 3200.
The memory 3400 may include, for example, a ROM (read only memory), a RAM (random access memory), a nonvolatile memory such as a hard disk, or the like.
The processor 3300 may be a desktop processor, a server processor, a mobile version processor, or the like.
The communication device 3200 is used for the server 3000 to realize communication connection with the image pickup devices 2000A, 2000B, 2000C, and 2000D and the terminal device 4000 via the network 5000.
In one embodiment, the image correlation apparatus 3100 of any embodiment of the invention is implemented by the processor 3300, wherein the image correlation apparatus 3100 is configured to implement the image correlation method of any embodiment of the invention.
The memory 3400 is used for storing instructions for controlling the processor 3300 to operate so as to execute the image correlation method according to the embodiment of the present invention, and those skilled in the art may design the instructions according to the technical solutions disclosed in the present invention. How the instructions control the operation of the processor 3300 is well known in the art, and thus embodiments of the present invention are not described in detail herein.
In this embodiment, the terminal device 4000 is an electronic device having a communication function and a service processing function. The terminal device 4000 may be a mobile terminal such as a mobile phone, a laptop, a tablet, a palmtop, etc.
Fig. 5 is a schematic diagram of a hardware configuration of the terminal device 4000 according to the present embodiment.
According to what is shown in fig. 5, the terminal equipment 4000 may comprise a processor 4010, a memory 4020, an interface device 4030, a communication device 4040, a display device 4050, an input device 4060, a speaker 4070, a microphone 4080, etc. The processor 4010 may be a central processing unit CPU, a microprocessor MCU, or the like. The memory 4020 includes, for example, a ROM (read only memory), a RAM (random access memory), a nonvolatile memory such as a hard disk, and the like. The interface 4030 includes, for example, a USB interface, a headphone interface, and the like. The communication device 4040 can perform wired or wireless communication, for example. The display device 4050 is, for example, a liquid crystal display panel, a touch panel, or the like. The input device 4060 may include, for example, a touch screen, a keyboard, and the like. A user can input/output voice information through the speaker 4070 and the microphone 4080.
The terminal equipment shown in fig. 5 is only illustrative and in no way implies any limitation of the invention, its application or use. In the embodiment of the present invention, the memory 4020 of the terminal device 4000 is configured to store an instruction for controlling the processor 4010 to perform the above operation. It should be understood by those skilled in the art that although a plurality of devices are shown for the terminal apparatus 4000 in fig. 5, the present invention may only relate to some of the devices, for example, the processor 4010 and the memory 4020.
In this embodiment, the network 5000 encompasses any suitable wireless network, such as but not limited to 4G networks, 3G networks, GPRS, Wi-Fi, and the like. The network coupling the ticket checker 1000 and the server 3000 and the network coupling the server 3000 and the camera 2000 may be the same network or different networks.
< method examples >
FIG. 6 is a schematic flow chart diagram of an image correlation method according to one embodiment.
Referring to fig. 6, the image association method of the present embodiment is implemented by the image association apparatus 3100, and the image association method of the present embodiment may include steps S3100 to S3300:
in step S3100, the image correlation device 3100 acquires a live image captured by the imaging device 2000 in the venue.
The venue includes at least a seating area and a non-seating area, the seating area includes at least one seating area, and a corresponding camera 2000 may be allocated to each seating area in the seating area, or one camera 2000 may be allocated to a plurality of seating areas in the seating area. The non-seating area includes at least one camera 2000.
The live video includes at least one of an image and a video.
In this embodiment, the camera 2000 in the seat area or the camera 2000 in the non-seat area may capture a live image or a live video, and transmit the captured live image or live video to the video association apparatus 3100 through a wireless communication method.
In step S3200, the image correlation apparatus 3100 identifies a character feature in the live image, and obtains a user account corresponding to the character feature.
The character features include at least one of a character face feature and a character dress feature.
In this embodiment, the image correlation apparatus 3100 may identify human features in live images by using a face recognition technique.
In this embodiment, the image correlation apparatus 3100 recognizing the human feature in the live image in step S3200, and obtaining the user account corresponding to the human feature may include the following steps:
in step S3210, the video correlation apparatus 3100 identifies a character in the live video.
In one example, where the live video is an image, the video correlation device 3100 may identify human features in the single frame image.
In another example, when the live video is a video, since the video is composed of a plurality of single-frame images, the video correlation apparatus 3100 may extract only one of the single-frame images in the video and recognize the character of the single-frame image, or may extract a plurality of frame images in the video and recognize the character of each of the plurality of frame images.
In step S3220, the video correlation apparatus 3100 searches the user database for a user image matching the person feature based on the identified person feature.
The user database stores at least user images of all the users entering the field.
In this embodiment, a similarity threshold may be set, and by comparing the similarity between the character feature and the user image in the user database, if the similarity is greater than or equal to the similarity threshold, it indicates that the user image in the user database is the user image matched with the character feature.
(1) In one example, when the live video is the video captured by the camera 2000 in the non-seat area of the venue, the similarity threshold may be a first threshold, and in step S3220, the video correlation device 3100 may search the user database for the user image matching with the character feature according to the identified character feature by: the visual association apparatus 3100 searches for a user image having a similarity greater than or equal to a first threshold with respect to the human character directly from all user images stored in the user database based on the recognized human character.
Since it is difficult for the camera 2000 to obtain other identification features of the user in addition to the facial features and the clothing features of the user in the non-seating area, the first threshold may be set to a higher parameter in order to avoid recognition errors.
In this example, when the live view is an image, the view correlation device 3100 identifies a person feature in a single frame image, matches the person feature with a user image in a user database, and finds a user image having a similarity greater than or equal to a first threshold with respect to the person feature.
In this example, when the live video is a video, the video associating apparatus 3100 may extract one of the images in the video, identify the person feature of the single image, match the person feature with the user image in the user database, and find the user image with which the similarity to the person feature is greater than or equal to the first threshold. Or extracting multiple frames of images in the video, respectively identifying character features of the multiple frames of images, respectively matching the character features corresponding to the multiple frames of images with the user images in the user database, and finding out the user images with the similarity of the corresponding character features larger than or equal to a first threshold.
(2) In another example, when the live view is a view captured by the camera 2000 in a seat area of a venue, the similarity threshold may be a second threshold, and the step S3220 of searching the user database for a user image matching with the person feature by the view associating device 3100 according to the identified person feature may further include the steps of:
in step S3221, the video relating device 3100 acquires an imaging area corresponding to the imaging device 2000.
In one example, the step S3221 of acquiring, by the video association apparatus 3100, the photographing region corresponding to the image capturing apparatus 2000 may include the steps of:
in step S3221-1, the image distribution device 3100 acquires position information of the imaging device 2000.
In an actual performance event, the seating area in the venue is fixed, and therefore, the stage and the playing field of the venue can be manually used as the origin, and meanwhile, considering that the X axis in the cartesian coordinate system is generally on the right side, and the map is the upper, lower, south, left, right, and east, the east-west direction is the X axis, the north-right direction is the Y axis, and the vertical upward direction is the Z axis, the cartesian coordinate system is established, and at this time, the position coordinates C (X, Y, Z) of each camera 2000 and the position coordinates S (X, Y, Z) of each seat can be determined, and the position coordinates of each camera 2000 and the position coordinates of each seat are recorded into the image distribution device 3100. Of course, the video distribution apparatus 3100 may obtain the position coordinates C (X, Y, Z) of each imaging apparatus 2000 and the position coordinates S (X, Y, Z) of each seat, instead of the manual operation.
For example, there are two-bit image capturing devices 2000 in total, where the position coordinate of the image capturing apparatus No. 1C 01 can be represented as C01(X01, Y01, Z01), if the origin of the position coordinate of the image capturing apparatus No. 1C 01 with respect to the cartesian coordinate system is: the east 50.00 m, the north 60.00 m, and the top 31.23 m, and the position coordinates of the imaging apparatus 2000 are accurate to cm, and the specific position coordinates of the No. 1 imaging device C01 are (50, 60, 3123) C01(X01, Y01, Z01).
For another example, the number of seats in a venue is at most 3, the number of rows is at most 2, and the number of seats is at most 3, wherein the position coordinates of the 056 04-row seat S12304056 in the 123-seat area can be represented as S12304056(x12304056, y12304056, z12304056), if the position coordinates of the seat S12304056 with respect to the origin of the cartesian coordinate system are: the east side is 34.56 meters, the north side is 23.45 meters, and the upper side is 12.34 meters, and the position coordinates of the seat are accurate to centimeters, so that the specific position coordinates of the seat S12304056 can be S12304056(x12304056, y12304056, z12304056) ═ 3456, 2345, 1234.
In step S3221-2, the video distribution device 3100 obtains the imaging area corresponding to the imaging device 2000 based on the position information.
Since the imaging area of each imaging device 2000 is determined after the imaging devices 2000 in the venue are installed, the video distribution device 3100 can acquire the imaging area imaged by the imaging devices 2000 based on the position information of the imaging devices 2000 after the video distribution device 3100 acquires the position information of the imaging devices 2000.
In this embodiment, in the case that the shooting area is a seat area of a venue, the obtaining, by the video distribution apparatus 3100 according to the position information in step S3222, the shooting area corresponding to the image capturing apparatus 2000 may further include: the video distribution device 3100 obtains seat number information included in the imaging area corresponding to the imaging device 2000 based on the position information.
In another example, for the image capturing device 2000 with a pan/tilt head, the image capturing device 2000 may adjust the capturing direction at any time according to the pan/tilt head, and in this case, the image association device 3100 may acquire the capturing area corresponding to the image capturing device 2000 by: the image correlation device 3100 acquires a shooting parameter of the camera 2000, the shooting parameter indicating a shooting angle of the camera 2000; based on the shooting angle and the positional information of the imaging device 2000, a shooting area corresponding to the imaging device 2000 is obtained.
In step S3222, the video correlation apparatus 3100, when the shooting area is a seat area of the venue, screens user images corresponding to the shooting area from the user database, and forms an image candidate set.
The method comprises the steps that users in a seat area of a venue hold admission certificates, the admission certificates can be admission entrance tickets, the admission entrance tickets at least comprise seat number information, therefore, the admission certificate information can be the seat number information, and a binding relationship is established between the admission certificate information and user images.
In this embodiment, when the shooting area is a seat area of a venue, the video correlation apparatus 3100 screens a user image formation image candidate set corresponding to the seat number information in a user database based on the acquired seat number information included in the shooting area.
In step S3223, the video correlation apparatus 3100 searches the image candidate set for a user image matching the human feature.
The user image that matches the character feature may be a user image whose degree of similarity is greater than or equal to a second threshold value.
Since the position of the user is relatively fixed in the seating area, it is easier to determine the position, and therefore, the second threshold value may be set to a lower parameter, and is generally smaller than the first threshold value.
In one example, when the live view is an image, the view associating apparatus 3100 identifies a human feature in the single frame image, and matches the human feature with the user images in the image candidate set to find a user image having a similarity greater than or equal to a second threshold with respect to the human feature.
In another example, when the live video is a video, the video associating apparatus 3100 may extract one frame of image in the video, identify the human feature of the single frame of image, match the human feature with the user images in the image candidate set, and find the user image with the similarity greater than or equal to the second threshold. Or extracting multi-frame images in the video, respectively identifying the character features of the multi-frame images, respectively matching the character features corresponding to the multi-frame images with the user images in the image candidate set, and finding out the user images with the similarity of the corresponding character features larger than or equal to a second threshold value.
In step S3230, the video correlation apparatus 3100 acquires a user account bound to the user image as a user account corresponding to the character feature.
In this embodiment, one user image corresponds to one user account, and the user image and the corresponding user account are bound, and when the user image in the user database matches the character features, the video association apparatus 3100 determines that the user image and the character features are images of the same person, so that the video association apparatus 3100 acquires the user account to which the user image is bound, and uses the user account as the user account corresponding to the character features.
In step S3300, the image correlation device 3100 correlates the live image with the corresponding user account.
In this embodiment, the image correlation device 3100 correlates the live image with the corresponding user account.
Therefore, after the live image acquired by the camera device 2000 in the venue is acquired, the method of the embodiment can identify the character features in the live image, obtain the user account corresponding to the character features, and associate the live image with the corresponding user account.
In one embodiment, before the image correlation device 3100 acquires the live image captured by the camera 2000 in the venue, the image correlation method of the present invention may further include the steps of:
in step S3400, the video correlation apparatus 3100 acquires a user image of the incoming user, and stores the user image of the incoming user in the user database.
The user images of the incoming users are provided with numbers in the user database, which may be the order in which the user images are stored in the user database.
In this embodiment, when the entering user passes through the ticket checking device 1000, the face information collecting device 1020 in the ticket checking device 1000 collects a user image of the entering user, and sends the user image of the entering user to the video associating device 3100 in a wireless communication manner, and the video associating device 3100 stores the received user image of the entering user in the user database, so that the user database can store user images of all the entering users.
In step S3500, the video correlation apparatus 3100 establishes a binding relationship between the user image of the entry user and the corresponding user account.
In this embodiment, the video correlation apparatus 3100 is capable of storing the user image of the incoming user in the corresponding user account, so that only the user image of the corresponding incoming user is stored in each user account.
According to the method of the embodiment, the image correlation device can establish a user database according to the user image of the incoming user, establish the binding relationship between the user image and the corresponding account, and further implement the image correlation method according to any embodiment based on the established user database and the binding relationship between the user image and the corresponding account.
In an embodiment, before the video association apparatus 3100 establishes the binding relationship between the user image of the entering user and the corresponding user account, the video association apparatus 3100 may create the user account of the entering user according to the following step S3600, or may create the user account of the corresponding entering user according to the following steps S3700 to S3800, and therefore, the video association method of the present invention may further include the following steps:
in step S3600, the video correlation apparatus 3100 creates a user account corresponding to the user image of the user who enters the venue, based on the user image of the user who enters the venue.
In this embodiment, the user account may be a combination of the item name and the number in step S3400, and the item name and the number are separated by a special character to distinguish them, where the special character is, for example, but not limited to, "+" or the like. In step S3600, the user account for the video association apparatus 3100 creating a user image corresponding to the user of the user may be, according to the user image of the user: when the entry user does not have an entry certificate, the video correlation apparatus 3100 creates a user account corresponding to the user image of the entry user according to the user image of the entry user acquired in step S3400.
In another embodiment, the user account of the entry user may be the same as the entry credential information. The admission voucher can be an admission ticket, and the admission ticket at least comprises seat number information, so that the admission voucher information can be the seat number information. For example, when the admission ticket is an admission ticket, the seat number information is: number 056 row 04 of 123 seats, and at this time, the user account is: 123 seats area 04 row No. 056.
In this embodiment, when the admission user has admission credentials, a user account corresponding to the admission user may be created according to the following steps S3700 to S3800.
In step S3700, the video correlation apparatus 3100 acquires entry credential information of the entry user.
In this embodiment, when the entering user passes through the ticket checking device 1000, the processor 1010 in the ticket checking device 1000 detects the entrance ticket of the entering user, acquires the seat number information in the entrance ticket, and transmits the seat number information to the image correlation device 3100 through a wireless communication method.
In step S3800, the video correlation apparatus 3100 creates a user account corresponding to the entry user, according to the entry credential information.
In this embodiment, when the admission user has an admission ticket, the video correlation apparatus 3100 creates a user account corresponding to the admission user according to the admission ticket information of the admission user acquired in step S3700.
According to the method of the embodiment, when the admission user does not have the admission voucher, the image correlation device 3100 can create a corresponding user account for the admission user according to the user image of the admission user, and when the admission user has the admission voucher, the image correlation device 3100 can create a corresponding user account for the admission user according to the admission voucher information of the admission user, so that different account creating methods are provided for the user, and the user requirements are met.
In an embodiment, after the image association apparatus 3100 associates the live image with the corresponding user account, the image association method of the present invention may further include the following steps:
in step S3900, the image correlation apparatus 3100 performs user identity authentication in response to request information for acquiring a live image sent by the user terminal 4000.
The request information may carry the admission voucher information, specifically, the seat number information.
In this embodiment, the event host may, for example, provide a live advertisement, a public number, a service number, and an application to guide the entry user to input seat number information of the user in the venue at the user terminal 4000, the user terminal 4000 receives the seat number information and then transmits request information for acquiring a live image to the image correlation apparatus 3100, and the image correlation apparatus 3100 performs user authentication after receiving the request information.
(1) In step S3900, the image association apparatus 3100 performing the user identity authentication may further include:
in step S3910, the video correlation apparatus 3100 acquires the entry certificate information carried in the request information.
The admission ticket information may be seat number information.
In step S3920, the video correlation apparatus 3100 searches a user database for a user image bound to the admission voucher information as a target image according to the admission voucher information.
In step S3930, the video correlation apparatus 3100 provides a plurality of user images including the target image to the user terminal 4000 for selection.
In step S3940, the video relating apparatus 3100 performs user authentication based on the selection result sent from the user terminal 4000.
In step S3940, a plurality of user images including a target image may be displayed on the display interface of the user terminal 4000, and the user selects the target image from the plurality of user images, and if the user selects the target image, it indicates that the user is the user holding the admission ticket, and the user identity authentication is passed.
The user identity authentication method in steps S3910 to S3940 is based on the presence of the admission voucher, and the video association apparatus 3100 is a user account created for the admission user according to the admission voucher information, so that after the admission user sends request information for acquiring a live video to the video association apparatus 3100 through the user terminal 4000, the video association apparatus 3100 can acquire the admission voucher information in the request information, and then provide a user image and other user images bound with the admission voucher information to the user terminal, and when a target image is selected, the identity authentication is passed.
(2) In step S3900, the image association apparatus 3100 may further include:
in step S3950, the video correlation apparatus 3100 configures the user terminal 4000 to start a camera of the user terminal 4000 to capture a user image, which is used as an authentication image.
The user terminal 4000 is provided with a camera.
In step S3950, after the image correlation apparatus 3100 responds to the request information for acquiring the live image sent by the user terminal 4000, the user terminal 4000 is configured to start the camera of the user terminal 4000, and the camera of the user terminal 4000 takes a picture of the user to obtain the user image.
In step S3960, the video correlation apparatus 3100 acquires an authentication image provided by the user terminal 4000.
In step S3960, after the user terminal 4000 collects the user image, the user image is used as an authentication image and the authentication image is transmitted to the video relating apparatus 3100 through a wireless communication method.
In step S3970, the video correlation apparatus 3100 searches the user database for a user image matching the authentication image.
In step S3980, the video correlation apparatus 3100 passes the authentication when finding the user image matching the authentication image.
The method of performing user identity authentication in steps S3950 to S3980 is based on that when there is no admission certificate, since the video correlation apparatus 3100 is a user account created according to a user image of an admitted user, after the admitted user sends request information for acquiring a live video to the video correlation apparatus 3100 via the user terminal 4000 (for example, seat number information is input in the user terminal 4000, where a corresponding relationship between the user account and the seat number information is not known), the video correlation apparatus 3100 calls a camera of the user terminal 4000 to photograph the user as an authentication image, and searches a user database for a user image matching the authentication image, and the user image is successfully searched for and the identity authentication passes.
In step S4100, the video correlation apparatus 3100 acquires the user account corresponding to the request information when the authentication is passed.
In step S4200, the image correlation apparatus 3100 acquires the live image added to the user account corresponding to the request information and transmits the acquired live image to the user terminal 4000 for downloading.
According to the method of the embodiment, after the image correlation device 3100 correlates the live image with the corresponding user account, the image correlation device 3100 can respond to the request information for acquiring the live image sent by the user terminal 4000, and after the user identity authentication is passed, acquire the live image added to the user account corresponding to the request information and send the live image to the user terminal 4000 for downloading.
Fig. 7 is a schematic flowchart of an interaction procedure among the ticket gate apparatus 1000, the camera 2000, the image association apparatus 3100, and the user terminal 4000 according to an embodiment of the present invention, in which the image association apparatus 3100 is implemented by the server 3000.
As shown in fig. 1 and fig. 7, the interaction process of the present embodiment may include the following steps:
in step S4211, the ticket gate apparatus 1000 acquires a user image of the entering user.
In step S4212, the ticket gate apparatus 1000 transmits the acquired user image of the user who enters the house to the server 3000.
In step S4220, the server 3000 stores the user image of the user who enters the office into the user database after receiving the user image of the user who enters the office.
In step S4230, the server 3000 creates a user account corresponding to the user image of the entering user, based on the user image of the entering user.
In step S4240, the ticket gate apparatus 1000 detects entrance credential information of the entrance user and transmits the entrance credential information of the entrance user to the server 3000.
The admission ticket may be a ticket and the admission ticket information may be seat number information.
In step S4250, the server 3000 creates a user account corresponding to the entry user according to the entry credential information.
In this embodiment, step S4230 and step S4240-step S4250 respectively show two different methods for creating a user account, and in step S4230, when there is no admission certificate, a user account is created for the admission user according to the user image of the admission user. In step S4240-step S4250, when the admission voucher exists, a user account is created for the admission user according to the admission voucher information.
In this embodiment, the execution of step S4210 and the execution of step S4240 to step S4250 are not in sequence, and step S4240 to step S4250 may be executed first, and then step S4210 is executed.
In step S4260, the server 3000 establishes a binding relationship between the user image of the entry user and the user account.
Step S4271, the camera 2000 in the venue collects the live image.
In step S4272, the camera 2000 in the venue transmits the captured live image to the server 3000.
The venue includes at least a seating area and a non-seating area.
In step S4272, the live image acquired by the imaging device 2000 in the seat area may be a live image or a live image acquired by the imaging device 2000 in the non-seat area.
In step S4280, the server 3000 receives the live image captured by the camera 2000 in the venue.
In step S4290, the server 3000 identifies character features in the live image.
In step S4300, the server 3000 directly searches the user database for a user image matching the character feature according to the identified character feature when the shooting area is the non-seat area of the venue.
In step S4310, the server 3000 acquires position information of the image pickup device 2000.
In step S4320, the server 3000 obtains a shooting area corresponding to the imaging device 2000 based on the position information.
In step S4330, when the shooting area is a seating area of the venue, the server 3000 screens user images corresponding to the shooting area from the user database to form an image candidate set.
In step S4340, the server 3000 searches the image candidate set for a user image matching the character feature.
In this embodiment, step S4300 and step S4310-step S4340 respectively show two different methods for searching the user database for a user image matching the character feature, and in step S4300, when the shooting area is a non-seat area of the venue, the user database is directly searched for a user image matching the character feature. In steps S4310 to S4340, when the shooting area is a seating area of the venue, the image candidate set is obtained from the user database in combination with the position information of the camera device 2000, and the user image matching the character feature is searched for in the image candidate set, which is equivalent to reducing the query range of the user database and improving the query efficiency.
In step S4350, the server 3000 obtains the user account bound to the user image as the user account corresponding to the character feature.
In step S4360, the server 3000 associates the live image with the corresponding user account.
In step S4371, the user terminal 4000 obtains request information of the live image.
In step S4372, the user terminal 4000 sends a request message for acquiring the live image to the server 3000.
In step S4380, the server 3000 responds to the request message, and obtains the admission voucher information carried in the request message.
The admission ticket information may be seat number information.
Step S4390, the server 3000 searches, according to the admission ticket information, the user image bound to the admission ticket information in the user database as a target image.
In step S4400, the server 3000 provides a plurality of user images including the target image to the user terminal 4000 for selection.
In step S4410, the server 3000 performs user identity authentication according to the selection result sent by the user terminal 4000.
In step S4420, the server 3000 configures the user terminal 4000 to start a camera of the user terminal 4000 to capture a user image as an authentication image.
In step S4430, the server 3000 acquires an authentication image provided by the user terminal 4000.
In step S4440, the server 3000 searches the user database for a user image matching the authentication image.
In step S4450, the server 3000 passes the authentication when finding the user image matching the authentication image.
In this embodiment, steps S4380 to S4410 and steps S4420 to S4450 respectively show two different methods for authenticating the user identity, and steps S4380 to S4410 should be performed when the user account is created for the entering user according to the entering credential information of the entering user. When a user account is created for the incoming user based on the user image of the incoming user, steps S4420 to S4450 should be performed.
In step S4460, the server 3000 acquires the user account corresponding to the request information when the authentication is passed.
In step S4470, the server 3000 obtains the live image added to the user account corresponding to the request information and sends the live image to the user terminal 4000 for downloading.
< embodiment of video-related apparatus >
Fig. 8 is a schematic block diagram of a video-related apparatus 3100 according to the present embodiment, and the video-related apparatus 3100 is implemented by a server 3000.
As shown in fig. 8, the image correlation apparatus 3100 of the present embodiment may include a live image obtaining module 3010, an account locking module 3020, and an image adding module 3030.
The live image acquiring module 3010 is configured to acquire a live image acquired by a camera in a venue.
The account locking module 3020 is configured to identify character features in the live image, and obtain a user account corresponding to the character features.
The image adding module 3030 is configured to associate the live image with the corresponding user account.
In one embodiment, as shown in fig. 9, the account locking module 3020 may include a feature recognition unit 3021 and an acquisition unit 3022.
The feature recognition unit is used for recognizing the character features in the live images.
The image matching unit 3021 is configured to search the user database for a user image matching the personal characteristics based on the recognized personal characteristics.
The acquiring unit 3022 is configured to acquire a user account bound with the user image as a user account corresponding to the character feature.
In one embodiment, as shown in fig. 9, the account locking module 3020 may further include an image matching unit 3023.
The acquiring unit 3022 is also configured to acquire a shooting area corresponding to the image pickup apparatus.
The image matching unit 3023 is also configured to, in a case where the shooting area is a seat area of a venue, screen a user image corresponding to the shooting area from the user database to form an image candidate set.
The image matching unit 3023 is also configured to find a user image matching the personal characteristics in the image candidate set.
In one embodiment, as shown in fig. 9, the video association apparatus 3100 may further include an account binding module 3040.
The acquiring unit 3022 is further configured to acquire a user image of the incoming user and store the user image of the incoming user in the user database.
The account binding module 3040 is configured to establish a binding relationship between a user image of an entering user and a corresponding user account.
In one embodiment, the imagery association device 3100 may also include an account creation module 3050.
The account creation module 3050 is configured to create a user account corresponding to the user image of the entry user according to the user image of the entry user.
The acquiring unit 3022 is also configured to acquire entrance credential information of an entrance user.
The account creation module 3050 is further configured to create a user account corresponding to the entry user according to the entry credential information.
In one embodiment, the imagery association device 3100 may also include an identity authentication module 3060.
The identity authentication module 3060 is configured to perform user identity authentication in response to request information sent by the user terminal to obtain the live image.
The acquiring unit 3022 is further configured to acquire the user account corresponding to the request information if the authentication is passed.
The acquiring unit 3022 is further configured to acquire the live image added to the user account corresponding to the request information and send the live image to the user terminal for downloading.
In one embodiment, the obtaining unit 3022 is further configured to obtain the admission voucher information carried in the request information.
The image matching unit 3023 is further configured to search the user database for the user image bound to the entrance voucher information as a target image according to the entrance voucher information.
The image matching unit 3023 is also configured to provide a plurality of user images including the target image to the user terminal for selection.
The identity authentication module 3060 is further configured to perform user identity authentication according to a selection result sent by the user terminal.
In one embodiment, the image association device 3100 may also include a configuration module 3070.
The configuration module 3070 is used for configuring the user terminal to start a camera of the user terminal to collect a user image as an authentication image.
The acquiring unit 3022 is also configured to acquire an authentication image provided by the user terminal.
The image matching unit 3023 is also configured to search the user database for a user image matching the authentication image.
The identity authentication module 3060 is also used to authenticate if a user image matching the authentication image is found.
< Server embodiment >
Fig. 10 is a functional block diagram of the server 3000 according to the present embodiment.
As shown in fig. 10, the server 3000 may include a video association apparatus 3100 according to any embodiment of the invention.
The image association apparatus 3100 is configured to implement the image association method according to any one of the embodiments.
The present invention may be a system, method and/or computer program product. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied therewith for causing a processor to implement various aspects of the present invention.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device over a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present invention may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present invention are implemented by personalizing an electronic circuit, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA), with state information of computer-readable program instructions, which can execute the computer-readable program instructions.
Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, by software, and by a combination of software and hardware are equivalent.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or technical improvements to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the invention is defined by the appended claims.

Claims (9)

1. An image association method implemented by a server side, comprising:
acquiring a field image acquired by a camera device in a venue;
identifying character features in the live image to obtain a user account corresponding to the character features;
associating the live image with the corresponding user account number,
after the live image is associated with the corresponding user account, the method further includes:
responding to request information for acquiring the on-site image sent by a user terminal, and performing user identity authentication;
under the condition that the authentication is passed, acquiring a user account corresponding to the request information;
and acquiring the on-site image added in the user account corresponding to the request information, and sending the on-site image to the user terminal for downloading.
2. The method of claim 1, wherein the step of identifying the character in the live image to obtain the user account corresponding to the character comprises:
identifying character features in the live image;
searching a user image matched with the character characteristics in a user database according to the identified character characteristics;
and acquiring a user account bound with the user image as a user account corresponding to the character characteristics.
3. The method of claim 2, wherein the step of searching the user database for the user image matching the character feature comprises:
acquiring a shooting area corresponding to the camera device;
screening user images corresponding to the shooting area from the user database under the condition that the shooting area is a seat area of a venue to form an image candidate set;
and searching the image candidate set for the user image matched with the character characteristic.
4. The method of claim 2, further comprising, prior to acquiring the live image captured by the camera device in the venue:
acquiring a user image of an entering user, and storing the user image of the entering user into the user database;
and establishing a binding relationship between the user image of the entry user and the corresponding user account.
5. The method according to claim 4, wherein before the establishing the binding relationship between the user image of the entering user and the corresponding user account, the method further comprises:
according to the user image of the entry user, creating a user account corresponding to the user image of the entry user; alternatively, the first and second electrodes may be,
obtaining entrance voucher information of an entrance user;
and creating a user account corresponding to the entrance user according to the entrance voucher information.
6. The method of claim 1, wherein the step of authenticating the user identity comprises:
obtaining entrance certificate information carried in the request information;
according to the entrance voucher information, searching a user image bound with the entrance voucher information in a user database as a target image;
providing a plurality of user images including the target image to the user terminal for selection; and the number of the first and second groups,
and performing user identity authentication according to the selection result sent by the user terminal.
7. The method of claim 1, wherein the step of authenticating the user identity comprises:
configuring the user terminal to start a camera of the user terminal to acquire a user image as an authentication image;
acquiring the authentication image provided by the user terminal;
searching a user database for a user image matched with the authentication image; and the number of the first and second groups,
and under the condition that the user image matched with the authentication image is found, the authentication is passed.
8. An image correlation apparatus implemented by a server side, comprising:
the on-site image acquisition module is used for acquiring an on-site image acquired by a camera device in a venue;
the account locking module is used for identifying character features in the live image to obtain a user account corresponding to the character features;
the image adding module is used for associating the on-site image with the corresponding user account; and
the identity authentication module is used for responding to request information for acquiring the on-site image sent by the user terminal to carry out user identity authentication,
the account locking module is further used for acquiring the user account corresponding to the request information under the condition that the authentication is passed, acquiring the field image added in the user account corresponding to the request information, and sending the field image to the user terminal for downloading.
9. A server comprising the image correlation apparatus of claim 8; alternatively, the server comprises a memory for storing instructions for controlling the processor to operate to perform the image correlation method according to any one of claims 1 to 7 and a processor.
CN201811420110.8A 2018-11-26 2018-11-26 Image correlation method and device and server Active CN111223220B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811420110.8A CN111223220B (en) 2018-11-26 2018-11-26 Image correlation method and device and server

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811420110.8A CN111223220B (en) 2018-11-26 2018-11-26 Image correlation method and device and server

Publications (2)

Publication Number Publication Date
CN111223220A CN111223220A (en) 2020-06-02
CN111223220B true CN111223220B (en) 2022-07-12

Family

ID=70828788

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811420110.8A Active CN111223220B (en) 2018-11-26 2018-11-26 Image correlation method and device and server

Country Status (1)

Country Link
CN (1) CN111223220B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112667918A (en) * 2020-12-24 2021-04-16 郑贤良 Communication method based on social communication tool
CN114741557B (en) * 2022-03-31 2022-11-15 慧之安信息技术股份有限公司 View database management and classification method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106095465A (en) * 2016-06-23 2016-11-09 北京小米移动软件有限公司 The method and device of identity image is set
CN107071143A (en) * 2017-01-17 2017-08-18 广东欧珀移动通信有限公司 Management method and device, the terminal of a kind of image
CN108280649A (en) * 2018-02-24 2018-07-13 广州逗号智能零售有限公司 Method of payment and device
CN108551519A (en) * 2018-03-05 2018-09-18 腾讯科技(深圳)有限公司 A kind of information processing method, device, storage medium and system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106780256A (en) * 2016-12-16 2017-05-31 北京华鼎新铭智能科技发展有限公司 A kind of comprehensive safeguard information management system
CN107341885A (en) * 2017-07-03 2017-11-10 杭州复恒科技有限公司 A kind of intelligent door lock management system
CN108156161A (en) * 2017-12-27 2018-06-12 百度在线网络技术(北京)有限公司 Verification method and device
CN108198315A (en) * 2018-01-31 2018-06-22 深圳正品创想科技有限公司 A kind of auth method and authentication means
CN108470392B (en) * 2018-03-26 2021-03-02 广州津虹网络传媒有限公司 Video data processing method
CN108881728B (en) * 2018-07-26 2021-03-30 北京京东尚科信息技术有限公司 Offline cross-device image shooting method and system and shooting device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106095465A (en) * 2016-06-23 2016-11-09 北京小米移动软件有限公司 The method and device of identity image is set
CN107071143A (en) * 2017-01-17 2017-08-18 广东欧珀移动通信有限公司 Management method and device, the terminal of a kind of image
CN108280649A (en) * 2018-02-24 2018-07-13 广州逗号智能零售有限公司 Method of payment and device
CN108551519A (en) * 2018-03-05 2018-09-18 腾讯科技(深圳)有限公司 A kind of information processing method, device, storage medium and system

Also Published As

Publication number Publication date
CN111223220A (en) 2020-06-02

Similar Documents

Publication Publication Date Title
US9189611B2 (en) Adapting content and monitoring user behavior based on facial recognition
CN108399349B (en) Image recognition method and device
KR102661983B1 (en) Method for processing image based on scene recognition of image and electronic device therefor
US20150016693A1 (en) Method and Apparatus for Prioritizing Image Quality of a Particular Subject within an Image
CN111726521B (en) Photographing method and photographing device of terminal and terminal
CN109788189A (en) The five dimension video stabilization device and methods that camera and gyroscope are fused together
CN104702826A (en) Image pickup apparatus and method of controlling same
CN103139466A (en) Information processing apparatus, imaging apparatus, information processing method, and program
CN108513069B (en) Image processing method, image processing device, storage medium and electronic equipment
US20140210941A1 (en) Image capture apparatus, image capture method, and image capture program
CN106031147A (en) Method and system for adjusting camera settings using corneal reflection
CN109409147A (en) A kind of bar code recognition and device
JPWO2017047012A1 (en) Imaging device and system including imaging device and server
CN111223220B (en) Image correlation method and device and server
CN110830730A (en) Apparatus and method for generating moving image data in electronic device
JP2014050022A (en) Image processing device, imaging device, and program
CN111246224A (en) Video live broadcast method and video live broadcast system
US20210258505A1 (en) Image processing apparatus, image processing method, and storage medium
US20150002731A1 (en) Optical Field Communication
KR20140115984A (en) Image capture apparatus, a communication method, a storage medium and a communication system
EP3955152A1 (en) Image processing device, image processing method, program, and imaging device
JP2015015570A (en) System, apparatus and program for image management
CN111212218A (en) Shooting control method and device and shooting system
JP2012074894A (en) Electronic camera
TWI437881B (en) Automatic photographing system and method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant