WO2018092378A1 - Dispositif de traitement d'informations, procédé de traitement d'informations et programme - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations et programme Download PDF

Info

Publication number
WO2018092378A1
WO2018092378A1 PCT/JP2017/030514 JP2017030514W WO2018092378A1 WO 2018092378 A1 WO2018092378 A1 WO 2018092378A1 JP 2017030514 W JP2017030514 W JP 2017030514W WO 2018092378 A1 WO2018092378 A1 WO 2018092378A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
person
information processing
unit
control unit
Prior art date
Application number
PCT/JP2017/030514
Other languages
English (en)
Japanese (ja)
Inventor
温子 國定
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Publication of WO2018092378A1 publication Critical patent/WO2018092378A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • Some embodiments of the present invention relate to an information processing apparatus, an information processing method, and a program.
  • Patent Literature 1 discloses an image distribution server that recognizes a person copied in an image requested by a user and distributes the image to the person.
  • the server registers the face image of the person for whom the user wants to distribute the person image and the e-mail address of the person's transmission destination for each user in the user database, and the person image is uploaded from the user terminal.
  • the person to be distributed is identified by comparing the face image of the person copied in the received person image with the face image to be recognized registered in the user database, and the person image is distributed based on the e-mail address.
  • Some aspects of the present invention have been made in view of the above points, and provide an information processing apparatus, an information processing method, and a program for protecting the privacy of a person included in an image to be provided.
  • One aspect of the present invention is made to solve the above-described problem, and one aspect of the present invention includes a communication unit and a control unit, and the control unit is configured to obtain a part of a person or an image from an acquired image. An image area including all of the images is extracted, and whether or not the image area corresponds to the predetermined person is determined based on the specific information for specifying the predetermined person, and it is determined that the image area does not correspond to the predetermined person.
  • the information processing apparatus generates a processed image by processing the image area and provides the processed image to a predetermined person via the communication unit.
  • One embodiment of the present invention is an information processing method performed by an information processing device, the step of extracting an image region including a part or all of a person from an acquired image, and the image region corresponds to a predetermined person Determining whether or not based on specific information for specifying a predetermined person, and generating a processed image by processing the image area when it is determined that the image area does not correspond to the predetermined person And an information processing method comprising: providing a processed image to a predetermined person.
  • a procedure for extracting an image area including a part or all of a person from an acquired image and whether or not the image area corresponds to the predetermined person A procedure for determining based on specific information for specifying, a procedure for generating a processed image by processing the image region when it is determined that the image region does not correspond to the predetermined person, and a processed image for the predetermined person A program for executing the provided procedure.
  • the privacy of a person included in an image to be provided can be protected.
  • FIG. 1 is a configuration diagram showing a configuration of an information processing system 1 according to the present embodiment.
  • the information processing system 1 includes an information processing device 10 and a server device 20, and the information processing device 10 and the server device 20 are connected by a wired or wireless network.
  • the information processing apparatus 10 is an information processing apparatus such as a digital camera, a smartphone, or a tablet having an imaging function, and in the present embodiment, is a fixed camera installed in a theme park.
  • the information processing apparatus 10 includes a control unit 11, a determination unit 111, an image processing unit 112, a provision control unit 113, an imaging unit 12, a storage unit 13, and a communication unit 14. The function of each part will be described later.
  • the server device 20 is a server device installed on a network, and is, for example, a personal computer or a notebook computer.
  • the server device 20 includes a control unit 21, a provision destination specifying unit 211, a storage unit 22, and a communication unit 23. The function of each part will be described later.
  • FIG. 2A and 2B are explanatory diagrams illustrating an example of image processing by the information processing system 1.
  • FIG. 2A is an example of an image captured by the imaging unit 12 of the information processing apparatus 10.
  • the image shown in FIG. 2A includes five persons A, B, C, D, and E from the left.
  • C and D are registered in advance in the user database of the information processing apparatus 10 as users of the captured image distribution service.
  • the determination unit 111 of the information processing apparatus 10 extracts persons (A, B, C, D, E) included in the captured image, and determines whether each person matches a person registered in the database. judge. As a result, the determination unit 111 determines that A, B, and E do not match the person registered in the database.
  • the image processing unit 112 processes the images A, B, and E that are determined not to match the person registered in the database. Specifically, a process of blurring the face portions of A, B, and E is performed so that no one can identify A, B, and E.
  • FIG. 2B is an example of an image after processing. As shown in FIG. 2B, the face portions A, B, and E are blurred so that an individual cannot be identified. Thereafter, the provision control unit 113 of the information processing apparatus 10 requests the server apparatus 20 for transmission destination information of the processed image.
  • the provision destination specifying unit 211 of the server apparatus 20 refers to the provision destination database, extracts C and D transmission destination information (e-mail address and the like), and notifies the information processing apparatus 10 of the information.
  • the provision control unit 113 of the information processing apparatus 10 transmits the processed image to C and D based on the acquired transmission destination information.
  • the imaging unit 12 includes a lens, a CCD (Charge Coupled Device) or CMOS (Complementary MOS) imaging device, a signal processing unit, and the like.
  • the imaging unit 12 periodically outputs the captured image to the storage unit 13.
  • the storage unit 13 includes, for example, an HDD (Hard Disc Drive), a flash memory, an EEPROM (Electrically Erasable Programmable Read Only Memory), a ROM (Read Only Memory, etc.), or a RAM (Random Memory Program).
  • various programs to be executed by a CPU (Central Processing Unit, not shown) included in the information processing apparatus 10 and results of processing executed by the CPU are stored.
  • the storage unit 13 stores a user database D1.
  • the user database D1 will be described with reference to FIG.
  • FIG. 3 is a diagram illustrating an example of the configuration of the user database D1 stored in the storage unit 13.
  • the user database D1 has a plurality of records representing information of users registered in the image transmission service. Each record has specific information for specifying a user.
  • the specific information includes a face feature parameter, an accessory parameter, a QR (Quick Response) code (registered trademark) parameter, and the like.
  • the face feature parameter is information indicating the feature of the user's face.
  • the face feature parameter is, for example, a feature value obtained by quantifying the shape, size, positional relationship, etc. of each part of the face (eyes, nose, mouth, etc.). It can be determined whether or not.
  • the face feature parameter may be image information of a face or each part of the face.
  • the clothing parameter is a parameter representing the clothing worn by the user on the day.
  • the clothing parameter may be acquired from an image taken at the time of entering the theme park, or may be obtained by receiving an image of the clothing of the day from the user.
  • the ornament parameter is a parameter representing the ornament worn by the user on the day. The method for obtaining the decoration parameter is the same as the clothing parameter.
  • the QR code parameter is information representing an identifier for QR code assigned to the user. For example, a sticker printed with a QR code encoded with a QR code identifier is pasted on a user's clothes at the time of entering the theme park, the imaging unit 12 reads the QR code, and the control unit 11 decodes the QR code. Then, the user can be specified by comparing with the QR code identifier of each user.
  • the user database D1 has group information.
  • the group information is information indicating a group to which the user belongs.
  • the same identifier is assigned to the same group based on information at the time of user registration.
  • the group information may not be held in the user database D1, and may be used by referring to information in another database (for example, a user's phone book, friend list, etc.).
  • the communication unit 14 includes a communication interface for performing communication between apparatuses via a wired or wireless network, and communicates with the server apparatus 20.
  • the network is, for example, a mobile phone network, a VPN (Virtual Private Network) network, a dedicated communication line network, a WAN (Wide Area Network), a LAN (Local Area Network), a PSTN (Public Switched Telephony network; An information communication network to be configured, or a combination thereof.
  • the control unit 11 controls various configurations included in the information processing apparatus 10. A part or all of the functions of the control unit 11 may be realized, for example, by executing a program stored in the storage unit 13 by a CPU included in the information processing apparatus 10.
  • the control unit 11 includes a determination unit 111, an image processing unit 112, and a provision control unit 113.
  • the determination unit 111 extracts a person included in the image captured by the imaging unit 12.
  • Various methods can be used to extract a person. For example, a method using a background difference that is detected based on a difference between an image in which only the background is captured in advance and an input image may be used. Further, for example, a method may be used in which a pattern image obtained by cutting out an image in a detection window having a set size is determined as a person.
  • the determination unit 111 may extract a part of the person (for example, a face) instead of the whole person and use it for the determination.
  • the determination unit 111 determines whether the extracted person matches a predetermined condition.
  • the predetermined condition is, for example, whether or not the extracted person matches the person registered in the user database D1 of the storage unit 13. In that case, the determination unit 111 compares the parameter calculated from the extracted person and each parameter of the person registered in the user database D1, and if any parameter matches or the matching degree is higher than a predetermined threshold, the determination unit 111 extracts It is determined that the selected person matches the person registered in the user database D1. Further, the determination unit 111 may determine that the extracted person matches the person registered in the user database D1 when the degree of coincidence of the plurality of parameters is higher than a predetermined threshold.
  • the determination unit 111 outputs the determined result to the image processing unit 112 and the provision control unit 113.
  • the image processing unit 112 processes a person image included in the image captured by the imaging unit 12 based on the determination result input from the determination unit 111, and generates a processed image.
  • the image processing unit 112 processes an image of a person who has been determined not to meet a pre-registered condition so that an individual cannot be identified. For example, the image processing unit 112 performs a process of blurring (mosaic) the face portion of the target person. The grain size of the blurring process can be set as appropriate.
  • the image processing unit 112 may fill the face portion of the target person, or may superimpose and display another image.
  • the portion to be processed is not limited to the face portion of the target person, but may be a part or all of the body of the target person or a region including the body.
  • the image processing unit 112 may perform processing so that the target person does not exist by using a background image stored in advance.
  • the image processing unit 112 may perform processing so that the target person does not exist by superimposing and displaying the background image that should exist behind the target person on the front (front side) of the target person.
  • the provision control unit 113 acquires transmission destination information of the person in the image determined to meet a predetermined condition based on the determination result input from the determination unit 111.
  • the destination information is, for example, an e-mail address or an address for sending an SNS (Social Networking Service) message.
  • the provision control unit 113 acquires the transmission destination information of the target person from the server device 20 via the communication unit 14.
  • the acquisition destination of the transmission destination information is not limited to the server device 20 and may be acquired from another external device or the storage unit 13.
  • the providing control unit 113 transmits the processed image processed by the image processing unit 112 via the communication unit 14 based on the acquired transmission destination information.
  • the storage unit 22 includes, for example, an HDD, a flash memory, an EEPROM, a ROM, or a RAM, and is executed by various programs executed by a CPU (not shown) included in the server device 20 such as firmware and application programs. The result of processing is stored.
  • the storage unit 22 stores a provision destination database D2 (not shown).
  • the provision destination database D2 has a record for each registered user, and each record has each value of an image transmission method and transmission destination information.
  • the transmission method indicates a method (means) for transmitting an image such as an electronic mail or an SNS message, and the destination information includes an electronic mail address corresponding to the transmission method, an address for transmitting an SNS message, or the like.
  • the communication unit 23 includes a communication interface for performing communication between apparatuses via a wired or wireless network, and communicates with the information processing apparatus 10.
  • the network is, for example, an information communication network configured by a mobile phone network, a VPN network, a dedicated communication line network, WAN, LAN, PSTN, or the like, or a combination thereof.
  • the control unit 21 controls various configurations included in the server device 20. A part or all of the functions of the control unit 21 may be realized, for example, by executing a program stored in the storage unit 22 by a CPU included in the server device 20.
  • the control unit 21 includes a provision destination specifying unit 211.
  • the provision destination specifying unit 211 refers to the provision destination database D2 based on a request from the information processing apparatus 10 and extracts transmission destination information corresponding to the designated user transmission method.
  • the provision destination specifying unit 211 notifies the information processing apparatus 10 of the extracted transmission destination information via the communication unit 23.
  • FIG. 4 is a flowchart showing an example of the operation of the information processing system 1 according to the present embodiment.
  • the imaging unit 12 of the information processing apparatus 10 captures an image included in the imaging range based on an instruction from the control unit 11.
  • An opportunity for shooting may be a shooting instruction from the outside, or may be shooting periodically.
  • the image of FIG. 2A is captured. Thereafter, the process proceeds to step S102.
  • Step S102 The determination unit 111 extracts a person included in the captured image. In this example, five persons A, B, C, D, and E are extracted. Thereafter, the process proceeds to step S103.
  • Step S103 The determination unit 111 determines whether each of the extracted persons A, B, C, D, and E is registered in the user database.
  • the determination unit 111 performs registration determination by comparing each feature amount (parameter) extracted from A, B, C, D, and E with each determination parameter in the user database D1. Thereafter, the process proceeds to step S104.
  • Step S104 If each parameter of the extracted person does not match any of the determination parameters in the user database D1, the determination unit 111 determines that no one is registered in the user database D1 (Step S104 / NO). ), The process is terminated. If any one of the extracted parameters of the person matches any of the determination parameters of the user database D1, the determination unit 111 determines that the person is registered in the user database D1 (step S104). / YES), the process proceeds to step S105. In this example, since the degree of coincidence between the facial feature parameters extracted from C and D and the facial feature parameters of the users C and D in the user database D1 is greater than a predetermined threshold, C and D are stored in the user database D1. Is determined to be registered. In addition, the determination unit 111 may perform the determination based on the matching degree of the clothing parameter, the accessory parameter, and the QR code parameter.
  • Step S105 The determination unit 111 determines whether there is a person in the image other than the person registered in the user database D1. When it is determined that there is no person other than the person registered in the user database D1 in the image (step S105 / NO), the process proceeds to step S107. When it is determined that there is a person other than the person registered in the user database D1 in the image (step S105 / YES), the process proceeds to step S106. In this example, since A, B, and E that are not registered in the user database D1 exist in the image, the process proceeds to step S106.
  • Step S106 The image processing unit 112 processes an image of a person who is not registered in the user database D1.
  • blurring is performed on the face portions of A, B, and E that are not registered in the user database D1. Thereafter, the process proceeds to step S107.
  • Step S107 The providing control unit 113 requests and acquires the transmission destination information of the person registered in the user database D1 from the server device 20.
  • the provision control unit 113 requests the server apparatus 20 for C and D transmission destination information that is an image transmission target, and acquires an e-mail address that is corresponding transmission destination information.
  • Step S108 The providing control unit 113 transmits an image based on the acquired transmission destination information.
  • the processed image is attached and transmitted as an attached file to C and D email addresses.
  • movement of the information processing system 1 which concerns on this embodiment is complete
  • the information processing apparatus 10 includes the communication unit 14 and the control unit 11, and the control unit 11 is configured to obtain a part of a person from the acquired image or An image area including all is extracted, and whether or not the image area corresponds to a predetermined person is determined based on specific information for specifying the predetermined person, and the image area corresponds to the predetermined person
  • a processed image is generated by processing the image area, and the processed image is provided to the predetermined person via the communication unit.
  • This configuration makes it possible to determine the person other than the service registered user reflected in the captured image and process the person so that the individual cannot be identified. Therefore, the privacy of the person included in the provided image can be protected. In addition, it is possible to provide an image to a service registered user simply and quickly.
  • FIG. 5 is a flowchart showing an example of the operation of the information processing system 1 according to the present embodiment. Description of the same steps as those in FIG. 4 will be omitted, and differences will be described.
  • the determination unit 111 determines whether the users determined to be registered in the user database D1 belong to the same group. The determination unit 111 performs the determination based on the value of the group information in the user database D1. For example, in the example of FIG. 2A, it is assumed that A and B are registered as a group 1 and C and D are registered as a group 2 in the user database D1. Then, when determining for C, the determination unit 111 determines that A and B are other groups, D is the same group, and E is not registered. Therefore, in addition to the person (D) belonging to the same group, there are persons in the image (A, B, E) (step S205 / YES), the process proceeds to step S206, and the person (A, The image of B, E) is processed.
  • step S207 the provision control unit 113 acquires transmission destination information from the server device 20.
  • the providing control unit 113 may acquire only transmission destination information of the transmission target (C) as a transmission destination, or transmission of a person (D) belonging to the same group as the transmission target (C). Prior information may be acquired.
  • the control unit 11 does not process an image region corresponding to a person belonging to the same group as a predetermined person. According to this configuration, since an image of a person who does not belong to the same group is processed so that an individual cannot be specified, the privacy of the person included in the provided image can be protected.
  • the control unit 11 provides the processed image to at least one person who belongs to the same group as the predetermined person. With this configuration, an image can be provided to members belonging to the group simply and quickly.
  • the information processing apparatus 10 is an information processing apparatus having an imaging function such as a smartphone, a tablet, or a digital camera possessed by the user, and an example in which the user captures a landscape using the information processing apparatus 10. Will be explained.
  • FIG. 6 is an explanatory diagram illustrating an example of the operation of the information processing system 1 according to the present embodiment.
  • Step S301 The imaging unit 12 of the information processing apparatus 10 captures a landscape image in the imaging range based on a user operation.
  • the control unit 11 may recognize that the user captures a landscape by switching the setting to a landscape mode or the like before capturing. Thereafter, the process proceeds to step S302.
  • Step S302 The determination unit 111 extracts a person included in the captured image. Thereafter, the process proceeds to step S303.
  • Step S303 The determination unit 111 determines whether or not a person is included in the captured image. If the person is included in the captured image (YES in Step S303), the process proceeds to Step S304. If no person is included in the captured image (step S303 / NO), the process proceeds to step S307.
  • Step S304 The determination unit 111 determines whether the extracted person is registered in the user database D1. The determination unit 111 further determines whether or not the extracted person belongs to the same group as the user. Thereafter, the process proceeds to step S305.
  • Step S305 When there is no person belonging to the same group as the user (step S305 / NO), the process proceeds to step S306. In this example, since the user is photographing a landscape, it is assumed that no person belonging to the same group as the user exists in the captured image, but when there is a registered user (step S305 / YES), the first The process proceeds to step S105 of the embodiment or step S205 of the second embodiment.
  • Step S306 The image processing unit 112 performs a blurring process on the extracted person image.
  • the extracted person is not registered in the image transmission service that is captured when the user has photographed the landscape, or is registered in the image transmission service but belongs to a group different from the user. Therefore, by processing the extracted person image so that an individual cannot be identified, the privacy of the person can be protected. Thereafter, the process proceeds to step S307.
  • Step S307 The provision control unit 113 requests the server device 20 for transmission destination information of a person belonging to the same group as the user.
  • the server device 20 refers to the user database D1, extracts the transmission destination information of a person belonging to the same group as the requesting user, and notifies the user. Thereafter, the process proceeds to step S308.
  • Step S308 The providing control unit 113 transmits an image to persons belonging to the same group based on the acquired transmission destination information.
  • movement of the information processing system 1 which concerns on this embodiment is complete
  • the information processing system 1 can be applied even when a user or a person in the same group as the user is not shown.
  • FIG. 7 is a flowchart showing an example of the operation of the information processing system 1 according to the present embodiment. Description of the same steps as those in FIG. 4 will be omitted, and differences will be described.
  • the image processing unit 112 determines whether to perform image trimming. The image processing unit 112 may make the determination based on the information on necessity of trimming registered in the user database D1, or may make the determination based on information in another database held by the storage unit 13. . Further, the image processing unit 112 may receive a user operation on an input unit (not shown) and determine whether to perform trimming. If it is determined that trimming is to be performed (step S407 / YES), the process proceeds to step S408. When it is determined not to perform trimming (step S407 / NO), the process proceeds to step S409.
  • step S408 the image processing unit 112 performs image trimming.
  • An example of the trimming process will be described with reference to FIGS. 8A and 8B.
  • 8A and 8B are explanatory diagrams illustrating an example of trimming processing according to the present embodiment.
  • an example in which the image subjected to the blurring process in step S406 is the image shown in FIG.
  • the image processing unit 112 selects C as a trimming target.
  • the image processing unit 112 may perform the determination based on information registered in the user database D1, or may perform the determination based on information in another database held by the storage unit 13.
  • the image processing unit 112 may select a user who transmits an image as a trimming target. Further, the image processing unit 112 may receive a user operation on an input unit (not shown) and determine whether to perform trimming.
  • the image processing unit 112 performs trimming so that the face C to be trimmed is positioned at the center of the image.
  • FIG. 8A is an example of an image after performing trimming on C. As shown in FIG. 8A, cropping is performed so that the face of C is positioned at the center of the image. The center portion of the image refers to a certain area near the center of the image, but the position of the face of the target person after trimming only needs to be closer to the center than the original captured image.
  • the image processing unit 112 may perform image enlargement processing together with trimming.
  • FIG. 8B is an example of an image after the image enlargement process is performed. The magnification of the enlargement process can be set as appropriate.
  • the image processing unit 112 is not limited to the face of the person to be trimmed, for example, obtains the center coordinates of the body of the person to be trimmed, and performs trimming so that the center part of the body of the target person is located at the center of the image. Also good. Further, for example, the image processing unit 112 may perform the trimming so that the central part of the face or body of the members (C and D in FIGS. 8A and 8B) belonging to the same group is located in the central part of the image. As a result, an image in which the group member is located at the center can be generated.
  • the control unit 11 includes the image region determined to correspond to a predetermined person and does not include at least a part of the image. Such a processed image is generated (trimmed). With this configuration, the image is processed so that the user is positioned in the vicinity of the center of the image, so that an image suitable for each user can be provided simply and quickly.
  • FIG. 9 is a block diagram illustrating a functional configuration of the information processing apparatus 30 according to the present embodiment.
  • the information processing apparatus 30 is an information processing apparatus having an imaging function such as a smartphone, a tablet, or a digital camera.
  • the information processing apparatus 30 includes a control unit 11, a determination unit 111, an image processing unit 112, a provision control unit 313, an imaging unit 12, a storage unit 33, and a communication unit 14.
  • the storage unit 33 has the same function as the storage unit 13 in the first embodiment, and further stores a provision destination database D2.
  • the provision destination database D2 has a record for each registered user, and each record has each value of an image transmission method and transmission destination information.
  • the transmission method indicates a method (means) for transmitting an image such as an electronic mail or an SNS message, and the destination information includes an electronic mail address corresponding to the transmission method, an address for transmitting an SNS message, or the like.
  • the storage unit 33 may hold the user database D1 and the provision destination database D2 as a single database.
  • the providing control unit 313 acquires the transmission destination information of the person registered in the user database D1 with reference to the providing destination database D2 stored in the storage unit 33.
  • step S107 the provision control unit 113 requests and obtains the transmission destination information of the person registered in the user database D1 from the server device 20, but as described above, the provision control unit 113 In step 313, the transmission destination information of the person registered in the user database D ⁇ b> 1 is acquired with reference to the provision destination database D ⁇ b> 2 stored in the storage unit 33.
  • Other operations are the same as those in the above-described embodiment.
  • the information processing apparatus 30 includes the communication unit 14 and the control unit 11, and the control unit 11 selects an image area that includes a part or all of a person from the acquired image.
  • the control unit 11 selects an image area that includes a part or all of a person from the acquired image.
  • This configuration makes it possible to determine the person other than the service registered user reflected in the captured image and process the person so that the individual cannot be identified. Therefore, the privacy of the person included in the provided image can be protected. In addition, it is possible to provide an image to a service registered user simply and quickly.
  • FIG. 10 is a configuration diagram showing a configuration of the information processing system 2 according to the present embodiment.
  • the information processing system 2 includes an information processing device 40 and a server device 50.
  • the information processing apparatus 40 includes a control unit 41, an imaging unit 12, a storage unit 43, and a communication unit 14.
  • the control unit 41 controls various configurations included in the information processing apparatus 40.
  • the determination unit 111, the image processing unit 112, and the provision control unit 113 are not provided.
  • the storage unit 43 stores various programs to be executed by the CPU included in the information processing apparatus 40, results of processes executed by the CPU, and the like.
  • the user database D1 is not stored.
  • the server device 50 includes a control unit 51, a storage unit 52, and a communication unit 53.
  • the storage unit 52 stores various programs to be executed by the CPU included in the server device 50, results of processing executed by the CPU, and the like.
  • the storage unit 52 further stores a user database D1 and a provision destination database D2. The contents of each database are the same as in the above embodiment. Since the communication unit 53 is the same as the communication unit 23 according to the first embodiment, a description thereof will be omitted.
  • the control unit 51 includes a determination unit 511, an image processing unit 512, a provision destination specifying unit 513, and a provision control unit 514. Since each operation is the same as that of the above-described embodiment, description thereof is omitted.
  • the server device 50 includes the communication unit 53 and the control unit 51, and the control unit 51 extracts an image area including a part or all of a person from the acquired image. Determining whether or not the image area corresponds to a predetermined person based on identification information for specifying the predetermined person, and determining that the image area does not correspond to the predetermined person, The processed image is processed to generate a processed image, and the processed image is provided to the predetermined person via the communication unit.
  • This configuration makes it possible to determine the person other than the service registered user reflected in the captured image and process the person so that the individual cannot be identified. Therefore, the privacy of the person included in the provided image can be protected. In addition, it is possible to provide an image to a service registered user simply and quickly.
  • a server apparatus for example, a determination part, an image process part, a provision control part, etc. with a computer.
  • a program for realizing this function may be recorded on a computer-readable recording medium, and the program recorded on the recording medium may be read into a computer system and executed.
  • the “computer system” is a computer system built in the information processing apparatus and server apparatus, and includes hardware such as an OS (Operating System) and peripheral devices.
  • the “computer-readable recording medium” means a storage device such as a flexible disk, a magneto-optical disk, a portable medium such as a ROM and a CD-ROM, and a hard disk incorporated in a computer system.
  • the “computer-readable recording medium” is a medium that dynamically holds a program for a short time, such as a communication line when transmitting a program via a network such as the Internet or a communication line such as a telephone line, In this case, a volatile memory inside a computer system that serves as a server or a client may be included that holds a program for a certain period of time.
  • the program may be a program for realizing a part of the functions described above, and may be a program capable of realizing the functions described above in combination with a program already recorded in a computer system.
  • part or all of the information processing apparatus and the server apparatus in the above-described embodiment may be realized as an integrated circuit such as an LSI (Large Scale Integration).
  • LSI Large Scale Integration
  • Each functional unit of the information processing apparatus and the server apparatus may be individually made into a processor, or a part or all of them may be integrated into a processor.
  • the method of circuit integration is not limited to LSI, and may be realized by a dedicated circuit or a general-purpose processor.
  • an integrated circuit based on the technology may be used.
  • the image to be processed has been described as a still image, but the processing target may be a set of a plurality of continuous images, that is, a moving image.
  • the image transmission may be performed by either the information processing apparatus or the server apparatus in the information processing system.
  • the information processing apparatus and the server apparatus may perform transmission processing in a distributed manner.
  • the provision of an image is not limited to a form in which an image is directly transmitted, and may be realized by instructing transmission to another device.
  • an image may be uploaded on the Internet, and a URL (Uniform Resource Locator) indicating the upload location may be notified.
  • an image may be posted (uploaded) on an SNS site used by the user.
  • the functions of the information processing system in the above-described embodiment are not limited to those shown in the drawings, and may be configured by functionally or physically distributed and combined as appropriate.
  • the object of the present invention may be realized by concentrating and deploying each function in the information processing apparatus or the server apparatus.
  • a communication unit and a control unit are provided, and the control unit extracts an image area including a part or all of a person from the acquired image, and determines whether or not the image area corresponds to a predetermined person. Determining based on specific information for specifying a predetermined person, and when determining that the image area does not correspond to the predetermined person, processing the image area to generate a processed image, and An information processing apparatus that provides the processed image to the predetermined person via
  • control unit provides the processed image to at least one person belonging to the same group as the predetermined person.
  • the control unit generates the processed image that includes the image area determined to correspond to the predetermined person and does not include at least a part of the image.
  • An information processing apparatus according to claim 1.
  • An information processing method performed by the information processing apparatus, the step of extracting an image area including a part or all of a person from the acquired image, and whether or not the image area corresponds to a predetermined person. Determining based on specific information for specifying a predetermined person; and when determining that the image area does not correspond to the predetermined person, processing the image area to generate a processed image; Providing the processed image to the predetermined person.
  • Procedure for extracting an image area including a part or all of a person from an acquired image to a computer and specifying for specifying the predetermined person whether or not the image area corresponds to the predetermined person A procedure for determining based on information; a procedure for processing the image region to generate a processed image when it is determined that the image region does not correspond to the predetermined person; and Program to execute the provided procedure.
  • Some aspects of the present invention can be applied to an information processing apparatus, an information processing method, a program, and the like that are required to protect the privacy of a person included in a provided image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

L'invention concerne un dispositif de traitement d'informations, comportant une unité de communication et une unité de commande. L'unité de commande extrait d'une image acquise une région d'image qui comprend tout ou partie d'une personne, juge si la région d'image correspond à une personne prescrite d'après des informations d'identification servant à identifier la personne prescrite, et s'il a été jugé que la région d'image ne correspond pas à la personne prescrite, traite la région d'image pour générer une image traitée et fournit l'image traitée à la personne prescrite via l'unité de communication.
PCT/JP2017/030514 2016-11-17 2017-08-25 Dispositif de traitement d'informations, procédé de traitement d'informations et programme WO2018092378A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016224092 2016-11-17
JP2016-224092 2016-11-17

Publications (1)

Publication Number Publication Date
WO2018092378A1 true WO2018092378A1 (fr) 2018-05-24

Family

ID=62145480

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/030514 WO2018092378A1 (fr) 2016-11-17 2017-08-25 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Country Status (1)

Country Link
WO (1) WO2018092378A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020086504A (ja) * 2018-11-15 2020-06-04 日本電気株式会社 画像処理装置、画像処理方法、プログラム
WO2021250805A1 (fr) * 2020-06-10 2021-12-16 マクセル株式会社 Dispositif de surveillance d'observation et procédé de surveillance d'observation
US11347978B2 (en) 2018-02-07 2022-05-31 Sony Corporation Image processing apparatus, image processing method, and image processing system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008040938A (ja) * 2006-08-09 2008-02-21 Nikon Corp 施設内監視システム
JP2010021921A (ja) * 2008-07-14 2010-01-28 Nikon Corp 電子カメラおよび画像処理プログラム
JP2013171311A (ja) * 2012-02-17 2013-09-02 Nec Casio Mobile Communications Ltd 画像処理装置、端末装置、画像処理方法及びプログラム
JP2015222460A (ja) * 2014-05-22 2015-12-10 日本電信電話株式会社 公開画像編集方法及び公開画像編集装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008040938A (ja) * 2006-08-09 2008-02-21 Nikon Corp 施設内監視システム
JP2010021921A (ja) * 2008-07-14 2010-01-28 Nikon Corp 電子カメラおよび画像処理プログラム
JP2013171311A (ja) * 2012-02-17 2013-09-02 Nec Casio Mobile Communications Ltd 画像処理装置、端末装置、画像処理方法及びプログラム
JP2015222460A (ja) * 2014-05-22 2015-12-10 日本電信電話株式会社 公開画像編集方法及び公開画像編集装置

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11347978B2 (en) 2018-02-07 2022-05-31 Sony Corporation Image processing apparatus, image processing method, and image processing system
JP2020086504A (ja) * 2018-11-15 2020-06-04 日本電気株式会社 画像処理装置、画像処理方法、プログラム
JP7387981B2 (ja) 2018-11-15 2023-11-29 日本電気株式会社 画像処理装置、画像処理方法、プログラム
WO2021250805A1 (fr) * 2020-06-10 2021-12-16 マクセル株式会社 Dispositif de surveillance d'observation et procédé de surveillance d'observation

Similar Documents

Publication Publication Date Title
US9298969B2 (en) Information processing device and storage medium, for replacing a face image
CN105069075B (zh) 照片共享方法和装置
JP5947196B2 (ja) 撮影サービスシステム及び撮影制御方法
KR102094509B1 (ko) 촬영 제한 요소를 포함하는 영상을 수정하는 방법, 이를 수행하기 위한 디바이스 및 시스템
TW201516939A (zh) 查詢使用者標識的方法及裝置、獲取使用者標識的方法及裝置與即時通訊中添加好友的方法及裝置
US10165178B2 (en) Image file management system and imaging device with tag information in a communication network
CN105956022B (zh) 电子镜图像处理方法和装置、图像处理方法和装置
JPWO2012004907A1 (ja) 画像配信装置
US10657361B2 (en) System to enforce privacy in images on an ad-hoc basis
WO2018092378A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
US20130148003A1 (en) Method, system and apparatus for selecting an image captured on an image capture device
JP2014067131A (ja) 画像処理装置、画像処理システム、画像処理方法およびコンピュータプログラム
JP2014045259A (ja) 端末装置、サーバ及びプログラム
JP2015233204A (ja) 画像記録装置及び画像記録方法
CN106549903B (zh) 一种设置用户头像的方法和装置
JP6736534B2 (ja) 通信中継装置、システム、方法及びプログラム
JP6428152B2 (ja) 肖像権保護プログラム、情報通信装置及び肖像権保護方法
JP2019118021A (ja) 撮影制御システム、撮影制御方法、プログラムおよび記録媒体
JP5891828B2 (ja) 携帯端末、撮影画像公開方法、プログラム
JP2022058833A (ja) 情報処理システム、情報処理装置、情報処理方法、およびプログラム
JP6476148B2 (ja) 画像処理装置及び画像処理方法
JP7075703B2 (ja) 情報処理装置、情報処理方法、及びプログラム
JP2017175453A (ja) 画像処理装置及び画像処理方法
JP6845604B2 (ja) プログラム
JP2012231208A (ja) 電子機器

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17871059

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17871059

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP