WO2018092378A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2018092378A1
WO2018092378A1 PCT/JP2017/030514 JP2017030514W WO2018092378A1 WO 2018092378 A1 WO2018092378 A1 WO 2018092378A1 JP 2017030514 W JP2017030514 W JP 2017030514W WO 2018092378 A1 WO2018092378 A1 WO 2018092378A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
person
information processing
unit
control unit
Prior art date
Application number
PCT/JP2017/030514
Other languages
French (fr)
Japanese (ja)
Inventor
温子 國定
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Publication of WO2018092378A1 publication Critical patent/WO2018092378A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • Some embodiments of the present invention relate to an information processing apparatus, an information processing method, and a program.
  • Patent Literature 1 discloses an image distribution server that recognizes a person copied in an image requested by a user and distributes the image to the person.
  • the server registers the face image of the person for whom the user wants to distribute the person image and the e-mail address of the person's transmission destination for each user in the user database, and the person image is uploaded from the user terminal.
  • the person to be distributed is identified by comparing the face image of the person copied in the received person image with the face image to be recognized registered in the user database, and the person image is distributed based on the e-mail address.
  • Some aspects of the present invention have been made in view of the above points, and provide an information processing apparatus, an information processing method, and a program for protecting the privacy of a person included in an image to be provided.
  • One aspect of the present invention is made to solve the above-described problem, and one aspect of the present invention includes a communication unit and a control unit, and the control unit is configured to obtain a part of a person or an image from an acquired image. An image area including all of the images is extracted, and whether or not the image area corresponds to the predetermined person is determined based on the specific information for specifying the predetermined person, and it is determined that the image area does not correspond to the predetermined person.
  • the information processing apparatus generates a processed image by processing the image area and provides the processed image to a predetermined person via the communication unit.
  • One embodiment of the present invention is an information processing method performed by an information processing device, the step of extracting an image region including a part or all of a person from an acquired image, and the image region corresponds to a predetermined person Determining whether or not based on specific information for specifying a predetermined person, and generating a processed image by processing the image area when it is determined that the image area does not correspond to the predetermined person And an information processing method comprising: providing a processed image to a predetermined person.
  • a procedure for extracting an image area including a part or all of a person from an acquired image and whether or not the image area corresponds to the predetermined person A procedure for determining based on specific information for specifying, a procedure for generating a processed image by processing the image region when it is determined that the image region does not correspond to the predetermined person, and a processed image for the predetermined person A program for executing the provided procedure.
  • the privacy of a person included in an image to be provided can be protected.
  • FIG. 1 is a configuration diagram showing a configuration of an information processing system 1 according to the present embodiment.
  • the information processing system 1 includes an information processing device 10 and a server device 20, and the information processing device 10 and the server device 20 are connected by a wired or wireless network.
  • the information processing apparatus 10 is an information processing apparatus such as a digital camera, a smartphone, or a tablet having an imaging function, and in the present embodiment, is a fixed camera installed in a theme park.
  • the information processing apparatus 10 includes a control unit 11, a determination unit 111, an image processing unit 112, a provision control unit 113, an imaging unit 12, a storage unit 13, and a communication unit 14. The function of each part will be described later.
  • the server device 20 is a server device installed on a network, and is, for example, a personal computer or a notebook computer.
  • the server device 20 includes a control unit 21, a provision destination specifying unit 211, a storage unit 22, and a communication unit 23. The function of each part will be described later.
  • FIG. 2A and 2B are explanatory diagrams illustrating an example of image processing by the information processing system 1.
  • FIG. 2A is an example of an image captured by the imaging unit 12 of the information processing apparatus 10.
  • the image shown in FIG. 2A includes five persons A, B, C, D, and E from the left.
  • C and D are registered in advance in the user database of the information processing apparatus 10 as users of the captured image distribution service.
  • the determination unit 111 of the information processing apparatus 10 extracts persons (A, B, C, D, E) included in the captured image, and determines whether each person matches a person registered in the database. judge. As a result, the determination unit 111 determines that A, B, and E do not match the person registered in the database.
  • the image processing unit 112 processes the images A, B, and E that are determined not to match the person registered in the database. Specifically, a process of blurring the face portions of A, B, and E is performed so that no one can identify A, B, and E.
  • FIG. 2B is an example of an image after processing. As shown in FIG. 2B, the face portions A, B, and E are blurred so that an individual cannot be identified. Thereafter, the provision control unit 113 of the information processing apparatus 10 requests the server apparatus 20 for transmission destination information of the processed image.
  • the provision destination specifying unit 211 of the server apparatus 20 refers to the provision destination database, extracts C and D transmission destination information (e-mail address and the like), and notifies the information processing apparatus 10 of the information.
  • the provision control unit 113 of the information processing apparatus 10 transmits the processed image to C and D based on the acquired transmission destination information.
  • the imaging unit 12 includes a lens, a CCD (Charge Coupled Device) or CMOS (Complementary MOS) imaging device, a signal processing unit, and the like.
  • the imaging unit 12 periodically outputs the captured image to the storage unit 13.
  • the storage unit 13 includes, for example, an HDD (Hard Disc Drive), a flash memory, an EEPROM (Electrically Erasable Programmable Read Only Memory), a ROM (Read Only Memory, etc.), or a RAM (Random Memory Program).
  • various programs to be executed by a CPU (Central Processing Unit, not shown) included in the information processing apparatus 10 and results of processing executed by the CPU are stored.
  • the storage unit 13 stores a user database D1.
  • the user database D1 will be described with reference to FIG.
  • FIG. 3 is a diagram illustrating an example of the configuration of the user database D1 stored in the storage unit 13.
  • the user database D1 has a plurality of records representing information of users registered in the image transmission service. Each record has specific information for specifying a user.
  • the specific information includes a face feature parameter, an accessory parameter, a QR (Quick Response) code (registered trademark) parameter, and the like.
  • the face feature parameter is information indicating the feature of the user's face.
  • the face feature parameter is, for example, a feature value obtained by quantifying the shape, size, positional relationship, etc. of each part of the face (eyes, nose, mouth, etc.). It can be determined whether or not.
  • the face feature parameter may be image information of a face or each part of the face.
  • the clothing parameter is a parameter representing the clothing worn by the user on the day.
  • the clothing parameter may be acquired from an image taken at the time of entering the theme park, or may be obtained by receiving an image of the clothing of the day from the user.
  • the ornament parameter is a parameter representing the ornament worn by the user on the day. The method for obtaining the decoration parameter is the same as the clothing parameter.
  • the QR code parameter is information representing an identifier for QR code assigned to the user. For example, a sticker printed with a QR code encoded with a QR code identifier is pasted on a user's clothes at the time of entering the theme park, the imaging unit 12 reads the QR code, and the control unit 11 decodes the QR code. Then, the user can be specified by comparing with the QR code identifier of each user.
  • the user database D1 has group information.
  • the group information is information indicating a group to which the user belongs.
  • the same identifier is assigned to the same group based on information at the time of user registration.
  • the group information may not be held in the user database D1, and may be used by referring to information in another database (for example, a user's phone book, friend list, etc.).
  • the communication unit 14 includes a communication interface for performing communication between apparatuses via a wired or wireless network, and communicates with the server apparatus 20.
  • the network is, for example, a mobile phone network, a VPN (Virtual Private Network) network, a dedicated communication line network, a WAN (Wide Area Network), a LAN (Local Area Network), a PSTN (Public Switched Telephony network; An information communication network to be configured, or a combination thereof.
  • the control unit 11 controls various configurations included in the information processing apparatus 10. A part or all of the functions of the control unit 11 may be realized, for example, by executing a program stored in the storage unit 13 by a CPU included in the information processing apparatus 10.
  • the control unit 11 includes a determination unit 111, an image processing unit 112, and a provision control unit 113.
  • the determination unit 111 extracts a person included in the image captured by the imaging unit 12.
  • Various methods can be used to extract a person. For example, a method using a background difference that is detected based on a difference between an image in which only the background is captured in advance and an input image may be used. Further, for example, a method may be used in which a pattern image obtained by cutting out an image in a detection window having a set size is determined as a person.
  • the determination unit 111 may extract a part of the person (for example, a face) instead of the whole person and use it for the determination.
  • the determination unit 111 determines whether the extracted person matches a predetermined condition.
  • the predetermined condition is, for example, whether or not the extracted person matches the person registered in the user database D1 of the storage unit 13. In that case, the determination unit 111 compares the parameter calculated from the extracted person and each parameter of the person registered in the user database D1, and if any parameter matches or the matching degree is higher than a predetermined threshold, the determination unit 111 extracts It is determined that the selected person matches the person registered in the user database D1. Further, the determination unit 111 may determine that the extracted person matches the person registered in the user database D1 when the degree of coincidence of the plurality of parameters is higher than a predetermined threshold.
  • the determination unit 111 outputs the determined result to the image processing unit 112 and the provision control unit 113.
  • the image processing unit 112 processes a person image included in the image captured by the imaging unit 12 based on the determination result input from the determination unit 111, and generates a processed image.
  • the image processing unit 112 processes an image of a person who has been determined not to meet a pre-registered condition so that an individual cannot be identified. For example, the image processing unit 112 performs a process of blurring (mosaic) the face portion of the target person. The grain size of the blurring process can be set as appropriate.
  • the image processing unit 112 may fill the face portion of the target person, or may superimpose and display another image.
  • the portion to be processed is not limited to the face portion of the target person, but may be a part or all of the body of the target person or a region including the body.
  • the image processing unit 112 may perform processing so that the target person does not exist by using a background image stored in advance.
  • the image processing unit 112 may perform processing so that the target person does not exist by superimposing and displaying the background image that should exist behind the target person on the front (front side) of the target person.
  • the provision control unit 113 acquires transmission destination information of the person in the image determined to meet a predetermined condition based on the determination result input from the determination unit 111.
  • the destination information is, for example, an e-mail address or an address for sending an SNS (Social Networking Service) message.
  • the provision control unit 113 acquires the transmission destination information of the target person from the server device 20 via the communication unit 14.
  • the acquisition destination of the transmission destination information is not limited to the server device 20 and may be acquired from another external device or the storage unit 13.
  • the providing control unit 113 transmits the processed image processed by the image processing unit 112 via the communication unit 14 based on the acquired transmission destination information.
  • the storage unit 22 includes, for example, an HDD, a flash memory, an EEPROM, a ROM, or a RAM, and is executed by various programs executed by a CPU (not shown) included in the server device 20 such as firmware and application programs. The result of processing is stored.
  • the storage unit 22 stores a provision destination database D2 (not shown).
  • the provision destination database D2 has a record for each registered user, and each record has each value of an image transmission method and transmission destination information.
  • the transmission method indicates a method (means) for transmitting an image such as an electronic mail or an SNS message, and the destination information includes an electronic mail address corresponding to the transmission method, an address for transmitting an SNS message, or the like.
  • the communication unit 23 includes a communication interface for performing communication between apparatuses via a wired or wireless network, and communicates with the information processing apparatus 10.
  • the network is, for example, an information communication network configured by a mobile phone network, a VPN network, a dedicated communication line network, WAN, LAN, PSTN, or the like, or a combination thereof.
  • the control unit 21 controls various configurations included in the server device 20. A part or all of the functions of the control unit 21 may be realized, for example, by executing a program stored in the storage unit 22 by a CPU included in the server device 20.
  • the control unit 21 includes a provision destination specifying unit 211.
  • the provision destination specifying unit 211 refers to the provision destination database D2 based on a request from the information processing apparatus 10 and extracts transmission destination information corresponding to the designated user transmission method.
  • the provision destination specifying unit 211 notifies the information processing apparatus 10 of the extracted transmission destination information via the communication unit 23.
  • FIG. 4 is a flowchart showing an example of the operation of the information processing system 1 according to the present embodiment.
  • the imaging unit 12 of the information processing apparatus 10 captures an image included in the imaging range based on an instruction from the control unit 11.
  • An opportunity for shooting may be a shooting instruction from the outside, or may be shooting periodically.
  • the image of FIG. 2A is captured. Thereafter, the process proceeds to step S102.
  • Step S102 The determination unit 111 extracts a person included in the captured image. In this example, five persons A, B, C, D, and E are extracted. Thereafter, the process proceeds to step S103.
  • Step S103 The determination unit 111 determines whether each of the extracted persons A, B, C, D, and E is registered in the user database.
  • the determination unit 111 performs registration determination by comparing each feature amount (parameter) extracted from A, B, C, D, and E with each determination parameter in the user database D1. Thereafter, the process proceeds to step S104.
  • Step S104 If each parameter of the extracted person does not match any of the determination parameters in the user database D1, the determination unit 111 determines that no one is registered in the user database D1 (Step S104 / NO). ), The process is terminated. If any one of the extracted parameters of the person matches any of the determination parameters of the user database D1, the determination unit 111 determines that the person is registered in the user database D1 (step S104). / YES), the process proceeds to step S105. In this example, since the degree of coincidence between the facial feature parameters extracted from C and D and the facial feature parameters of the users C and D in the user database D1 is greater than a predetermined threshold, C and D are stored in the user database D1. Is determined to be registered. In addition, the determination unit 111 may perform the determination based on the matching degree of the clothing parameter, the accessory parameter, and the QR code parameter.
  • Step S105 The determination unit 111 determines whether there is a person in the image other than the person registered in the user database D1. When it is determined that there is no person other than the person registered in the user database D1 in the image (step S105 / NO), the process proceeds to step S107. When it is determined that there is a person other than the person registered in the user database D1 in the image (step S105 / YES), the process proceeds to step S106. In this example, since A, B, and E that are not registered in the user database D1 exist in the image, the process proceeds to step S106.
  • Step S106 The image processing unit 112 processes an image of a person who is not registered in the user database D1.
  • blurring is performed on the face portions of A, B, and E that are not registered in the user database D1. Thereafter, the process proceeds to step S107.
  • Step S107 The providing control unit 113 requests and acquires the transmission destination information of the person registered in the user database D1 from the server device 20.
  • the provision control unit 113 requests the server apparatus 20 for C and D transmission destination information that is an image transmission target, and acquires an e-mail address that is corresponding transmission destination information.
  • Step S108 The providing control unit 113 transmits an image based on the acquired transmission destination information.
  • the processed image is attached and transmitted as an attached file to C and D email addresses.
  • movement of the information processing system 1 which concerns on this embodiment is complete
  • the information processing apparatus 10 includes the communication unit 14 and the control unit 11, and the control unit 11 is configured to obtain a part of a person from the acquired image or An image area including all is extracted, and whether or not the image area corresponds to a predetermined person is determined based on specific information for specifying the predetermined person, and the image area corresponds to the predetermined person
  • a processed image is generated by processing the image area, and the processed image is provided to the predetermined person via the communication unit.
  • This configuration makes it possible to determine the person other than the service registered user reflected in the captured image and process the person so that the individual cannot be identified. Therefore, the privacy of the person included in the provided image can be protected. In addition, it is possible to provide an image to a service registered user simply and quickly.
  • FIG. 5 is a flowchart showing an example of the operation of the information processing system 1 according to the present embodiment. Description of the same steps as those in FIG. 4 will be omitted, and differences will be described.
  • the determination unit 111 determines whether the users determined to be registered in the user database D1 belong to the same group. The determination unit 111 performs the determination based on the value of the group information in the user database D1. For example, in the example of FIG. 2A, it is assumed that A and B are registered as a group 1 and C and D are registered as a group 2 in the user database D1. Then, when determining for C, the determination unit 111 determines that A and B are other groups, D is the same group, and E is not registered. Therefore, in addition to the person (D) belonging to the same group, there are persons in the image (A, B, E) (step S205 / YES), the process proceeds to step S206, and the person (A, The image of B, E) is processed.
  • step S207 the provision control unit 113 acquires transmission destination information from the server device 20.
  • the providing control unit 113 may acquire only transmission destination information of the transmission target (C) as a transmission destination, or transmission of a person (D) belonging to the same group as the transmission target (C). Prior information may be acquired.
  • the control unit 11 does not process an image region corresponding to a person belonging to the same group as a predetermined person. According to this configuration, since an image of a person who does not belong to the same group is processed so that an individual cannot be specified, the privacy of the person included in the provided image can be protected.
  • the control unit 11 provides the processed image to at least one person who belongs to the same group as the predetermined person. With this configuration, an image can be provided to members belonging to the group simply and quickly.
  • the information processing apparatus 10 is an information processing apparatus having an imaging function such as a smartphone, a tablet, or a digital camera possessed by the user, and an example in which the user captures a landscape using the information processing apparatus 10. Will be explained.
  • FIG. 6 is an explanatory diagram illustrating an example of the operation of the information processing system 1 according to the present embodiment.
  • Step S301 The imaging unit 12 of the information processing apparatus 10 captures a landscape image in the imaging range based on a user operation.
  • the control unit 11 may recognize that the user captures a landscape by switching the setting to a landscape mode or the like before capturing. Thereafter, the process proceeds to step S302.
  • Step S302 The determination unit 111 extracts a person included in the captured image. Thereafter, the process proceeds to step S303.
  • Step S303 The determination unit 111 determines whether or not a person is included in the captured image. If the person is included in the captured image (YES in Step S303), the process proceeds to Step S304. If no person is included in the captured image (step S303 / NO), the process proceeds to step S307.
  • Step S304 The determination unit 111 determines whether the extracted person is registered in the user database D1. The determination unit 111 further determines whether or not the extracted person belongs to the same group as the user. Thereafter, the process proceeds to step S305.
  • Step S305 When there is no person belonging to the same group as the user (step S305 / NO), the process proceeds to step S306. In this example, since the user is photographing a landscape, it is assumed that no person belonging to the same group as the user exists in the captured image, but when there is a registered user (step S305 / YES), the first The process proceeds to step S105 of the embodiment or step S205 of the second embodiment.
  • Step S306 The image processing unit 112 performs a blurring process on the extracted person image.
  • the extracted person is not registered in the image transmission service that is captured when the user has photographed the landscape, or is registered in the image transmission service but belongs to a group different from the user. Therefore, by processing the extracted person image so that an individual cannot be identified, the privacy of the person can be protected. Thereafter, the process proceeds to step S307.
  • Step S307 The provision control unit 113 requests the server device 20 for transmission destination information of a person belonging to the same group as the user.
  • the server device 20 refers to the user database D1, extracts the transmission destination information of a person belonging to the same group as the requesting user, and notifies the user. Thereafter, the process proceeds to step S308.
  • Step S308 The providing control unit 113 transmits an image to persons belonging to the same group based on the acquired transmission destination information.
  • movement of the information processing system 1 which concerns on this embodiment is complete
  • the information processing system 1 can be applied even when a user or a person in the same group as the user is not shown.
  • FIG. 7 is a flowchart showing an example of the operation of the information processing system 1 according to the present embodiment. Description of the same steps as those in FIG. 4 will be omitted, and differences will be described.
  • the image processing unit 112 determines whether to perform image trimming. The image processing unit 112 may make the determination based on the information on necessity of trimming registered in the user database D1, or may make the determination based on information in another database held by the storage unit 13. . Further, the image processing unit 112 may receive a user operation on an input unit (not shown) and determine whether to perform trimming. If it is determined that trimming is to be performed (step S407 / YES), the process proceeds to step S408. When it is determined not to perform trimming (step S407 / NO), the process proceeds to step S409.
  • step S408 the image processing unit 112 performs image trimming.
  • An example of the trimming process will be described with reference to FIGS. 8A and 8B.
  • 8A and 8B are explanatory diagrams illustrating an example of trimming processing according to the present embodiment.
  • an example in which the image subjected to the blurring process in step S406 is the image shown in FIG.
  • the image processing unit 112 selects C as a trimming target.
  • the image processing unit 112 may perform the determination based on information registered in the user database D1, or may perform the determination based on information in another database held by the storage unit 13.
  • the image processing unit 112 may select a user who transmits an image as a trimming target. Further, the image processing unit 112 may receive a user operation on an input unit (not shown) and determine whether to perform trimming.
  • the image processing unit 112 performs trimming so that the face C to be trimmed is positioned at the center of the image.
  • FIG. 8A is an example of an image after performing trimming on C. As shown in FIG. 8A, cropping is performed so that the face of C is positioned at the center of the image. The center portion of the image refers to a certain area near the center of the image, but the position of the face of the target person after trimming only needs to be closer to the center than the original captured image.
  • the image processing unit 112 may perform image enlargement processing together with trimming.
  • FIG. 8B is an example of an image after the image enlargement process is performed. The magnification of the enlargement process can be set as appropriate.
  • the image processing unit 112 is not limited to the face of the person to be trimmed, for example, obtains the center coordinates of the body of the person to be trimmed, and performs trimming so that the center part of the body of the target person is located at the center of the image. Also good. Further, for example, the image processing unit 112 may perform the trimming so that the central part of the face or body of the members (C and D in FIGS. 8A and 8B) belonging to the same group is located in the central part of the image. As a result, an image in which the group member is located at the center can be generated.
  • the control unit 11 includes the image region determined to correspond to a predetermined person and does not include at least a part of the image. Such a processed image is generated (trimmed). With this configuration, the image is processed so that the user is positioned in the vicinity of the center of the image, so that an image suitable for each user can be provided simply and quickly.
  • FIG. 9 is a block diagram illustrating a functional configuration of the information processing apparatus 30 according to the present embodiment.
  • the information processing apparatus 30 is an information processing apparatus having an imaging function such as a smartphone, a tablet, or a digital camera.
  • the information processing apparatus 30 includes a control unit 11, a determination unit 111, an image processing unit 112, a provision control unit 313, an imaging unit 12, a storage unit 33, and a communication unit 14.
  • the storage unit 33 has the same function as the storage unit 13 in the first embodiment, and further stores a provision destination database D2.
  • the provision destination database D2 has a record for each registered user, and each record has each value of an image transmission method and transmission destination information.
  • the transmission method indicates a method (means) for transmitting an image such as an electronic mail or an SNS message, and the destination information includes an electronic mail address corresponding to the transmission method, an address for transmitting an SNS message, or the like.
  • the storage unit 33 may hold the user database D1 and the provision destination database D2 as a single database.
  • the providing control unit 313 acquires the transmission destination information of the person registered in the user database D1 with reference to the providing destination database D2 stored in the storage unit 33.
  • step S107 the provision control unit 113 requests and obtains the transmission destination information of the person registered in the user database D1 from the server device 20, but as described above, the provision control unit 113 In step 313, the transmission destination information of the person registered in the user database D ⁇ b> 1 is acquired with reference to the provision destination database D ⁇ b> 2 stored in the storage unit 33.
  • Other operations are the same as those in the above-described embodiment.
  • the information processing apparatus 30 includes the communication unit 14 and the control unit 11, and the control unit 11 selects an image area that includes a part or all of a person from the acquired image.
  • the control unit 11 selects an image area that includes a part or all of a person from the acquired image.
  • This configuration makes it possible to determine the person other than the service registered user reflected in the captured image and process the person so that the individual cannot be identified. Therefore, the privacy of the person included in the provided image can be protected. In addition, it is possible to provide an image to a service registered user simply and quickly.
  • FIG. 10 is a configuration diagram showing a configuration of the information processing system 2 according to the present embodiment.
  • the information processing system 2 includes an information processing device 40 and a server device 50.
  • the information processing apparatus 40 includes a control unit 41, an imaging unit 12, a storage unit 43, and a communication unit 14.
  • the control unit 41 controls various configurations included in the information processing apparatus 40.
  • the determination unit 111, the image processing unit 112, and the provision control unit 113 are not provided.
  • the storage unit 43 stores various programs to be executed by the CPU included in the information processing apparatus 40, results of processes executed by the CPU, and the like.
  • the user database D1 is not stored.
  • the server device 50 includes a control unit 51, a storage unit 52, and a communication unit 53.
  • the storage unit 52 stores various programs to be executed by the CPU included in the server device 50, results of processing executed by the CPU, and the like.
  • the storage unit 52 further stores a user database D1 and a provision destination database D2. The contents of each database are the same as in the above embodiment. Since the communication unit 53 is the same as the communication unit 23 according to the first embodiment, a description thereof will be omitted.
  • the control unit 51 includes a determination unit 511, an image processing unit 512, a provision destination specifying unit 513, and a provision control unit 514. Since each operation is the same as that of the above-described embodiment, description thereof is omitted.
  • the server device 50 includes the communication unit 53 and the control unit 51, and the control unit 51 extracts an image area including a part or all of a person from the acquired image. Determining whether or not the image area corresponds to a predetermined person based on identification information for specifying the predetermined person, and determining that the image area does not correspond to the predetermined person, The processed image is processed to generate a processed image, and the processed image is provided to the predetermined person via the communication unit.
  • This configuration makes it possible to determine the person other than the service registered user reflected in the captured image and process the person so that the individual cannot be identified. Therefore, the privacy of the person included in the provided image can be protected. In addition, it is possible to provide an image to a service registered user simply and quickly.
  • a server apparatus for example, a determination part, an image process part, a provision control part, etc. with a computer.
  • a program for realizing this function may be recorded on a computer-readable recording medium, and the program recorded on the recording medium may be read into a computer system and executed.
  • the “computer system” is a computer system built in the information processing apparatus and server apparatus, and includes hardware such as an OS (Operating System) and peripheral devices.
  • the “computer-readable recording medium” means a storage device such as a flexible disk, a magneto-optical disk, a portable medium such as a ROM and a CD-ROM, and a hard disk incorporated in a computer system.
  • the “computer-readable recording medium” is a medium that dynamically holds a program for a short time, such as a communication line when transmitting a program via a network such as the Internet or a communication line such as a telephone line, In this case, a volatile memory inside a computer system that serves as a server or a client may be included that holds a program for a certain period of time.
  • the program may be a program for realizing a part of the functions described above, and may be a program capable of realizing the functions described above in combination with a program already recorded in a computer system.
  • part or all of the information processing apparatus and the server apparatus in the above-described embodiment may be realized as an integrated circuit such as an LSI (Large Scale Integration).
  • LSI Large Scale Integration
  • Each functional unit of the information processing apparatus and the server apparatus may be individually made into a processor, or a part or all of them may be integrated into a processor.
  • the method of circuit integration is not limited to LSI, and may be realized by a dedicated circuit or a general-purpose processor.
  • an integrated circuit based on the technology may be used.
  • the image to be processed has been described as a still image, but the processing target may be a set of a plurality of continuous images, that is, a moving image.
  • the image transmission may be performed by either the information processing apparatus or the server apparatus in the information processing system.
  • the information processing apparatus and the server apparatus may perform transmission processing in a distributed manner.
  • the provision of an image is not limited to a form in which an image is directly transmitted, and may be realized by instructing transmission to another device.
  • an image may be uploaded on the Internet, and a URL (Uniform Resource Locator) indicating the upload location may be notified.
  • an image may be posted (uploaded) on an SNS site used by the user.
  • the functions of the information processing system in the above-described embodiment are not limited to those shown in the drawings, and may be configured by functionally or physically distributed and combined as appropriate.
  • the object of the present invention may be realized by concentrating and deploying each function in the information processing apparatus or the server apparatus.
  • a communication unit and a control unit are provided, and the control unit extracts an image area including a part or all of a person from the acquired image, and determines whether or not the image area corresponds to a predetermined person. Determining based on specific information for specifying a predetermined person, and when determining that the image area does not correspond to the predetermined person, processing the image area to generate a processed image, and An information processing apparatus that provides the processed image to the predetermined person via
  • control unit provides the processed image to at least one person belonging to the same group as the predetermined person.
  • the control unit generates the processed image that includes the image area determined to correspond to the predetermined person and does not include at least a part of the image.
  • An information processing apparatus according to claim 1.
  • An information processing method performed by the information processing apparatus, the step of extracting an image area including a part or all of a person from the acquired image, and whether or not the image area corresponds to a predetermined person. Determining based on specific information for specifying a predetermined person; and when determining that the image area does not correspond to the predetermined person, processing the image area to generate a processed image; Providing the processed image to the predetermined person.
  • Procedure for extracting an image area including a part or all of a person from an acquired image to a computer and specifying for specifying the predetermined person whether or not the image area corresponds to the predetermined person A procedure for determining based on information; a procedure for processing the image region to generate a processed image when it is determined that the image region does not correspond to the predetermined person; and Program to execute the provided procedure.
  • Some aspects of the present invention can be applied to an information processing apparatus, an information processing method, a program, and the like that are required to protect the privacy of a person included in a provided image.

Abstract

Provided is an information processing device, comprising a communication unit and a control unit. The control unit extracts from an acquired image an image region which includes all or part of a person, assesses whether the image region corresponds to a prescribed person on the basis of identification information for identifying the prescribed person, and if it has been assessed that the image region does not correspond to the prescribed person, processes the image region to generate a processed image and provides the processed image to the prescribed person via the communication unit.

Description

情報処理装置、情報処理方法、及びプログラムInformation processing apparatus, information processing method, and program
 本発明のいくつかの態様は、情報処理装置、情報処理方法、及びプログラムに関する。
 本願は、2016年11月17日に日本に出願された特願2016-224092号について優先権を主張し、その内容をここに援用する。
Some embodiments of the present invention relate to an information processing apparatus, an information processing method, and a program.
This application claims priority on Japanese Patent Application No. 2016-224092 filed in Japan on November 17, 2016, the contents of which are incorporated herein by reference.
 従来から、画像に含まれる人物を、顔認識技術等を用いて特定することが行われている。また、特定した人物に対して、画像を電子メール等で配信することが行われている。
 例えば、特許文献1には、ユーザが配信を要求した画像に写された人物を認識し、その人物宛に画像を配信する画像配信サーバが開示されている。当該サーバにより、ユーザ毎に該ユーザが人物画像の配信を希望する人物の顔画像と、該人物の送信先の電子メールアドレスとがユーザデータベースに登録され、ユーザ端末から人物画像がアップロードされたとき、受信した人物画像に写された人物の顔画像とユーザデータベースに登録された認識対象の顔画像との比較により配信先の人物が特定され、電子メールアドレスに基づいて人物画像が配信される。
Conventionally, a person included in an image is specified using a face recognition technique or the like. In addition, an image is distributed to an identified person by e-mail or the like.
For example, Patent Literature 1 discloses an image distribution server that recognizes a person copied in an image requested by a user and distributes the image to the person. When the server registers the face image of the person for whom the user wants to distribute the person image and the e-mail address of the person's transmission destination for each user in the user database, and the person image is uploaded from the user terminal Then, the person to be distributed is identified by comparing the face image of the person copied in the received person image with the face image to be recognized registered in the user database, and the person image is distributed based on the e-mail address.
特開2004-326281号公報JP 2004-326281 A
 しかしながら、特許文献1に記載の技術では、撮影された画像に、送信対象者以外の他人が写っている場合も、当該他人の許諾なく画像が送信されるため、当該他人のプライバシーの侵害となるおそれがある。特に、テーマパークのように、不特定多数の人物が存在する場所に設置されたカメラで撮影された画像を送信する場合は、上記が問題となる可能性が高い。 However, with the technique described in Patent Document 1, even when a photographed image includes a person other than the person to be transmitted, the image is transmitted without the permission of the other person, which is an infringement of the privacy of the other person. There is a fear. In particular, when transmitting an image taken by a camera installed in a place where a large number of unspecified persons exist like a theme park, the above is highly likely to be a problem.
 本発明のいくつかの態様は上記の点に鑑みてなされたものであり、提供する画像に含まれる人物のプライバシーを守る情報処理装置、情報処理方法、及びプログラムを提供する。 Some aspects of the present invention have been made in view of the above points, and provide an information processing apparatus, an information processing method, and a program for protecting the privacy of a person included in an image to be provided.
 本発明の一態様は上記の課題を解決するためになされたものであり、本発明の一態様は、通信部と、制御部とを備え、制御部は、取得した画像から人物の一部又は全部を含む画像領域を抽出し、画像領域が所定の人物に対応するか否かを、所定の人物を特定するための特定情報に基づいて判定し、画像領域が所定の人物に対応しないと判定した場合に、画像領域を加工して加工画像を生成し、通信部を介して、所定の人物に加工画像を提供する情報処理装置である。 One aspect of the present invention is made to solve the above-described problem, and one aspect of the present invention includes a communication unit and a control unit, and the control unit is configured to obtain a part of a person or an image from an acquired image. An image area including all of the images is extracted, and whether or not the image area corresponds to the predetermined person is determined based on the specific information for specifying the predetermined person, and it is determined that the image area does not correspond to the predetermined person. In this case, the information processing apparatus generates a processed image by processing the image area and provides the processed image to a predetermined person via the communication unit.
 また、本発明の一態様は、情報処理装置が行う情報処理方法であって、取得した画像から人物の一部又は全部を含む画像領域を抽出するステップと、画像領域が所定の人物に対応するか否かを、所定の人物を特定するための特定情報に基づいて判定するステップと、画像領域が所定の人物に対応しないと判定した場合に、画像領域を加工して加工画像を生成するステップと、所定の人物に加工画像を提供するステップとを備える情報処理方法である。 One embodiment of the present invention is an information processing method performed by an information processing device, the step of extracting an image region including a part or all of a person from an acquired image, and the image region corresponds to a predetermined person Determining whether or not based on specific information for specifying a predetermined person, and generating a processed image by processing the image area when it is determined that the image area does not correspond to the predetermined person And an information processing method comprising: providing a processed image to a predetermined person.
 また、本発明の一態様は、コンピュータに、取得した画像から人物の一部又は全部を含む画像領域を抽出する手順と、画像領域が所定の人物に対応するか否かを、所定の人物を特定するための特定情報に基づいて判定する手順と、画像領域が所定の人物に対応しないと判定した場合に、画像領域を加工して加工画像を生成する手順と、所定の人物に加工画像を提供する手順とを実行させるためのプログラムである。 Further, according to one embodiment of the present invention, a procedure for extracting an image area including a part or all of a person from an acquired image and whether or not the image area corresponds to the predetermined person A procedure for determining based on specific information for specifying, a procedure for generating a processed image by processing the image region when it is determined that the image region does not correspond to the predetermined person, and a processed image for the predetermined person A program for executing the provided procedure.
 本発明の一態様によれば、提供する画像に含まれる人物のプライバシーを守ることができる。 According to one embodiment of the present invention, the privacy of a person included in an image to be provided can be protected.
第1の実施形態に係る情報処理システムの構成を示す構成図である。It is a block diagram which shows the structure of the information processing system which concerns on 1st Embodiment. 第1の実施形態に係る画像処理の一例を示す第1の説明図である。It is a 1st explanatory view showing an example of image processing concerning a 1st embodiment. 第1の実施形態に係る画像処理の一例を示す第2の説明図である。It is the 2nd explanatory view showing an example of image processing concerning a 1st embodiment. 第1の実施形態に係るユーザデータベースの構成の一例を示す図である。It is a figure which shows an example of a structure of the user database which concerns on 1st Embodiment. 第1の実施形態に係る情報処理システムの動作の一例を示すフローチャートである。It is a flowchart which shows an example of operation | movement of the information processing system which concerns on 1st Embodiment. 第2の実施形態に係る情報処理システムの動作の一例を示すフローチャートである。It is a flowchart which shows an example of operation | movement of the information processing system which concerns on 2nd Embodiment. 第3の実施形態に係る情報処理システムの動作の一例を示すフローチャートである。It is a flowchart which shows an example of operation | movement of the information processing system which concerns on 3rd Embodiment. 第4の実施形態に係る情報処理システムの動作の一例を示すフローチャートである。It is a flowchart which shows an example of operation | movement of the information processing system which concerns on 4th Embodiment. 第4の実施形態に係るトリミング処理の一例を示す第1の説明図である。It is a 1st explanatory view showing an example of trimming processing concerning a 4th embodiment. 第4の実施形態に係るトリミング処理の一例を示す第2の説明図である。It is the 2nd explanatory view showing an example of trimming processing concerning a 4th embodiment. 第5の実施形態に係る情報処理装置の機能構成を示すブロック図である。It is a block diagram which shows the function structure of the information processing apparatus which concerns on 5th Embodiment. 第6の実施形態に係る情報処理システムの構成を示す構成図である。It is a block diagram which shows the structure of the information processing system which concerns on 6th Embodiment.
[第1の実施形態]
 以下、各図面を参照しながら本発明の第1の実施形態について説明する。
 図1は、本実施形態に係る情報処理システム1の構成を示す構成図である。
 情報処理システム1は、情報処理装置10とサーバ装置20とを備え、情報処理装置10とサーバ装置20は、有線または無線のネットワークで接続されている。
[First Embodiment]
The first embodiment of the present invention will be described below with reference to the drawings.
FIG. 1 is a configuration diagram showing a configuration of an information processing system 1 according to the present embodiment.
The information processing system 1 includes an information processing device 10 and a server device 20, and the information processing device 10 and the server device 20 are connected by a wired or wireless network.
 情報処理装置10は、撮像機能を有するデジタルカメラ、スマートフォン、タブレット等の情報処理装置であり、本実施形態では、テーマパークに設置された固定カメラである。情報処理装置10は、制御部11と、判定部111と、画像処理部112と、提供制御部113と、撮像部12と、記憶部13と、通信部14とを備える。各部の機能については後述する。 The information processing apparatus 10 is an information processing apparatus such as a digital camera, a smartphone, or a tablet having an imaging function, and in the present embodiment, is a fixed camera installed in a theme park. The information processing apparatus 10 includes a control unit 11, a determination unit 111, an image processing unit 112, a provision control unit 113, an imaging unit 12, a storage unit 13, and a communication unit 14. The function of each part will be described later.
 サーバ装置20は、ネットワーク上に設置されたサーバ装置であり、例えば、パーソナルコンピュータ、ノートパソコン等である。サーバ装置20は、制御部21と、提供先特定部211と、記憶部22と、通信部23とを備える。各部の機能については後述する。 The server device 20 is a server device installed on a network, and is, for example, a personal computer or a notebook computer. The server device 20 includes a control unit 21, a provision destination specifying unit 211, a storage unit 22, and a communication unit 23. The function of each part will be described later.
 図2A及び図2Bは、情報処理システム1による画像処理の一例を示す説明図である。
 図2Aは、情報処理装置10の撮像部12が撮像した画像の例である。図2Aに示す画像には、左からA、B、C、D、Eの5人の人物が写っている。この例では、CとDが予め撮像画像の配信サービスのユーザとして情報処理装置10のユーザデータベースに登録されているとする。情報処理装置10の判定部111は、撮像された画像に含まれる人物(A、B、C、D、E)を抽出し、各人物について、データベースに登録された人物に合致するか否かを判定する。その結果、判定部111は、A、B、Eはデータベースに登録された人物に合致しないと判定する。画像処理部112は、データベースに登録された人物に合致しないと判定されたA、B、Eの画像を加工する。具体的にはA、B、Eが誰か特定できないように、A、B、Eの顔部分をぼかす処理を行う。
2A and 2B are explanatory diagrams illustrating an example of image processing by the information processing system 1.
FIG. 2A is an example of an image captured by the imaging unit 12 of the information processing apparatus 10. The image shown in FIG. 2A includes five persons A, B, C, D, and E from the left. In this example, it is assumed that C and D are registered in advance in the user database of the information processing apparatus 10 as users of the captured image distribution service. The determination unit 111 of the information processing apparatus 10 extracts persons (A, B, C, D, E) included in the captured image, and determines whether each person matches a person registered in the database. judge. As a result, the determination unit 111 determines that A, B, and E do not match the person registered in the database. The image processing unit 112 processes the images A, B, and E that are determined not to match the person registered in the database. Specifically, a process of blurring the face portions of A, B, and E is performed so that no one can identify A, B, and E.
 図2Bは、加工後の画像の例である。図2Bに示される通り、A、B、Eの顔部分にぼかし加工がされ、個人が特定できないようになっている。その後、情報処理装置10の提供制御部113は、サーバ装置20に加工後の画像の送信先情報を要求する。サーバ装置20の提供先特定部211は、提供先データベースを参照して、C、Dの送信先情報(電子メールアドレス等)を抽出し、情報処理装置10に通知する。情報処理装置10の提供制御部113は、取得した送信先情報に基づいて、CおよびDに対して、加工後の画像を送信する。 FIG. 2B is an example of an image after processing. As shown in FIG. 2B, the face portions A, B, and E are blurred so that an individual cannot be identified. Thereafter, the provision control unit 113 of the information processing apparatus 10 requests the server apparatus 20 for transmission destination information of the processed image. The provision destination specifying unit 211 of the server apparatus 20 refers to the provision destination database, extracts C and D transmission destination information (e-mail address and the like), and notifies the information processing apparatus 10 of the information. The provision control unit 113 of the information processing apparatus 10 transmits the processed image to C and D based on the acquired transmission destination information.
 このように、テーマパーク等で撮像された画像に写り込んだサービス登録ユーザ以外の人物の画像を加工することで、当該人物のプライバシーを守ることができる。また、当該人物のプライバシーを守りつつ、サービス登録ユーザに簡易かつ迅速に撮像した画像を提供することができる。 Thus, by processing an image of a person other than a service registered user that is reflected in an image taken at a theme park or the like, the privacy of the person can be protected. In addition, while protecting the privacy of the person, it is possible to provide a service registered user with an image captured simply and quickly.
 図1に戻り、情報処理装置10の各機能の説明を行う。
 撮像部12は、レンズ、CCD(Charge Coupled Device)またはCMOS(Complementary MOS)撮像素子、信号処理部等を備える。
撮像部12は、撮像した画像を、定期的に記憶部13に出力する。
Returning to FIG. 1, each function of the information processing apparatus 10 will be described.
The imaging unit 12 includes a lens, a CCD (Charge Coupled Device) or CMOS (Complementary MOS) imaging device, a signal processing unit, and the like.
The imaging unit 12 periodically outputs the captured image to the storage unit 13.
 記憶部13は、例えば、HDD(Hard Disc Drive)、フラッシュメモリ、EEPROM(Electrically Erasable Programmable Read Only Memory)、ROM(Read Only Memory)、またはRAM(Random Access Memory)などを備え、ファームウェアやアプリケーションプログラムなど、情報処理装置10が備えるCPU(Central Processing Unit、不図示)が実行するための各種プログラムやCPUが実行した処理の結果などを記憶する。 The storage unit 13 includes, for example, an HDD (Hard Disc Drive), a flash memory, an EEPROM (Electrically Erasable Programmable Read Only Memory), a ROM (Read Only Memory, etc.), or a RAM (Random Memory Program). In addition, various programs to be executed by a CPU (Central Processing Unit, not shown) included in the information processing apparatus 10 and results of processing executed by the CPU are stored.
 記憶部13は、ユーザデータベースD1を記憶している。ここで、図3を参照してユーザデータベースD1を説明する。
 図3は、記憶部13が記憶するユーザデータベースD1の構成の一例を示す図である。
 ユーザデータベースD1は、画像の送信サービスに登録したユーザの情報を表すレコードを複数有する。各レコードは、ユーザを特定するための、特定情報を有する。特定情報とは、顔特徴パラメータ、装飾品パラメータ、QR(Quick Response)コード(登録商標)パラメータ等からなる。
The storage unit 13 stores a user database D1. Here, the user database D1 will be described with reference to FIG.
FIG. 3 is a diagram illustrating an example of the configuration of the user database D1 stored in the storage unit 13. As illustrated in FIG.
The user database D1 has a plurality of records representing information of users registered in the image transmission service. Each record has specific information for specifying a user. The specific information includes a face feature parameter, an accessory parameter, a QR (Quick Response) code (registered trademark) parameter, and the like.
 顔特徴パラメータは、ユーザの顔の特徴を示す情報である。顔特徴パラメータは、例えば、顔の各部分(目、鼻、口等)の形、大きさ、位置関係等を数値化した特徴量であり、これらの特徴量の少なくとも一つの一致度により同一人物か否かを判定することができる。また、顔特徴パラメータは、顔や顔の各部分の画像情報であってもよい。
 服装パラメータは、ユーザが当日着用している服装を表すパラメータである。服装パラメータは、テーマパーク入園時に撮影した画像から取得してもよいし、ユーザから当日の服装を撮影した画像を受信することで取得してもよい。
 装飾品パラメータは、ユーザが当日身に着けている装飾品を表すパラメータである。装飾品パラメータの取得方法は、服装パラメータと同様である。
 QRコードパラメータは、ユーザに割り当てられたQRコード用の識別子を表す情報である。例えば、QRコード用の識別子をエンコードしたQRコードを印刷したシールを、テーマパーク入園時にユーザの服等に貼り付けておき、撮像部12が当該QRコードを読み取り、制御部11がQRコードをデコードして各ユーザのQRコード用識別子と比較することでユーザを特定することができる。
The face feature parameter is information indicating the feature of the user's face. The face feature parameter is, for example, a feature value obtained by quantifying the shape, size, positional relationship, etc. of each part of the face (eyes, nose, mouth, etc.). It can be determined whether or not. Further, the face feature parameter may be image information of a face or each part of the face.
The clothing parameter is a parameter representing the clothing worn by the user on the day. The clothing parameter may be acquired from an image taken at the time of entering the theme park, or may be obtained by receiving an image of the clothing of the day from the user.
The ornament parameter is a parameter representing the ornament worn by the user on the day. The method for obtaining the decoration parameter is the same as the clothing parameter.
The QR code parameter is information representing an identifier for QR code assigned to the user. For example, a sticker printed with a QR code encoded with a QR code identifier is pasted on a user's clothes at the time of entering the theme park, the imaging unit 12 reads the QR code, and the control unit 11 decodes the QR code. Then, the user can be specified by comparing with the QR code identifier of each user.
 また、ユーザデータベースD1は、グループ情報を有する。グループ情報は、ユーザの所属するグループを示す情報である。ユーザの登録時の情報に基づいて同一グループには同一の識別子が割り当てられる。なお、グループ情報は、ユーザデータベースD1内に保持していなくてもよく、例えば、他のデータベース(例えば、ユーザの電話帳、友達リスト等)の情報を参照して使用することでもよい。 The user database D1 has group information. The group information is information indicating a group to which the user belongs. The same identifier is assigned to the same group based on information at the time of user registration. The group information may not be held in the user database D1, and may be used by referring to information in another database (for example, a user's phone book, friend list, etc.).
 図1に戻り、情報処理装置10の説明を続ける。
 通信部14は、有線または無線のネットワークを介して装置間の通信を行うための通信用インターフェイスを備え、サーバ装置20と通信する。 ネットワークは、例えば、携帯電話網、VPN(Virtual Private Network)網、専用通信回線網、WAN(Wide Area Network)、LAN(Local Area Network)、PSTN(Public Switched Telephone Network;公衆交換電話網)などによって構成される情報通信ネットワークであり、または、これらの組み合わせである。
Returning to FIG. 1, the description of the information processing apparatus 10 will be continued.
The communication unit 14 includes a communication interface for performing communication between apparatuses via a wired or wireless network, and communicates with the server apparatus 20. The network is, for example, a mobile phone network, a VPN (Virtual Private Network) network, a dedicated communication line network, a WAN (Wide Area Network), a LAN (Local Area Network), a PSTN (Public Switched Telephony network; An information communication network to be configured, or a combination thereof.
 制御部11は、情報処理装置10が備える各種構成を制御する。制御部11が有する機能の一部または全ては、例えば、情報処理装置10が備えるCPUが記憶部13に記憶されているプログラムを実行することにより実現されてもよい。
 制御部11は、判定部111と、画像処理部112と、提供制御部113とを備える。
The control unit 11 controls various configurations included in the information processing apparatus 10. A part or all of the functions of the control unit 11 may be realized, for example, by executing a program stored in the storage unit 13 by a CPU included in the information processing apparatus 10.
The control unit 11 includes a determination unit 111, an image processing unit 112, and a provision control unit 113.
 判定部111は、撮像部12が撮像した画像に含まれる人物を抽出する。人物の抽出には種々の方法を用いることができる。例えば、予め背景のみが撮影された画像と入力画像の差分により検出する背景差分による方法を用いてもよい。また、例えば、設定された大きさの検出ウィンドウ内の画像を切り出したパターン画像に対し、人物であるかの判定を行う方法を用いてもよい。なお、判定部111は、人物全体でなく、人物の一部分(例えば顔)を抽出して判定に用いてもよい。 The determination unit 111 extracts a person included in the image captured by the imaging unit 12. Various methods can be used to extract a person. For example, a method using a background difference that is detected based on a difference between an image in which only the background is captured in advance and an input image may be used. Further, for example, a method may be used in which a pattern image obtained by cutting out an image in a detection window having a set size is determined as a person. Note that the determination unit 111 may extract a part of the person (for example, a face) instead of the whole person and use it for the determination.
 次に、判定部111は、抽出した人物が、予め定められた条件に合致するか否かを判定する。予め定められた条件とは、例えば、抽出した人物が記憶部13のユーザデータベースD1に登録された人物に合致するか否かである。その場合、判定部111は、抽出した人物から算出したパラメータとユーザデータベースD1に登録された人物の各パラメータを比較し、いずれかのパラメータが一致または一致度が所定の閾値より高い場合に、抽出した人物がユーザデータベースD1に登録された人物に合致すると判定する。また、判定部111は、複数のパラメータの一致度がいずれも所定の閾値より高い場合に、抽出した人物がユーザデータベースD1に登録された人物に合致すると判定してもよい。例えば、複数の顔特徴パラメータ(目、鼻、口)の一致度がいずれも所定の閾値より高い場合等である。予め定められた条件は、抽出した人物が、ユーザデータベースD1に同じグループとして登録されたユーザであるかであってもよい。
 判定部111は、判定した結果を画像処理部112および提供制御部113に出力する。
Next, the determination unit 111 determines whether the extracted person matches a predetermined condition. The predetermined condition is, for example, whether or not the extracted person matches the person registered in the user database D1 of the storage unit 13. In that case, the determination unit 111 compares the parameter calculated from the extracted person and each parameter of the person registered in the user database D1, and if any parameter matches or the matching degree is higher than a predetermined threshold, the determination unit 111 extracts It is determined that the selected person matches the person registered in the user database D1. Further, the determination unit 111 may determine that the extracted person matches the person registered in the user database D1 when the degree of coincidence of the plurality of parameters is higher than a predetermined threshold. For example, this is the case where the degree of coincidence of a plurality of face feature parameters (eyes, nose, mouth) is higher than a predetermined threshold. The predetermined condition may be whether the extracted person is a user registered as the same group in the user database D1.
The determination unit 111 outputs the determined result to the image processing unit 112 and the provision control unit 113.
 画像処理部112は、判定部111から入力された判定結果に基づき、撮像部12が撮像した画像に含まれる人物の画像を加工し、加工画像を生成する。画像処理部112は、予め登録された条件に合致しないと判定された人物について、個人が特定できないように当該人物の画像を加工する。例えば、画像処理部112は、対象人物の顔部分にぼかし(モザイク)を入れる加工を行う。ぼかし加工の粒度は適宜設定可能である。その他、画像処理部112は、対象人物の顔部分を塗りつぶしてもよいし、他の画像を重畳して表示してもよい。また、加工する部分は、対象人物の顔部分に限らず、対象人物の体の一部または全部、あるいは、体を含む領域であってもよい。また、画像処理部112は、予め保持する背景画像を用いて、対象人物が存在しないように見せる加工を行ってもよい。例えば、画像処理部112は、対象人物の背後に存在するはずの背景画像を対象人物の上(前面)に重畳表示することで対象人物が存在しないように見せる加工を行ってもよい。 The image processing unit 112 processes a person image included in the image captured by the imaging unit 12 based on the determination result input from the determination unit 111, and generates a processed image. The image processing unit 112 processes an image of a person who has been determined not to meet a pre-registered condition so that an individual cannot be identified. For example, the image processing unit 112 performs a process of blurring (mosaic) the face portion of the target person. The grain size of the blurring process can be set as appropriate. In addition, the image processing unit 112 may fill the face portion of the target person, or may superimpose and display another image. Further, the portion to be processed is not limited to the face portion of the target person, but may be a part or all of the body of the target person or a region including the body. Further, the image processing unit 112 may perform processing so that the target person does not exist by using a background image stored in advance. For example, the image processing unit 112 may perform processing so that the target person does not exist by superimposing and displaying the background image that should exist behind the target person on the front (front side) of the target person.
 提供制御部113は、判定部111から入力された判定結果に基づき、予め定められた条件に合致すると判定された画像内人物の送信先情報を取得する。送信先情報は、例えば、電子メールアドレスやSNS(Social Networking Service)のメッセージ送信用のアドレス等である。提供制御部113は、通信部14を介して、サーバ装置20から対象人物の送信先情報を取得する。送信先情報の取得先は、サーバ装置20に限らず、その他の外部機器や記憶部13から取得してもよい。
 提供制御部113は、取得した送信先情報に基づいて、画像処理部112が加工した加工画像を、通信部14を介して送信する。
The provision control unit 113 acquires transmission destination information of the person in the image determined to meet a predetermined condition based on the determination result input from the determination unit 111. The destination information is, for example, an e-mail address or an address for sending an SNS (Social Networking Service) message. The provision control unit 113 acquires the transmission destination information of the target person from the server device 20 via the communication unit 14. The acquisition destination of the transmission destination information is not limited to the server device 20 and may be acquired from another external device or the storage unit 13.
The providing control unit 113 transmits the processed image processed by the image processing unit 112 via the communication unit 14 based on the acquired transmission destination information.
 続けて、サーバ装置20の各機能の説明を行う。
 記憶部22は、例えば、HDD、フラッシュメモリ、EEPROM、ROM、またはRAMなどを備え、ファームウェアやアプリケーションプログラムなど、サーバ装置20が備えるCPU(不図示)が実行するための各種プログラムやCPUが実行した処理の結果などを記憶する。
 記憶部22は、提供先データベースD2(不図示)を記憶する。提供先データベースD2は、登録されたユーザ毎のレコードを有し、各レコードは、画像の送信方法、送信先情報の各値を有する。送信方法は、電子メール、SNSメッセージ等の画像を送信する方法(手段)を示し、送信先情報は、送信方法に対応する電子メールアドレス、SNSメッセージ送信用のアドレス等である。
Subsequently, each function of the server device 20 will be described.
The storage unit 22 includes, for example, an HDD, a flash memory, an EEPROM, a ROM, or a RAM, and is executed by various programs executed by a CPU (not shown) included in the server device 20 such as firmware and application programs. The result of processing is stored.
The storage unit 22 stores a provision destination database D2 (not shown). The provision destination database D2 has a record for each registered user, and each record has each value of an image transmission method and transmission destination information. The transmission method indicates a method (means) for transmitting an image such as an electronic mail or an SNS message, and the destination information includes an electronic mail address corresponding to the transmission method, an address for transmitting an SNS message, or the like.
 通信部23は、有線または無線のネットワークを介して装置間の通信を行うための通信用インターフェイスを備え、情報処理装置10と通信する。ネットワークは、例えば、携帯電話網、VPN網、専用通信回線網、WAN、LAN、PSTNなどによって構成される情報通信ネットワークであり、または、これらの組み合わせである。 The communication unit 23 includes a communication interface for performing communication between apparatuses via a wired or wireless network, and communicates with the information processing apparatus 10. The network is, for example, an information communication network configured by a mobile phone network, a VPN network, a dedicated communication line network, WAN, LAN, PSTN, or the like, or a combination thereof.
 制御部21は、サーバ装置20が備える各種構成を制御する。制御部21が有する機能の一部または全ては、例えば、サーバ装置20が備えるCPUが記憶部22に記憶されているプログラムを実行することにより実現されてもよい。
 制御部21は、提供先特定部211を備える。
The control unit 21 controls various configurations included in the server device 20. A part or all of the functions of the control unit 21 may be realized, for example, by executing a program stored in the storage unit 22 by a CPU included in the server device 20.
The control unit 21 includes a provision destination specifying unit 211.
 提供先特定部211は、情報処理装置10からの要求に基づいて、提供先データベースD2を参照し、指定されたユーザの送信方法に対応する送信先情報を抽出する。提供先特定部211は、抽出した送信先情報を、通信部23を介して情報処理装置10に通知する。 The provision destination specifying unit 211 refers to the provision destination database D2 based on a request from the information processing apparatus 10 and extracts transmission destination information corresponding to the designated user transmission method. The provision destination specifying unit 211 notifies the information processing apparatus 10 of the extracted transmission destination information via the communication unit 23.
 次に、図4を参照して本実施形態に係る情報処理システム1の動作を説明する。
 図4は、本実施形態に係る情報処理システム1の動作の一例を示すフローチャートである。
(ステップS101)情報処理装置10の撮像部12は、制御部11の指示に基づいて撮影範囲に含まれる画像を撮像する。撮影の契機は、外部からの撮影指示でもよいし、定期的に撮影することであってもよい。この例では、図2Aの画像を撮像する。その後、ステップS102の処理に進む。
Next, the operation of the information processing system 1 according to the present embodiment will be described with reference to FIG.
FIG. 4 is a flowchart showing an example of the operation of the information processing system 1 according to the present embodiment.
(Step S <b> 101) The imaging unit 12 of the information processing apparatus 10 captures an image included in the imaging range based on an instruction from the control unit 11. An opportunity for shooting may be a shooting instruction from the outside, or may be shooting periodically. In this example, the image of FIG. 2A is captured. Thereafter, the process proceeds to step S102.
(ステップS102)判定部111は、撮像された画像に含まれる人物を抽出する。この例では、A、B、C、D、Eの5人を抽出する。その後、ステップS103の処理に進む。 (Step S102) The determination unit 111 extracts a person included in the captured image. In this example, five persons A, B, C, D, and E are extracted. Thereafter, the process proceeds to step S103.
(ステップS103)判定部111は、抽出した人物A、B、C、D、Eのそれぞれについて、ユーザデータベースに登録されているかを判定する。判定部111は、A、B、C、D、Eから抽出した各特徴量(パラメータ)と、ユーザデータベースD1の各判定パラメータを比較して登録判定を行う。その後、ステップS104の処理に進む。 (Step S103) The determination unit 111 determines whether each of the extracted persons A, B, C, D, and E is registered in the user database. The determination unit 111 performs registration determination by comparing each feature amount (parameter) extracted from A, B, C, D, and E with each determination parameter in the user database D1. Thereafter, the process proceeds to step S104.
(ステップS104)判定部111は、抽出した人物の各パラメータが、ユーザデータベースD1の各判定パラメータのいずれにも合致しない場合、誰もユーザデータベースD1に登録されていないと判定し(ステップS104/NO)、処理を終了する。判定部111は、抽出した人物の各パラメータのいずれか一つが、ユーザデータベースD1の各判定パラメータのいずれかに合致する場合は、当該人物はユーザデータベースD1に登録されていると判定し(ステップS104/YES)、ステップS105の処理に進む。
 この例では、CおよびDから抽出した顔特徴パラメータと、ユーザデータベースD1のユーザC、ユーザDの顔特徴パラメータとの一致度が予め定められた閾値より大きいため、CおよびDは、ユーザデータベースD1に登録されていると判定する。その他、判定部111は、服装パラメータ、装飾品パラメータ、QRコードパラメータの一致度により判定を行ってもよい。
(Step S104) If each parameter of the extracted person does not match any of the determination parameters in the user database D1, the determination unit 111 determines that no one is registered in the user database D1 (Step S104 / NO). ), The process is terminated. If any one of the extracted parameters of the person matches any of the determination parameters of the user database D1, the determination unit 111 determines that the person is registered in the user database D1 (step S104). / YES), the process proceeds to step S105.
In this example, since the degree of coincidence between the facial feature parameters extracted from C and D and the facial feature parameters of the users C and D in the user database D1 is greater than a predetermined threshold, C and D are stored in the user database D1. Is determined to be registered. In addition, the determination unit 111 may perform the determination based on the matching degree of the clothing parameter, the accessory parameter, and the QR code parameter.
(ステップS105)判定部111は、ユーザデータベースD1に登録されている人物以外に、画像内に人物が存在するかを判定する。画像内に、ユーザデータベースD1に登録されている人物以外が存在しないと判定する場合(ステップS105/NO)、ステップS107の処理に進む。画像内に、ユーザデータベースD1に登録されている人物以外が存在すると判定する場合(ステップS105/YES)、ステップS106の処理に進む。この例では、画像内に、ユーザデータベースD1に登録されていないA、B、Eが存在するため、ステップS106に進む。 (Step S105) The determination unit 111 determines whether there is a person in the image other than the person registered in the user database D1. When it is determined that there is no person other than the person registered in the user database D1 in the image (step S105 / NO), the process proceeds to step S107. When it is determined that there is a person other than the person registered in the user database D1 in the image (step S105 / YES), the process proceeds to step S106. In this example, since A, B, and E that are not registered in the user database D1 exist in the image, the process proceeds to step S106.
(ステップS106)画像処理部112は、ユーザデータベースD1に登録されていない人物の画像を加工する。この例では、ユーザデータベースD1に登録されていないA、B、Eの顔部分にぼかし加工を行う。その後、ステップS107の処理に進む。 (Step S106) The image processing unit 112 processes an image of a person who is not registered in the user database D1. In this example, blurring is performed on the face portions of A, B, and E that are not registered in the user database D1. Thereafter, the process proceeds to step S107.
(ステップS107)提供制御部113は、ユーザデータベースD1に登録されている人物の送信先情報を、サーバ装置20に要求して取得する。この例では、提供制御部113は、画像の送信対象であるC、Dの送信先情報をサーバ装置20に要求し、対応する送信先情報である電子メールアドレスを取得する。 (Step S107) The providing control unit 113 requests and acquires the transmission destination information of the person registered in the user database D1 from the server device 20. In this example, the provision control unit 113 requests the server apparatus 20 for C and D transmission destination information that is an image transmission target, and acquires an e-mail address that is corresponding transmission destination information.
(ステップS108)提供制御部113は、取得した送信先情報に基づいて、画像を送信する。この例では、C、Dの電子メールアドレス宛てに、加工された画像を添付ファイルとして添付して送信する。
 以上で、本実施形態に係る情報処理システム1の動作の例の説明を終了する。
(Step S108) The providing control unit 113 transmits an image based on the acquired transmission destination information. In this example, the processed image is attached and transmitted as an attached file to C and D email addresses.
Above, description of the example of operation | movement of the information processing system 1 which concerns on this embodiment is complete | finished.
 以上、説明した通り、本実施形態に係る情報処理システム1において、情報処理装置10は、通信部14と、制御部11とを備え、前記制御部11は、取得した画像から人物の一部又は全部を含む画像領域を抽出し、前記画像領域が所定の人物に対応するか否かを、所定の人物を特定するための特定情報に基づいて判定し、前記画像領域が前記所定の人物に対応しないと判定した場合に、前記画像領域を加工して加工画像を生成し、前記通信部を介して、前記所定の人物に前記加工画像を提供する。 As described above, in the information processing system 1 according to the present embodiment, the information processing apparatus 10 includes the communication unit 14 and the control unit 11, and the control unit 11 is configured to obtain a part of a person from the acquired image or An image area including all is extracted, and whether or not the image area corresponds to a predetermined person is determined based on specific information for specifying the predetermined person, and the image area corresponds to the predetermined person When it is determined that the image is not to be processed, a processed image is generated by processing the image area, and the processed image is provided to the predetermined person via the communication unit.
 この構成により、撮像された画像に写り込んだサービス登録ユーザ以外の人物を判定し、個人が特定できないように加工するため、提供する画像に含まれる人物のプライバシーを守ることができる。また、サービス登録ユーザに対して、簡易かつ迅速に画像を提供することができる。 This configuration makes it possible to determine the person other than the service registered user reflected in the captured image and process the person so that the individual cannot be identified. Therefore, the privacy of the person included in the provided image can be protected. In addition, it is possible to provide an image to a service registered user simply and quickly.
[第2の実施形態]
 以下、図面を参照しながら本発明の第2の実施形態について説明する。なお、上述した実施形態と同様の構成については、同一の符号を付し、その説明を援用する。
 本実施形態は、第1の実施形態とシステム構成は同じである。第1の実施形態では、予め定められた条件として、抽出した人物が記憶部13のユーザデータベースD1に登録された人物に合致するか否かを用いたが、本実施形態では、さらに、同一のグループに属するか否かも判定の条件とする点が異なる。以下に、第1の実施形態との差異点を中心に説明する。
[Second Embodiment]
The second embodiment of the present invention will be described below with reference to the drawings. In addition, about the structure similar to embodiment mentioned above, the same code | symbol is attached | subjected and the description is used.
This embodiment has the same system configuration as the first embodiment. In the first embodiment, whether or not the extracted person matches the person registered in the user database D1 of the storage unit 13 is used as a predetermined condition. In the present embodiment, the same condition is used. The difference is whether or not it belongs to a group. Below, it demonstrates centering on the difference with 1st Embodiment.
 図5は、本実施形態に係る情報処理システム1の動作の一例を示すフローチャートである。図4の各ステップと同様のステップについては、説明を省略し、差異点について説明する。
 ステップS205において、判定部111は、ユーザデータベースD1に登録されていると判定したユーザが、それぞれ同じグループに属するかを判定する。判定部111は、ユーザデータベースD1のグループ情報の値に基づいて当該判定を行う。
 例えば、図2Aの例において、A、Bはグループ1として、C、Dはグループ2としてユーザデータベースD1に登録されているとする。そうすると、判定部111は、Cについて判定する場合、A、Bについては、他グループ、Dについては同一グループ、Eについては登録無しと判定する。したがって、同じ、グループに属する人物(D)以外に、画像に写っている人物が存在する(A、B、E)ため(ステップS205/YES)、ステップS206の処理に進み、当該人物(A、B、E)の画像を加工する。
FIG. 5 is a flowchart showing an example of the operation of the information processing system 1 according to the present embodiment. Description of the same steps as those in FIG. 4 will be omitted, and differences will be described.
In step S205, the determination unit 111 determines whether the users determined to be registered in the user database D1 belong to the same group. The determination unit 111 performs the determination based on the value of the group information in the user database D1.
For example, in the example of FIG. 2A, it is assumed that A and B are registered as a group 1 and C and D are registered as a group 2 in the user database D1. Then, when determining for C, the determination unit 111 determines that A and B are other groups, D is the same group, and E is not registered. Therefore, in addition to the person (D) belonging to the same group, there are persons in the image (A, B, E) (step S205 / YES), the process proceeds to step S206, and the person (A, The image of B, E) is processed.
 ステップS207において、提供制御部113は、サーバ装置20から送信先情報を取得する。ここで、提供制御部113は、送信先として、送信対象(C)の送信先情報のみを取得してもよいし、送信対象(C)と送信対象と同じグループに属する人物(D)の送信先情報を取得してもよい。 In step S207, the provision control unit 113 acquires transmission destination information from the server device 20. Here, the providing control unit 113 may acquire only transmission destination information of the transmission target (C) as a transmission destination, or transmission of a person (D) belonging to the same group as the transmission target (C). Prior information may be acquired.
 以上説明した通り、本実施形態に係る情報処理システム1において、制御部11(画像処理部112)は、所定の人物と同じグループに属する人物に対応する画像領域については加工を行わない。
 この構成により、同じグループに属さない人物の画像を個人が特定できないように加工するため、提供する画像に含まれる人物のプライバシーを守ることができる。
As described above, in the information processing system 1 according to the present embodiment, the control unit 11 (image processing unit 112) does not process an image region corresponding to a person belonging to the same group as a predetermined person.
According to this configuration, since an image of a person who does not belong to the same group is processed so that an individual cannot be specified, the privacy of the person included in the provided image can be protected.
 また、制御部11(提供制御部113)は、前記所定の人物と同じグループに属する人物の少なくとも一人に加工画像を提供する。
 この構成により、グループに属するメンバーに簡易かつ迅速に画像を提供することができる。
The control unit 11 (providing control unit 113) provides the processed image to at least one person who belongs to the same group as the predetermined person.
With this configuration, an image can be provided to members belonging to the group simply and quickly.
[第3の実施形態]
 以下、図面を参照しながら本発明の第3の実施形態について説明する。なお、上述した実施形態と同様の構成については、同一の符号を付し、その説明を援用する。
 本実施形態は、第1の実施形態とシステム構成は同じである。本実施形態では、情報処理装置10は、ユーザが所持するスマートフォン、タブレット、デジタルカメラ等の撮像機能を有する情報処理装置であり、ユーザが、情報処理装置10を用いて風景を撮影する場合の例を説明する。
[Third Embodiment]
The third embodiment of the present invention will be described below with reference to the drawings. In addition, about the structure similar to embodiment mentioned above, the same code | symbol is attached | subjected and the description is used.
This embodiment has the same system configuration as the first embodiment. In this embodiment, the information processing apparatus 10 is an information processing apparatus having an imaging function such as a smartphone, a tablet, or a digital camera possessed by the user, and an example in which the user captures a landscape using the information processing apparatus 10. Will be explained.
 図6は、本実施形態に係る情報処理システム1の動作の一例を示す説明図である。
(ステップS301)情報処理装置10の撮像部12は、ユーザ操作に基づいて撮影範囲の風景画像を撮像する。この際、制御部11は、撮影前に、ユーザが風景モード等に設定を切り替えることで、風景を撮影することを認識してもよい。その後、ステップS302の処理に進む。
FIG. 6 is an explanatory diagram illustrating an example of the operation of the information processing system 1 according to the present embodiment.
(Step S301) The imaging unit 12 of the information processing apparatus 10 captures a landscape image in the imaging range based on a user operation. At this time, the control unit 11 may recognize that the user captures a landscape by switching the setting to a landscape mode or the like before capturing. Thereafter, the process proceeds to step S302.
(ステップS302)判定部111は、撮像された画像に含まれる人物を抽出する。その後、ステップS303の処理に進む。
(ステップS303)判定部111は、撮像された画像に人物が含まれるか否かを判定し、撮像された画像に人物が含まれる場合(ステップS303/YES)、ステップS304の処理に進む。撮像された画像に人物が含まれない場合(ステップS303/NO)、ステップS307の処理に進む。
(Step S302) The determination unit 111 extracts a person included in the captured image. Thereafter, the process proceeds to step S303.
(Step S303) The determination unit 111 determines whether or not a person is included in the captured image. If the person is included in the captured image (YES in Step S303), the process proceeds to Step S304. If no person is included in the captured image (step S303 / NO), the process proceeds to step S307.
(ステップS304)判定部111は、抽出した人物がユーザデータベースD1に登録されているかを判定する。判定部111は、さらに、抽出した人物がユーザと同一のグループに属するか否かを判定する。その後、ステップS305の処理に進む。
(ステップS305)ユーザと同一のグループに属する人物が存在しない場合(ステップS305/NO)、ステップS306の処理に進む。この例では、ユーザは風景を撮影しているため、撮像画像にユーザと同一のグループに属する人物は存在しない前提であるが、登録されたユーザが存在する場合(ステップS305/YES)、第1の実施形態のステップS105または第2の実施形態のステップS205に進む。
(Step S304) The determination unit 111 determines whether the extracted person is registered in the user database D1. The determination unit 111 further determines whether or not the extracted person belongs to the same group as the user. Thereafter, the process proceeds to step S305.
(Step S305) When there is no person belonging to the same group as the user (step S305 / NO), the process proceeds to step S306. In this example, since the user is photographing a landscape, it is assumed that no person belonging to the same group as the user exists in the captured image, but when there is a registered user (step S305 / YES), the first The process proceeds to step S105 of the embodiment or step S205 of the second embodiment.
(ステップS306)画像処理部112は、抽出された人物の画像にぼかし加工を行う。
抽出された人物は、ユーザが風景を撮影した際に写り込んだ画像送信サービスに登録していない、または、画像送信サービスに登録しているがユーザと異なるグループに属する人物である。したがって、抽出された人物の画像を、個人が特定できないように加工することで当該人物のプライバシーを守ることができる。その後、ステップS307の処理に進む。
(Step S306) The image processing unit 112 performs a blurring process on the extracted person image.
The extracted person is not registered in the image transmission service that is captured when the user has photographed the landscape, or is registered in the image transmission service but belongs to a group different from the user. Therefore, by processing the extracted person image so that an individual cannot be identified, the privacy of the person can be protected. Thereafter, the process proceeds to step S307.
(ステップS307)提供制御部113は、ユーザと同一のグループに属する人物の送信先情報を、サーバ装置20に要求する。サーバ装置20は、ユーザデータベースD1を参照して、依頼元のユーザと同一のグループに属する人物の送信先情報を抽出し、ユーザに通知する。その後、ステップS308の処理に進む。 (Step S307) The provision control unit 113 requests the server device 20 for transmission destination information of a person belonging to the same group as the user. The server device 20 refers to the user database D1, extracts the transmission destination information of a person belonging to the same group as the requesting user, and notifies the user. Thereafter, the process proceeds to step S308.
(ステップS308)提供制御部113は、取得した送信先情報に基づいて、同一グループに属する人物に対して画像を送信する。
 以上で、本実施形態に係る情報処理システム1の動作の例の説明を終了する。
(Step S308) The providing control unit 113 transmits an image to persons belonging to the same group based on the acquired transmission destination information.
Above, description of the example of operation | movement of the information processing system 1 which concerns on this embodiment is complete | finished.
 以上説明した通り、本実施形態に係る情報処理システム1は、ユーザやユーザと同一グループの人物が写っていない場合にも適用可能である。 As described above, the information processing system 1 according to the present embodiment can be applied even when a user or a person in the same group as the user is not shown.
[第4の実施形態]
 以下、図面を参照しながら本発明の第4の実施形態について説明する。なお、上述した実施形態と同様の構成については、同一の符号を付し、その説明を援用する。
 本実施形態は、第1の実施形態とシステム構成は同じである。本実施形態に係る情報処理システム1は、上述の実施形態の処理に加えて、対象となるユーザが画像の中心になるようにトリミング処理を行う。以下に、第1の実施形態との差異点を中心に説明する。
[Fourth Embodiment]
The fourth embodiment of the present invention will be described below with reference to the drawings. In addition, about the structure similar to embodiment mentioned above, the same code | symbol is attached | subjected and the description is used.
This embodiment has the same system configuration as the first embodiment. The information processing system 1 according to the present embodiment performs a trimming process so that the target user becomes the center of the image in addition to the processes of the above-described embodiment. Below, it demonstrates centering on the difference with 1st Embodiment.
 図7は、本実施形態に係る情報処理システム1の動作の一例を示すフローチャートである。図4の各ステップと同様のステップについては、説明を省略し、差異点について説明する。
 ステップS407において、画像処理部112は、画像のトリミングを行うか否かを判定する。画像処理部112は、ユーザデータベースD1に登録されたトリミング要否の情報に基づいて当該判定を行ってもよいし、記憶部13が保持する他のデータベースの情報に基づいて判定を行ってもよい。また、画像処理部112は、入力部(不図示)に対するユーザ操作を受け付けて、トリミングの実施を判定してもよい。トリミングを実施すると判定した場合(ステップS407/YES)、ステップS408の処理に進む。トリミングを実施しないと判定した場合(ステップS407/NO)、ステップS409の処理に進む。
FIG. 7 is a flowchart showing an example of the operation of the information processing system 1 according to the present embodiment. Description of the same steps as those in FIG. 4 will be omitted, and differences will be described.
In step S407, the image processing unit 112 determines whether to perform image trimming. The image processing unit 112 may make the determination based on the information on necessity of trimming registered in the user database D1, or may make the determination based on information in another database held by the storage unit 13. . Further, the image processing unit 112 may receive a user operation on an input unit (not shown) and determine whether to perform trimming. If it is determined that trimming is to be performed (step S407 / YES), the process proceeds to step S408. When it is determined not to perform trimming (step S407 / NO), the process proceeds to step S409.
 ステップS408において、画像処理部112は、画像のトリミングを行う。図8A及び図8Bを用いてトリミング処理の一例を説明する。
 図8A及び図8Bは、本実施形態に係るトリミング処理の一例を示す説明図である。
 この例では、ステップS406において、ぼかし加工を行った画像を図2Bに示す画像であるとし、その後、Cについてトリミングを行う例を説明する。
 画像処理部112は、トリミング対象としてCを選択する。画像処理部112は、ユーザデータベースD1に登録された情報に基づいて当該判定を行ってもよいし、記憶部13が保持する他のデータベースの情報に基づいて判定を行ってもよい。例えば、画像処理部112は、画像を送信するユーザをトリミング対象として選択してもよい。また、画像処理部112は、入力部(不図示)に対するユーザ操作を受け付けて、トリミングの実施を判定してもよい。
In step S408, the image processing unit 112 performs image trimming. An example of the trimming process will be described with reference to FIGS. 8A and 8B.
8A and 8B are explanatory diagrams illustrating an example of trimming processing according to the present embodiment.
In this example, an example in which the image subjected to the blurring process in step S406 is the image shown in FIG.
The image processing unit 112 selects C as a trimming target. The image processing unit 112 may perform the determination based on information registered in the user database D1, or may perform the determination based on information in another database held by the storage unit 13. For example, the image processing unit 112 may select a user who transmits an image as a trimming target. Further, the image processing unit 112 may receive a user operation on an input unit (not shown) and determine whether to perform trimming.
 画像処理部112は、トリミング対象であるCの顔が画像の中心部に位置するようにトリミングを行う。図8AはCについてトリミングを実施した後の画像の例である。図8Aの通り、Cの顔が画像の中心部に位置するようにトリミングされる。画像の中心部とは、画像の中心付近の一定の領域を指すが、トリミング後の対象人物の顔の位置が、当初の撮像画像よりも中心に近く位置していればよい。画像処理部112は、トリミングとあわせて、画像の拡大処理を行ってもよい。図8Bは画像の拡大処理を実施した後の画像の例である。拡大処理の倍率は適宜設定が可能である。また、画像処理部112は、トリミング対象人物の顔に限らず、例えば、トリミング対象人物の体の中心座標を求め、対象人物の体の中心部分が画像の中心部に位置するようトリミングを行ってもよい。また、例えば、画像処理部112は、同一グループに属するメンバー(図8A及び図8BのCとD)の顔や体の中心部分が画像の中心部に位置するようにトリミングを行ってもよい。これにより、グループメンバーが中心に位置する画像を生成することができる。
 以上で、本実施形態に係る情報処理システム1の動作の例の説明を終了する。
The image processing unit 112 performs trimming so that the face C to be trimmed is positioned at the center of the image. FIG. 8A is an example of an image after performing trimming on C. As shown in FIG. 8A, cropping is performed so that the face of C is positioned at the center of the image. The center portion of the image refers to a certain area near the center of the image, but the position of the face of the target person after trimming only needs to be closer to the center than the original captured image. The image processing unit 112 may perform image enlargement processing together with trimming. FIG. 8B is an example of an image after the image enlargement process is performed. The magnification of the enlargement process can be set as appropriate. The image processing unit 112 is not limited to the face of the person to be trimmed, for example, obtains the center coordinates of the body of the person to be trimmed, and performs trimming so that the center part of the body of the target person is located at the center of the image. Also good. Further, for example, the image processing unit 112 may perform the trimming so that the central part of the face or body of the members (C and D in FIGS. 8A and 8B) belonging to the same group is located in the central part of the image. As a result, an image in which the group member is located at the center can be generated.
Above, description of the example of operation | movement of the information processing system 1 which concerns on this embodiment is complete | finished.
 以上説明した通り、本実施形態に係る情報処理システム1において、制御部11(画像処理部112)は、所定の人物に対応すると判定した前記画像領域を含み、かつ画像の少なくとも一部を含まないような加工画像を生成(トリミング)する。
 この構成により、ユーザが画像の中心付近に位置するように画像を加工するため、ユーザ毎に適した画像を簡易かつ迅速に提供することができる。
As described above, in the information processing system 1 according to the present embodiment, the control unit 11 (image processing unit 112) includes the image region determined to correspond to a predetermined person and does not include at least a part of the image. Such a processed image is generated (trimmed).
With this configuration, the image is processed so that the user is positioned in the vicinity of the center of the image, so that an image suitable for each user can be provided simply and quickly.
[第5の実施形態]
 以下、図面を参照しながら本発明の第5の実施形態について説明する。なお、上述した実施形態と同様の構成については、同一の符号を付し、その説明を援用する。
 本実施形態では、サーバ装置を用いず、情報処理装置30のみで、各処理を行う点が、第1の実施形態と異なる。
[Fifth Embodiment]
The fifth embodiment of the present invention will be described below with reference to the drawings. In addition, about the structure similar to embodiment mentioned above, the same code | symbol is attached | subjected and the description is used.
This embodiment is different from the first embodiment in that each process is performed only by the information processing apparatus 30 without using the server apparatus.
 図9は、本実施形態に係る情報処理装置30の機能構成を示すブロック図である。
 情報処理装置30は、スマートフォン、タブレット、デジタルカメラ等の撮像機能を有する情報処理装置である。情報処理装置30は、制御部11と、判定部111と、画像処理部112と、提供制御部313と、撮像部12と、記憶部33と、通信部14とを備える。
FIG. 9 is a block diagram illustrating a functional configuration of the information processing apparatus 30 according to the present embodiment.
The information processing apparatus 30 is an information processing apparatus having an imaging function such as a smartphone, a tablet, or a digital camera. The information processing apparatus 30 includes a control unit 11, a determination unit 111, an image processing unit 112, a provision control unit 313, an imaging unit 12, a storage unit 33, and a communication unit 14.
 記憶部33は、第1の実施形態における記憶部13と同様の機能を有し、さらに、提供先データベースD2を記憶する。提供先データベースD2は、登録されたユーザ毎のレコードを有し、各レコードは、画像の送信方法、送信先情報の各値を有する。送信方法は、電子メール、SNSメッセージ等の画像を送信する方法(手段)を示し、送信先情報は、送信方法に対応する電子メールアドレス、SNSメッセージ送信用のアドレス等である。
記憶部33は、ユーザデータベースD1と提供先データベースD2を統合した一つのデータベースとして保持してもよい。
The storage unit 33 has the same function as the storage unit 13 in the first embodiment, and further stores a provision destination database D2. The provision destination database D2 has a record for each registered user, and each record has each value of an image transmission method and transmission destination information. The transmission method indicates a method (means) for transmitting an image such as an electronic mail or an SNS message, and the destination information includes an electronic mail address corresponding to the transmission method, an address for transmitting an SNS message, or the like.
The storage unit 33 may hold the user database D1 and the provision destination database D2 as a single database.
 提供制御部313は、ユーザデータベースD1に登録されている人物の送信先情報を記憶部33に記憶されている提供先データベースD2を参照して取得する。 The providing control unit 313 acquires the transmission destination information of the person registered in the user database D1 with reference to the providing destination database D2 stored in the storage unit 33.
 第1の実施形態では、ステップS107において、提供制御部113は、ユーザデータベースD1に登録されている人物の送信先情報を、サーバ装置20に要求して取得したが、上述の通り、提供制御部313は、ユーザデータベースD1に登録されている人物の送信先情報を記憶部33に記憶されている提供先データベースD2を参照して取得する。
 その他の動作は、上述の実施形態と同様である。
In the first embodiment, in step S107, the provision control unit 113 requests and obtains the transmission destination information of the person registered in the user database D1 from the server device 20, but as described above, the provision control unit 113 In step 313, the transmission destination information of the person registered in the user database D <b> 1 is acquired with reference to the provision destination database D <b> 2 stored in the storage unit 33.
Other operations are the same as those in the above-described embodiment.
 以上、説明した通り、本実施形態に係る情報処理装置30は、通信部14と、制御部11とを備え、前記制御部11は、取得した画像から人物の一部又は全部を含む画像領域を抽出し、前記画像領域が所定の人物に対応するか否かを、所定の人物を特定するための特定情報に基づいて判定し、前記画像領域が前記所定の人物に対応しないと判定した場合に、前記画像領域を加工して加工画像を生成し、前記通信部を介して、前記所定の人物に前記加工画像を提供する。 As described above, the information processing apparatus 30 according to the present embodiment includes the communication unit 14 and the control unit 11, and the control unit 11 selects an image area that includes a part or all of a person from the acquired image. When extracting and determining whether or not the image area corresponds to the predetermined person based on the specific information for specifying the predetermined person, and determining that the image area does not correspond to the predetermined person The processed image area is processed to generate a processed image, and the processed image is provided to the predetermined person via the communication unit.
 この構成により、撮像された画像に写り込んだサービス登録ユーザ以外の人物を判定し、個人が特定できないように加工するため、提供する画像に含まれる人物のプライバシーを守ることができる。また、サービス登録ユーザに対して、簡易かつ迅速に画像を提供することができる。 This configuration makes it possible to determine the person other than the service registered user reflected in the captured image and process the person so that the individual cannot be identified. Therefore, the privacy of the person included in the provided image can be protected. In addition, it is possible to provide an image to a service registered user simply and quickly.
[第6の実施形態]
 以下、図面を参照しながら本発明の第6の実施形態について説明する。なお、上述した実施形態と同様の構成については、同一の符号を付し、その説明を援用する。
 図10は、本実施形態に係る情報処理システム2の構成を示す構成図である。
 情報処理システム2は、情報処理装置40と、サーバ装置50とを備える。
[Sixth Embodiment]
Hereinafter, a sixth embodiment of the present invention will be described with reference to the drawings. In addition, about the structure similar to embodiment mentioned above, the same code | symbol is attached | subjected and the description is used.
FIG. 10 is a configuration diagram showing a configuration of the information processing system 2 according to the present embodiment.
The information processing system 2 includes an information processing device 40 and a server device 50.
 情報処理装置40は、制御部41と、撮像部12と、記憶部43と、通信部14とを備える。
 制御部41は、情報処理装置40が備える各種構成を制御する。第1の実施形態に係る制御部11と異なり、判定部111、画像処理部112、提供制御部113を備えていない。
 記憶部43は、情報処理装置40が備えるCPUが実行するための各種プログラムやCPUが実行した処理の結果などを記憶する。第1の実施形態に係る記憶部13と異なり、ユーザデータベースD1は記憶していない。
The information processing apparatus 40 includes a control unit 41, an imaging unit 12, a storage unit 43, and a communication unit 14.
The control unit 41 controls various configurations included in the information processing apparatus 40. Unlike the control unit 11 according to the first embodiment, the determination unit 111, the image processing unit 112, and the provision control unit 113 are not provided.
The storage unit 43 stores various programs to be executed by the CPU included in the information processing apparatus 40, results of processes executed by the CPU, and the like. Unlike the storage unit 13 according to the first embodiment, the user database D1 is not stored.
 サーバ装置50は、制御部51と、記憶部52と、通信部53とを備える。
 記憶部52は、サーバ装置50が備えるCPUが実行するための各種プログラムやCPUが実行した処理の結果などを記憶する。記憶部52は、さらに、ユーザデータベースD1と、提供先データベースD2とを記憶する。各データベースの内容は、上述の実施形態と同様である。
 通信部53は、第1の実施形態に係る通信部23と同様であるため説明を省略する。
The server device 50 includes a control unit 51, a storage unit 52, and a communication unit 53.
The storage unit 52 stores various programs to be executed by the CPU included in the server device 50, results of processing executed by the CPU, and the like. The storage unit 52 further stores a user database D1 and a provision destination database D2. The contents of each database are the same as in the above embodiment.
Since the communication unit 53 is the same as the communication unit 23 according to the first embodiment, a description thereof will be omitted.
 制御部51は、判定部511と、画像処理部512と、提供先特定部513と、提供制御部514とを備える。それぞれの動作は上述の実施形態と同様であるため説明を省略する。 The control unit 51 includes a determination unit 511, an image processing unit 512, a provision destination specifying unit 513, and a provision control unit 514. Since each operation is the same as that of the above-described embodiment, description thereof is omitted.
以上、説明した通り、本実施形態に係るサーバ装置50は、通信部53と、制御部51とを備え、前記制御部51は、取得した画像から人物の一部又は全部を含む画像領域を抽出し、前記画像領域が所定の人物に対応するか否かを、所定の人物を特定するための特定情報に基づいて判定し、前記画像領域が前記所定の人物に対応しないと判定した場合に、前記画像領域を加工して加工画像を生成し、前記通信部を介して、前記所定の人物に前記加工画像を提供する。 As described above, the server device 50 according to the present embodiment includes the communication unit 53 and the control unit 51, and the control unit 51 extracts an image area including a part or all of a person from the acquired image. Determining whether or not the image area corresponds to a predetermined person based on identification information for specifying the predetermined person, and determining that the image area does not correspond to the predetermined person, The processed image is processed to generate a processed image, and the processed image is provided to the predetermined person via the communication unit.
 この構成により、撮像された画像に写り込んだサービス登録ユーザ以外の人物を判定し、個人が特定できないように加工するため、提供する画像に含まれる人物のプライバシーを守ることができる。また、サービス登録ユーザに対して、簡易かつ迅速に画像を提供することができる。 This configuration makes it possible to determine the person other than the service registered user reflected in the captured image and process the person so that the individual cannot be identified. Therefore, the privacy of the person included in the provided image can be protected. In addition, it is possible to provide an image to a service registered user simply and quickly.
[補記1]
 なお、上述した各実施形態における情報処理装置、サーバ装置の一部、例えば、判定部、画像処理部、提供制御部などをコンピュータで実現するようにしてもよい。その場合、この機能を実現するためのプログラムをコンピュータ読み取り可能な記録媒体に記録して、この記録媒体に記録されたプログラムをコンピュータシステムに読み込ませ、実行することによって実現してもよい。なお、ここでいう「コンピュータシステム」とは、情報処理装置、サーバ装置に内蔵されたコンピュータシステムであって、OS(Operating System)や周辺機器等のハードウェアを含むものとする。
[Appendix 1]
In addition, you may make it implement | achieve a part of information processing apparatus in each embodiment mentioned above, a server apparatus, for example, a determination part, an image process part, a provision control part, etc. with a computer. In that case, a program for realizing this function may be recorded on a computer-readable recording medium, and the program recorded on the recording medium may be read into a computer system and executed. Here, the “computer system” is a computer system built in the information processing apparatus and server apparatus, and includes hardware such as an OS (Operating System) and peripheral devices.
 また、「コンピュータ読み取り可能な記録媒体」とは、フレキシブルディスク、光磁気ディスク、ROM、CD-ROM等の可搬媒体、コンピュータシステムに内蔵されるハードディスク等の記憶装置のことをいう。さらに「コンピュータ読み取り可能な記録媒体」とは、インターネット等のネットワークや電話回線等の通信回線を介してプログラムを送信する場合の通信回線のように、短時間、動的にプログラムを保持するもの、その場合のサーバやクライアントとなるコンピュータシステム内部の揮発性メモリのように、一定時間プログラムを保持しているものも含んでもよい。また上記プログラムは、前述した機能の一部を実現するためのものであってもよく、さらに前述した機能をコンピュータシステムにすでに記録されているプログラムとの組み合わせで実現できるものであってもよい。 Further, the “computer-readable recording medium” means a storage device such as a flexible disk, a magneto-optical disk, a portable medium such as a ROM and a CD-ROM, and a hard disk incorporated in a computer system. Further, the “computer-readable recording medium” is a medium that dynamically holds a program for a short time, such as a communication line when transmitting a program via a network such as the Internet or a communication line such as a telephone line, In this case, a volatile memory inside a computer system that serves as a server or a client may be included that holds a program for a certain period of time. The program may be a program for realizing a part of the functions described above, and may be a program capable of realizing the functions described above in combination with a program already recorded in a computer system.
 また、上述した実施形態における情報処理装置、サーバ装置の一部、または全部を、LSI(Large Scale Integration)等の集積回路として実現してもよい。情報処理装置、サーバ装置の各機能部は個別にプロセッサ化してもよいし、一部、または全部を集積してプロセッサ化してもよい。また、集積回路化の手法はLSIに限らず専用回路、または汎用プロセッサで実現してもよい。また、半導体技術の進歩によりLSIに代替する集積回路化の技術が出現した場合、当該技術による集積回路を用いてもよい。 Also, part or all of the information processing apparatus and the server apparatus in the above-described embodiment may be realized as an integrated circuit such as an LSI (Large Scale Integration). Each functional unit of the information processing apparatus and the server apparatus may be individually made into a processor, or a part or all of them may be integrated into a processor. Further, the method of circuit integration is not limited to LSI, and may be realized by a dedicated circuit or a general-purpose processor. In addition, when an integrated circuit technology that replaces LSI appears due to the advancement of semiconductor technology, an integrated circuit based on the technology may be used.
 また、上述した実施形態においては、処理の対象となる画像は静止画として説明したが、処理の対象は複数の連続する画像の集合、すなわち動画であってもよい。
 また、画像の送信は、情報処理システム内の情報処理装置およびサーバ装置のいずれが行ってもよい。また、情報処理装置とサーバ装置が分散して送信処理を行ってもよい。
 また、画像の提供は、画像を直接送信する形態に限らず、他の機器に送信を指示することで実現してもよい。また、例えば、画像をインターネット上にアップロードし、アップロードした場所を示すURL(Uniform Resource Locator)を通知することでもよい。また、例えば、ユーザの使用するSNSサイト上に、画像を投稿(アップロード)することとしてもよい。
In the above-described embodiment, the image to be processed has been described as a still image, but the processing target may be a set of a plurality of continuous images, that is, a moving image.
The image transmission may be performed by either the information processing apparatus or the server apparatus in the information processing system. The information processing apparatus and the server apparatus may perform transmission processing in a distributed manner.
Further, the provision of an image is not limited to a form in which an image is directly transmitted, and may be realized by instructing transmission to another device. Alternatively, for example, an image may be uploaded on the Internet, and a URL (Uniform Resource Locator) indicating the upload location may be notified. Further, for example, an image may be posted (uploaded) on an SNS site used by the user.
 また、上述した実施形態における情報処理システムが有する各機能については、図示のものに限られず、適宜、機能的または物理的に分散、結合して構成してもよい。例えば、情報処理装置またはサーバ装置に各機能を集中して配備することで本発明の目的を実現することとしてもよい。 Further, the functions of the information processing system in the above-described embodiment are not limited to those shown in the drawings, and may be configured by functionally or physically distributed and combined as appropriate. For example, the object of the present invention may be realized by concentrating and deploying each function in the information processing apparatus or the server apparatus.
 以上、図面を参照してこの発明の一実施形態について詳しく説明してきたが、具体的な構成は上述のものに限られることはなく、この発明の要旨を逸脱しない範囲内において様々な設計変更等をすることが可能である。
 また、本発明の一態様は、請求項に示した範囲で種々の変更が可能であり、異なる実施形態にそれぞれ開示された技術的手段を適宜組み合わせて得られる実施形態についても本発明の技術的範囲に含まれる。また、上記各実施形態や変形例に記載された要素であり、同様の効果を奏する要素同士を置換した構成も含まれる。
As described above, the embodiment of the present invention has been described in detail with reference to the drawings. However, the specific configuration is not limited to the above, and various design changes and the like can be made without departing from the scope of the present invention. It is possible to
In addition, one aspect of the present invention can be modified in various ways within the scope of the claims, and the technical aspects of the present invention also relate to embodiments obtained by appropriately combining technical means disclosed in different embodiments. Included in the range. Moreover, it is the element described in said each embodiment and modification, and the structure which substituted the element which has the same effect is also contained.
[補記2]
(1)通信部と、制御部とを備え、前記制御部は、取得した画像から人物の一部又は全部を含む画像領域を抽出し、前記画像領域が所定の人物に対応するか否かを、所定の人物を特定するための特定情報に基づいて判定し、前記画像領域が前記所定の人物に対応しないと判定した場合に、前記画像領域を加工して加工画像を生成し、前記通信部を介して、前記所定の人物に前記加工画像を提供する情報処理装置。
[Appendix 2]
(1) A communication unit and a control unit are provided, and the control unit extracts an image area including a part or all of a person from the acquired image, and determines whether or not the image area corresponds to a predetermined person. Determining based on specific information for specifying a predetermined person, and when determining that the image area does not correspond to the predetermined person, processing the image area to generate a processed image, and An information processing apparatus that provides the processed image to the predetermined person via
(2)前記制御部は、前記所定の人物と同じグループに属する人物に対応する前記画像領域については加工を行わない(1)に記載の情報処理装置。 (2) The information processing apparatus according to (1), wherein the control unit does not process the image area corresponding to a person belonging to the same group as the predetermined person.
(3)前記制御部は、前記所定の人物と同じグループに属する人物の少なくとも一人に前記加工画像を提供する(1)または(2)に記載の情報処理装置。 (3) The information processing apparatus according to (1) or (2), wherein the control unit provides the processed image to at least one person belonging to the same group as the predetermined person.
(4)前記制御部は、前記所定の人物に対応すると判定した前記画像領域を含み、かつ前記画像の少なくとも一部を含まないような前記加工画像を生成する(1)から(3)のいずれかに記載の情報処理装置。 (4) The control unit generates the processed image that includes the image area determined to correspond to the predetermined person and does not include at least a part of the image. An information processing apparatus according to claim 1.
(5)情報処理装置が行う情報処理方法であって、取得した画像から人物の一部又は全部を含む画像領域を抽出するステップと、前記画像領域が所定の人物に対応するか否かを、所定の人物を特定するための特定情報に基づいて判定するステップと、前記画像領域が前記所定の人物に対応しないと判定した場合に、前記画像領域を加工して加工画像を生成するステップと、前記所定の人物に前記加工画像を提供するステップとを備える情報処理方法。 (5) An information processing method performed by the information processing apparatus, the step of extracting an image area including a part or all of a person from the acquired image, and whether or not the image area corresponds to a predetermined person. Determining based on specific information for specifying a predetermined person; and when determining that the image area does not correspond to the predetermined person, processing the image area to generate a processed image; Providing the processed image to the predetermined person.
(6)コンピュータに、取得した画像から人物の一部又は全部を含む画像領域を抽出する手順と、前記画像領域が所定の人物に対応するか否かを、所定の人物を特定するための特定情報に基づいて判定する手順と、前記画像領域が前記所定の人物に対応しないと判定した場合に、前記画像領域を加工して加工画像を生成する手順と、前記所定の人物に前記加工画像を提供する手順とを実行させるためのプログラム。 (6) Procedure for extracting an image area including a part or all of a person from an acquired image to a computer and specifying for specifying the predetermined person whether or not the image area corresponds to the predetermined person A procedure for determining based on information; a procedure for processing the image region to generate a processed image when it is determined that the image region does not correspond to the predetermined person; and Program to execute the provided procedure.
 本発明のいくつかの態様は、提供する画像に含まれる人物のプライバシーを守ることが必要な情報処理装置、情報処理方法、及びプログラムなどに適用することができる。 Some aspects of the present invention can be applied to an information processing apparatus, an information processing method, a program, and the like that are required to protect the privacy of a person included in a provided image.
1、2・・・情報処理システム、10、30、40・・・情報処理装置、20、50・・・サーバ装置、11、21、41、51・・・制御部、12・・・撮像部、13、22、33、43、52・・・記憶部、14、23、53・・・通信部、111、511・・・判定部、112、512・・・画像処理部、113、313、514・・・提供制御部、211、513・・・提供先特定部 DESCRIPTION OF SYMBOLS 1, 2, ... Information processing system 10, 30, 40 ... Information processing apparatus 20, 50 ... Server apparatus, 11, 21, 41, 51 ... Control part, 12 ... Imaging part , 13, 22, 33, 43, 52 ... storage unit, 14, 23, 53 ... communication unit, 111, 511 ... determination unit, 112, 512 ... image processing unit, 113, 313, 514... Provision control unit, 211 513.

Claims (6)

  1.  通信部と、
     制御部とを備え、
     前記制御部は、
     取得した画像から人物の一部又は全部を含む画像領域を抽出し、
     前記画像領域が所定の人物に対応するか否かを、所定の人物を特定するための特定情報に基づいて判定し、
     前記画像領域が前記所定の人物に対応しないと判定した場合に、前記画像領域を加工して加工画像を生成し、
     前記通信部を介して、前記所定の人物に前記加工画像を提供する
     情報処理装置。
    A communication department;
    A control unit,
    The controller is
    Extract an image area that includes part or all of a person from the acquired image,
    Determining whether or not the image area corresponds to a predetermined person based on the specific information for specifying the predetermined person;
    When it is determined that the image area does not correspond to the predetermined person, the image area is processed to generate a processed image;
    An information processing apparatus that provides the processed image to the predetermined person via the communication unit.
  2.  前記制御部は、前記所定の人物と同じグループに属する人物に対応する前記画像領域については加工を行わない
     請求項1に記載の情報処理装置。
    The information processing apparatus according to claim 1, wherein the control unit does not process the image area corresponding to a person belonging to the same group as the predetermined person.
  3.  前記制御部は、前記所定の人物と同じグループに属する人物の少なくとも一人に前記加工画像を提供する
     請求項1または請求項2に記載の情報処理装置。
    The information processing apparatus according to claim 1, wherein the control unit provides the processed image to at least one person belonging to the same group as the predetermined person.
  4.  前記制御部は、前記所定の人物に対応すると判定した前記画像領域を含み、かつ前記画像の少なくとも一部を含まないような前記加工画像を生成する
     請求項1から請求項3のいずれか一項に記載の情報処理装置。
    The said control part produces | generates the said processed image which contains the said image area | region determined as corresponding to the said predetermined person, and does not contain at least one part of the said image. The information processing apparatus described in 1.
  5.  情報処理装置が行う情報処理方法であって、
     取得した画像から人物の一部又は全部を含む画像領域を抽出するステップと、
     前記画像領域が所定の人物に対応するか否かを、所定の人物を特定するための特定情報に基づいて判定するステップと、
     前記画像領域が前記所定の人物に対応しないと判定した場合に、前記画像領域を加工して加工画像を生成するステップと、
     前記所定の人物に前記加工画像を提供するステップと
     を備える情報処理方法。
    An information processing method performed by an information processing apparatus,
    Extracting an image area including a part or all of a person from the acquired image;
    Determining whether or not the image region corresponds to a predetermined person based on identification information for identifying the predetermined person;
    If it is determined that the image area does not correspond to the predetermined person, processing the image area to generate a processed image;
    Providing the processed image to the predetermined person.
  6.  コンピュータに、
     取得した画像から人物の一部又は全部を含む画像領域を抽出する手順と、
     前記画像領域が所定の人物に対応するか否かを、所定の人物を特定するための特定情報に基づいて判定する手順と、
     前記画像領域が前記所定の人物に対応しないと判定した場合に、前記画像領域を加工して加工画像を生成する手順と、
     前記所定の人物に前記加工画像を提供する手順と
     を実行させるためのプログラム。
    On the computer,
    A procedure for extracting an image region including a part or all of a person from the acquired image;
    Determining whether the image area corresponds to a predetermined person based on specific information for specifying the predetermined person;
    When it is determined that the image area does not correspond to the predetermined person, the image area is processed to generate a processed image;
    A program for causing the predetermined person to execute the procedure of providing the processed image.
PCT/JP2017/030514 2016-11-17 2017-08-25 Information processing device, information processing method, and program WO2018092378A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016224092 2016-11-17
JP2016-224092 2016-11-17

Publications (1)

Publication Number Publication Date
WO2018092378A1 true WO2018092378A1 (en) 2018-05-24

Family

ID=62145480

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/030514 WO2018092378A1 (en) 2016-11-17 2017-08-25 Information processing device, information processing method, and program

Country Status (1)

Country Link
WO (1) WO2018092378A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020086504A (en) * 2018-11-15 2020-06-04 日本電気株式会社 Image processing apparatus, image processing method and program
WO2021250805A1 (en) * 2020-06-10 2021-12-16 マクセル株式会社 Watching monitoring device and watching monitoring method
US11347978B2 (en) 2018-02-07 2022-05-31 Sony Corporation Image processing apparatus, image processing method, and image processing system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008040938A (en) * 2006-08-09 2008-02-21 Nikon Corp Facility inside monitoring system
JP2010021921A (en) * 2008-07-14 2010-01-28 Nikon Corp Electronic camera and image processing program
JP2013171311A (en) * 2012-02-17 2013-09-02 Nec Casio Mobile Communications Ltd Image processing device, terminal device, image processing method, and program
JP2015222460A (en) * 2014-05-22 2015-12-10 日本電信電話株式会社 Disclosed image edition method and disclosed image edition device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008040938A (en) * 2006-08-09 2008-02-21 Nikon Corp Facility inside monitoring system
JP2010021921A (en) * 2008-07-14 2010-01-28 Nikon Corp Electronic camera and image processing program
JP2013171311A (en) * 2012-02-17 2013-09-02 Nec Casio Mobile Communications Ltd Image processing device, terminal device, image processing method, and program
JP2015222460A (en) * 2014-05-22 2015-12-10 日本電信電話株式会社 Disclosed image edition method and disclosed image edition device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11347978B2 (en) 2018-02-07 2022-05-31 Sony Corporation Image processing apparatus, image processing method, and image processing system
JP2020086504A (en) * 2018-11-15 2020-06-04 日本電気株式会社 Image processing apparatus, image processing method and program
JP7387981B2 (en) 2018-11-15 2023-11-29 日本電気株式会社 Image processing device, image processing method, program
WO2021250805A1 (en) * 2020-06-10 2021-12-16 マクセル株式会社 Watching monitoring device and watching monitoring method

Similar Documents

Publication Publication Date Title
US9298969B2 (en) Information processing device and storage medium, for replacing a face image
CN105069075B (en) Photo be shared method and apparatus
JP5947196B2 (en) Imaging service system and imaging control method
WO2012004907A1 (en) Image delivery device
KR102094509B1 (en) Method of modifying image including photographing restricted element, device and system for performing the same
CN105956022B (en) Electronic mirror image processing method and device, and image processing method and device
US10657361B2 (en) System to enforce privacy in images on an ad-hoc basis
US9344631B2 (en) Method, system and apparatus for selecting an image captured on an image capture device
JP6028457B2 (en) Terminal device, server, and program
JP6192306B2 (en) Imaging apparatus, management server, image transmission method, and program
WO2018092378A1 (en) Information processing device, information processing method, and program
WO2018133354A1 (en) Information acquisition method and acquisition device
JP2014067131A (en) Image processing apparatus, image processing system, image processing method, and computer program
JP2015233204A (en) Image recording device and image recording method
CN106549903B (en) Method and device for setting head portrait of user
JP6428152B2 (en) Portrait right protection program, information communication device, and portrait right protection method
JP2019118021A (en) Shooting control system, shooting control method, program, and recording medium
JP6504896B2 (en) INFORMATION PROCESSING APPARATUS, INFORMATION PROCESSING METHOD, AND IMAGE TRANSFER SYSTEM
JP5891828B2 (en) Mobile terminal, photographed image disclosure method, program
JP2022058833A (en) Information processing system, information processing apparatus, information processing method, and program
JP6476148B2 (en) Image processing apparatus and image processing method
JP2014160963A (en) Image processing device and program
JP7075703B2 (en) Information processing equipment, information processing methods, and programs
JP6833571B2 (en) Image processing equipment, image processing system, and control method
JP2017175453A (en) Image processing apparatus and image processing method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17871059

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17871059

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP