WO2022050217A1 - Information processing device, information processing method, and program - Google Patents

Information processing device, information processing method, and program Download PDF

Info

Publication number
WO2022050217A1
WO2022050217A1 PCT/JP2021/031723 JP2021031723W WO2022050217A1 WO 2022050217 A1 WO2022050217 A1 WO 2022050217A1 JP 2021031723 W JP2021031723 W JP 2021031723W WO 2022050217 A1 WO2022050217 A1 WO 2022050217A1
Authority
WO
WIPO (PCT)
Prior art keywords
person
distance
persons
information processing
threshold value
Prior art date
Application number
PCT/JP2021/031723
Other languages
French (fr)
Japanese (ja)
Inventor
崇 大矢
Original Assignee
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by キヤノン株式会社 filed Critical キヤノン株式会社
Publication of WO2022050217A1 publication Critical patent/WO2022050217A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/24Reminder alarms, e.g. anti-loss alarms
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to information processing technology.
  • Japanese Patent Application Laid-Open No. 2019-200718 discloses a method for outputting a warning when the distance between a person belonging to a minority group and a person belonging to a majority group is less than a preset threshold value.
  • the image processing apparatus has the following configuration so that the user can be appropriately presented whether the distance between people is a distance to be noted. That is, a detection means for detecting a person included in an image captured by the image pickup means, a determination means for determining at least one of the presence or absence of a mask of the person detected by the detection means, and the age of the person. And an output control means for outputting predetermined information based on the distance between the persons detected by the detection means in the image and the determination result by the determination means.
  • FIG. 1 is a diagram showing a system configuration in this embodiment.
  • the system in this embodiment includes an information processing device 100, an image pickup device 110, a recording device 120, and a display 130.
  • the information processing device 100, the image pickup device 110, and the recording device 120 are connected to each other via the network 140.
  • the network 140 is realized from a plurality of routers, switches, cables and the like conforming to a communication standard such as ETHERNET (registered trademark).
  • the network 140 may be realized by the Internet, a wired LAN (Local Area Network), a wireless LAN (Wireless Lan), a WAN (Wide Area Network), or the like.
  • the information processing device 100 is realized by, for example, a personal computer or the like in which a program for realizing the information processing function described later is installed.
  • the image pickup device 110 is a device for taking an image and functions as an image pickup means.
  • the image pickup device 110 associates the image data of the captured image, the information on the date and time when the image was captured, and the identification information which is the information for identifying the image pickup device 110, and records the information processing device 100 and the recording via the network 140. It is transmitted to an external device such as the device 120.
  • the number of image pickup devices 110 is one, but the number of image pickup devices 110 may be multiple. That is, a plurality of image pickup devices 110 may be connected to the information processing device 100 and the recording device 120 via the network 140.
  • the information processing device 100 and the recording device 120 use, for example, the identification information associated with the transmitted image, and the transmitted image is captured by the image pickup device 110 of the plurality of image pickup devices 110. Judge whether it is.
  • the recording device 120 records the image data of the image captured by the image pickup device 110, the information on the date and time when the image was captured, and the identification information for identifying the image pickup device 110 in association with each other. Then, in accordance with the request from the information processing device 100, the recording device 120 transmits the recorded data (image, identification information, etc.) to the information processing device 100.
  • the display 130 is composed of an LCD (Liquid Crystal Display) or the like, and displays an image or the like captured by the image pickup device 110.
  • the display 130 is connected to the information processing apparatus 100 via a display cable compliant with a communication standard such as HDMI (registered trademark) (High Definition Multimedia Interface). At least two or all of the display 130, the information processing device 100, and the recording device 120 may be provided in a single housing.
  • the image captured by the image pickup device 110 is not limited to the display 130 connected to the information processing device 100 via the display cable, and may be displayed on, for example, a display having the following external device. That is, it may be displayed on a display of a mobile device such as a smartphone or a tablet terminal connected via the network 140.
  • each function shown in FIG. 2 uses a ROM (Read Only Memory) 920, a RAM (Random Access Memory) 910, and a CPU (Central Processing Unit) 900, which will be described later with reference to FIG. It shall be realized as follows. That is, each function shown in FIG. 2 is realized by the CPU 900 of the information processing apparatus 100 expanding the computer program stored in the ROM 920 of the information processing apparatus 100 into the RAM 910 and executing the computer program.
  • ROM Read Only Memory
  • RAM Random Access Memory
  • CPU Central Processing Unit
  • the acquisition unit 101 sequentially acquires images of each frame constituting the moving image captured by the image pickup device 110.
  • the acquisition unit 101 may acquire the moving image transmitted from the imaging device 110, or may acquire the moving image transmitted from the recording device 120.
  • the storage unit 201 can be realized by a RAM (Random Access Memory) 900, an HDD (Hard Disk Drive) 930, etc., which will be described later with reference to FIG. 9, and for example, image data of an image acquired by the acquisition unit 200 is stored (retained). do.
  • the operation receiving unit 202 receives an operation performed by the user via an input device (not shown) such as a keyboard or a mouse.
  • the detection unit 203 executes a process of detecting a person included in the image acquired by the acquisition unit 200.
  • the detection unit 203 in the present embodiment detects a person in an image by performing pattern matching processing using a matching pattern (dictionary).
  • a matching pattern (dictionary)
  • the person may be detected from the image by using a plurality of matching patterns such as a matching pattern when the person is facing forward and a matching pattern when the person is facing sideways.
  • a collation pattern a collation pattern may be prepared when a specific object is viewed from another angle such as from an oblique direction or from an upward direction.
  • the detection unit 203 in the present embodiment uses pattern matching processing as a method of detecting a person from an image, but may detect a person from an image by using another conventional technique.
  • the tracking unit 204 tracks a person detected by the detection unit 203.
  • the tracking unit 204 in the present embodiment detects the same person as the person detected from the image of the frame one or more before the frame of interest by the detection unit 203, the people in each frame correspond to each other. Attach. That is, a person is tracked between images for a plurality of frames that are close in time.
  • the tracking unit 204 may associate a person with a high correlation between images of a plurality of frames by using the color, shape, size (number of pixels), and the like of the person.
  • the tracking unit 204 is not limited to a specific method as long as it can perform a process of determining that the person is the same person over a plurality of frames of images and performing tracking.
  • the tracking unit 204 assigns a unique ID to each person to be tracked.
  • the tracking unit 204 assigns an ID “a” to a person detected by the detection unit 203 from an image of a frame one or more before the frame of interest. Then, when the detection unit 203 also detects the person from the frame of interest, the tracking unit 204 assigns the same ID “a” to the person. If a person newly detected in the frame of interest exists, the tracking unit 204 newly assigns a unique ID to the person.
  • the determination unit 205 determines whether or not the person wearing the mask detected from the image by the detection unit 203, and at least one of the ages of the person. In the process of determining whether or not the person detected from the image is wearing a mask, a known technique for determining whether or not the person is wearing a mask may be used. Specifically, the determination unit 205 executes the following processing using, for example, a discriminator learned by machine learning using an image of the face of a person wearing a mask as training data. That is, the determination unit 205 inputs the image of the face of the person detected from the image by the detection unit 203 into the classifier, and acquires information on whether or not the person is wearing a mask as an output.
  • the determination unit 205 inputs an image of a person's face detected from the image by the detection unit 203 into a classifier learned by machine learning using images of the faces of people of different ages as learning data.
  • the information of the age of the person concerned is acquired as an output.
  • the calculation unit 206 calculates the distance between the persons included in the image. Further, the calculation unit 206 compares the calculated distance between the persons with the threshold value. The details of the processing by the calculation unit 206 will be described later.
  • the output control unit 207 outputs predetermined information based on the distance between persons calculated by the calculation unit 206 and the determination result of at least one of the presence / absence of wearing a mask and the age by the determination unit 205.
  • FIG. 3 is a diagram showing how people in a line are imaged by the image pickup apparatus 110.
  • a line is regulated by a guide pole or the like, and a line with an entrance and an exit is assumed.
  • the user enters the queue from the queue entrance 301, which is the entrance of the queue, proceeds along the guide 300, and exits from the queue exit 302, which is the exit of the queue. That is, the queue is formed from the matrix exit 302 toward the inlet 301 along the guide 300.
  • the image pickup apparatus 110 in the present embodiment is installed so as to image the matrix formed in the guide 300.
  • the information processing apparatus 100 in the present embodiment detects the persons included in the matrix imaged by the image pickup apparatus 110, and counts the number of the detected persons to acquire the number of people in the matrix.
  • the number of image pickup devices 110 is one, but a plurality of image pickup devices 110 may be used to divide the queue into a plurality of portions for image pickup.
  • the detection unit 203 detects a person for each of the plurality of images obtained by imaging each of the plurality of parts obtained by dividing the queue, and the calculation unit 206 totals the number of detected persons. By calculating, the number of people in a queue may be obtained.
  • the information processing apparatus 100 in the present embodiment notifies the user of an alert (predetermined information) according to the distance between the persons.
  • the detection unit 203 in the present embodiment detects a person included in the matrix from the image obtained by capturing the matrix shown in FIG. Then, the calculation unit 206 calculates the average Davr of the distances of the front and back people in the matrix by using the equation (1) based on the detection result of the person in the image.
  • the detection unit 203 detects a person included in the matrix imaged by the image pickup apparatus 110, and the calculation unit 206 acquires the number of people in the matrix by counting the number of the detected persons. In this case, the calculation unit 206 calculates, for example, the sum of the number of persons detected by the detection unit 203 in each of the regions R1 (region 411) to the region R5 (region 415) set in the image, and the calculated sum. Is obtained as the number of people in the queue N.
  • the information processing apparatus 100 may acquire the number of people in a matrix by executing the following processing, for example. That is, the information processing apparatus 100 may calculate the number of people in a matrix by the difference Nin-Nout between the number of people passing through the procession entrance 301 and the number of people passing through the procession exit 302. At this time, the calculation unit 206 determines, for example, whether the person tracked by the tracking unit 204 has passed the procession entrance 301, and acquires the number of persons determined to have passed as the number of passing people Nin. Similarly, it is determined whether or not the person tracked by the tracking unit 204 has passed the procession exit 302, and the number of persons determined to have passed is acquired as the number of passing persons Now. Then, the calculation unit 206 acquires the number of people in the matrix N by subtracting Nout from the acquired Nin.
  • the "matrix length” is the distance from the beginning to the end of the matrix obtained along the path.
  • the information processing apparatus 100 in the present embodiment sets regions R1 to R5 on the path of the matrix reflected in the image, and maintains the distance along the path of the matrix for each region.
  • the region Rn is set on the image according to the user operation, and the information of the distance along the path of the matrix in the real space of the region Rn is acquired.
  • the detection unit 203 executes a process of detecting a person in each region of the image, and the calculation unit 206 identifies the region Rend in which n is the largest among the regions Rn in which the stationary person exists.
  • the calculation unit 206 can obtain the total matrix length by adding the distances along the path in order from R1 to End.
  • a method for obtaining the queue length other known techniques such as a method of individually measuring the positions between people waiting in a queue and calculating the distance may be used.
  • the calculation unit 206 further compares the average Davr of the distance between the persons calculated by obtaining the equation (1) with the threshold value. Then, when the average Davr is less than the threshold value, the output control unit 207 notifies the user of predetermined information, assuming that there is a high possibility of infection and the distance between the persons is a distance to be noted. On the other hand, when the average Davr is equal to or greater than the threshold value, the output control unit 207 does not notify the user of predetermined information because the possibility of infection is low and the distance between persons is not a cautionary distance.
  • the output control unit 207 may display a message, a mark, or the like indicating a warning as predetermined information on the display 130. Further, the output control unit 207 may make a voice indicating a warning flow from a speaker (not shown) as a method of notifying the user of predetermined information.
  • the calculation unit 206 in the present embodiment may calculate the latest average Davr at predetermined time intervals using the equation (1) and compare the calculated latest average Davr with the threshold value. Then, when the average Davr is continuously less than the threshold value a predetermined number of times, the output control unit 207 may notify the user of the predetermined information. For example, when the average Davr is less than the threshold value three times in a row, the output control unit 207 causes the display 130 to display a message indicating a warning as predetermined information. On the other hand, the output control unit 207 may not display the warning message on the display 130 when the average Davr is not less than the threshold value three times in a row.
  • the calculation unit 206 in the present embodiment may correct the average Davr depending on whether or not the mask is worn. Specifically, the calculation unit 206 in the present embodiment corrects so that the calculated average Davr value becomes large, assuming that the possibility of virus infection is reduced when the person wears a mask.
  • the calculation unit 206 calculates, for example, the corrected average Davr', which is a value obtained by correcting the average davr, using the equation (2).
  • Cm in the formula (2) indicates the ratio "mask wearing rate" of the person wearing a mask among the persons in a procession.
  • the determination unit 205 in the present embodiment determines whether or not each of the persons located along the route from the matrix entrance 301 to the matrix exit 302 shown in FIG. 4 is wearing a mask. Then, the calculation unit 206 calculates the mask wearing rate Cm by dividing the number of people determined by the determination unit 205 to be wearing the mask by the number of people in the matrix N.
  • the calculation unit 206 in the present embodiment corrects the calculated average Davr value of the distance between the persons to be larger as the mask wearing rate is higher, thereby calculating the corrected average Davr'. calculate. Then, the calculation unit 206 compares the corrected average Davr'with the threshold value. When the corrected average Davr'is less than the threshold value, the output control unit 207 notifies the user of a message indicating a warning as predetermined information, assuming that the distance is likely to be infected and should be noted. Then, when the corrected average Davr'is equal to or greater than the threshold value, the output control unit 207 does not notify the user of a message indicating a warning because the possibility of infection is low and the distance should be noted.
  • the information processing apparatus 100 in the present embodiment corrects the average Davr of the distance between people according to the mask wearing rate, and determines the information (alert) to the user according to the result of comparing the average Davr'and the threshold value. May be notified.
  • the mask wearing rate is high and it is not necessary to notify the user of the alert, it is possible to reduce the notification of the alert to the user.
  • the calculation unit 206 in the present embodiment may correct the average Davr according to the age of the person. In the present embodiment, it is assumed that the older the person is, the higher the possibility of becoming severe when infected with the virus. Therefore, the calculation unit 206 in the present embodiment corrects so that the calculated average Davr value becomes smaller so that the user is more likely to be notified of the alert as the number of elderly people increases. Specifically, the calculation unit 206 calculates the corrected average Davr', which is a value obtained by correcting the average Davr, using, for example, the equation (3).
  • the determination unit 205 in the present embodiment determines the age of each of the persons lined up in the matrix along the route from the matrix entrance 301 to the matrix exit 302 shown in FIG. 4, and determines whether the person is an elderly person based on the determined age. .. At this time, the determination unit 205 determines, for example, the age of each of the persons in the line, determines that the person of the predetermined age or older is an elderly person, and determines that the person of less than the predetermined age is not an elderly person.
  • the predetermined age is 70 years old, but other criteria such as 65 years old and 75 years old may be followed.
  • the calculation unit 206 calculates the elderly rate Ck by dividing the number of people determined to be elderly by the determination unit 205 by the number of people in the matrix N. As shown in the formula (3), the calculation unit 206 in the present embodiment corrects so that the higher the elderly rate Ck, the smaller the value of the calculated average Davr of the distance between the persons, so that the corrected average Davr' Is calculated. Then, the calculation unit 206 compares the corrected average Davr'with the threshold value. When the corrected average Davr'is less than the threshold value, the output control unit 207 notifies the user of predetermined information as a distance to be noted.
  • the output control unit 207 does not notify the user of predetermined information because it is not a distance to be noted.
  • the information processing apparatus 100 in the present embodiment notifies the user of an alert according to the result of comparing the average Davr of the distance between the persons with the corrected average Davr'according to the age of the person and the threshold value. You may. By doing so, it is possible to reduce the fact that the alert is not notified to the user when the alert should be notified to the user because the person is old. In other words, it is possible to adaptively determine whether the distance between persons is a distance to be noted, and notify the user of an alert based on the result of the determination.
  • the calculation unit 206 in the present embodiment may correct the average Davr according to the presence or absence of wearing a mask and the age of the person. At this time, the calculation unit 206 may acquire the corrected average Davr'corrected by the calculated average Davr, for example, by using the equation (4).
  • Davr' (1 + Cm) (1-Ck / 2) Davr ... Equation (4)
  • the flow processing shown in FIG. 5 is, for example, started or terminated according to an instruction by the user.
  • the processing of the flowchart shown in FIG. 5 is executed by, for example, the functional block shown in FIG. 2 realized by executing the computer program stored in the ROM 920 of the information processing apparatus 100 by the CPU 900 of the information processing apparatus 100. And.
  • the acquisition unit 200 acquires the image captured by the image pickup device 120. At this time, the acquisition unit 200 acquires an image of one frame among the images of a plurality of frames constituting the moving image captured by the image pickup apparatus 110 as an image to be processed (hereinafter referred to as an image to be processed).
  • the detection unit 203 detects a person included in the image to be processed. At this time, the detection unit 203 detects a person included in the image to be processed, for example, by pattern matching processing using a person matching pattern.
  • the tracking unit 204 tracks the person detected by the detection unit 203.
  • the tracking unit 204 When the same person as the person detected by the detection unit 203 from the image of the frame one or more before the processing target image is detected from the processing target image by the detection unit 203, the tracking unit 204 is a person in each frame. By associating with, the person is tracked. The tracking unit 204 assigns a unique ID to each person to be tracked. For example, it is assumed that the tracking unit 204 assigns an ID “a” to a person detected by the detection unit 203 from an image of a frame one or more before the image to be processed. Then, when the detection unit 203 also detects the person from the image to be processed, the tracking unit 204 assigns the same ID “a” to the person.
  • the tracking unit 204 newly assigns a unique ID to the person. Further, the storage unit 201 stores the ID assigned by the tracking unit 204 to the persons currently in line in the person information 600 shown in FIG.
  • the person information 600 shown in FIG. 6 stores (retains) information about people currently in a line. Specifically, in the person information 600, the ID 601 of the person given by the tracking unit 204, the age information 602 which is the information of the age of the person, and the mask information 603 which is the information of whether or not the person wears a mask. Is recorded (retained) in association with.
  • the person information 600 is not limited to the example shown in FIG.
  • the storage unit 201 may include, for example, information on the size and position of the person for each frame for each person to which the ID 601 is assigned.
  • the storage unit 201 newly stores the newly assigned ID in the ID 601 of the person information 600.
  • the storage unit 201 provides information about the person (ID601, age information 602, mask information 603) to the person. Delete from information 600.
  • the determination unit 205 determines whether or not the person being tracked by the tracking unit 204 is wearing a mask.
  • the determination unit 205 in the present embodiment may determine whether or not the mask is worn only for a person who has not yet been determined whether or not the mask is worn.
  • the mask information 603 of the person with ID 601 "c" is blank, and it is determined whether or not the person with ID 601 "c" is still wearing the mask. It shows that it is not. This may occur when it is not properly determined whether or not the mask is worn, such as when the face image of the front face of the person is not acquired.
  • the determination unit 205 may determine whether or not the mask is worn for the person with the person ID “c” for whom the presence or absence of the mask has not been determined. Further, in this case, the determination unit 205 does not determine whether or not the mask is worn for the person with ID 601 "a, b, d" who has already been determined whether or not the mask is worn.
  • the storage unit 201 stores the information on whether or not the mask is worn in the mask information 603.
  • the determination unit 205 determines the age of the person tracked by the tracking unit 204.
  • the determination unit 205 in the present embodiment may determine the age of a person whose age has not yet been determined.
  • the age information 602 of the person whose ID 601 is "d" is blank, indicating that the age of the person whose ID 601 "d” has not been determined yet. This may occur when the age is not properly determined, such as when the face image of the front face of the person is not acquired.
  • the determination unit 205 determines the age of the person with the person ID "d” whose age has not been determined, and for the person with ID 601 "a, b, c" whose age has already been determined. The age may not be determined.
  • the age information is stored in the age information 602.
  • the calculation unit 206 calculates the number of people in the line N, which is the number of people in the line.
  • the calculation unit 206 can be obtained by the difference Nin-Nout between the number of passers Nin passing through the matrix entrance 301 and the number of passers Nout passing through the matrix exit 302.
  • the calculation unit 206 specifies the matrix length, which is the distance along the matrix path from the beginning to the end of the matrix.
  • the calculation unit 206 in the present embodiment specifies the region Rmax in which n is the largest among the regions Rn in which a stationary person exists. Then, based on the information of the distance along the path of the matrix held in advance for each of the regions R1 to R5, the calculation unit 206 can obtain the total matrix length by adding the distances in order from R1 to Rmax. ..
  • the calculation unit 206 calculates the average Davr of the distance between people.
  • the calculation unit 206 in the present embodiment calculates the average Davr using the formula (1), but the average Davr of the distance between people may be calculated by using another method.
  • the calculation unit 206 corrects the average Davr based on at least one of the presence or absence of the person wearing a mask and the age of the person.
  • the calculation unit 206 obtains the corrected average Davr'by correcting the average Davr using any one of the equations (2) to (4).
  • the calculation unit 206 executes a comparison process of comparing the calculated average Davr or the corrected average Davr'corrected with the average Davr with the threshold value.
  • the calculation unit 206 compares the calculated average Davr with the threshold value in S511.
  • the calculation unit 206 compares the average Davr' obtained by correcting the average Davr with the threshold value. ..
  • the threshold value here is a parameter that is preset by the user and can be changed as appropriate.
  • the transition to S513 is performed, and in S513, the output control unit 207 notifies the user of a message indicating a warning as predetermined information. ..
  • the average Davr (or the corrected average Davr') is equal to or greater than the threshold value (No in S512)
  • the transition to S514 occurs.
  • the process of the flow shown in FIG. 5 is terminated.
  • the transition to S501 is performed, and the acquisition unit 200 newly acquires the image to be processed next.
  • the information processing apparatus 100 in the present embodiment calculates the average Davr of the distance between the persons, and calculates it based on at least one of the presence or absence of the person wearing a mask and the age of the person. Correct the average Davr. Then, the information processing apparatus 100 compares the corrected average Davr' obtained by correcting the average Davr with the threshold value, and notifies the user of a message indicating a warning as predetermined information according to the comparison result. By doing so, it is possible to adaptively determine whether the distance between the persons is a cautionary distance, and notify the user of an alert based on the result of the determination.
  • the information processing apparatus 100 uses the average Davr of the distance between people to notify the user of predetermined information according to the comparison result with the threshold value, but the present invention is not limited to this.
  • the calculation unit 206 calculates the distance D between the first person in the line and the second person located in front of or behind the first person. ..
  • the calculation unit 206 identifies, for example, the position of the first person and the position of the second person detected from the image, and on the image from the position of the first person to the position of the second person on the image. Let the distance be the distance D.
  • the calculation unit 206 may convert the distance on the image from the position of the first person on the image to the distance on the second person into the distance in the real space, and may use the distance in the real space as the distance D.
  • the process of converting the distance on the image into the distance on the real space shall be performed using a known technique.
  • the determination unit 205 determines whether or not the mask is worn and at least one of the ages of each of the first person and the second person. Then, the calculation unit 206 corrects the distance D between the first person and the second person based on whether or not the mask is worn and at least one of the ages. At this time, for example, the calculation unit 206 corrects the distance D by using the equation (5).
  • M indicates the ratio of the person wearing the mask among the first person and the second person. For example, if both people are wearing masks, only one person is wearing the mask. If there is no mask wearer, it will be 1/2, and if there is no mask wearer, it will be 0. Further, K indicates the ratio of elderly people among the first person and the second person, for example, 1 if both are elderly, 1/2 if only 1 person is elderly, and no elderly person. In the case, it becomes 0. Further, Rm and Rk each indicate a predetermined coefficient.
  • the output control unit 207 notifies the user of a message indicating a warning as predetermined information, and the distance D'is set. If it is above the threshold value, the message is not notified.
  • the distance between a certain person (first person) and another person (second person) is set to at least one of the presence / absence of a mask and the age. Correct based on the judgment result. Then, the information processing apparatus 100 notifies the user of predetermined information according to the comparison result between the correction distance corrected for the distance between the first person and the second person and the threshold value. By doing so, it is possible to adaptively determine whether the distance between the persons is a cautionary distance, and notify the user of an alert based on the result of the determination.
  • the distance between the persons is corrected according to the mask wearing rate and the age of the person, and the corrected distance between the persons and the threshold value are compared.
  • the distance between the persons is compared. Correct the threshold value to be compared with the distance of. That is, the information processing apparatus 100 in the second embodiment corrects the threshold value according to the mask wearing rate of the person and the age of the person, compares the corrected threshold value with the distance between the people, and determines the predetermined value according to the result of the comparison. Notify the user of a warning message as information about.
  • the parts different from those of the first embodiment will be mainly described, and the same components and the same reference numerals are given to the components and the processes which are the same as or the same as those of the first embodiment, and the duplicated description will be omitted.
  • the calculation unit 206 in the present embodiment calculates the average Davr of the distances of the front and back persons in the matrix by using the equation (1). Since the process of calculating the average Davr is the same as that of the first embodiment, the description thereof will be omitted. Then, the calculation unit 206 in the present embodiment acquires the threshold value r'by correcting the threshold value r using the equation (6).
  • Threshold r' r-Cm * Rm + Ck * Rk ... Equation (6)
  • Cm indicates the mask wearing rate
  • Ck indicates the elderly rate
  • Rm and Rk indicate predetermined coefficients, respectively.
  • the threshold value r is corrected to be larger as the elderly rate Ck is higher. That is, the higher the elderly rate Ck, the larger the threshold value r, so that the alert is more likely to be notified to the user.
  • Rm is a predetermined value and Rk is zero, the higher the mask wearing rate Cm, the smaller the threshold value r is corrected.
  • each of Rm and Rk may have a predetermined value. That is, the threshold value r'may depend on both the mask wearing rate and the elderly rate.
  • the calculation unit 206 compares the average Davr of the distance between people with the corrected threshold value r'. At this time, if the average Davr of the distance between the persons is less than the corrected threshold value r', the output control unit 207 notifies the user of a message indicating a warning as an alert. On the other hand, the output control unit 207 does not notify the user of a message indicating a warning when the average Davr of the distance between people is equal to or greater than the corrected threshold value r'.
  • the average Davr of the distance between people and the corrected threshold value r' are compared, but the present invention is not limited to this.
  • the calculation unit 206 in the present embodiment may compare the distance D between a certain person (first person) and another person (second person) with the corrected threshold value r'. At this time, when the distance D between the first person and the second person is less than the corrected threshold value r', the output control unit 207 notifies the user of a message indicating a warning as an alert. On the other hand, the output control unit 207 does not notify the user of a warning message when the distance D between the first person and the second person is equal to or greater than the corrected threshold value r'.
  • the information processing apparatus 100 in the present embodiment compares the threshold value corrected based on at least one of the age of the person and whether or not the person wears a mask and the distance between the persons (average Davr and distance D). Then, an alert is notified to the user according to the result of the comparison. By doing so, it is possible to adaptively determine whether the distance between the persons is a cautionary distance, and notify the user of an alert based on the result of the determination.
  • the information processing apparatus 100 uses information on the queue length and the number of people waiting, as well as information on the waiting time, to calculate a degree of attention, which is an index indicating whether the distance between persons is a distance to be noted. explain.
  • the lower the degree of attention the more the distance between people is not the distance to be careful, and the higher the degree of attention, the more the distance between people is the distance to be careful. For example, in a procession, the shorter the distance between the person before and after, and the longer the waiting time, the higher the possibility of infection, and the higher the degree of caution.
  • the calculation unit 206 in this embodiment calculates the attention level R using the equation (7).
  • the average Davr of the distance between people is obtained by the formula (1).
  • Tavr indicates an "estimated waiting time" which is the result of estimating the waiting time of the queue.
  • a person enters the line from the line entrance 301 and then exits from the line exit 302.
  • the constant C1 is a normalization coefficient for keeping the attention level R within a predetermined range.
  • the calculation unit 206 in the present embodiment acquires the number of persons who have passed through the matrix exit 302 and exited from the matrix within a predetermined time. Then, the calculation unit 206 calculates the number of people leaving the matrix per unit time by dividing the number of acquired people by a predetermined time. At this time, the number of people leaving the procession per unit time is defined as the exit frequency. Then, the calculation unit 206 acquires the number of people N in the matrix, which is the number of people lined up in the matrix formed along the path from the matrix entrance 301 to the matrix exit 302. Since the method of calculating the number of people in the matrix N is the same as that of the first embodiment, the description thereof will be omitted. The calculation unit 206 calculates the estimated waiting time Tavr by dividing the number of people in the queue N by the exit frequency.
  • the tracking unit 204 tracks a person from the matrix entrance 301 to the matrix exit 302, and the calculation unit 206 acquires the movement time required for the person to move from the matrix entrance 301 to the matrix exit 302. Then, the calculation unit 206 may calculate the average value of the travel time acquired for each of the plurality of persons, and acquire the average value as the estimated waiting time Tavr.
  • the calculation unit 206 in the present embodiment executes a comparison process for comparing the attention level R calculated using the equation (7) with the threshold value.
  • the output control unit 207 does not notify the user as an alert, assuming that the distance between the persons is not the distance to be noted.
  • the output control unit 207 notifies the user as an alert with a message indicating a warning, assuming that the distance between the persons is a distance to be noted.
  • the calculation unit 206 in the present embodiment may use the equation (7) to calculate the latest attention level R at predetermined time intervals and compare the calculated latest attention level R with the threshold value. ..
  • the output control unit 207 may notify the user of a message indicating a warning as an alert. For example, the output control unit 207 causes the display 130 to display a message indicating a warning when the attention level R is equal to or higher than the threshold value three times in a row. On the other hand, the output control unit 207 does not display a warning message on the display 130 when the attention level R is not equal to or higher than the threshold value three times in a row.
  • the calculation unit 206 in the present embodiment may correct the caution level R depending on whether or not the mask is worn. Specifically, the calculation unit 206 in the present embodiment corrects the calculated caution level R to be small, assuming that the possibility of virus infection is reduced when the person is wearing a mask. For example, the calculation unit 206 calculates the correction caution level R', which is a value obtained by correcting the attention level R, using the equation (8).
  • R' R / (1 + Cm) ... Equation (8) Similar to the first embodiment, Cm indicates the mask wearing rate of the persons in a line. Further, in the example of the equation (8), it is assumed that the attention level R of the matrix is halved when all the persons in the matrix wear masks.
  • the calculation unit 206 in the present embodiment may correct the attention level R according to the age of the person.
  • the calculation unit 206 in the present embodiment corrects the calculated attention level R so that the larger the number of elderly people, the easier it is for the user to be notified of the alert.
  • the calculation unit 206 calculates the correction attention level R', which is a value obtained by correcting the attention level R, using, for example, the equation (9).
  • R' R * (1 + Ck / 2) ... Equation (9)
  • Ck indicates the elderly rate, which is the ratio of the elderly among the persons in the line.
  • the calculation unit 206 in the present embodiment may correct the caution level R based on whether or not the person wears a mask and the age of the person. At this time, the calculation unit 206 calculates the correction caution level with the attention level R corrected by using, for example, the equation (10).
  • R' R * (1-Cm / 2 + Ck / 2) ... Equation (10)
  • the calculation unit 206 calculates the corrected caution level R'by correcting the calculated caution level R based on at least one of the presence or absence of the person wearing a mask and the age of the person. Then, the calculation unit 206 executes a comparison process for comparing the corrected attention level R'acquired by correcting the attention level R with the threshold value. When the correction attention level R'is less than the threshold value, the output control unit 207 does not notify the user as an alert, assuming that the distance between the persons is not the distance to be noted.
  • the output control unit 207 notifies the user as an alert with a message indicating a warning, assuming that the distance between the persons is a distance to be noted.
  • the information processing apparatus 100 in the present embodiment corrects the attention level R based on at least one of the presence / absence of the mask of the person and the age of the person, and the corrected caution level R obtained by the correction. Notify the user of an alert depending on the result of the comparison between'and the threshold. By doing so, it is possible to adaptively determine whether the distance between the persons is a cautionary distance, and notify the user of an alert based on the result of the determination.
  • the information processing apparatus 100 in the fourth embodiment calculates the distance between each person in the matrix and the people around the person, and calculates the density of the matrix based on the calculated distance. Then, the information processing apparatus 100 notifies the user of an alert according to the result of comparing the calculated density and the threshold value. It should be noted that the parts different from each of the above-described embodiments will be mainly described, and the same or equivalent components and processes as those of each of the above-described embodiments are designated by the same reference numerals, and duplicated description will be omitted.
  • n indicates the number of people in the queue.
  • the predetermined value Dth is a numerical value specified in advance by the user, for example, 1.8 m.
  • the calculation unit 206 compares the calculated density Den with the threshold value, and if the density Den is equal to or greater than the threshold value, determines that the distance between the persons in the matrix is a distance to be noted. Then, the output control unit 207 notifies the user of the alert. On the other hand, when the calculated density Den is less than the threshold value, the calculation unit 206 determines that the distance between the persons in the matrix is not the distance to be noted. Then, the output control unit 207 notifies the user of the alert.
  • the calculation unit 206 in the present embodiment calculates the distance Dij between the i-th person and the j-th person, and further corrects the calculated Dij based on at least one of the presence / absence of wearing a mask and the age. You may. Specifically, the calculation unit 206 may calculate the correction distance Dij'by correcting the distance Dij using the equation (13).
  • Dij' (1 + Mij ⁇ Rm) (1-Kij ⁇ Rk) Dij ... Equation (13)
  • Mij here indicates the ratio of the person wearing a mask among the two people of the i-th and j-th, for example, if both of them are wearing masks, only one person is wearing a mask. If you are wearing a mask, it will be 1/2, and if there is no mask wearer, it will be 0.
  • Kij indicates the ratio of elderly people among the two people at the i-th and j-th, for example, 1 if both are elderly, 1/2 if only 1 is elderly, and elderly. If there is no person, it will be 0.
  • Rm and Rk each indicate a predetermined coefficient.
  • Rm when Rm is zero and Rk is a predetermined value, the higher the elderly rate Kij, the smaller the distance Dij between the i-th person and the j-th person is corrected. That is, the higher the elderly rate Kij, the easier it is for the user to be notified of the alert.
  • Rm when Rm is a predetermined value and Rk is zero, the higher the mask wearing rate Mij, the larger the distance Dij between the i-th person and the j-th person is corrected. In other words, the higher the mask wearing rate Mij, the less likely it is that the alert will be notified to the user.
  • Den'of the matrix is the total number of people whose correction distance Dij'is less than Dth, it is assumed.
  • the calculation unit 206 compares the density Den'calculated based on the correction distance Dij'with the threshold value, and when the density Den'is equal to or greater than the threshold value, the distance between the persons in the matrix is a distance to be noted. Is determined. Then, the output control unit 207 notifies the user of the alert. On the other hand, when the calculated density Den'is less than the threshold value, the calculation unit 206 determines that the distance between the persons in the matrix is not the distance to be noted. Then, the output control unit 207 does not notify the user of the alert. By doing so, it is possible to adaptively determine whether the distance between the persons is a cautionary distance, and notify the user of an alert based on the result of the determination.
  • the calculation unit 206 corrects the distance Dij between the i-th person and the j-th person in the matrix based on the orientations of the faces of the i-th person and the j-th person.
  • the determination unit 205 in the present embodiment determines the orientation of the face of the person detected by the detection unit 203 from the image. A known technique may be used to determine the orientation of the face.
  • the information processing apparatus 100 creates in advance, for example, a classifier trained by machine learning using face images having different face orientations as learning data.
  • the determination unit 205 inputs the face image of the person detected from the image by the detection unit 203 into the discriminator, and obtains the information on the orientation of the face of the person as output.
  • the distance Dij is small when the people are facing each other, and conversely, the distance Dij is large when the people are back to back. Correct to.
  • the process of correcting the distance Dij will be described with reference to FIG.
  • the counterclockwise direction is positive with respect to the straight line drawn from the center position of the face area of the i-th person 801 toward the center position of the face area of the j-th person 802.
  • the face orientation of the person 801 with the center of the face area of the i-th person 801 as the origin is the arc degree ⁇ i with respect to the straight line, and the center of the face area of the j-th person 802.
  • the face orientation of the person 802 with the origin be the radian ⁇ j with respect to the straight line.
  • the calculation unit 206 in this modification calculates the correction distance D'ij by correcting the distance Dij using the equation (16).
  • the calculation unit 206 calculates the density Den'using the equations (14) and (15) described in the fourth embodiment. Then, when the density Den'is equal to or greater than the threshold value, the output control unit 207 in the present embodiment notifies the user of an alert, assuming that the distance between the persons in the matrix is a distance to be noted. On the other hand, when the calculated density Den'is less than the threshold value, the output control unit 207 does not notify the user of the alert because the distance between the persons in the matrix is not the distance to be noted. It is possible to adaptively determine whether the distance between people is a cautionary distance, and notify the user of an alert based on the result of the determination.
  • the information processing apparatus 100 in the present embodiment has a CPU 900, a RAM 910, a ROM 920, an HDD 930, and an I / F 940.
  • the CPU 900 is a central processing unit that controls the information processing device 100 in an integrated manner.
  • the RAM 910 temporarily stores a computer program executed by the CPU 900.
  • the RAM 910 also provides a work area used by the CPU 900 to execute processing. Further, the RAM 910 functions as, for example, a frame memory or a buffer memory.
  • the ROM 920 stores a program or the like for the CPU 900 to control the information processing apparatus 100.
  • the HDD 930 is a storage device for recording image data and the like.
  • the I / F 910 communicates with an external device via the network 140 according to TCP / IP, HTTP, or the like.
  • the CPU 900 executes the processing
  • the process of displaying a GUI (GRAPHICAL USER INTERFACE) or image data on the display 130 may be executed by a GPU (GRAPHICS PROCESSING UNIT).
  • the process of reading the program code from the ROM 920 and expanding it to the RAM 910 may be executed by a DMA (Direct Memory Access) that functions as a transfer device.
  • DMA Direct Memory Access
  • the present invention can also be realized by a process in which one or more processors read and execute a program that realizes one or more functions of the above-described embodiment.
  • the program may be supplied to a system or device having a processor via a network or storage medium.
  • the present invention can also be realized by a circuit (for example, an ASIC) that realizes one or more functions of the above-described embodiment.
  • each part of the information processing apparatus 100 may be realized by the hardware shown in FIG. 9 or by software.
  • another device may have one or more functions of the information processing device 100 according to each of the above-described embodiments.
  • the image pickup apparatus 110 may have one or more functions of the information processing apparatus 100 according to each embodiment. It should be noted that each of the above-described embodiments may be combined, and for example, the above-mentioned embodiments may be arbitrarily combined.
  • the present invention has been described above with the embodiments, the above embodiments are merely examples of embodiment of the present invention, and the technical scope of the present invention is limitedly interpreted by these. It's not a thing. That is, the present invention can be implemented in various forms within a range that does not deviate from the technical idea or its main features. For example, a combination of the respective embodiments is also included in the disclosure contents of the present specification.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Emergency Management (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)
  • Emergency Alarm Devices (AREA)
  • Alarm Systems (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The present invention comprises a detection means for detecting a person included in an image captured by an image capturing means, a determination means for determining at least one of whether the person detected by the detection means is wearing a mask and the age of the person, and an output control means for outputting predetermined information on the basis of a distance between persons detected by the detection means in the image and the result of the determination by the determination means.

Description

情報処理装置、情報処理方法、およびプログラムInformation processing equipment, information processing methods, and programs
 本発明は、情報処理技術に関する。 The present invention relates to information processing technology.
 撮像装置に撮像された画像に含まれる人物間の距離と閾値とを比較し、当該距離が閾値未満である場合、当該距離は注意すべき距離としてアラートをユーザに通知する技術がある。特開2019-200718号公報では、少数グループに属する人物と多数グループに属する人物との距離が、予め設定された閾値未満である場合、警告を出力する旨の方法が開示されている。 There is a technology to compare the distance between people included in the image captured by the image pickup device with the threshold value, and if the distance is less than the threshold value, notify the user of an alert as the distance to be noted. Japanese Patent Application Laid-Open No. 2019-200718 discloses a method for outputting a warning when the distance between a person belonging to a minority group and a person belonging to a majority group is less than a preset threshold value.
特開2019-200718号公報Japanese Unexamined Patent Publication No. 2019-2007188
 しかしながら、特開2019-200718号公報では、人物間の距離が注意すべき距離かを判定するために用いる閾値が一律固定であるため、当該人物の特徴によっては本来アラートを通知しなくてよい場合であってもユーザにアラートを通知してしまうことがある。 However, in Japanese Patent Application Laid-Open No. 2019-200718, since the threshold value used for determining whether the distance between persons is a cautionary distance is fixed uniformly, there is a case where it is not necessary to notify an alert originally depending on the characteristics of the person. Even so, it may notify the user of an alert.
 人物間の距離が注意すべき距離かを適切にユーザへ提示できるようにするために、例えば、本発明に係る画像処理装置は、以下の構成を備える。すなわち、撮像手段により撮像された画像に含まれる人物を検出する検出手段と、前記検出手段により検出された人物のマスク着用の有無、および、当該人物の年齢の少なくともいずれか一方を判定する判定手段と、前記画像において前記検出手段により検出された人物間の距離と、前記判定手段による判定結果とに基づき、所定の情報を出力する出力制御手段とを有する。 For example, the image processing apparatus according to the present invention has the following configuration so that the user can be appropriately presented whether the distance between people is a distance to be noted. That is, a detection means for detecting a person included in an image captured by the image pickup means, a determination means for determining at least one of the presence or absence of a mask of the person detected by the detection means, and the age of the person. And an output control means for outputting predetermined information based on the distance between the persons detected by the detection means in the image and the determination result by the determination means.
 以上の各実施形態によれば、人物間の距離が注意すべき距離かを適切にユーザへ提示することができる。 According to each of the above embodiments, it is possible to appropriately indicate to the user whether the distance between persons is a distance to be noted.
システム構成の一例を示す図である。It is a figure which shows an example of a system configuration. 情報処理装置の機能ブロックを示す図である。It is a figure which shows the functional block of an information processing apparatus. 行列を撮像した様子を示す図である。It is a figure which shows the state which imaged the matrix. 人物間の距離が注意すべき距離かを判定する処理を説明するための図である。It is a figure for demonstrating the process of determining whether the distance between a person is a cautionary distance. 人物間の距離が注意すべき距離かを判定する処理の流れを示すフローチャートである。It is a flowchart which shows the flow of the process of determining whether the distance between people is a distance to be careful. 人物情報の一例を示す図である。It is a figure which shows an example of person information. 人物間の距離が注意すべき距離かを判定する処理を説明するための図である。It is a figure for demonstrating the process of determining whether the distance between a person is a cautionary distance. 人物間の距離を補正する処理を説明するための図である。It is a figure for demonstrating the process which corrects the distance between a person. 各装置のハードウェア構成の一例を示す図である。It is a figure which shows an example of the hardware composition of each apparatus.
 以下、添付図面を参照しながら、本発明に係る実施形態について説明する。なお、以下の実施形態において示す構成は一例に過ぎず、図示された構成に限定されるものではない。 Hereinafter, embodiments according to the present invention will be described with reference to the accompanying drawings. The configuration shown in the following embodiments is merely an example, and is not limited to the configurations shown in the drawings.
 (実施形態1)
 図1は、本実施形態におけるシステム構成を示す図である。本実施形態におけるシステムは、情報処理装置100、撮像装置110、記録装置120、およびディスプレイ130を有している。
(Embodiment 1)
FIG. 1 is a diagram showing a system configuration in this embodiment. The system in this embodiment includes an information processing device 100, an image pickup device 110, a recording device 120, and a display 130.
 情報処理装置100、撮像装置110、および記録装置120は、ネットワーク140を介して相互に接続されている。ネットワーク140は、例えばETHERNET(登録商標)等の通信規格に準拠する複数のルータ、スイッチ、ケーブル等から実現される。 The information processing device 100, the image pickup device 110, and the recording device 120 are connected to each other via the network 140. The network 140 is realized from a plurality of routers, switches, cables and the like conforming to a communication standard such as ETHERNET (registered trademark).
 なお、ネットワーク140は、インターネットや有線LAN(Local AreaNetwork)、無線LAN(Wireless Lan)、WAN(Wide Area Network)等により実現されてもよい。 The network 140 may be realized by the Internet, a wired LAN (Local Area Network), a wireless LAN (Wireless Lan), a WAN (Wide Area Network), or the like.
 情報処理装置100は、例えば、後述する情報処理の機能を実現するためのプログラムがインストールされたパーソナルコンピュータ等によって実現される。撮像装置110は、画像を撮像する装置であり、撮像手段として機能する。撮像装置110は、撮像した画像の画像データと、画像を撮像した撮像日時の情報と、撮像装置110を識別する情報である識別情報とを関連付けて、ネットワーク140を介し、情報処理装置100や記録装置120等の外部装置へ送信する。なお、本実施形態に係るシステムにおいて、撮像装置110は1つとするが、複数であってもよい。すなわち、複数の撮像装置110が、ネットワーク140を介して、情報処理装置100や記録装置120と接続されてもよい。この場合、情報処理装置100および記録装置120は、例えば、送信された画像と関連付けられた識別情報を用いて、送信された当該画像は、複数の撮像装置110のうちどの撮像装置110により撮像されたかを判断する。 The information processing device 100 is realized by, for example, a personal computer or the like in which a program for realizing the information processing function described later is installed. The image pickup device 110 is a device for taking an image and functions as an image pickup means. The image pickup device 110 associates the image data of the captured image, the information on the date and time when the image was captured, and the identification information which is the information for identifying the image pickup device 110, and records the information processing device 100 and the recording via the network 140. It is transmitted to an external device such as the device 120. In the system according to the present embodiment, the number of image pickup devices 110 is one, but the number of image pickup devices 110 may be multiple. That is, a plurality of image pickup devices 110 may be connected to the information processing device 100 and the recording device 120 via the network 140. In this case, the information processing device 100 and the recording device 120 use, for example, the identification information associated with the transmitted image, and the transmitted image is captured by the image pickup device 110 of the plurality of image pickup devices 110. Judge whether it is.
 記録装置120は、撮像装置110が撮像した画像の画像データと、画像を撮像した撮像日時の情報と、撮像装置110を識別する識別情報とを関連付けて記録する。そして、情報処理装置100からの要求に従って、記録装置120は、記録したデータ(画像、識別情報など)を情報処理装置100へ送信する。 The recording device 120 records the image data of the image captured by the image pickup device 110, the information on the date and time when the image was captured, and the identification information for identifying the image pickup device 110 in association with each other. Then, in accordance with the request from the information processing device 100, the recording device 120 transmits the recorded data (image, identification information, etc.) to the information processing device 100.
 ディスプレイ130は、LCD(Liquid Crystal Display)等により構成されており、撮像装置110が撮像した画像などを表示する。ディスプレイ130は、HDMI(登録商標)(High Definition Multimedia Interface)等の通信規格に準拠したディスプレイケーブルを介して情報処理装置100と接続されている。なお、ディスプレイ130、情報処理装置100、および記録装置120の少なくともいずれか2つ又は全ては、単一の筐体に設けられてもよい。 The display 130 is composed of an LCD (Liquid Crystal Display) or the like, and displays an image or the like captured by the image pickup device 110. The display 130 is connected to the information processing apparatus 100 via a display cable compliant with a communication standard such as HDMI (registered trademark) (High Definition Multimedia Interface). At least two or all of the display 130, the information processing device 100, and the recording device 120 may be provided in a single housing.
 なお、撮像装置110により撮像された画像は、情報処理装置100にディスプレイケーブルを介して接続されたディスプレイ130に限らず、例えば、次のような外部装置が有するディスプレイに表示されてもよい。すなわち、ネットワーク140を介して接続されたスマートフォン、タブレット端末などのモバイルデバイスが有するディスプレイに表示されていてもよい。 The image captured by the image pickup device 110 is not limited to the display 130 connected to the information processing device 100 via the display cable, and may be displayed on, for example, a display having the following external device. That is, it may be displayed on a display of a mobile device such as a smartphone or a tablet terminal connected via the network 140.
 次に、図2に示す本実施形態に係る情報処理装置100の機能ブロックを参照して、本実施形態に係る情報処理装置100の情報処理について説明する。なお、図2に示す各機能は、本実施形態の場合、図9を参照して後述するROM(Read Only Memory)920とRAM(Random Access Memory)910とCPU(Central Processing Unit)900とを用いて、次のようにして実現されるものとする。すなわち、図2に示す各機能は、情報処理装置100のROM920に格納されたコンピュータプログラムを情報処理装置100のCPU900がRAM910に展開して実行することにより実現される。 Next, the information processing of the information processing apparatus 100 according to the present embodiment will be described with reference to the functional block of the information processing apparatus 100 according to the present embodiment shown in FIG. In the case of this embodiment, each function shown in FIG. 2 uses a ROM (Read Only Memory) 920, a RAM (Random Access Memory) 910, and a CPU (Central Processing Unit) 900, which will be described later with reference to FIG. It shall be realized as follows. That is, each function shown in FIG. 2 is realized by the CPU 900 of the information processing apparatus 100 expanding the computer program stored in the ROM 920 of the information processing apparatus 100 into the RAM 910 and executing the computer program.
 取得部101は、撮像装置110に撮像された動画像を構成する各フレームの画像を順次取得する。なお、取得部101は、撮像装置110から送信された動画像を取得してもよいし、記録装置120から送信された動画像を取得してもよい。 The acquisition unit 101 sequentially acquires images of each frame constituting the moving image captured by the image pickup device 110. The acquisition unit 101 may acquire the moving image transmitted from the imaging device 110, or may acquire the moving image transmitted from the recording device 120.
 記憶部201は、図9を参照して後述するRAM(Random Access Memory)900やHDD(Hard Disk Drive)930等によって実現でき、例えば、取得部200が取得した画像の画像データを記憶(保持)する。操作受付部202は、キーボードやマウス等の入力装置(不図示)を介して、ユーザが行った操作を受け付ける。 The storage unit 201 can be realized by a RAM (Random Access Memory) 900, an HDD (Hard Disk Drive) 930, etc., which will be described later with reference to FIG. 9, and for example, image data of an image acquired by the acquisition unit 200 is stored (retained). do. The operation receiving unit 202 receives an operation performed by the user via an input device (not shown) such as a keyboard or a mouse.
 検出部203は、取得部200が取得した画像に含まれる人物を検出する処理を実行する。なお本実施形態における検出部203は、照合パターン(辞書)を使用したパターンマッチング処理を行うことで、画像における人物を検出する。なお、人物が正面向きである場合の照合パターンと横向きである場合の照合パターンなど複数の照合パターンを用いて画像から人物を検出するようにしてもよい。このように、複数の照合パターンを用いた検出処理を実行することで、検出精度の向上が期待できる。なお、照合パターンとして、斜め方向からや上方向からなど他の角度から特定の物体を見た場合の照合パターンを用意しておいてもよい。また、人物を検出するにあたって、必ずしも全身の特徴を示す照合パターン(辞書)を用意しておく必要はなく、上半身、下半身、頭部、顔、足などの人物の一部について照合パターンを用意してもよい。なお、本実施形態における検出部203は、画像から人物を検出する方法として、パターンマッチング処理を用いるが、他の従来技術を用いて画像から人物を検出してもよい。 The detection unit 203 executes a process of detecting a person included in the image acquired by the acquisition unit 200. The detection unit 203 in the present embodiment detects a person in an image by performing pattern matching processing using a matching pattern (dictionary). It should be noted that the person may be detected from the image by using a plurality of matching patterns such as a matching pattern when the person is facing forward and a matching pattern when the person is facing sideways. By executing the detection process using a plurality of collation patterns in this way, improvement in detection accuracy can be expected. As a collation pattern, a collation pattern may be prepared when a specific object is viewed from another angle such as from an oblique direction or from an upward direction. In addition, when detecting a person, it is not always necessary to prepare a collation pattern (dictionary) that shows the characteristics of the whole body, and a collation pattern is prepared for a part of the person such as the upper body, lower body, head, face, and legs. You may. The detection unit 203 in the present embodiment uses pattern matching processing as a method of detecting a person from an image, but may detect a person from an image by using another conventional technique.
 追尾部204は、検出部203により検出された人物を追尾する。本実施形態における追尾部204は、検出部203が着目フレームよりも1つ以上前のフレームの画像から検出した人物と同じ人物を着目フレームの画像から検出した場合、それぞれのフレームにおける人物同士を対応づける。すなわち、時間的に近い複数のフレームについて画像間で人物を追尾する。 The tracking unit 204 tracks a person detected by the detection unit 203. When the tracking unit 204 in the present embodiment detects the same person as the person detected from the image of the frame one or more before the frame of interest by the detection unit 203, the people in each frame correspond to each other. Attach. That is, a person is tracked between images for a plurality of frames that are close in time.
 追尾部204が複数のフレームの画像にわたって同じ人物であると判断する方法として、例えば、検出された人物の移動ベクトルを用いて人物の移動予測位置と検出した人物位置が一定距離内であれば同一人物であるとする。また、追尾部204は、人物の色、形状、大きさ(画素数)等を用いて、複数のフレームの画像間で相関の高い人物を対応付けてもよい。このように、追尾部204は、複数のフレームの画像にわたって同じ人物であると判断し追尾する処理を実行できればよく、特定の方法に限定されるものではない。なお、追尾部204は、追尾する対象となる人物ごとに固有のIDを付与する。例えば、追尾部204は、着目フレームよりも1つ以上前のフレームの画像から検出部203が検出した人物に対してID“a”を付与したとする。そして、検出部203が着目フレームからも当該人物を検出した場合、追尾部204は、当該人物にも同一のID“a”を付与する。なお、追尾部204は、着目フレームにおいて新規に検出された人物が存在する場合、当該人物に対して新規に固有のIDを付与する。 As a method of determining that the tracking unit 204 is the same person over images of a plurality of frames, for example, if the predicted movement position of the person and the detected person position are within a certain distance using the movement vector of the detected person, they are the same. Suppose you are a person. Further, the tracking unit 204 may associate a person with a high correlation between images of a plurality of frames by using the color, shape, size (number of pixels), and the like of the person. As described above, the tracking unit 204 is not limited to a specific method as long as it can perform a process of determining that the person is the same person over a plurality of frames of images and performing tracking. The tracking unit 204 assigns a unique ID to each person to be tracked. For example, it is assumed that the tracking unit 204 assigns an ID “a” to a person detected by the detection unit 203 from an image of a frame one or more before the frame of interest. Then, when the detection unit 203 also detects the person from the frame of interest, the tracking unit 204 assigns the same ID “a” to the person. If a person newly detected in the frame of interest exists, the tracking unit 204 newly assigns a unique ID to the person.
 判定部205は、検出部203により画像から検出された人物のマスク着用の有無、および、当該人物の年齢の少なくともいずれか一方を判定する。なお、画像から検出された人物がマスクを着用しているか否かの判定の処理は、マスク着用の有無を判定する公知技術を用いてよい。具体的には、判定部205は、例えば、マスクを着用した人物の顔の画像を学習データとして機械学習により学習された識別器を用いて、次のような処理を実行する。すなわち、判定部205は、検出部203により画像から検出された人物の顔の画像を当該識別器に入力することで、当該人物がマスクを着用しているか否かの情報を出力として取得する。また、画像から検出された人物の年齢の判定の処理は、公知技術を用いてよい。例えば、判定部205は、年齢が異なる人物の顔の画像を学習データとして機械学習により学習された識別器に対し、検出部203により画像から検出された人物の顔の画像を入力することで、当該人物の年齢の情報を出力として取得する。 The determination unit 205 determines whether or not the person wearing the mask detected from the image by the detection unit 203, and at least one of the ages of the person. In the process of determining whether or not the person detected from the image is wearing a mask, a known technique for determining whether or not the person is wearing a mask may be used. Specifically, the determination unit 205 executes the following processing using, for example, a discriminator learned by machine learning using an image of the face of a person wearing a mask as training data. That is, the determination unit 205 inputs the image of the face of the person detected from the image by the detection unit 203 into the classifier, and acquires information on whether or not the person is wearing a mask as an output. Further, a known technique may be used for the process of determining the age of the person detected from the image. For example, the determination unit 205 inputs an image of a person's face detected from the image by the detection unit 203 into a classifier learned by machine learning using images of the faces of people of different ages as learning data. The information of the age of the person concerned is acquired as an output.
 算出部206は、画像に含まれる人物間の距離を算出する。また、算出部206は、算出した人物間の距離と、閾値とを比較する。なお、算出部206による処理の詳細については後述する。出力制御部207は、算出部206により算出された人物間の距離と、判定部205によるマスク着用の有無および年齢の少なくともいずれか一方の判定結果とに基づき、所定の情報を出力する。 The calculation unit 206 calculates the distance between the persons included in the image. Further, the calculation unit 206 compares the calculated distance between the persons with the threshold value. The details of the processing by the calculation unit 206 will be described later. The output control unit 207 outputs predetermined information based on the distance between persons calculated by the calculation unit 206 and the determination result of at least one of the presence / absence of wearing a mask and the age by the determination unit 205.
 次に、図3および図4を参照して、本実施形態における情報処理装置100による情報処理について説明する。図3は、行列に並ぶ人物が撮像装置110により撮像される様子を示す図である。なお、本実施形態では、ガイドポール等によって列が規制され、入口と出口のある行列を想定する。利用者は、行列の入口である行列入口301から待ち行列に入場し、ガイド300に沿って進み、行列の出口である行列出口302から退場する。すなわち、待ち行列は、行列出口302を先頭に、ガイド300に沿って、入口301に向かって形成される。本実施形態における撮像装置110は、ガイド300において形成される行列を撮像するように設置されているものとする。本実施形態における情報処理装置100は、撮像装置110に撮像された行列に含まれる人物を検出し、検出した人物の数をカウントすることで、行列に並ぶ人数を取得する。なお、図2に示す例では、撮像装置110は1つであるが、複数の撮像装置110を用いて待ち行列を複数の部分に分けて撮像してもよい。この場合、検出部203は、待ち行列を分割した複数の部分各々を撮像することで得られた複数の画像各々に対し人物を検出し、算出部206は、検出された人物の数の合計を算出することで、行列人数を求めるようにしてもよい。 Next, information processing by the information processing apparatus 100 in the present embodiment will be described with reference to FIGS. 3 and 4. FIG. 3 is a diagram showing how people in a line are imaged by the image pickup apparatus 110. In this embodiment, a line is regulated by a guide pole or the like, and a line with an entrance and an exit is assumed. The user enters the queue from the queue entrance 301, which is the entrance of the queue, proceeds along the guide 300, and exits from the queue exit 302, which is the exit of the queue. That is, the queue is formed from the matrix exit 302 toward the inlet 301 along the guide 300. It is assumed that the image pickup apparatus 110 in the present embodiment is installed so as to image the matrix formed in the guide 300. The information processing apparatus 100 in the present embodiment detects the persons included in the matrix imaged by the image pickup apparatus 110, and counts the number of the detected persons to acquire the number of people in the matrix. In the example shown in FIG. 2, the number of image pickup devices 110 is one, but a plurality of image pickup devices 110 may be used to divide the queue into a plurality of portions for image pickup. In this case, the detection unit 203 detects a person for each of the plurality of images obtained by imaging each of the plurality of parts obtained by dividing the queue, and the calculation unit 206 totals the number of detected persons. By calculating, the number of people in a queue may be obtained.
 次に、図4を参照して、本実施形態における情報処理装置100による情報処理について説明する。行列において、前後の人物との間隔(人物間の距離)が短いほど、人物間でウイルスの感染が広まる可能性(以下、感染可能性)が高まることがある。そこで、本実施形態における情報処理装置100は、人物間の距離に応じて、ユーザに対しアラート(所定の情報)を通知する。 Next, information processing by the information processing apparatus 100 in the present embodiment will be described with reference to FIG. In the procession, the shorter the distance between the people before and after (the distance between the people), the higher the possibility that the virus infection spreads between the people (hereinafter referred to as the possibility of infection). Therefore, the information processing apparatus 100 in the present embodiment notifies the user of an alert (predetermined information) according to the distance between the persons.
 図4に示す例において、本実施形態における検出部203は、図4に示す行列を撮像した画像から、当該行列に含まれる人物を検出する。そして、算出部206は、画像における人物の検出結果に基づき、式(1)を用いて、行列における前後の人物の距離の平均Davrを算出する。 In the example shown in FIG. 4, the detection unit 203 in the present embodiment detects a person included in the matrix from the image obtained by capturing the matrix shown in FIG. Then, the calculation unit 206 calculates the average Davr of the distances of the front and back people in the matrix by using the equation (1) based on the detection result of the person in the image.
 Davr=L/(N-1)・・・式(1)
 Lは行列長、Nは行列人数
 式(1)において、「行列人数」とは行列に並ぶ人数の合計である。本実施形態において、検出部203は、撮像装置110に撮像された行列に含まれる人物を検出し、算出部206は、検出した人物の数をカウントすることで、行列人数を取得する。この場合、算出部206は、例えば、画像において設定された領域R1(領域411)から領域R5(領域415)の各々において検出部203により検出された人物の数の総和を算出し、算出した総和を行列人数Nとして取得する。なお、情報処理装置100は、例えば、次のような処理を実行することで行列人数を取得してもよい。すなわち、情報処理装置100は、行列入口301を通過した通過人数Ninから行列出口302を通過した通過人数Noutの差Nin-Noutにより行列人数を算出するようにしてよい。このとき、算出部206は、例えば、追尾部204により追尾される人物が行列入口301を通過したかを判定し、通過したと判定された人物の数を通過人数Ninとして取得する。同様に、追尾部204により追尾される人物が行列出口302を通過したかを判定し、通過したと判定された人物の数を通過人数Noutとして取得する。そして、算出部206は、取得したNinからNoutを差し引くことで、行列人数Nを取得する。
Davr = L / (N-1) ... Equation (1)
L is the length of the matrix, and N is the number of people in the matrix. In the equation (1), the "number of people in the matrix" is the total number of people in the matrix. In the present embodiment, the detection unit 203 detects a person included in the matrix imaged by the image pickup apparatus 110, and the calculation unit 206 acquires the number of people in the matrix by counting the number of the detected persons. In this case, the calculation unit 206 calculates, for example, the sum of the number of persons detected by the detection unit 203 in each of the regions R1 (region 411) to the region R5 (region 415) set in the image, and the calculated sum. Is obtained as the number of people in the queue N. The information processing apparatus 100 may acquire the number of people in a matrix by executing the following processing, for example. That is, the information processing apparatus 100 may calculate the number of people in a matrix by the difference Nin-Nout between the number of people passing through the procession entrance 301 and the number of people passing through the procession exit 302. At this time, the calculation unit 206 determines, for example, whether the person tracked by the tracking unit 204 has passed the procession entrance 301, and acquires the number of persons determined to have passed as the number of passing people Nin. Similarly, it is determined whether or not the person tracked by the tracking unit 204 has passed the procession exit 302, and the number of persons determined to have passed is acquired as the number of passing persons Now. Then, the calculation unit 206 acquires the number of people in the matrix N by subtracting Nout from the acquired Nin.
 また式(1)において、「行列長」とは行列の先頭から末尾までの距離を経路に沿って求めたものである。図4において、本実施形態における情報処理装置100は、画像に映る行列の経路上に領域R1から領域R5を設定し、各々の領域について行列の経路に沿った距離を保持する。例えば、ユーザ操作に従って画像上に領域Rnを設定するととともに、当該領域Rnの実空間における行列の経路に沿った距離の情報を取得する。次に、検出部203は、画像における各領域内で人物を検出する処理を実行し、算出部206は、静止している人物が存在する領域Rnのうちnが最大の領域Rendを特定する。そして、算出部206は、R1からRendまで順に経路に沿った距離を加算すると全体の行列長を求めることができる。なお行列長を求める方法としては、他にも行列待ち中の人物間の位置を個別に計測して距離を算出する手法など、他の公知技術を用いてもよい。 Also, in equation (1), the "matrix length" is the distance from the beginning to the end of the matrix obtained along the path. In FIG. 4, the information processing apparatus 100 in the present embodiment sets regions R1 to R5 on the path of the matrix reflected in the image, and maintains the distance along the path of the matrix for each region. For example, the region Rn is set on the image according to the user operation, and the information of the distance along the path of the matrix in the real space of the region Rn is acquired. Next, the detection unit 203 executes a process of detecting a person in each region of the image, and the calculation unit 206 identifies the region Rend in which n is the largest among the regions Rn in which the stationary person exists. Then, the calculation unit 206 can obtain the total matrix length by adding the distances along the path in order from R1 to End. As a method for obtaining the queue length, other known techniques such as a method of individually measuring the positions between people waiting in a queue and calculating the distance may be used.
 算出部206は、更に、式(1)を求めて算出した人物間の距離の平均Davrと、閾値とを比較する。そして、出力制御部207は、平均Davrが閾値未満である場合、感染可能性が高く人物間の距離は注意すべき距離であるとして、所定の情報をユーザに通知する。一方、出力制御部207は、平均Davrが閾値以上である場合、感染可能性が低く人物間の距離は注意すべき距離でないとして、所定の情報をユーザに通知しない。なお、出力制御部207は、所定の情報として警告を示すメッセージやマークなどをディスプレイ130に表示させるようにしてもよい。また、出力制御部207は、所定の情報をユーザに通知する方法として、警告を示す音声がスピーカ(不図示)から流れるようにしてもよい。 The calculation unit 206 further compares the average Davr of the distance between the persons calculated by obtaining the equation (1) with the threshold value. Then, when the average Davr is less than the threshold value, the output control unit 207 notifies the user of predetermined information, assuming that there is a high possibility of infection and the distance between the persons is a distance to be noted. On the other hand, when the average Davr is equal to or greater than the threshold value, the output control unit 207 does not notify the user of predetermined information because the possibility of infection is low and the distance between persons is not a cautionary distance. The output control unit 207 may display a message, a mark, or the like indicating a warning as predetermined information on the display 130. Further, the output control unit 207 may make a voice indicating a warning flow from a speaker (not shown) as a method of notifying the user of predetermined information.
 なお、本実施形態における算出部206は、式(1)を用いて所定時間ごとに最新の平均Davrを算出し、算出した最新の平均Davrと閾値とを比較するようにしてもよい。そして、平均Davrが所定回数連続して閾値未満である場合、出力制御部207は、所定の情報をユーザに通知するようにしてもよい。例えば出力制御部207は、平均Davrが3回連続して閾値未満である場合、所定の情報として警告を示すメッセージをディスプレイ130に表示させる。一方、出力制御部207は、平均Davrが3回連続して閾値未満でなかった場合、警告を示すメッセージをディスプレイ130に表示させないようにしてもよい。 Note that the calculation unit 206 in the present embodiment may calculate the latest average Davr at predetermined time intervals using the equation (1) and compare the calculated latest average Davr with the threshold value. Then, when the average Davr is continuously less than the threshold value a predetermined number of times, the output control unit 207 may notify the user of the predetermined information. For example, when the average Davr is less than the threshold value three times in a row, the output control unit 207 causes the display 130 to display a message indicating a warning as predetermined information. On the other hand, the output control unit 207 may not display the warning message on the display 130 when the average Davr is not less than the threshold value three times in a row.
 また、本実施形態における算出部206は、マスク着用の有無に応じて、平均Davrを補正してもよい。具体的には、本実施形態における算出部206は、人物がマスク着用をしている場合、ウイルスの感染可能性が低減されるとして、算出した平均Davrの値が大きくなるよう補正する。算出部206は、例えば、式(2)を用いて、平均davrを補正した値である補正平均Davr’を算出する。 Further, the calculation unit 206 in the present embodiment may correct the average Davr depending on whether or not the mask is worn. Specifically, the calculation unit 206 in the present embodiment corrects so that the calculated average Davr value becomes large, assuming that the possibility of virus infection is reduced when the person wears a mask. The calculation unit 206 calculates, for example, the corrected average Davr', which is a value obtained by correcting the average davr, using the equation (2).
 Davr’=(1+Cm)Davr・・・式(2)
 但し、0≦Cm≦1
 なお、式(2)におけるCmは、行列における人物のうちマスクを着用している者の割合「マスク着用率」を示す。本実施形態における判定部205は、図4に示す行列入口301から行列出口302まで経路に沿って位置する人物の各々について、マスクを着用しているかを判定する。そして、算出部206は、判定部205によりマスクを着用していると判定された人数を行列人数Nで割ることでマスク着用率Cmを算出する。式(2)に示すように、本実施形態における算出部206は、マスク着用率が高いほど、算出した人物間の距離の平均Davrの値が大きくなるよう補正することで、補正平均Davr’を算出する。そして、算出部206は、補正平均Davr’と閾値とを比較する。出力制御部207は、補正平均Davr’が閾値未満である場合、感染可能性が高く注意すべき距離であるとして、所定の情報として警告を示すメッセージをユーザに通知する。そして、出力制御部207は、補正平均Davr’が閾値以上である場合、感染可能性が低く注意すべき距離ではないとして、警告を示すメッセージをユーザに通知しない。このように、本実施形態における情報処理装置100は、人物間の距離の平均Davrをマスク着用率に応じて補正平均Davr’と閾値とを比較した結果に応じてユーザに所定の情報(アラート)を通知するようにしてもよい。このようにすることで、マスク着用率が高く本来ユーザにアラートを通知する必要がない場合、ユーザにアラートを通知してしまうことを低減することができる。言い換えれば、人物間の距離が注意すべき距離かを適応的に判定し、当該判定の結果に基づき、ユーザにアラートを通知することができる。
Davr'= (1 + Cm) Davr ... Equation (2)
However, 0 ≦ Cm ≦ 1
In addition, Cm in the formula (2) indicates the ratio "mask wearing rate" of the person wearing a mask among the persons in a procession. The determination unit 205 in the present embodiment determines whether or not each of the persons located along the route from the matrix entrance 301 to the matrix exit 302 shown in FIG. 4 is wearing a mask. Then, the calculation unit 206 calculates the mask wearing rate Cm by dividing the number of people determined by the determination unit 205 to be wearing the mask by the number of people in the matrix N. As shown in the formula (2), the calculation unit 206 in the present embodiment corrects the calculated average Davr value of the distance between the persons to be larger as the mask wearing rate is higher, thereby calculating the corrected average Davr'. calculate. Then, the calculation unit 206 compares the corrected average Davr'with the threshold value. When the corrected average Davr'is less than the threshold value, the output control unit 207 notifies the user of a message indicating a warning as predetermined information, assuming that the distance is likely to be infected and should be noted. Then, when the corrected average Davr'is equal to or greater than the threshold value, the output control unit 207 does not notify the user of a message indicating a warning because the possibility of infection is low and the distance should be noted. As described above, the information processing apparatus 100 in the present embodiment corrects the average Davr of the distance between people according to the mask wearing rate, and determines the information (alert) to the user according to the result of comparing the average Davr'and the threshold value. May be notified. By doing so, when the mask wearing rate is high and it is not necessary to notify the user of the alert, it is possible to reduce the notification of the alert to the user. In other words, it is possible to adaptively determine whether the distance between persons is a distance to be noted, and notify the user of an alert based on the result of the determination.
 また、本実施形態における算出部206は、人物の年齢に応じて、平均Davrを補正してもよい。本実施形態において、人物の年齢が高いと、ウイルスに感染したときに重症化の可能性が高いと想定する。そこで、本実施形態における算出部206は、高齢者の数が多いほど、ユーザにアラートの通知が行われやすいよう、算出した平均Davrの値が小さくなるよう補正する。具体的には、算出部206は、例えば、式(3)を用いて、平均Davrを補正した値である補正平均Davr’を算出する。 Further, the calculation unit 206 in the present embodiment may correct the average Davr according to the age of the person. In the present embodiment, it is assumed that the older the person is, the higher the possibility of becoming severe when infected with the virus. Therefore, the calculation unit 206 in the present embodiment corrects so that the calculated average Davr value becomes smaller so that the user is more likely to be notified of the alert as the number of elderly people increases. Specifically, the calculation unit 206 calculates the corrected average Davr', which is a value obtained by correcting the average Davr, using, for example, the equation (3).
 Davr’=(1-Ck/2)Davr・・・式(3)
 但し、0≦Ck≦1
 なお、式(3)におけるCkは、行列における人物のうち高齢者の割合「高齢者率」を示す。本実施形態における判定部205は、図4に示す行列入口301から行列出口302まで経路に沿った行列に並ぶ人物の各々について、年齢を判定し、判定した年齢に基づき、高齢者かを判定する。このとき、判定部205は、例えば、行列に並ぶ人物の各々について年齢を判定し、所定年齢以上の人物を高齢者と判定し、所定年齢未満の人物を高齢者ではないと判定する。なお、本実施形態において、所定年齢を70歳とするが、65歳や75歳など他の基準に従ってもよい。そして、算出部206は、判定部205により高齢者と判定された人数を行列人数Nで割ることで高齢者率Ckを算出する。式(3)に示すように、本実施形態における算出部206は、高齢者率Ckが高いほど、算出した人物間の距離の平均Davrの値が小さくなるよう補正することで、補正平均Davr’を算出する。そして、算出部206は、補正平均Davr’と閾値とを比較する。出力制御部207は、補正平均Davr’が閾値未満である場合、注意すべき距離であるとして、所定の情報をユーザに通知する。そして、出力制御部207は、補正平均Davr’が閾値以上である場合、注意すべき距離ではないとして、所定の情報をユーザに通知しない。このように、本実施形態における情報処理装置100は、人物間の距離の平均Davrを人物の年齢に応じて補正平均Davr’と閾値とを比較した結果に応じてユーザにアラートを通知するようにしてもよい。このようにすることで、人物の年齢が高いために本来ユーザにアラートを通知すべきときに、ユーザにアラートを通知されないことを低減することができる。言い換えれば、人物間の距離が注意すべき距離かを適応的に判定し、当該判定の結果に基づき、ユーザにアラートを通知することができる。
Davr'= (1-Ck / 2) Davr ... Equation (3)
However, 0 ≦ Ck ≦ 1
In addition, Ck in the formula (3) indicates the ratio of elderly people among the persons in the matrix "elderly person rate". The determination unit 205 in the present embodiment determines the age of each of the persons lined up in the matrix along the route from the matrix entrance 301 to the matrix exit 302 shown in FIG. 4, and determines whether the person is an elderly person based on the determined age. .. At this time, the determination unit 205 determines, for example, the age of each of the persons in the line, determines that the person of the predetermined age or older is an elderly person, and determines that the person of less than the predetermined age is not an elderly person. In the present embodiment, the predetermined age is 70 years old, but other criteria such as 65 years old and 75 years old may be followed. Then, the calculation unit 206 calculates the elderly rate Ck by dividing the number of people determined to be elderly by the determination unit 205 by the number of people in the matrix N. As shown in the formula (3), the calculation unit 206 in the present embodiment corrects so that the higher the elderly rate Ck, the smaller the value of the calculated average Davr of the distance between the persons, so that the corrected average Davr' Is calculated. Then, the calculation unit 206 compares the corrected average Davr'with the threshold value. When the corrected average Davr'is less than the threshold value, the output control unit 207 notifies the user of predetermined information as a distance to be noted. Then, when the corrected average Davr'is equal to or greater than the threshold value, the output control unit 207 does not notify the user of predetermined information because it is not a distance to be noted. As described above, the information processing apparatus 100 in the present embodiment notifies the user of an alert according to the result of comparing the average Davr of the distance between the persons with the corrected average Davr'according to the age of the person and the threshold value. You may. By doing so, it is possible to reduce the fact that the alert is not notified to the user when the alert should be notified to the user because the person is old. In other words, it is possible to adaptively determine whether the distance between persons is a distance to be noted, and notify the user of an alert based on the result of the determination.
 なお、本実施形態における算出部206は、マスク着用の有無と人物の年齢とに応じて、平均Davrを補正してもよい。このとき、算出部206は、例えば、式(4)を用いて、算出した平均Davrを補正した補正平均Davr’を取得してもよい。 Note that the calculation unit 206 in the present embodiment may correct the average Davr according to the presence or absence of wearing a mask and the age of the person. At this time, the calculation unit 206 may acquire the corrected average Davr'corrected by the calculated average Davr, for example, by using the equation (4).
 Davr’=(1+Cm)(1-Ck/2)Davr・・・式(4)
 式(4)の例において、マスク着用率が高いほど平均Davrが大きくなるよう補正され、また、高齢者率が高いほど平均Davrが小さくなるよう補正される。
Davr'= (1 + Cm) (1-Ck / 2) Davr ... Equation (4)
In the example of the formula (4), the higher the mask wearing rate, the larger the average Davr, and the higher the elderly rate, the smaller the average Davr.
 続いて図5を参照して、本実施形態における情報処理装置100による情報処理について説明する。なお、図5に示すフローの処理は、例えば、ユーザによる指示に従って、開始又は終了するものとする。なお、図5に示すフローチャートの処理は、例えば、情報処理装置100のROM920に格納されたコンピュータプログラムを情報処理装置100のCPU900が実行して実現される図2に示す機能ブロックにより実行されるものとする。 Subsequently, with reference to FIG. 5, information processing by the information processing apparatus 100 in the present embodiment will be described. The flow processing shown in FIG. 5 is, for example, started or terminated according to an instruction by the user. The processing of the flowchart shown in FIG. 5 is executed by, for example, the functional block shown in FIG. 2 realized by executing the computer program stored in the ROM 920 of the information processing apparatus 100 by the CPU 900 of the information processing apparatus 100. And.
 まず、S501にて、取得部200は、撮像装置120に撮像された画像を取得する。このとき、取得部200は、撮像装置110に撮像された動画像を構成する複数のフレームの画像のうち、1つのフレームの画像を処理対象の画像(以下、処理対象画像)として取得する。次に、S502にて、検出部203は、処理対象画像に含まれる人物を検出する。このとき、検出部203は、例えば人物の照合パターンを用いたパターンマッチング処理により、処理対象画像に含まれる人物を検出する。次に、S503にて、追尾部204は、検出部203により検出された人物を追尾する。追尾部204は、処理対象画像よりも1つ以上前のフレームの画像から検出部203により検出された人物と同じ人物が検出部203により処理対象画像から検出された場合、それぞれのフレームにおける人物同士を対応付けることで、当該人物の追尾を行う。なお、追尾部204は、追尾する対象となる人物ごとに固有のIDを付与する。例えば、追尾部204は、処理対象画像よりも1つ以上前のフレームの画像から検出部203が検出した人物に対してID“a”を付与したとする。そして、検出部203が処理対象画像からも当該人物を検出した場合、追尾部204は、当該人物にも同一のID“a”を付与する。なお、処理対象画像において新規に検出された人物が存在する場合、追尾部204は、当該人物に対して新規に固有のIDを付与する。また、記憶部201は、現在行列に並ぶ人物に対して追尾部204により付与されたIDを、図6に示す人物情報600に記憶する。なお、図6に示す人物情報600は、現在行列に並ぶ人物に関する情報が記憶(保持)される。具体的には、人物情報600において、追尾部204により付与された人物のID601と、当該人物の年齢の情報である年齢情報602と、当該人物のマスク着用の有無の情報であるマスク情報603とを関連付けて記録(保持)される。なお、人物情報600は、図6に示す例に限らず、例えば、ID601が付与された人物各々について、フレームごとの当該人物のサイズおよび位置の情報を含むようにしてもよい。S503にて、新規に検出された人物に対し固有のIDが付与された場合、記憶部201は、人物情報600のID601に新規に付与されたIDを新たに格納する。なお、本実施形態における追尾部204により追尾される人物が行列出口302を通過して行列から退場した場合、記憶部201は、当該人物に関する情報(ID601、年齢情報602、マスク情報603)を人物情報600から削除する。 First, in S501, the acquisition unit 200 acquires the image captured by the image pickup device 120. At this time, the acquisition unit 200 acquires an image of one frame among the images of a plurality of frames constituting the moving image captured by the image pickup apparatus 110 as an image to be processed (hereinafter referred to as an image to be processed). Next, in S502, the detection unit 203 detects a person included in the image to be processed. At this time, the detection unit 203 detects a person included in the image to be processed, for example, by pattern matching processing using a person matching pattern. Next, in S503, the tracking unit 204 tracks the person detected by the detection unit 203. When the same person as the person detected by the detection unit 203 from the image of the frame one or more before the processing target image is detected from the processing target image by the detection unit 203, the tracking unit 204 is a person in each frame. By associating with, the person is tracked. The tracking unit 204 assigns a unique ID to each person to be tracked. For example, it is assumed that the tracking unit 204 assigns an ID “a” to a person detected by the detection unit 203 from an image of a frame one or more before the image to be processed. Then, when the detection unit 203 also detects the person from the image to be processed, the tracking unit 204 assigns the same ID “a” to the person. If there is a newly detected person in the image to be processed, the tracking unit 204 newly assigns a unique ID to the person. Further, the storage unit 201 stores the ID assigned by the tracking unit 204 to the persons currently in line in the person information 600 shown in FIG. The person information 600 shown in FIG. 6 stores (retains) information about people currently in a line. Specifically, in the person information 600, the ID 601 of the person given by the tracking unit 204, the age information 602 which is the information of the age of the person, and the mask information 603 which is the information of whether or not the person wears a mask. Is recorded (retained) in association with. The person information 600 is not limited to the example shown in FIG. 6, and may include, for example, information on the size and position of the person for each frame for each person to which the ID 601 is assigned. When a unique ID is assigned to the newly detected person in S503, the storage unit 201 newly stores the newly assigned ID in the ID 601 of the person information 600. When a person tracked by the tracking unit 204 in the present embodiment passes through the procession exit 302 and exits the procession, the storage unit 201 provides information about the person (ID601, age information 602, mask information 603) to the person. Delete from information 600.
 次に、S504にて、判定部205は、追尾部204により追尾される人物について、マスク着用の有無を判定する。なお、S504にて、本実施形態における判定部205は、まだマスク着用の有無が判定されてない人物に対してのみ、マスク着用の有無を判定するようにしてもよい。本実施形態において、図6に示す人物情報600にて、ID601が“c”の人物のマスク情報603がブランクとなっており、ID601“c”の人物に対して未だマスク着用の有無が判定されていないことを示している。これは、当該人物の正面顔の顔画像が取得されていない場合など、適切にマスク着用の有無の判定がなされていない場合に起こり得る。S504にて、判定部205は、マスク有無の判定がなされていない人物ID“c”の人物について、マスク着用の有無の判定を行うようにしてもよい。またこの場合、判定部205は、既にマスク着用の有無の判定がなされたID601“a,b,d”の人物についてはマスク着用の有無の判定は行わない。ここで、判定部205による判定に従って人物ID“c”についてマスク着用の有無の情報が得られた場合、記憶部201は、当該マスク着用の有無の情報をマスク情報603に格納する。 Next, in S504, the determination unit 205 determines whether or not the person being tracked by the tracking unit 204 is wearing a mask. In S504, the determination unit 205 in the present embodiment may determine whether or not the mask is worn only for a person who has not yet been determined whether or not the mask is worn. In the present embodiment, in the person information 600 shown in FIG. 6, the mask information 603 of the person with ID 601 "c" is blank, and it is determined whether or not the person with ID 601 "c" is still wearing the mask. It shows that it is not. This may occur when it is not properly determined whether or not the mask is worn, such as when the face image of the front face of the person is not acquired. In S504, the determination unit 205 may determine whether or not the mask is worn for the person with the person ID “c” for whom the presence or absence of the mask has not been determined. Further, in this case, the determination unit 205 does not determine whether or not the mask is worn for the person with ID 601 "a, b, d" who has already been determined whether or not the mask is worn. Here, when the information on whether or not the person ID “c” is worn is obtained according to the determination by the determination unit 205, the storage unit 201 stores the information on whether or not the mask is worn in the mask information 603.
 次に、S505にて、判定部205は、追尾部204により追尾される人物について、年齢を判定する。なお、S505にて、本実施形態における判定部205は、まだ年齢が判定されてない人物に対してのみ、年齢を判定するようにしてもよい。図6に示す人物情報600にて、ID601が“d”の人物の年齢情報602がブランクとなっており、ID601“d”の人物に対して未だ年齢が判定されていないことを示している。これは、当該人物の正面顔の顔画像が取得されていない場合など、適切に年齢の判定がなされていない場合に起こり得る。S505にて、判定部205は、年齢の判定がなされていない人物ID“d”の人物について、年齢の判定を行い、既に年齢の判定がなされたID601“a,b,c”の人物については年齢の判定は行わないようにしてもよい。ここで、判定部205による判定に従って人物ID“d”について年齢の情報が得られた場合、当該年齢の情報を年齢情報602に格納する。 Next, in S505, the determination unit 205 determines the age of the person tracked by the tracking unit 204. In S505, the determination unit 205 in the present embodiment may determine the age of a person whose age has not yet been determined. In the person information 600 shown in FIG. 6, the age information 602 of the person whose ID 601 is "d" is blank, indicating that the age of the person whose ID 601 "d" has not been determined yet. This may occur when the age is not properly determined, such as when the face image of the front face of the person is not acquired. In S505, the determination unit 205 determines the age of the person with the person ID "d" whose age has not been determined, and for the person with ID 601 "a, b, c" whose age has already been determined. The age may not be determined. Here, when the age information is obtained for the person ID "d" according to the determination by the determination unit 205, the age information is stored in the age information 602.
 次に、S506にて、算出部206は、行列に並ぶ人数である行列人数Nを算出する。例えば、算出部206は、行列入口301を通過した通過人数Ninから行列出口302を通過した通過人数Noutの差Nin-Noutにより求めることができる。 Next, in S506, the calculation unit 206 calculates the number of people in the line N, which is the number of people in the line. For example, the calculation unit 206 can be obtained by the difference Nin-Nout between the number of passers Nin passing through the matrix entrance 301 and the number of passers Nout passing through the matrix exit 302.
 次に、S507にて、算出部206は、行列の先頭から末尾までの行列の経路に沿った距離である行列長を特定する。本実施形態における算出部206は、例えば、図4に示す例において、静止している人物が存在する領域Rnのうちnが最大の領域Rmaxを特定する。そして、領域R1~R5の各々について予め保持していた、行列の経路に沿った距離の情報に基づき、算出部206は、R1からRmaxまで順に距離を加算すると全体の行列長を求めることができる。 Next, in S507, the calculation unit 206 specifies the matrix length, which is the distance along the matrix path from the beginning to the end of the matrix. For example, in the example shown in FIG. 4, the calculation unit 206 in the present embodiment specifies the region Rmax in which n is the largest among the regions Rn in which a stationary person exists. Then, based on the information of the distance along the path of the matrix held in advance for each of the regions R1 to R5, the calculation unit 206 can obtain the total matrix length by adding the distances in order from R1 to Rmax. ..
 次に、S508にて、算出部206は、人物間の距離の平均Davrを算出する。本実施形態における算出部206は、式(1)を用いて、平均Davrを算出するものとするが、他の方法を用いて、人物間の距離の平均Davrを算出してもよい。 Next, in S508, the calculation unit 206 calculates the average Davr of the distance between people. The calculation unit 206 in the present embodiment calculates the average Davr using the formula (1), but the average Davr of the distance between people may be calculated by using another method.
 次に、S509にて、平均Davrの補正を行うか否かを判定する。例えば、平均Davrの補正を行う指示がユーザに予めされていた場合(S509にてYes)、S510へ遷移し、平均Davrの補正を行う指示がユーザにされていない場合(S509にてNo)、S511へ遷移する。S510にて、算出部206は、人物のマスク着用の有無、および、人物の年齢、の少なくともいずれか一方に基づき、平均Davrの補正を行う。ここで例えば、算出部206は、式(2)~式(4)のいずれかを用いて、平均Davrの補正を行うことで、補正平均Davr’を取得する。 Next, in S509, it is determined whether or not to correct the average Davr. For example, if the user has been instructed to correct the average Davr in advance (Yes in S509), the transition to S510 has been performed, and the user has not been instructed to correct the average Davr (No in S509). Transition to S511. In S510, the calculation unit 206 corrects the average Davr based on at least one of the presence or absence of the person wearing a mask and the age of the person. Here, for example, the calculation unit 206 obtains the corrected average Davr'by correcting the average Davr using any one of the equations (2) to (4).
 S511にて、算出部206は、算出した平均Davr、または、平均Davrを補正した補正平均Davr’と、閾値とを比較する比較処理を実行する。なお、S510における平均Davrの補正が行われなかった場合(S509からS511へ遷移した場合)、S511にて、算出部206は、算出した平均Davrと閾値とを比較する。一方、S510における平均Davrの補正が行われた場合(S509からS510へ遷移した場合)、S511にて、算出部206は、平均Davrを補正して得られた平均Davr’と閾値とを比較する。なお、ここでの閾値は、ユーザによって予め設定されているものであり、適宜変更可能なパラメータであるものとする。 In S511, the calculation unit 206 executes a comparison process of comparing the calculated average Davr or the corrected average Davr'corrected with the average Davr with the threshold value. When the average Davr is not corrected in S510 (when transitioning from S509 to S511), the calculation unit 206 compares the calculated average Davr with the threshold value in S511. On the other hand, when the average Davr is corrected in S510 (when transitioning from S509 to S510), in S511, the calculation unit 206 compares the average Davr' obtained by correcting the average Davr with the threshold value. .. It should be noted that the threshold value here is a parameter that is preset by the user and can be changed as appropriate.
 平均Davr(または補正平均Davr’)が閾値未満である場合(S512にてYes)、S513へ遷移し、S513にて、出力制御部207は、所定の情報として警告を示すメッセージをユーザに通知する。一方、平均Davr(または補正平均Davr’)が閾値以上である場合(S512にてNo)、S514へ遷移する。S514にて、ユーザにより終了の指示がされている場合(S514にてYes)、図5に示すフローの処理を終了する。一方、ユーザにより終了の指示がされていない場合、(S514にてNO)、S501へ遷移し、取得部200は、次に処理対象とする画像を新たに取得する。 When the average Davr (or the corrected average Davr') is less than the threshold value (Yes in S512), the transition to S513 is performed, and in S513, the output control unit 207 notifies the user of a message indicating a warning as predetermined information. .. On the other hand, when the average Davr (or the corrected average Davr') is equal to or greater than the threshold value (No in S512), the transition to S514 occurs. When the user has instructed the end in S514 (Yes in S514), the process of the flow shown in FIG. 5 is terminated. On the other hand, if the user has not instructed the end (NO in S514), the transition to S501 is performed, and the acquisition unit 200 newly acquires the image to be processed next.
 以上説明したように、本実施形態における情報処理装置100は、人物間の距離の平均Davrを算出し、人物のマスク着用の有無、および、人物の年齢、の少なくともいずれか一方に基づき、算出した平均Davrを補正する。そして、情報処理装置100は、平均Davrを補正することで得た補正平均Davr’と閾値とを比較し、比較結果に応じて、所定の情報として警告を示すメッセージをユーザに通知する。このようにすることで、人物間の距離が注意すべき距離かを適応的に判定し、当該判定の結果に基づき、ユーザにアラートを通知することができる。 As described above, the information processing apparatus 100 in the present embodiment calculates the average Davr of the distance between the persons, and calculates it based on at least one of the presence or absence of the person wearing a mask and the age of the person. Correct the average Davr. Then, the information processing apparatus 100 compares the corrected average Davr' obtained by correcting the average Davr with the threshold value, and notifies the user of a message indicating a warning as predetermined information according to the comparison result. By doing so, it is possible to adaptively determine whether the distance between the persons is a cautionary distance, and notify the user of an alert based on the result of the determination.
 なお、上述の説明では、情報処理装置100は、人物間の距離の平均Davrを用いて、閾値との比較結果に応じた所定の情報のユーザへの通知を実行したが、これに限らない。例えば、人物間の距離の平均ではなく、或る2人の人物間の距離だけを用いてもよい。具体的には、算出部206は、図4に示す例において、行列に並ぶ人物のうち、第1人物と、当該第1人物の前または後ろに位置する第2人物との距離Dを算出する。このとき、算出部206は、例えば、画像から検出された第1人物の位置と第2人物の位置とを特定し、画像上の第1人物の位置から第2人物の位置までの画像上の距離を距離Dとする。なお、算出部206は、画像上の第1人物の位置から第2人物までの画像上の距離を、実空間上の距離に変換し、当該実空間上の距離を距離Dとしてもよい。なお、画像上の距離を、実空間上の距離に変換する処理は、公知の技術を用いて実行するものとする。更に、判定部205は、第1人物および第2人物の各々について、マスク着用の有無、および、年齢の少なくともいずれか一方を判定する。そして、算出部206は、マスク着用の有無、および、年齢の少なくともいずれか一方に基づき、第1人物と第2人物との間の距離Dを補正する。このとき、例えば、算出部206は、式(5)を用いて、距離Dの補正を行う。 In the above description, the information processing apparatus 100 uses the average Davr of the distance between people to notify the user of predetermined information according to the comparison result with the threshold value, but the present invention is not limited to this. For example, only the distance between two people may be used instead of the average distance between the people. Specifically, in the example shown in FIG. 4, the calculation unit 206 calculates the distance D between the first person in the line and the second person located in front of or behind the first person. .. At this time, the calculation unit 206 identifies, for example, the position of the first person and the position of the second person detected from the image, and on the image from the position of the first person to the position of the second person on the image. Let the distance be the distance D. The calculation unit 206 may convert the distance on the image from the position of the first person on the image to the distance on the second person into the distance in the real space, and may use the distance in the real space as the distance D. The process of converting the distance on the image into the distance on the real space shall be performed using a known technique. Further, the determination unit 205 determines whether or not the mask is worn and at least one of the ages of each of the first person and the second person. Then, the calculation unit 206 corrects the distance D between the first person and the second person based on whether or not the mask is worn and at least one of the ages. At this time, for example, the calculation unit 206 corrects the distance D by using the equation (5).
 D’=(1+M×Rm)(1-K×Rk)D・・・式(5)
 ここで、Mは、第1人物および第2人物のうちマスクを着用している人物の割合を示し、例えば、2人ともマスクを着用していれば1、1人だけがマスクを着用していれば1/2、マスク着用者がいない場合は0となる。また、Kは、第1人物および第2人物のうち高齢者の割合を示し、例えば、2人とも高齢者であれば1、1人だけが高齢者であれば1/2、高齢者がいない場合は0となる。また、RmおよびRkは、それぞれ所定の係数を示す。ここで、例えば、RmがゼロでありRkが所定の値である場合、Kが高いほど、第1人物と第2人物との距離Dは小さくなるよう補正され、アラートが通知されやすくなる。また、Rmが所定の値でありRkがゼロである場合、Mが高いほど、第1人物と第2人物との距離Dは大きくなるよう補正され、アラートが通知されにくくなる。そして、出力制御部207は、第1人物と第2人物との距離Dを補正した距離D’が閾値未満である場合、所定の情報として警告を示すメッセージをユーザに通知し、距離D’が閾値以上である場合、当該メッセージを通知しないようにする。このように、本実施形態における情報処理装置100は、或る人物(第1人物)と他の人物(第2人物)との間の距離を、マスク着用の有無および年齢の少なくともいずれか一方の判定結果に基づき、補正する。そして、情報処理装置100は、第1人物および第2人物の間の距離を補正した補正距離と閾値との比較結果に応じて、所定の情報をユーザに通知する。このようにすることで、人物間の距離が注意すべき距離かを適応的に判定し、当該判定の結果に基づき、ユーザにアラートを通知することができる。
D'= (1 + M × Rm) (1-K × Rk) D ... Equation (5)
Here, M indicates the ratio of the person wearing the mask among the first person and the second person. For example, if both people are wearing masks, only one person is wearing the mask. If there is no mask wearer, it will be 1/2, and if there is no mask wearer, it will be 0. Further, K indicates the ratio of elderly people among the first person and the second person, for example, 1 if both are elderly, 1/2 if only 1 person is elderly, and no elderly person. In the case, it becomes 0. Further, Rm and Rk each indicate a predetermined coefficient. Here, for example, when Rm is zero and Rk is a predetermined value, the higher K is, the smaller the distance D between the first person and the second person is corrected, and the alert is more likely to be notified. Further, when Rm is a predetermined value and Rk is zero, the higher M is, the larger the distance D between the first person and the second person is corrected, and the alert is less likely to be notified. Then, when the distance D'corrected for the distance D between the first person and the second person is less than the threshold value, the output control unit 207 notifies the user of a message indicating a warning as predetermined information, and the distance D'is set. If it is above the threshold value, the message is not notified. As described above, in the information processing apparatus 100 of the present embodiment, the distance between a certain person (first person) and another person (second person) is set to at least one of the presence / absence of a mask and the age. Correct based on the judgment result. Then, the information processing apparatus 100 notifies the user of predetermined information according to the comparison result between the correction distance corrected for the distance between the first person and the second person and the threshold value. By doing so, it is possible to adaptively determine whether the distance between the persons is a cautionary distance, and notify the user of an alert based on the result of the determination.
 (実施形態2)
 実施形態1における情報処理装置100では、マスク着用率や人物の年齢に応じて人物間の距離を補正し、補正後の人物間の距離と閾値とを比較したが、実施形態2では、人物間の距離の比較対象となる閾値を補正する。すなわち、実施形態2における情報処理装置100は、人物のマスク着用率や人物の年齢に応じて閾値を補正し、補正した閾値と人物間の距離とを比較し、比較した結果に応じて、所定の情報として警告を示すメッセージをユーザに通知する。なお、実施形態1と異なる部分を主に説明し、実施形態1と同一または同等の構成要素、および処理には同一の符号を付すとともに、重複する説明は省略する。
(Embodiment 2)
In the information processing apparatus 100 in the first embodiment, the distance between the persons is corrected according to the mask wearing rate and the age of the person, and the corrected distance between the persons and the threshold value are compared. In the second embodiment, the distance between the persons is compared. Correct the threshold value to be compared with the distance of. That is, the information processing apparatus 100 in the second embodiment corrects the threshold value according to the mask wearing rate of the person and the age of the person, compares the corrected threshold value with the distance between the people, and determines the predetermined value according to the result of the comparison. Notify the user of a warning message as information about. The parts different from those of the first embodiment will be mainly described, and the same components and the same reference numerals are given to the components and the processes which are the same as or the same as those of the first embodiment, and the duplicated description will be omitted.
 図4に示す例において、本実施形態における算出部206は、式(1)を用いて、行列における前後の人物の距離の平均Davrを算出する。なお、平均Davrを算出する処理は、実施形態1と同様であるため説明を省略する。そして、本実施形態における算出部206は、式(6)を用いて閾値rの補正することで、閾値r’を取得する。 In the example shown in FIG. 4, the calculation unit 206 in the present embodiment calculates the average Davr of the distances of the front and back persons in the matrix by using the equation (1). Since the process of calculating the average Davr is the same as that of the first embodiment, the description thereof will be omitted. Then, the calculation unit 206 in the present embodiment acquires the threshold value r'by correcting the threshold value r using the equation (6).
 閾値r’=r-Cm*Rm+Ck*Rk・・・式(6)
 なお、式(5)と同様、Cmはマスク着用率、Ckは高齢者率を示し、また、RmおよびRkはそれぞれ所定の係数を示す。ここで、例えば、RmがゼロでありRkが所定の値である場合、高齢者率Ckが高いほど、閾値rは大きくなるよう補正される。つまり、高齢者率Ckが高いほど、閾値rが大きくなるため、アラートがユーザに通知されやすくなる。また、Rmが所定の値でありRkがゼロである場合、マスク着用率Cmが高いほど、閾値rは小さくなるよう補正される。言い換えれば、マスク着用率が高いほど、閾値rが小さくなるよう補正されるため、アラートがユーザに通知されにくくなる。なお、RmおよびRkのそれぞれが所定の値を有するようにしてもよい。すなわち、マスク着用率および高齢者率の両方に閾値r’が依存するようにしてもよい。
Threshold r'= r-Cm * Rm + Ck * Rk ... Equation (6)
As in the formula (5), Cm indicates the mask wearing rate, Ck indicates the elderly rate, and Rm and Rk indicate predetermined coefficients, respectively. Here, for example, when Rm is zero and Rk is a predetermined value, the threshold value r is corrected to be larger as the elderly rate Ck is higher. That is, the higher the elderly rate Ck, the larger the threshold value r, so that the alert is more likely to be notified to the user. Further, when Rm is a predetermined value and Rk is zero, the higher the mask wearing rate Cm, the smaller the threshold value r is corrected. In other words, the higher the mask wearing rate, the smaller the threshold value r is corrected, so that the alert is less likely to be notified to the user. In addition, each of Rm and Rk may have a predetermined value. That is, the threshold value r'may depend on both the mask wearing rate and the elderly rate.
 算出部206は、人物間の距離の平均Davrと、補正後の閾値r’とを比較する。このとき、出力制御部207は、人物間の距離の平均Davrが、補正後の閾値r’未満である場合、警告を示すメッセージをアラートとしてユーザに通知する。一方、出力制御部207は、人物間の距離の平均Davrが、補正後の閾値r’以上である場合、警告を示すメッセージをユーザに通知しない。 The calculation unit 206 compares the average Davr of the distance between people with the corrected threshold value r'. At this time, if the average Davr of the distance between the persons is less than the corrected threshold value r', the output control unit 207 notifies the user of a message indicating a warning as an alert. On the other hand, the output control unit 207 does not notify the user of a message indicating a warning when the average Davr of the distance between people is equal to or greater than the corrected threshold value r'.
 なお、上述の例では、人物間の距離の平均Davrと補正後の閾値r’とを比較したが、これに限らない。例えば、本実施形態における算出部206は、或る人物(第1人物)と他の人物(第2人物)との距離Dと、補正後の閾値r’とを比較するようにしてもよい。このとき、出力制御部207は、第1人物と第2人物との間の距離Dが、補正後の閾値r’未満である場合、警告を示すメッセージをアラートとしてユーザに通知する。一方、出力制御部207は、第1人物と第2人物との間の距離Dが、補正後の閾値r’以上である場合、警告を示すメッセージをユーザに通知しない。 In the above example, the average Davr of the distance between people and the corrected threshold value r'are compared, but the present invention is not limited to this. For example, the calculation unit 206 in the present embodiment may compare the distance D between a certain person (first person) and another person (second person) with the corrected threshold value r'. At this time, when the distance D between the first person and the second person is less than the corrected threshold value r', the output control unit 207 notifies the user of a message indicating a warning as an alert. On the other hand, the output control unit 207 does not notify the user of a warning message when the distance D between the first person and the second person is equal to or greater than the corrected threshold value r'.
 このように本実施形態における情報処理装置100は、人物の年齢および人物のマスク着用の有無、の少なくともいずれか一方に基づき補正した閾値と、人物間の距離(平均Davrや距離D)とを比較し、当該比較の結果に応じて、ユーザにアラートを通知する。このようにすることで、人物間の距離が注意すべき距離かを適応的に判定し、当該判定の結果に基づき、ユーザにアラートを通知することができる。 As described above, the information processing apparatus 100 in the present embodiment compares the threshold value corrected based on at least one of the age of the person and whether or not the person wears a mask and the distance between the persons (average Davr and distance D). Then, an alert is notified to the user according to the result of the comparison. By doing so, it is possible to adaptively determine whether the distance between the persons is a cautionary distance, and notify the user of an alert based on the result of the determination.
 (実施形態3)
 実施形態3における情報処理装置100は、行列長や待ち人数の情報に加え、待ち時間の情報を用いて、人物間の距離は注意すべき距離かを示す指標である注意度を算定する方法について説明する。なお、注意度が低いほど、人物間の距離は注意すべき距離ではなく、注意度が高いほど、人物間の距離は注意すべき距離となる。例えば、行列において、前後の人物との間隔が短いほど、また、待ち時間が長いほど、感染可能性が高まる場合が考えられるため、注意度が高くなる。なお、上述の各実施形態と異なる部分を主に説明し、上述の各実施形態と同一または同等の構成要素、および処理には同一の符号を付すとともに、重複する説明は省略する。
(Embodiment 3)
The information processing apparatus 100 according to the third embodiment uses information on the queue length and the number of people waiting, as well as information on the waiting time, to calculate a degree of attention, which is an index indicating whether the distance between persons is a distance to be noted. explain. The lower the degree of attention, the more the distance between people is not the distance to be careful, and the higher the degree of attention, the more the distance between people is the distance to be careful. For example, in a procession, the shorter the distance between the person before and after, and the longer the waiting time, the higher the possibility of infection, and the higher the degree of caution. It should be noted that the parts different from each of the above-described embodiments will be mainly described, and the same or equivalent components and processes as those of each of the above-described embodiments are designated by the same reference numerals, and duplicated description will be omitted.
 本実施形態における算出部206は、式(7)を用いて、注意度Rを算出する。 The calculation unit 206 in this embodiment calculates the attention level R using the equation (7).
 R=C1×Tavr/Davr・・・式(7)
 式(7)において、人物間の距離の平均Davrは式(1)により求める。また、Tavrは、待ち行列の待ち時間を推定した結果である「推定待ち時間」を示し、図4に示す例において、或る人物が行列入口301から行列に入場してから行列出口302から退場するまでの時間を推定した結果に対応する。また、定数C1は、注意度Rを所定の範囲におさめるための正規化係数である。
R = C1 × Tavr / Davr ... Equation (7)
In the formula (7), the average Davr of the distance between people is obtained by the formula (1). Further, Tavr indicates an "estimated waiting time" which is the result of estimating the waiting time of the queue. In the example shown in FIG. 4, a person enters the line from the line entrance 301 and then exits from the line exit 302. Corresponds to the result of estimating the time to do. Further, the constant C1 is a normalization coefficient for keeping the attention level R within a predetermined range.
 ここで、図4を参照して、推定待ち時間Tavrを算出する方法について説明する。図4に示す例において、本実施形態における算出部206は、或る所定時間内に行列出口302を通過して行列から退場した人物の数を取得する。そして、算出部206は、取得した人物の数を所定時間で割ることで、単位時間当たりに行列から退場する人物の数を算出する。このとき、単位時間当たりに行列から退場する人物の数を退場頻度とする。そして算出部206は、行列入口301から行列出口302まで経路に沿って形成される行列に並ぶ人物の数である行列人数Nを取得する。行列人数Nの算出方法については、実施形態1と同様であるため説明を省略する。算出部206は、行列人数Nを退場頻度で割ることで、推定待ち時間Tavrを算出する。 Here, a method of calculating the estimated waiting time Tavr will be described with reference to FIG. In the example shown in FIG. 4, the calculation unit 206 in the present embodiment acquires the number of persons who have passed through the matrix exit 302 and exited from the matrix within a predetermined time. Then, the calculation unit 206 calculates the number of people leaving the matrix per unit time by dividing the number of acquired people by a predetermined time. At this time, the number of people leaving the procession per unit time is defined as the exit frequency. Then, the calculation unit 206 acquires the number of people N in the matrix, which is the number of people lined up in the matrix formed along the path from the matrix entrance 301 to the matrix exit 302. Since the method of calculating the number of people in the matrix N is the same as that of the first embodiment, the description thereof will be omitted. The calculation unit 206 calculates the estimated waiting time Tavr by dividing the number of people in the queue N by the exit frequency.
 なお、推定待ち時間Tavrを算出する方法は、他の公知技術を用いてもよい。例えば、追尾部204は、行列入口301から行列出口302まで人物を追尾し、算出部206は、当該人物が行列入口301から行列出口302まで移動するのに要した移動時間を取得する。そして、算出部206は、複数の人物の各々について取得した移動時間の平均値を算出し、当該平均値を推定待ち時間Tavrとして取得してもよい。 As a method for calculating the estimated waiting time Tavr, another known technique may be used. For example, the tracking unit 204 tracks a person from the matrix entrance 301 to the matrix exit 302, and the calculation unit 206 acquires the movement time required for the person to move from the matrix entrance 301 to the matrix exit 302. Then, the calculation unit 206 may calculate the average value of the travel time acquired for each of the plurality of persons, and acquire the average value as the estimated waiting time Tavr.
 そして、本実施形態における算出部206は、式(7)を用いて算出した注意度Rと、閾値とを比較する比較処理を実行する。出力制御部207は、注意度Rが閾値未満である場合、人物間の距離は注意すべき距離でないとして、警告を示すメッセージをアラートとしてユーザに通知しない。一方、出力制御部207は、注意度Rが閾値以上である場合、人物間の距離は注意すべき距離であるとして、警告を示すメッセージをアラートとしてユーザに通知する。なお、本実施形態における算出部206は、式(7)を用いて、所定時間ごとに最新の注意度Rを算出し、算出した最新の注意度Rと閾値とを比較するようにしてもよい。そして、注意度Rが所定回数連続して閾値以上である場合、出力制御部207は、警告を示すメッセージをアラートとしてユーザに通知するようにしてもよい。例えば出力制御部207は、注意度Rが3回連続して閾値以上である場合、警告を示すメッセージをディスプレイ130に表示させる。一方、出力制御部207は、注意度Rが3回連続して閾値以上でなかった場合、警告を示すメッセージをディスプレイ130に表示させない。 Then, the calculation unit 206 in the present embodiment executes a comparison process for comparing the attention level R calculated using the equation (7) with the threshold value. When the attention level R is less than the threshold value, the output control unit 207 does not notify the user as an alert, assuming that the distance between the persons is not the distance to be noted. On the other hand, when the attention level R is equal to or higher than the threshold value, the output control unit 207 notifies the user as an alert with a message indicating a warning, assuming that the distance between the persons is a distance to be noted. The calculation unit 206 in the present embodiment may use the equation (7) to calculate the latest attention level R at predetermined time intervals and compare the calculated latest attention level R with the threshold value. .. Then, when the attention level R is continuously equal to or higher than the threshold value a predetermined number of times, the output control unit 207 may notify the user of a message indicating a warning as an alert. For example, the output control unit 207 causes the display 130 to display a message indicating a warning when the attention level R is equal to or higher than the threshold value three times in a row. On the other hand, the output control unit 207 does not display a warning message on the display 130 when the attention level R is not equal to or higher than the threshold value three times in a row.
 また、本実施形態における算出部206は、マスク着用の有無に応じて注意度Rを補正してもよい。具体的には、本実施形態における算出部206は、人物がマスク着用をしている場合、ウイルスの感染可能性が低減されるとして、算出した注意度Rの値が小さくなるよう補正する。算出部206は、例えば、式(8)を用いて、注意度Rを補正した値である補正注意度R’を算出する。 Further, the calculation unit 206 in the present embodiment may correct the caution level R depending on whether or not the mask is worn. Specifically, the calculation unit 206 in the present embodiment corrects the calculated caution level R to be small, assuming that the possibility of virus infection is reduced when the person is wearing a mask. For example, the calculation unit 206 calculates the correction caution level R', which is a value obtained by correcting the attention level R, using the equation (8).
 R’=R/(1+Cm)・・・式(8)
 実施形態1と同様、Cmは行列に並ぶ人物のマスク着用率を示す。また式(8)の例では、行列に並ぶ全ての人物がマスクを着用した場合、行列の注意度Rが半分となる場合を想定している。
R'= R / (1 + Cm) ... Equation (8)
Similar to the first embodiment, Cm indicates the mask wearing rate of the persons in a line. Further, in the example of the equation (8), it is assumed that the attention level R of the matrix is halved when all the persons in the matrix wear masks.
 また、本実施形態における算出部206は、人物の年齢に応じて、注意度Rを補正するようにしてもよい。ここで、行列に並ぶ人物が高齢者である場合、ウイルスに感染すると重症化の可能性が比較的高いと想定する。そこで、本実施形態における算出部206は、高齢者の数が多いほど、ユーザにアラートの通知が行われやすいよう、算出した注意度Rの値が大きくなるよう補正する。具体的には、算出部206は、例えば、式(9)を用いて、注意度Rを補正した値である補正注意度R’を算出する。 Further, the calculation unit 206 in the present embodiment may correct the attention level R according to the age of the person. Here, if the people in the line are elderly people, it is assumed that there is a relatively high possibility that they will become seriously ill if they are infected with the virus. Therefore, the calculation unit 206 in the present embodiment corrects the calculated attention level R so that the larger the number of elderly people, the easier it is for the user to be notified of the alert. Specifically, the calculation unit 206 calculates the correction attention level R', which is a value obtained by correcting the attention level R, using, for example, the equation (9).
 R’=R*(1+Ck/2)・・・式(9)
 なお、Ckは実施形態1と同様、行列に並ぶ人物のうち高齢者の割合である高齢者率を示す。
R'= R * (1 + Ck / 2) ... Equation (9)
As in the first embodiment, Ck indicates the elderly rate, which is the ratio of the elderly among the persons in the line.
 また、本実施形態における算出部206は、人物のマスク着用の有無および人物の年齢に基づき、注意度Rを補正するようにしてもよい。このとき、算出部206は、例えば、式(10)を用いて、注意度Rを補正した補正注意度を算出する。 Further, the calculation unit 206 in the present embodiment may correct the caution level R based on whether or not the person wears a mask and the age of the person. At this time, the calculation unit 206 calculates the correction caution level with the attention level R corrected by using, for example, the equation (10).
 R’=R*(1-Cm/2+Ck/2)・・・式(10)
 式(10)の例では、行列に並ぶ人物のマスク着用率が高いほど注意度は低くなり、また、行列に並ぶ人物のうち高齢者の数が多いほど注意度は大きくなる。
R'= R * (1-Cm / 2 + Ck / 2) ... Equation (10)
In the example of the formula (10), the higher the mask wearing rate of the people in the line, the lower the degree of attention, and the larger the number of elderly people in the line, the higher the degree of attention.
 以上説明したように、算出部206は、人物のマスク着用の有無、および人物の年齢、の少なくともいずれか一方に基づき、算出した注意度Rを補正することで補正注意度R’を算出する。そして、算出部206は、注意度Rを補正することで取得した補正注意度R’と閾値とを比較する比較処理を実行する。出力制御部207は、補正注意度R’が閾値未満である場合、人物間の距離は注意すべき距離でないとして、警告を示すメッセージをアラートとしてユーザに通知しない。一方、出力制御部207は、補正注意度R’が閾値以上である場合、人物間の距離は注意すべき距離であるとして、警告を示すメッセージをアラートとしてユーザに通知する。このように、本実施形態における情報処理装置100は、人物のマスク着用の有無、および、人物の年齢の少なくともいずれか一方に基づき注意度Rを補正し、補正して得られた補正注意度R’と閾値との比較の結果に応じて、ユーザにアラートを通知する。このようにすることで、人物間の距離が注意すべき距離かを適応的に判定し、当該判定の結果に基づき、ユーザにアラートを通知することができる。 As described above, the calculation unit 206 calculates the corrected caution level R'by correcting the calculated caution level R based on at least one of the presence or absence of the person wearing a mask and the age of the person. Then, the calculation unit 206 executes a comparison process for comparing the corrected attention level R'acquired by correcting the attention level R with the threshold value. When the correction attention level R'is less than the threshold value, the output control unit 207 does not notify the user as an alert, assuming that the distance between the persons is not the distance to be noted. On the other hand, when the correction attention level R'is equal to or greater than the threshold value, the output control unit 207 notifies the user as an alert with a message indicating a warning, assuming that the distance between the persons is a distance to be noted. As described above, the information processing apparatus 100 in the present embodiment corrects the attention level R based on at least one of the presence / absence of the mask of the person and the age of the person, and the corrected caution level R obtained by the correction. Notify the user of an alert depending on the result of the comparison between'and the threshold. By doing so, it is possible to adaptively determine whether the distance between the persons is a cautionary distance, and notify the user of an alert based on the result of the determination.
 (実施形態4)
 実施形態4における情報処理装置100は、行列中の人物の各々について、当該人物の周辺の人物との距離を算出し、算出した距離に基づき、行列の密集度を算出する。そして、情報処理装置100は、算出した密集度と閾値とを比較した結果に応じて、ユーザにアラートを通知する。なお、上述の各実施形態と異なる部分を主に説明し、上述の各実施形態と同一または同等の構成要素、および処理には同一の符号を付すとともに、重複する説明は省略する。
(Embodiment 4)
The information processing apparatus 100 in the fourth embodiment calculates the distance between each person in the matrix and the people around the person, and calculates the density of the matrix based on the calculated distance. Then, the information processing apparatus 100 notifies the user of an alert according to the result of comparing the calculated density and the threshold value. It should be noted that the parts different from each of the above-described embodiments will be mainly described, and the same or equivalent components and processes as those of each of the above-described embodiments are designated by the same reference numerals, and duplicated description will be omitted.
 図8は、本実施形態における人物間の距離の算出方法を説明するための図である。行列において先頭に位置する人物P1(人物711)から順に並んでいるものとする。ここでi番目の人物の位置をPi(x,y)とすると、i番目の人物とj番目の人物の距離はDij=|Pi―Pj|である。なお、人物の位置は画面上の位置でもよいし、実空間上に変換した位置でもよい。つぎに、人物iについて、周囲の人物との距離が所定の値Dth未満の人数を、Ni人と定義すると、
 Ni=COUNTIF(Dij<Dth),(j≠i、j=1、2、・・・n)  ... 式(11)
と表記できる。なお、nは、待ち行列に並ぶ人物の数を示すものとする。また所定の値Dthは、例えば1.8mなど予めユーザに指定された数値とする。ここで行列の密集度Denを、周囲の人物との距離がDth未満の延べ人数とすると、
 Den=ΣNi,(i=1、2、・・・n)   式(12)
である。そして、算出部206は、算出した密集度Denと閾値とを比較し、密集度Denが閾値以上である場合、行列における人物間の距離は注意すべき距離であると判定する。そして、出力制御部207は、ユーザにアラートを通知する。一方、算出部206は、算出した密集度Denが閾値未満である場合、行列における人物間の距離は注意すべき距離ではないと判定する。そして、出力制御部207は、ユーザにアラートを通知する。
FIG. 8 is a diagram for explaining a method of calculating the distance between persons in the present embodiment. It is assumed that the person P1 (person 711) located at the head of the matrix is arranged in order. Here, assuming that the position of the i-th person is Pi (x, y), the distance between the i-th person and the j-th person is Dij = | Pi-Pj |. The position of the person may be a position on the screen or a position converted into a real space. Next, for person i, if the number of people whose distance to the surrounding people is less than the predetermined value Dth is defined as Ni people,
Ni = COUNTIF (Dij <Dth), (j ≠ i, j = 1, 2, ... n). .. .. Equation (11)
Can be written as. Note that n indicates the number of people in the queue. Further, the predetermined value Dth is a numerical value specified in advance by the user, for example, 1.8 m. Here, assuming that the density Den of the matrix is the total number of people whose distance to the surrounding people is less than Dth,
Den = ΣNi, (i = 1, 2, ... n) Equation (12)
Is. Then, the calculation unit 206 compares the calculated density Den with the threshold value, and if the density Den is equal to or greater than the threshold value, determines that the distance between the persons in the matrix is a distance to be noted. Then, the output control unit 207 notifies the user of the alert. On the other hand, when the calculated density Den is less than the threshold value, the calculation unit 206 determines that the distance between the persons in the matrix is not the distance to be noted. Then, the output control unit 207 notifies the user of the alert.
 また、本実施形態における算出部206は、i番目の人物とj番目の人物との距離Dijを算出し、更に、マスク着用の有無および年齢の少なくともいずれか一方に基づき、算出したDijを補正してもよい。具体的には、算出部206は、式(13)を用いて、距離Dijを補正することで補正距離Dij’を算出するようにしてもよい。 Further, the calculation unit 206 in the present embodiment calculates the distance Dij between the i-th person and the j-th person, and further corrects the calculated Dij based on at least one of the presence / absence of wearing a mask and the age. You may. Specifically, the calculation unit 206 may calculate the correction distance Dij'by correcting the distance Dij using the equation (13).
 Dij’=(1+Mij×Rm)(1-Kij×Rk)Dij・・・・式(13)
 なお、ここでのMijは、i番目およびj番目の2人の人物のうちマスクを着用している人物の割合を示し、例えば、2人ともマスクを着用していれば1、1人だけがマスクを着用していれば1/2、マスク着用者がいない場合は0となる。また、Kijは、i番目およびj番目の2人の人物のうち高齢者の割合を示し、例えば、2人とも高齢者であれば1、1人だけが高齢者であれば1/2、高齢者がいない場合は0となる。また、RmおよびRkはそれぞれ所定の係数を示す。ここで、例えば、RmがゼロでありRkが所定の値である場合、高齢者率Kijが高いほど、i番目の人物とj番目の人物との距離Dijは小さくなるよう補正される。すなわち、高齢者率Kijが高いほど、アラートがユーザに通知されやすいようにする。また、Rmが所定の値でありRkがゼロである場合、マスク着用率Mijが高いほど、i番目の人物とj番目の人物との距離Dijは大きくなるよう補正される。言い換えれば、マスク着用率Mijが高いほど、アラートがユーザに通知されにくくなる。
Dij'= (1 + Mij × Rm) (1-Kij × Rk) Dij ... Equation (13)
In addition, Mij here indicates the ratio of the person wearing a mask among the two people of the i-th and j-th, for example, if both of them are wearing masks, only one person is wearing a mask. If you are wearing a mask, it will be 1/2, and if there is no mask wearer, it will be 0. In addition, Kij indicates the ratio of elderly people among the two people at the i-th and j-th, for example, 1 if both are elderly, 1/2 if only 1 is elderly, and elderly. If there is no person, it will be 0. Further, Rm and Rk each indicate a predetermined coefficient. Here, for example, when Rm is zero and Rk is a predetermined value, the higher the elderly rate Kij, the smaller the distance Dij between the i-th person and the j-th person is corrected. That is, the higher the elderly rate Kij, the easier it is for the user to be notified of the alert. Further, when Rm is a predetermined value and Rk is zero, the higher the mask wearing rate Mij, the larger the distance Dij between the i-th person and the j-th person is corrected. In other words, the higher the mask wearing rate Mij, the less likely it is that the alert will be notified to the user.
 つぎに、i番目の人物について、補正距離Dij’が所定の値Dth未満の人数を、Ni’人と定義すると、
 Ni’=COUNTIF(Dij’<Dth),(j≠i、j=1、2、・・・n) ... 式(14)
と表記できる。ここで行列の密集度Den’を、補正距離Dij’がDth未満の延べ人数とすると、
 Den’=ΣNi’(i=1、2、・・・n)   式(15)
である。そして、算出部206は、補正距離Dij’に基づき算出した密集度Den’と閾値とを比較し、密集度Den’が閾値以上である場合、行列における人物間の距離は注意すべき距離であると判定する。そして、出力制御部207は、ユーザにアラートを通知する。一方、算出部206は、算出した密集度Den’が閾値未満である場合、行列における人物間の距離は注意すべき距離ではないと判定する。そして、出力制御部207は、ユーザにアラートを通知しない。このようにすることで、人物間の距離が注意すべき距離かを適応的に判定し、当該判定の結果に基づき、ユーザにアラートを通知することができる。
Next, for the i-th person, if the number of people whose correction distance Dij'is less than the predetermined value Dth is defined as Ni'person,
Ni'= COUNTIF (Dij'<Dth), (j ≠ i, j = 1, 2, ... n). .. .. Equation (14)
Can be written as. Here, assuming that the density Den'of the matrix is the total number of people whose correction distance Dij'is less than Dth, it is assumed.
Den'= ΣNi' (i = 1, 2, ... n) Equation (15)
Is. Then, the calculation unit 206 compares the density Den'calculated based on the correction distance Dij'with the threshold value, and when the density Den'is equal to or greater than the threshold value, the distance between the persons in the matrix is a distance to be noted. Is determined. Then, the output control unit 207 notifies the user of the alert. On the other hand, when the calculated density Den'is less than the threshold value, the calculation unit 206 determines that the distance between the persons in the matrix is not the distance to be noted. Then, the output control unit 207 does not notify the user of the alert. By doing so, it is possible to adaptively determine whether the distance between the persons is a cautionary distance, and notify the user of an alert based on the result of the determination.
 (変形例)
 以下、実施形態4の変形例として、補正距離Dij’を算出する他の方法について説明する。本変形例に係る算出部206は、i番目の人物およびj番目の人物の各々の顔の向きに基づき、行列におけるi番目の人物とj番目の人物との間の距離Dijを補正する。なお、本実施形態における判定部205は、画像から検出部203により検出された人物の顔の向きを判定するものとする。なお、顔の向きの判定は公知技術を用いてよい。例えば、情報処理装置100は、例えば、顔の向きが異なる顔画像を学習データとして機械学習により学習させた識別器を予め作成する。そして、判定部205は、例えば、検出部203により画像から検出された人物の顔画像を当該識別機に入力し、当該人物の顔の向きの情報を出力として得る。ここで、待ち行列に並ぶ人物について通常みられるような同一の向きを基準として、人物同士が互いに向き合っている場合は距離Dijが小さくなるように、逆に背中合わせの場合は距離Dijが大きくなるように補正する。
(Modification example)
Hereinafter, as a modification of the fourth embodiment, another method for calculating the correction distance Dij'will be described. The calculation unit 206 according to this modification corrects the distance Dij between the i-th person and the j-th person in the matrix based on the orientations of the faces of the i-th person and the j-th person. The determination unit 205 in the present embodiment determines the orientation of the face of the person detected by the detection unit 203 from the image. A known technique may be used to determine the orientation of the face. For example, the information processing apparatus 100 creates in advance, for example, a classifier trained by machine learning using face images having different face orientations as learning data. Then, for example, the determination unit 205 inputs the face image of the person detected from the image by the detection unit 203 into the discriminator, and obtains the information on the orientation of the face of the person as output. Here, based on the same orientation that is usually seen for people in a queue, the distance Dij is small when the people are facing each other, and conversely, the distance Dij is large when the people are back to back. Correct to.
 ここで図8を参照して距離Dijを補正する処理について説明する。図8に示す例において、i番目の人物801の顔領域の中心位置からj番目の人物802の顔領域の中心位置に向けて引いた直線に対して反時計回りを正方向とする。このとき、当該直線をx軸とする単位円において、i番目の人物801の顔領域の中心を原点とした人物801の顔向きを当該直線に対する弧度Θi、j番目の人物802の顔領域の中心を原点とした人物802の顔向きを当該直線に対する弧度Θjとする。このとき、本変形例における算出部206は、式(16)を用いて、距離Dijを補正することで補正距離D’ijを算出する。 Here, the process of correcting the distance Dij will be described with reference to FIG. In the example shown in FIG. 8, the counterclockwise direction is positive with respect to the straight line drawn from the center position of the face area of the i-th person 801 toward the center position of the face area of the j-th person 802. At this time, in the unit circle with the straight line as the x-axis, the face orientation of the person 801 with the center of the face area of the i-th person 801 as the origin is the arc degree Θi with respect to the straight line, and the center of the face area of the j-th person 802. Let the face orientation of the person 802 with the origin be the radian Θj with respect to the straight line. At this time, the calculation unit 206 in this modification calculates the correction distance D'ij by correcting the distance Dij using the equation (16).
 D’ij={sin((Θi-Θj)/2.0)/2.0+1.0}*Dij・・・式(16)
 なお、補正距離D’ijを用いて、実施形態4にて説明した式(14)および式(15)を用いて、算出部206は、密集度Den’を算出する。そして、本実施形態における出力制御部207は、密集度Den’が閾値以上である場合、行列における人物間の距離は注意すべき距離であるとして、ユーザにアラートを通知する。一方、出力制御部207は、算出した密集度Den’が閾値未満である場合、行列における人物間の距離は注意すべき距離ではないとして、ユーザにアラートを通知しない。人物間の距離が注意すべき距離かを適応的に判定し、当該判定の結果に基づき、ユーザにアラートを通知することができる。
D'ij = {sin ((Θi-Θj) /2.0)/2.0+1.0}*Dij ... Equation (16)
Using the correction distance D'ij, the calculation unit 206 calculates the density Den'using the equations (14) and (15) described in the fourth embodiment. Then, when the density Den'is equal to or greater than the threshold value, the output control unit 207 in the present embodiment notifies the user of an alert, assuming that the distance between the persons in the matrix is a distance to be noted. On the other hand, when the calculated density Den'is less than the threshold value, the output control unit 207 does not notify the user of the alert because the distance between the persons in the matrix is not the distance to be noted. It is possible to adaptively determine whether the distance between people is a cautionary distance, and notify the user of an alert based on the result of the determination.
 (その他の実施形態)
 次に図9を参照して、各実施形態の各機能を実現するための情報処理装置100のハードウェア構成を説明する。なお、以降の説明において情報処理装置100のハードウェア構成について説明するが、記録装置120および撮像装置110も同様のハードウェア構成によって実現されるものとする。
(Other embodiments)
Next, with reference to FIG. 9, the hardware configuration of the information processing apparatus 100 for realizing each function of each embodiment will be described. Although the hardware configuration of the information processing device 100 will be described in the following description, it is assumed that the recording device 120 and the image pickup device 110 are also realized by the same hardware configuration.
 本実施形態における情報処理装置100は、CPU900と、RAM910と、ROM920、HDD930と、I/F940と、を有している。 The information processing apparatus 100 in the present embodiment has a CPU 900, a RAM 910, a ROM 920, an HDD 930, and an I / F 940.
 CPU900は情報処理装置100を統括制御する中央処理装置である。RAM910は、CPU900が実行するコンピュータプログラムを一時的に記憶する。また、RAM910は、CPU900が処理を実行する際に用いるワークエリアを提供する。また、RAM910は、例えば、フレームメモリとして機能したり、バッファメモリとして機能したりする。 The CPU 900 is a central processing unit that controls the information processing device 100 in an integrated manner. The RAM 910 temporarily stores a computer program executed by the CPU 900. The RAM 910 also provides a work area used by the CPU 900 to execute processing. Further, the RAM 910 functions as, for example, a frame memory or a buffer memory.
 ROM920は、CPU900が情報処理装置100を制御するためのプログラムなどを記憶する。HDD930は、画像データ等を記録する記憶装置である。 The ROM 920 stores a program or the like for the CPU 900 to control the information processing apparatus 100. The HDD 930 is a storage device for recording image data and the like.
 I/F910は、ネットワーク140を介して、TCP/IPやHTTPなどに従って、外部装置との通信を行う。 The I / F 910 communicates with an external device via the network 140 according to TCP / IP, HTTP, or the like.
 なお、上述した各実施形態の説明では、CPU900が処理を実行する例について説明するが、CPU900の処理のうち少なくとも一部を専用のハードウェアによって行うようにしてもよい。例えば、ディスプレイ130にGUI(GRAPHICAL USER INTERFACE)や画像データを表示する処理は、GPU(GRAPHICS PROCESSING UNIT)で実行してもよい。また、ROM920からプログラムコードを読み出してRAM910に展開する処理は、転送装置として機能するDMA(DIRECT MEMORY ACCESS)によって実行してもよい。 In the description of each of the above-described embodiments, an example in which the CPU 900 executes the processing will be described, but at least a part of the processing of the CPU 900 may be performed by dedicated hardware. For example, the process of displaying a GUI (GRAPHICAL USER INTERFACE) or image data on the display 130 may be executed by a GPU (GRAPHICS PROCESSING UNIT). Further, the process of reading the program code from the ROM 920 and expanding it to the RAM 910 may be executed by a DMA (Direct Memory Access) that functions as a transfer device.
 なお、本発明は、上述の実施形態の1以上の機能を実現するプログラムを1つ以上のプロセッサが読出して実行する処理でも実現可能である。プログラムは、ネットワーク又は記憶媒体を介して、プロセッサを有するシステム又は装置に供給するようにしてもよい。また、本発明は、上述の実施形態の1以上の機能を実現する回路(例えば、ASIC)によっても実現可能である。また、情報処理装置100の各部は、図9に示すハードウェアにより実現してもよいし、ソフトウェアにより実現することもできる。 The present invention can also be realized by a process in which one or more processors read and execute a program that realizes one or more functions of the above-described embodiment. The program may be supplied to a system or device having a processor via a network or storage medium. The present invention can also be realized by a circuit (for example, an ASIC) that realizes one or more functions of the above-described embodiment. Further, each part of the information processing apparatus 100 may be realized by the hardware shown in FIG. 9 or by software.
 なお、上述した各実施形態に係る情報処理装置100の1以上の機能を他の装置が有していてもよい。例えば、各実施形態に係る情報処理装置100の1以上の機能を撮像装置110が有していてもよい。なお、上述した各実施形態を組み合わせて、例えば、上述した実施形態を任意に組み合わせて実施してもよい。 It should be noted that another device may have one or more functions of the information processing device 100 according to each of the above-described embodiments. For example, the image pickup apparatus 110 may have one or more functions of the information processing apparatus 100 according to each embodiment. It should be noted that each of the above-described embodiments may be combined, and for example, the above-mentioned embodiments may be arbitrarily combined.
 以上、本発明を実施形態と共に説明したが、上記実施形態は本発明を実施するにあたっての具体化の例を示したものに過ぎず、これらによって本発明の技術的範囲は限定的に解釈されるものではない。すなわち、本発明はその技術思想、又はその主要な特徴から逸脱しない範囲において、様々な形で実施することができる。例えば、各実施形態を組み合わせたものも本明細書の開示内容に含まれる。 Although the present invention has been described above with the embodiments, the above embodiments are merely examples of embodiment of the present invention, and the technical scope of the present invention is limitedly interpreted by these. It's not a thing. That is, the present invention can be implemented in various forms within a range that does not deviate from the technical idea or its main features. For example, a combination of the respective embodiments is also included in the disclosure contents of the present specification.
 本発明は上記実施の形態に制限されるものではなく、本発明の精神及び範囲から離脱することなく、様々な変更及び変形が可能である。従って、本発明の範囲を公にするために以下の請求項を添付する。 The present invention is not limited to the above embodiment, and various changes and modifications can be made without departing from the spirit and scope of the present invention. Therefore, the following claims are attached in order to publicize the scope of the present invention.
 本願は、2020年9月1日提出の日本国特許出願特願2020-147156を基礎として優先権を主張するものであり、その記載内容の全てをここに援用する。 This application claims priority based on Japanese Patent Application No. 2020-147156 submitted on September 1, 2020, and all the contents thereof are incorporated herein by reference.

Claims (15)

  1.  撮像された画像に含まれる人物を検出する検出手段と、
     前記検出手段により検出された人物のマスク着用の有無、および、当該人物の年齢の少なくともいずれか一方を判定する判定手段と、
     前記画像において前記検出手段により検出された人物間の距離を前記判定手段による判定結果に応じて補正する補正手段と、
     前記補正手段により補正された前記人物間の距離に応じて、所定の情報を出力する出力制御手段とを有することを特徴とする情報処理装置。
    A detection means for detecting a person included in an captured image,
    A determination means for determining whether or not a person wearing a mask is detected by the detection means, and at least one of the ages of the person.
    A correction means for correcting the distance between persons detected by the detection means in the image according to the determination result by the determination means, and a correction means.
    An information processing apparatus comprising: an output control means for outputting predetermined information according to a distance between the persons corrected by the correction means.
  2.  前記出力制御手段は、前記補正手段により補正された前記人物間の距離と、閾値とを比較した結果に応じて、前記所定の情報を出力することを特徴とする請求項1に記載の情報処理装置。 The information processing according to claim 1, wherein the output control means outputs the predetermined information according to the result of comparing the distance between the persons corrected by the correction means and the threshold value. Device.
  3.  前記出力制御手段は、前記補正手段により補正された前記人物間の距離が前記閾値未満である場合、前記所定の情報を出力することを特徴とする請求項2に記載の情報処理装置。 The information processing device according to claim 2, wherein the output control means outputs the predetermined information when the distance between the persons corrected by the correction means is less than the threshold value.
  4.  前記補正手段は、前記検出された人物がマスク着用有の場合、前記人物間の距離が大きくなるよう前記人物間の距離を補正することを特徴とする請求項1に記載の情報処理装置。 The information processing device according to claim 1, wherein the correction means corrects the distance between the persons so that the distance between the persons becomes large when the detected person wears a mask.
  5.  前記補正手段は、前記検出された人物が所定年齢以上である場合、前記人物間の距離が小さくなるよう前記人物間の距離を補正することを特徴とする請求項1に記載の情報処理装置。 The information processing device according to claim 1, wherein the correction means corrects the distance between the people so that the distance between the people becomes smaller when the detected person is older than a predetermined age.
  6.  前記検出手段は、待ち行列を撮像する前記撮像手段により得られた前記画像に含まれる前記待ち行列における前記人物を検出し、
     前記補正手段は、前記判定手段による判定結果に基づき、前記待ち行列における前記人物間の距離を補正することを特徴とする請求項1に記載の情報処理装置。
    The detection means detects the person in the queue included in the image obtained by the imaging means that captures the queue.
    The information processing apparatus according to claim 1, wherein the correction means corrects the distance between the persons in the queue based on the determination result by the determination means.
  7.  前記人物間の距離は、前記待ち行列の前後に並ぶ人物同士の距離の平均値であり、前記補正手段は、前記待ち行列に並ぶ人物の数に対するマスクを着用している人物の数、及び、前記待ち行列に並ぶ人物の数に対する所定年齢未満の人物の数、の少なくともいずれか一方に基づき、前記人物間の距離を補正することを特徴とする請求項6に記載の情報処理装置。 The distance between the persons is an average value of the distances between the persons lined up before and after the queue, and the correction means is the number of persons wearing a mask for the number of persons lined up in the queue, and The information processing apparatus according to claim 6, wherein the distance between the persons is corrected based on at least one of the number of persons under a predetermined age with respect to the number of persons in the queue.
  8.  前記補正手段は、前記マスクを着用している数が多いほど、又は、前記所定年齢未満の人物の数が多いほど、前記人物間の距離が大きくなるよう、当該人物間の距離を補正することを特徴とする請求項7に記載の情報処理装置。 The correction means corrects the distance between the persons so that the greater the number of people wearing the mask or the greater the number of persons under the predetermined age, the greater the distance between the persons. The information processing apparatus according to claim 7.
  9.  画像に含まれる人物を検出する検出手段と、
     前記検出手段により検出された人物のマスク着用の有無を判定する判定手段と、
     所定の情報を出力するかを判定するために用いる閾値を、前記判定手段による判定結果に応じて補正する補正手段と、
     前記検出手段により検出された人物間の距離と、前記補正手段により補正された閾値とに基づき、前記所定の情報を出力する出力制御手段を有することを特徴とする情報処理装置。
    A detection means that detects a person contained in an image,
    A determination means for determining whether or not a person wearing a mask is detected by the detection means, and a determination means.
    A correction means for correcting a threshold value used for determining whether to output predetermined information according to a determination result by the determination means, and a correction means.
    An information processing apparatus comprising: an output control means for outputting the predetermined information based on a distance between persons detected by the detection means and a threshold value corrected by the correction means.
  10.  前記出力制御手段は、前記検出手段により検出された人物間の距離が、前記補正手段により補正された閾値を下回る場合、前記所定の情報を出力することを特徴とする請求項9に記載の情報処理装置。 The information according to claim 9, wherein the output control means outputs the predetermined information when the distance between the persons detected by the detection means is less than the threshold value corrected by the correction means. Processing equipment.
  11.  前記補正手段は、前記検出手段により検出された人物がマスク着用している場合、前記閾値が小さくなるよう前記閾値を補正することを特徴とする請求項9に記載の情報処理装置。 The information processing apparatus according to claim 9, wherein the correction means corrects the threshold value so that the threshold value becomes smaller when the person detected by the detection means wears a mask.
  12.  前記人物間の距離は、前記検出手段により検出された第1人物と第2人物との間の距離であり、
     前記補正手段は、前記第1人物および前記第2人物がマスク着用していると判定された場合の閾値が、前記第1人物および前記第2人物のいずれか一方がマスク着用していると判定された場合の閾値よりも小さくなるよう、前記閾値を補正することを特徴とする請求項11に記載の情報処理装置。
    The distance between the persons is the distance between the first person and the second person detected by the detection means.
    The correction means determines that the threshold value when it is determined that the first person and the second person are wearing a mask is that either the first person or the second person is wearing a mask. The information processing apparatus according to claim 11, wherein the threshold value is corrected so as to be smaller than the threshold value when the threshold value is set.
  13.  撮像された画像に含まれる人物を検出する検出工程と、
     前記検出工程において検出された人物のマスク着用の有無、および、当該人物の年齢の少なくともいずれか一方を判定する判定工程と、
     前記画像において前記検出工程において検出された人物間の距離を前記判定工程における判定結果に応じて補正する補正工程と、
     前記補正工程において補正された前記人物間の距離に応じて、所定の情報を出力する出力制御工程とを有することを特徴とする情報処理方法。
    A detection process that detects a person included in the captured image, and
    A determination step of determining whether or not the person wearing a mask detected in the detection step and at least one of the ages of the person are determined.
    A correction step of correcting the distance between persons detected in the detection step in the image according to the determination result in the determination step, and a correction step.
    An information processing method comprising an output control step of outputting predetermined information according to the distance between the persons corrected in the correction step.
  14.  画像に含まれる人物を検出する検出工程と、
     前記検出工程において検出された人物のマスク着用の有無を判定する判定工程と、
     所定の情報を出力するかを判定するために用いる閾値を、前記判定工程における判定結果に応じて補正する補正工程と、
     前記検出工程において検出された人物間の距離と、前記補正工程において補正された閾値とに基づき、前記所定の情報を出力する出力制御工程とを有することを特徴とする情報処理方法。
    A detection process that detects people contained in images, and
    A determination step for determining whether or not a person wearing a mask is detected in the detection step, and a determination step.
    A correction step of correcting a threshold value used for determining whether to output predetermined information according to a determination result in the determination step, and a correction step.
    An information processing method comprising: an output control step for outputting the predetermined information based on a distance between persons detected in the detection step and a threshold value corrected in the correction step.
  15.  コンピュータを、請求項1乃至12のいずれか1項に記載の情報処理装置の各手段として機能させるためのプログラム。 A program for making a computer function as each means of the information processing apparatus according to any one of claims 1 to 12.
PCT/JP2021/031723 2020-09-01 2021-08-30 Information processing device, information processing method, and program WO2022050217A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020147156A JP7476038B2 (en) 2020-09-01 2020-09-01 Information processing device, information processing method, and program
JP2020-147156 2020-09-01

Publications (1)

Publication Number Publication Date
WO2022050217A1 true WO2022050217A1 (en) 2022-03-10

Family

ID=80491752

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/031723 WO2022050217A1 (en) 2020-09-01 2021-08-30 Information processing device, information processing method, and program

Country Status (2)

Country Link
JP (1) JP7476038B2 (en)
WO (1) WO2022050217A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015087841A (en) * 2013-10-29 2015-05-07 パナソニック株式会社 Congestion status analyzer, congestion status analyzing system, and congestion status analyzing method
JP2019200718A (en) * 2018-05-18 2019-11-21 キヤノン株式会社 Monitoring device, monitoring method, and program
JP2020086994A (en) * 2018-11-27 2020-06-04 キヤノン株式会社 Information processor, information processing method and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015087841A (en) * 2013-10-29 2015-05-07 パナソニック株式会社 Congestion status analyzer, congestion status analyzing system, and congestion status analyzing method
JP2019200718A (en) * 2018-05-18 2019-11-21 キヤノン株式会社 Monitoring device, monitoring method, and program
JP2020086994A (en) * 2018-11-27 2020-06-04 キヤノン株式会社 Information processor, information processing method and program

Also Published As

Publication number Publication date
JP2022041755A (en) 2022-03-11
JP7476038B2 (en) 2024-04-30

Similar Documents

Publication Publication Date Title
JPWO2017104835A1 (en) Setting support apparatus, setting support method, and computer program
JP6280020B2 (en) Moving object tracking device
JPWO2019003973A1 (en) Face authentication device, face authentication method and program
Ottakath et al. ViDMASK dataset for face mask detection with social distance measurement
Tepelea et al. A vision module for visually impaired people by using Raspberry PI platform
US20160063345A1 (en) Pattern recognition apparatus, pattern recognition method, and storage medium
JP2018025988A (en) Image processing program, image processing method and image processing device
US20240104769A1 (en) Information processing apparatus, control method, and non-transitory storage medium
WO2022050217A1 (en) Information processing device, information processing method, and program
US20240020837A1 (en) Image processing apparatus, image processing method, and nontransitory computer-readable medium
JP2007219603A (en) Person tracking device, person tracking method and person tracking program
US11527090B2 (en) Information processing apparatus, control method, and non-transitory storage medium
US20230238149A1 (en) Image processing apparatus, image processing method, and non-transitory storage medium
Hadi et al. Early warning system for physical distancing detection in the prevention of covid-19 spread
JP6787075B2 (en) Image processing system, image processing device and image processing method
JP6451418B2 (en) Gaze target determination device, gaze target determination method, and gaze target determination program
CN113947795A (en) Mask wearing detection method, device, equipment and storage medium
JP2014153815A (en) Estimation device, method and program
JP2022083653A (en) Matrix analyzer, matrix analysis method and program
JP7458303B2 (en) Information processing device, information processing method, and program
JP7392851B2 (en) Image processing device, image processing method, and program
JP2021056899A (en) Image processor, image processing method, and program
JP2022026849A (en) Information processing device, information processing method, and program
WO2023233650A1 (en) Pose analyzing apparatus, pose analyzing method, and non-transitory computer-readable storage medium
US20230401867A1 (en) Image processing apparatus, image processing method, and non-transitory storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21864275

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21864275

Country of ref document: EP

Kind code of ref document: A1