WO2017061239A1 - Surveillance system - Google Patents
Surveillance system Download PDFInfo
- Publication number
- WO2017061239A1 WO2017061239A1 PCT/JP2016/076955 JP2016076955W WO2017061239A1 WO 2017061239 A1 WO2017061239 A1 WO 2017061239A1 JP 2016076955 W JP2016076955 W JP 2016076955W WO 2017061239 A1 WO2017061239 A1 WO 2017061239A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- terminal device
- distribution
- unit
- frame rate
- Prior art date
Links
- 238000012544 monitoring process Methods 0.000 claims description 86
- 238000003384 imaging method Methods 0.000 claims description 77
- 238000000034 method Methods 0.000 description 50
- 230000008569 process Effects 0.000 description 37
- 238000001514 detection method Methods 0.000 description 18
- 238000012545 processing Methods 0.000 description 12
- 238000004364 calculation method Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 210000000056 organ Anatomy 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 230000006837 decompression Effects 0.000 description 2
- 210000001061 forehead Anatomy 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000002265 prevention Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 235000002673 Dioscorea communis Nutrition 0.000 description 1
- 241000544230 Dioscorea communis Species 0.000 description 1
- 208000035753 Periorbital contusion Diseases 0.000 description 1
- 206010038743 Restlessness Diseases 0.000 description 1
- 238000012508 change request Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Definitions
- the present invention relates to a monitoring system.
- surveillance systems are installed in facilities visited by an unspecified number of people such as hotels, buildings, convenience stores, financial institutions, dams and roads for the purpose of crime prevention and accident prevention.
- a person to be monitored is photographed by an imaging device such as a camera, and the photographed image is transmitted to a monitoring center such as a management office or a security room, and the resident supervisor monitors it. Then, it calls attention or records video according to the purpose and necessity.
- a monitoring center such as a management office or a security room
- the resident supervisor monitors it. Then, it calls attention or records video according to the purpose and necessity.
- monitoring systems have been developed not only for the facilities described above, but also for shopping streets, downtowns, and residential roads in residential areas. Increasing cases are being installed.
- Patent Document 1 discloses an invention that can seamlessly use various communication forms such as voice communication, video communication, and the middle thereof with low burden.
- Patent Document 2 discloses an invention in which a best shot is reliably recorded without missing a facial expression of a person.
- An object of the present invention is to reduce network flow.
- the monitoring system of the present invention is a monitoring system including an imaging device, a user imaging device, and a terminal device, the imaging device has means for changing the image quality and frame rate of the distribution image, and the user imaging device is a terminal device.
- the monitoring device is photographed, and the terminal device acquires a gaze frequency for the monitor image from the captured image of the user imaging device, a unit for determining the image quality or frame rate of the distribution image based on the gaze frequency, and the determination result
- the monitoring system of the present invention is a monitoring system including an imaging device, a user imaging device, and a terminal device.
- the imaging device has means for changing the image quality and frame rate of a distribution image
- the user imaging device is a terminal.
- the terminal image is taken by the terminal device, the terminal device displays the distribution image from the imaging device, the means for personal identification of the supervisor from the captured image of the user imaging device, and the image quality of the distribution image based on the personal identification result
- the network flow rate can be reduced.
- FIG. 1 is a diagram showing a configuration of a monitoring system according to an embodiment of the present invention.
- the monitoring system includes a network 100, monitoring imaging devices 101 to 105, a terminal device 106, and a user imaging device 107.
- the network 100 includes a dedicated line such as a USB (Universal Serial Bus) that connects the monitoring imaging devices 101 to 105, the terminal device 106, and the user imaging device 107 to each other for data communication, an intranet, the Internet, and a wireless LAN (Local Area Network). ).
- a dedicated line such as a USB (Universal Serial Bus) that connects the monitoring imaging devices 101 to 105, the terminal device 106, and the user imaging device 107 to each other for data communication, an intranet, the Internet, and a wireless LAN (Local Area Network).
- USB Universal Serial Bus
- the surveillance imaging devices 101 to 105 include a zoom lens capable of perspective control and focus control, an imaging device such as a CCD or CMOS, an A / D circuit for digitally converting an analog signal, a temporary storage memory such as a RAM, a data transmission bus, It is a device such as a monitoring camera equipped with a timing circuit, an external input / output interface, a power supply circuit, a pan / tilt head, illumination such as a visible light and a near infrared LED (Light Emitting Diode).
- an imaging device such as a CCD or CMOS
- an A / D circuit for digitally converting an analog signal
- a temporary storage memory such as a RAM
- a data transmission bus It is a device such as a monitoring camera equipped with a timing circuit, an external input / output interface, a power supply circuit, a pan / tilt head, illumination such as a visible light and a near infrared LED (Light Emitting Diode).
- the monitoring imaging devices 101 to 105 convert the light passing through the lens into an electrical signal by the imaging device, digitally convert the electrical signal by the A / D circuit, and store (store) it as image data in the temporary storage memory.
- the stored image data is output from the external input / output interface to the network in response to an external video request input to the external input / output interface or an instruction from the timing circuit.
- the terminal device 106 includes an arithmetic circuit such as a CPU (Central Processing Unit), a temporary storage memory such as a RAM (Random Access Memory), a recording medium such as an HDD (Hard Disk Drive), a data transmission bus, an external input / output interface, a power supply
- a device such as a computer provided with a circuit, a screen such as a liquid crystal display, and user input devices such as a keyboard and a mouse.
- the terminal device 106 stores the image data from the monitoring imaging devices 101 to 105 input from the network 100 to the external input / output interface in the temporary storage memory.
- the stored image data is converted into a form suitable for display using an arithmetic circuit and displayed on the screen.
- the recording medium stores a set of software including the method of the present invention, an OS (operation system), and the like.
- a user operation on the terminal device 106 is performed on a user input device.
- the user imaging device 107 includes a zoom lens capable of perspective control and focus control, an imaging device such as a CCD (Charge Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor), an A / D circuit that converts an analog signal into digital, a RAM, and the like.
- an imaging device such as a CCD (Charge Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor)
- CMOS Complementary Metal Oxide Semiconductor
- a / D circuit that converts an analog signal into digital
- RAM Random Access Memory
- Devices such as a USB camera equipped with a temporary storage memory, a data transmission bus, a timing circuit, an external input / output interface, a power supply circuit, and the like.
- the user imaging device 107 converts the light that has passed through the lens into an electrical signal by the imaging device, digitally converts the electrical signal by the A / D circuit, and stores it as image data in the temporary storage memory.
- the stored image data is output from the external input / output interface to the network in response to an external video request input to the external input / output interface or an instruction from the timing circuit. Similarly, it is possible to change the image quality of the image to be distributed, the distribution frame rate, and the like in accordance with an external control command input to the external input / output interface.
- the user imaging device 107 is arranged at a position where the user's (monitor) face can be photographed from the front, for example, near the screen of the terminal device 106.
- FIG. 2 is a diagram showing a functional configuration of the monitoring system according to one embodiment of the present invention.
- the same reference numerals as those in FIG. 1 denote the same devices.
- the terminal device 106 includes a distribution instruction unit 200, a monitoring image reception unit 201, a screen display unit 202, a user image acquisition unit 203, a face detection unit 204, a gaze determination unit 205, a determination result holding unit 206, and a control unit 207. Yes.
- the distribution instruction unit 200 is a functional unit that instructs distribution of monitoring images to the monitoring imaging devices 101 to 105.
- the distribution instruction includes an instruction on a distribution frame rate and image quality.
- the distribution instruction unit 200 holds distribution instruction data. The mode of the distribution instruction data will be described later.
- the monitoring image receiving unit 201 is a functional unit that acquires a monitoring image from the monitoring imaging device. In this embodiment, image data input from the monitoring imaging devices 101 to 105 is received.
- the screen display unit 202 is a functional unit that displays a monitor image on the screen.
- the image data received by the image receiving unit 201 is converted into a form suitable for display and displayed.
- the user image acquisition unit 203 is a functional unit that acquires a user (monitorer) image from the user imaging device.
- image data is acquired from the user imaging device 107.
- the face detection unit 204 is a functional unit that detects the face of the supervisor from the user image. In the present embodiment, face detection using image recognition technology is performed on the image data acquired by the user image acquisition unit 203, the presence of a monitor is determined, and if the monitor exists, the region coordinate output is output. Do.
- the gaze determination unit 205 is a functional unit that determines the presence or absence of a monitoring image of the monitoring person and the gaze target image. In the present embodiment, when the face detection unit 204 determines that there is a monitor, the direction of the line of sight is detected, and when the point of the line of sight is a monitoring image displayed by the screen display unit 202, the user is watched. It is determined that At that time, the monitoring image that is the gaze target is also determined.
- the determination result holding unit 206 is a functional unit that holds the result determined by the gaze determination unit 205 for a predetermined time. In this embodiment, the result determined by the gaze determination unit 205 is stored in the temporary storage memory together with the determination time. The holding time is given as a preset, and the result exceeding the holding time is discarded. Any manner of holding may be used.
- the control unit 207 is a functional unit that controls each functional unit of the terminal device 106.
- FIG. 3 is a diagram showing an aspect of distribution instruction data in the distribution instruction unit of the monitoring system according to one embodiment of the present invention.
- the distribution instruction data 300 stores one type of distribution instruction as one record. In this embodiment, seven records 301 to 307, that is, seven types of distribution instructions can be stored. This data is set in advance.
- the record includes a frequency cell 310, a distribution frame rate cell 311, and an image quality cell 312.
- the frequency cell 310 is an area for storing a gaze frequency range.
- the gaze frequency is the number of gazes in a certain time, for example, 1 minute. In this example, specific numerical values are not shown, but it is assumed that the range stored in the record 301 is the most frequent and the value stored in the record 307 is the least frequent.
- the distribution frame rate cell 311 is an area for storing a distribution frame rate.
- the distribution frame rate is the number of images output from the imaging device in a certain time, for example, 1 second.
- specific numerical values are not shown, but it is assumed that the value stored in the record 301 is the highest frame rate and the value stored in the record 307 is the lowest frame rate.
- the image quality cell 312 is an area for storing image quality.
- the image quality is constituted by, for example, image resolution, compression rate, or a combination thereof.
- specific numerical values are not shown, but it is assumed that the value stored in the record 301 has the highest image quality and the value stored in the record 307 has the lowest image quality.
- FIG. 4 is a flowchart for explaining the flow of processing in the terminal device of the monitoring system according to one embodiment of the present invention.
- the processing in the terminal device 106 is composed of FIG. 4 (A) and FIG. 4 (B).
- the process in FIG. 4B is performed for each of the monitoring imaging devices 101 to 105.
- the terminal device 106 is configured such that one process in FIG. 4A and five processes in FIG.
- step 400 the control unit 207 of the terminal device 106 stands by for a predetermined time.
- the length of the waiting time is set in advance.
- step 401 the user image acquisition unit 203 of the terminal device 106 acquires a user image from the user imaging device 107.
- the face detection unit 204 of the terminal device 106 performs face detection on the user image data acquired in step 401.
- Face detection is, for example, a method for obtaining a difference from a background image and determining the area based on the area or shape of the difference area, or arrangement of main components such as eyes, nose, and mouth prepared and learned in advance. This is performed by the presence / absence of a region having an image feature pattern of the face such as the difference between the forehead and the contrast of the eyes, a search method using an in-image search, and the like. Any method may be used in an embodiment of the present invention. If it is determined in step 402 that the face is detected (YES), the face detection unit 204 proceeds to step 403. If no face is detected (NO), the face detection unit 204 returns to step 400.
- the gaze determination unit 205 of the terminal device 106 detects organ positions such as the corners of the eyes, the head of the eyes, the center of the black eyes, the nose and the mouth of the face area obtained in step 402.
- the gaze determination unit 205 of the terminal device 106 detects the gaze direction.
- the line-of-sight direction is detected by a method of estimating the face direction from the positional relationship of the organs obtained in step 403 and determining from the positional relationship of the face direction and the black eye. Any method may be used in the present invention.
- step 405 the gaze determination unit 205 of the terminal device 106 determines the presence or absence of the monitoring image gaze.
- the process proceeds to step 406. If it is determined that there is no gaze (NO), the process returns to step 400.
- the determination result holding unit 206 of the terminal device 106 stores (stores) the determination result obtained in step 405 and the determination time.
- step 407 the determination result holding unit 206 of the terminal device 106 calculates the gaze frequency in a predetermined time.
- the frequency is calculated from the number of determination results with gaze in a certain past time. After the process, the process returns to the process of step 400.
- step 410 the distribution instruction unit 200 of the terminal device 106 transmits a request for image distribution to the monitoring imaging device.
- the requested distribution frame rate and image quality are taken out of the distribution instruction data 300 as default values stored in the record 304, for example.
- the distribution frame rate and image quality instructed in this step are stored (stored) as current distribution conditions.
- step 411 the monitoring image receiving unit 201 of the terminal device 106 waits for image reception, and when an incoming image from the monitoring imaging device 101 to 105 is detected, the process proceeds to step 412.
- step 412 the monitoring image receiving unit 201 of the terminal device 106 receives a monitoring image from the monitoring imaging device.
- step 413 the screen display unit 202 of the terminal device 106 performs decompression of the compressed image and conversion of the image format on the received monitoring image data as necessary, and displays them on the screen.
- the current delivery condition stored in step 410 is taken out and displayed in a large area if the condition is high quality, a small area if the condition is low, or a screen if the frame rate is high.
- Display control is also performed in accordance with distribution conditions such as displaying at the top of the screen, and displaying at the bottom of the screen when the frame rate is low. This display control is performed in a coordinated manner so that the display positions do not overlap among the five processes being executed in parallel.
- the distribution instruction unit 200 of the terminal device 106 extracts the gaze frequency calculated and stored in step 407 and refers to the distribution instruction data 300. For the records 301 to 307, the value stored in the frequency cell 310 and the gaze frequency extracted above are compared in order to find the corresponding record. Then, the distribution frame rate and image quality stored in the corresponding record are extracted. The extracted delivery frame rate and image quality are stored as new delivery conditions.
- step 415 the distribution instruction unit 200 of the terminal device 106 compares the current distribution condition with the new distribution condition. If they do not match (YES), the process proceeds to step 416. Otherwise (NO), the process returns to step 411.
- step 416 the distribution instruction unit 200 of the terminal apparatus 106 transmits an image distribution change request to the monitoring imaging apparatus. At that time, the requested distribution frame rate and image quality are the values extracted in step 414. The distribution frame rate and image quality instructed in this step are stored as current distribution conditions. After the process, the process returns to step 411.
- the present invention provides a method for obtaining a gaze frequency for a monitoring image in real time and performing distribution while changing a distribution condition according to the gaze frequency.
- the monitor image is displayed at a higher frame rate and higher image quality.
- monitor image display is performed at a lower frame rate and lower image quality.
- the present invention makes it possible to optimize the network flow rate of the entire system.
- five monitoring imaging devices and one terminal device are shown in the configuration, but these are configurations other than the above for the network. Also good.
- FIG. 5 is a diagram showing the configuration of a monitoring system according to another embodiment of the present invention. 5, devices having the same reference numerals as those in FIG. 1 indicate the same devices.
- the monitoring system includes a network 100, monitoring imaging devices 101 to 105, a terminal device 506, and a user imaging device 107.
- the terminal device 506 is an arithmetic circuit such as a CPU, a temporary storage memory such as a RAM, a recording medium such as an HDD, a data transmission bus, an external input / output interface, a power circuit, a screen such as a liquid crystal display, and a user input such as a keyboard and a mouse. It is a device such as a computer provided with a device.
- the terminal device 506 stores (stores) the image data from the monitoring imaging devices 101 to 105 input from the network 100 to the external input / output interface in the temporary storage memory.
- the stored image data is converted into a form suitable for display using an arithmetic circuit and displayed on the screen.
- the recording medium stores a set of software including the method of the present invention, an OS (operation system), a set of images (person feature amounts) of a user (monitoring person), and the like.
- a user operation on the terminal device is performed on a user input device.
- the user imaging device 107 is arranged at a position where the user (monitor) 's face can be photographed from the front, for example, near the screen of the terminal device 506.
- FIG. 6 is a diagram showing a functional configuration of a monitoring system according to another embodiment of the present invention.
- the terminal device 506 includes a distribution instruction unit 200, a monitoring image reception unit 201, a screen display unit 202, a user image acquisition unit 203, a face detection unit 204, a face feature amount calculation unit 608, a person identification unit 609, a person recording unit 610, a control
- Each functional unit of the unit 611 is configured.
- the face feature amount calculation unit 608 is a functional unit that calculates a feature amount using an image recognition technique for the face area in the image detected by the face detection unit 204.
- the person identification unit 609 determines the person identification for the supervisor detected by the face detection unit 204 by comparing the feature amount obtained by the face feature amount calculation unit 608 with a database stored in the person recording unit 610. It is.
- the person recording unit 610 is a functional unit that records the person feature amount of the supervisor, in this embodiment, the face feature amount together with the person ID (first and last name, identification information given to the person) and the like as person data. .
- the feature amount used here is an image feature amount for identifying an individual with respect to a person shown in the image, and may be, for example, the positional relationship of organs such as the eyes and nose and the respective contour information. In this example, any feature quantity may be used.
- the person recording unit 610 records the delivery conditions for the monitoring imaging devices 101 to 105 for each person.
- the distribution conditions are composed of a distribution frame rate, image quality, and the like.
- the person data is given in advance, and the way of giving it may be any. The aspect of the person data will be described later.
- the control unit 611 is a functional unit that controls each functional unit of the terminal device 506.
- FIG. 7 is a diagram showing an aspect of person data in the person recording unit of the monitoring system according to another embodiment of the present invention.
- the person data 700 records information of one person as one record.
- the present embodiment shows information of three monitor persons, that is, a state where three records are recorded, and is composed of records 701 to 703. This data is set in advance.
- the record is composed of a record number cell 710, a person ID cell 711, a feature amount cell 712, distribution frame rate cells 713 to 717, and image quality cells 718 to 722.
- a record number cell 710 is an area for storing a record number.
- the record number is a number used for managing records, and is, for example, a continuous integer value uniquely assigned to each record.
- the person ID cell 711 is an area for storing the person ID of the supervisor. In this embodiment, an example in which a character string as a person name is stored is shown. However, an integer value or the like may be used as long as it is identification information given to a person.
- the feature quantity cell 712 is an area for storing the person feature quantity of the supervisor. In this embodiment, an example in which a decimal value as a one-dimensional value is stored is shown for ease of explanation, but a multidimensional value may be used.
- Delivery frame rate cells 713 to 717 are areas for storing delivery frame rates.
- the distribution frame rate is the number of images output from the imaging device in a predetermined time, for example, 1 second.
- the distribution frame rate cell 713 is the distribution frame rate for the monitoring imaging apparatus 101
- the distribution frame rate cell 714 is the distribution frame rate for the monitoring imaging apparatus 102
- the distribution frame rate cell 715 is the distribution frame rate for the monitoring imaging apparatus 103
- the distribution frame The rate cell 716 stores the distribution frame rate for the monitoring imaging device 104
- the distribution frame rate cell 717 stores the distribution frame rate for the monitoring imaging device 105, respectively.
- the value itself stored in the cell may be any value, and in this embodiment, the symbol is shown instead of a specific numerical value.
- Image quality cells 718 to 722 are areas for storing image quality.
- the image quality is constituted by, for example, image resolution, compression rate, or a combination thereof.
- the value itself stored in this cell may be any value, and in this example, a symbol is shown instead of a specific numerical value.
- FIG. 8 is a flowchart for explaining the flow of processing in the terminal device of the monitoring system according to another embodiment of the present invention.
- the processing in the terminal device 506 is composed of FIG. 8A and FIG. Since the processing in FIG. 8B is performed for each monitoring imaging device, in the system configuration in FIG. 5 showing five monitoring imaging devices, inside the terminal device 506, FIG. There is one process, and a total of six processes in FIG. 5B are executed in parallel.
- step 800 the control unit 611 of the terminal device 506 waits for a predetermined time.
- the length of the waiting time is set in advance. After a predetermined time elapses, the process proceeds to step 801.
- step 801 the user image acquisition unit 203 of the terminal device 506 performs user image acquisition from the user imaging device 107.
- the face detection unit 204 of the terminal device 506 performs face detection on the user image data acquired in step 801.
- Face detection is, for example, a method for obtaining a difference from a background image and determining the area based on the area or shape of the difference area, or arrangement of main components such as eyes, nose, and mouth prepared and learned in advance. Further, it is performed by a method of searching for the presence or absence of a region having a facial image feature pattern such as a difference between the forehead and the contrast of eyes using an in-image search. In another embodiment of the present invention, any method may be used. If it is determined in step 802 that the face is detected (YES), the face detection unit 204 proceeds to step 803. If no face is detected (NO), the face detection unit 204 proceeds to step 808.
- step 803 the control unit 611 of the terminal device 506 turns off the no-monitorer flag and advances the process to step 804.
- the no-monitorer flag is set to the initial state ON.
- step 804 the face feature amount calculation unit 608 of the terminal device 506 performs feature amount calculation on the face area obtained in step 802.
- the feature quantities calculated here include, for example, image feature quantities such as the color of hair and skin, the shape and direction of the outline of the face, the size and shape of the main components such as eyes, nose, and mouth, and the arrangement relationship. However, in another embodiment of the present invention, any kind or number of feature quantities may be used.
- step 805 the person identifying unit 609 of the terminal device 506 performs feature amount collation. Specifically, the feature amount calculated in step 804 is sequentially collated (calculation of coincidence) for all the feature amounts of each person in the person data 700 recorded in advance in the person recording unit 610, Find the feature with the highest match. The person having this feature amount is set as the most similar person.
- the degree of coincidence is a numerical value indicating the closeness between images (image feature amounts), and the calculation method thereof is, for example, “Representation model for large-scale image collection” (see Non-Patent Document 1).
- the coincidence calculation method is a difference between feature amount values
- the stored value of the feature amount cell 712 of three records 701 to 703 As a result of the collation, the person stored in the record 702 having a difference, that is, a matching degree of 0.4 becomes the most similar person.
- any method may be used for determining the collation order and calculating the degree of coincidence.
- step 806 the person identification unit 609 of the terminal device 506 performs a monitoring person's determination.
- the person is decided as a supervisor when the degree of coincidence is equal to or less than the threshold value.
- the threshold is set in advance. For example, if the threshold is 0.6, the degree of coincidence of the most similar person obtained in step 805 is 0.4, and therefore the person photographed by the user imaging device 107 is the person stored in the record 702. That is, it is determined to be a supervisor.
- the person identification unit 609 proceeds to the process of step 807 if there is a supervisor (YES), and proceeds to the process of step 808 if there is no supervisor (NO).
- step 807 the control unit 611 of the terminal device 506 sets the person ID of the monitor obtained in step 806 as the monitor ID. For example, the stored value of the person ID cell 711 of the record 702 is set. After the process, the process returns to step 800.
- the supervisor ID is not set in the initial state.
- the control unit 611 of the terminal device 506 turns on the no-monitorer flag, clears the supervisor ID, and returns to the process in step 800.
- step 810 the control unit 611 of the terminal device 506 waits for a predetermined time. The length of the waiting time is set in advance. After a predetermined time elapses, the process proceeds to step 811.
- step 811 the control unit 611 of the terminal device 506 determines whether or not the no-monitorer flag is ON. If the no-monitorer flag is OFF (NO), the process proceeds to step 812. If the no-monitorer flag is ON (YES), the process returns to step 810.
- step 812 the control unit 611 of the terminal device 506 determines a change in the supervisor ID. If the supervisor ID has changed from the previous determination (YES), the process proceeds to step 813, and the supervisor ID is set. If there is no change (NO), the process proceeds to step 815.
- step 813 the control unit 611 of the terminal device 506 acquires distribution conditions. From the person data 700 stored in the person recording unit 610, a record in which the supervisor ID matches the person ID is specified. Then, the distribution frame rate value and the image quality value are extracted from the distribution frame rate cell and the image quality cell of the matching record.
- step 814 the distribution instruction unit 200 of the terminal device 506 transmits an image distribution request to the monitoring imaging device.
- the requested distribution conditions that is, the distribution frame rate and image quality are the values extracted in step 813.
- step 815 the monitoring image receiving unit 201 of the terminal device 506 waits for image reception. If an incoming image from the monitoring imaging apparatus is detected, the process proceeds to step 816.
- step 816 the monitoring image receiving unit 201 of the terminal device 506 receives a monitoring image from the monitoring imaging device.
- step 817 the screen display unit 202 of the terminal device 506 performs decompression of the compressed image and conversion of the image format on the received monitoring image data as necessary, and displays them on the screen.
- the distribution condition acquired in step 813 is referred to, and if the condition is high image quality, a large area is displayed, if the image quality is low, a small area is displayed, or if the frame rate is high, the screen is displayed.
- Display control is also performed in accordance with distribution conditions such as displaying at the top of the screen, and displaying at the bottom of the screen when the frame rate is low. This display control is performed in a coordinated manner so that the display positions do not overlap among the five processes executed in parallel. After the process, the process returns to step 811.
- a system for obtaining a monitor for a monitoring image in real time and performing distribution while changing distribution conditions according to the monitor is provided.
- a system that does not perform distribution is provided. As a result, the waste of the network flow rate of the entire system can be reduced.
- the configuration of five monitoring imaging devices and one terminal device is shown, but these configurations may be other than the above for the network.
- the configuration in which the monitoring video distribution source is an imaging device is shown, the configuration may be a recording device or a re-distribution device.
- the process concerning a person identification was shown by the structure implemented on a terminal device, you may make it implement in a separate apparatus, for example, a server apparatus.
- the 1st form and 2nd form of this invention were shown as a separate Example, you may make it implement combining a 1st form and a 2nd form.
- the monitoring system according to the embodiment of the present invention can reduce the network flow rate.
- -It can be applied to applications that want to reduce network flow rate by changing the image quality and frame rate of the distribution image based on the frequency of gaze at the image of the supervisor.
- monitoring imaging device 106 terminal device 107: user imaging device 200: distribution instruction unit 201: monitoring image receiving unit 202: screen display unit 203: user image acquisition unit 204 : Face detection unit, 205: gaze determination unit, 206: determination result holding unit, 207: control unit, 300: distribution instruction data, 301 to 307: record, 310: frequency cell, 311: distribution frame rate cell, 312: image quality Cell, 506: terminal device, 608: face feature amount calculation unit, 609: person identification unit, 610: person recording unit, 611: control unit, 700: person data, 701 to 703: record, 710: record number cell, 711 : Person ID cell, 712: feature amount cell, 713 to 717: distribution frame rate cell, and 718 to 722: image quality cell.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Closed-Circuit Television Systems (AREA)
- Studio Devices (AREA)
Abstract
The objective of the present invention is to reduce network flow rates.
The surveillance system according to the present invention is provided with an image capturing device, a user image capturing device and a terminal device, and is characterized in that: the image capturing device includes a means for changing the image quality and the frame rate of a delivered image; the user image capturing device captures images of a surveillant of the terminal device; and the terminal device includes a means for acquiring, from the images captured by the user image capturing device, a gazing frequency of the surveillant with respect to the images, a means for setting the image quality or the frame rate of the delivered image on the basis of the gazing frequency, and a means for outputting a delivery instruction to the image capturing device on the basis of the determined result.
Description
本発明は、監視システムに関するものである。
The present invention relates to a monitoring system.
従来から、ホテル、ビル、コンビニエンスストア、金融機関、ダムや道路といった不特定多数の人が訪れる施設には犯罪抑止や事故防止等の目的で、監視システムが設置されている。このような監視システムでは、監視対象の人物等をカメラ等の撮像装置で撮影し、撮影された映像を、管理事務所や警備室等の監視センタに伝送し、常駐する監視者がそれを監視し、目的や必要に応じて注意を喚起する、あるいは映像を記録するものである。
近年、治安の悪化や社会不安の高まり、監視カメラの低価格化とともに、上述のような施設のみならず、商店街や繁華街の街頭、住宅地内の生活道路にまで、このような監視システムが設置されるケースが増加している。 Conventionally, surveillance systems are installed in facilities visited by an unspecified number of people such as hotels, buildings, convenience stores, financial institutions, dams and roads for the purpose of crime prevention and accident prevention. In such a monitoring system, a person to be monitored is photographed by an imaging device such as a camera, and the photographed image is transmitted to a monitoring center such as a management office or a security room, and the resident supervisor monitors it. Then, it calls attention or records video according to the purpose and necessity.
In recent years, with the worsening of security and increasing social unrest, the price of surveillance cameras has been reduced, such monitoring systems have been developed not only for the facilities described above, but also for shopping streets, downtowns, and residential roads in residential areas. Increasing cases are being installed.
近年、治安の悪化や社会不安の高まり、監視カメラの低価格化とともに、上述のような施設のみならず、商店街や繁華街の街頭、住宅地内の生活道路にまで、このような監視システムが設置されるケースが増加している。 Conventionally, surveillance systems are installed in facilities visited by an unspecified number of people such as hotels, buildings, convenience stores, financial institutions, dams and roads for the purpose of crime prevention and accident prevention. In such a monitoring system, a person to be monitored is photographed by an imaging device such as a camera, and the photographed image is transmitted to a monitoring center such as a management office or a security room, and the resident supervisor monitors it. Then, it calls attention or records video according to the purpose and necessity.
In recent years, with the worsening of security and increasing social unrest, the price of surveillance cameras has been reduced, such monitoring systems have been developed not only for the facilities described above, but also for shopping streets, downtowns, and residential roads in residential areas. Increasing cases are being installed.
先行技術文献としては、例えば、特許文献1に、音声通信、映像通信、その中間等の各種通信形態を低負担かつシームレスに利用できる発明が開示されている。
また、他の先行技術文献としては、例えば、特許文献2に、人物の振り向きざまの表情を撮り逃すことがなく、ベストショットを確実に記録する発明が開示されている。 As a prior art document, for example,Patent Document 1 discloses an invention that can seamlessly use various communication forms such as voice communication, video communication, and the middle thereof with low burden.
As another prior art document, for example,Patent Document 2 discloses an invention in which a best shot is reliably recorded without missing a facial expression of a person.
また、他の先行技術文献としては、例えば、特許文献2に、人物の振り向きざまの表情を撮り逃すことがなく、ベストショットを確実に記録する発明が開示されている。 As a prior art document, for example,
As another prior art document, for example,
監視システムの設置場所の広がりは、システムに接続される撮像装置の数を飛躍的に増大させ、ネットワーク流量の増加から、ネットワークの構築や維持に掛かるコストを押し上げる要因となっている。
本発明の目的は、ネットワーク流量を低減することである。 The expansion of the installation location of the monitoring system dramatically increases the number of imaging devices connected to the system, and increases the network flow rate, which increases the cost for constructing and maintaining the network.
An object of the present invention is to reduce network flow.
本発明の目的は、ネットワーク流量を低減することである。 The expansion of the installation location of the monitoring system dramatically increases the number of imaging devices connected to the system, and increases the network flow rate, which increases the cost for constructing and maintaining the network.
An object of the present invention is to reduce network flow.
本発明の監視システムは、撮像装置とユーザ撮像装置と端末装置を備えた監視システムであって、撮像装置は配信画像の画質やフレームレートを変更する手段を有し、ユーザ撮像装置は端末装置の監視者を撮影し、端末装置はユーザ撮像装置の撮影画像から監視者の画像に対する注視頻度を取得する手段と、注視頻度に基づき配信画像の画質またはフレームレートを決定する手段と、該決定した結果に基づき撮像装置への配信指示を出す手段とを有することを特徴とする。
The monitoring system of the present invention is a monitoring system including an imaging device, a user imaging device, and a terminal device, the imaging device has means for changing the image quality and frame rate of the distribution image, and the user imaging device is a terminal device. The monitoring device is photographed, and the terminal device acquires a gaze frequency for the monitor image from the captured image of the user imaging device, a unit for determining the image quality or frame rate of the distribution image based on the gaze frequency, and the determination result And a means for issuing a delivery instruction to the imaging device based on the above.
また、本発明の監視システムは、撮像装置とユーザ撮像装置と端末装置を備えた監視システムであって、撮像装置は配信画像の画質やフレームレートを変更する手段を有し、ユーザ撮像装置は端末装置の監視者を撮影し、端末装置は撮像装置からの配信画像を画面表示する手段とユーザ撮像装置の撮影画像から監視者の個人識別を行う手段と記個人識別結果に基づき配信画像の画質やフレームレートを決定する手段と、該決定した結果に基づき撮像装置への配信指示を出す手段とを有することを特徴とする。
The monitoring system of the present invention is a monitoring system including an imaging device, a user imaging device, and a terminal device. The imaging device has means for changing the image quality and frame rate of a distribution image, and the user imaging device is a terminal. The terminal image is taken by the terminal device, the terminal device displays the distribution image from the imaging device, the means for personal identification of the supervisor from the captured image of the user imaging device, and the image quality of the distribution image based on the personal identification result And a means for determining a frame rate and a means for issuing a distribution instruction to the imaging apparatus based on the determined result.
本発明によれば、ネットワーク流量を低減することができる。
According to the present invention, the network flow rate can be reduced.
以下、本発明の実施形態について図面を参照して詳細に説明する。(第1の実施形態) 本発明の第1の実施形態について、図1から図4を用いて説明する。
図1は本発明の一実施例に係る監視システムの構成を示す図である。
図1において、監視システムは、ネットワーク100、監視撮像装置101~105、端末装置106、ユーザ撮像装置107で構成されている。 Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. First Embodiment A first embodiment of the present invention will be described with reference to FIGS.
FIG. 1 is a diagram showing a configuration of a monitoring system according to an embodiment of the present invention.
In FIG. 1, the monitoring system includes a network 100,monitoring imaging devices 101 to 105, a terminal device 106, and a user imaging device 107.
図1は本発明の一実施例に係る監視システムの構成を示す図である。
図1において、監視システムは、ネットワーク100、監視撮像装置101~105、端末装置106、ユーザ撮像装置107で構成されている。 Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. First Embodiment A first embodiment of the present invention will be described with reference to FIGS.
FIG. 1 is a diagram showing a configuration of a monitoring system according to an embodiment of the present invention.
In FIG. 1, the monitoring system includes a network 100,
ネットワーク100は、監視撮像装置101~105、端末装置106、ユーザ撮像装置107を相互に接続してデータ通信を行うUSB(Universal Serial Bus)等の専用線やイントラネット、インターネット、無線LAN(Local Area Network)等の通信線である。
監視撮像装置101~105は、遠近制御やピント制御が可能なズームレンズや、CCDやCMOS等の撮像素子、アナログ信号をデジタル変換するA/D回路、RAM等の一時記憶メモリ、データ伝送バス、タイミング回路、外部入出力インタフェース、電源回路、パンやチルト等を行う雲台、可視光ライトや近赤外LED(Light Emitting Diode)等の照明等を備える監視カメラ等の装置である。 The network 100 includes a dedicated line such as a USB (Universal Serial Bus) that connects themonitoring imaging devices 101 to 105, the terminal device 106, and the user imaging device 107 to each other for data communication, an intranet, the Internet, and a wireless LAN (Local Area Network). ).
Thesurveillance imaging devices 101 to 105 include a zoom lens capable of perspective control and focus control, an imaging device such as a CCD or CMOS, an A / D circuit for digitally converting an analog signal, a temporary storage memory such as a RAM, a data transmission bus, It is a device such as a monitoring camera equipped with a timing circuit, an external input / output interface, a power supply circuit, a pan / tilt head, illumination such as a visible light and a near infrared LED (Light Emitting Diode).
監視撮像装置101~105は、遠近制御やピント制御が可能なズームレンズや、CCDやCMOS等の撮像素子、アナログ信号をデジタル変換するA/D回路、RAM等の一時記憶メモリ、データ伝送バス、タイミング回路、外部入出力インタフェース、電源回路、パンやチルト等を行う雲台、可視光ライトや近赤外LED(Light Emitting Diode)等の照明等を備える監視カメラ等の装置である。 The network 100 includes a dedicated line such as a USB (Universal Serial Bus) that connects the
The
監視撮像装置101~105は、レンズを通った光を、撮像素子で電気信号に変換し、電気信号をA/D回路でデジタル変換し、一時記憶メモリに画像データとして格納(記憶)する。格納された画像データは、外部入出力インタフェースに入力された外部からの映像要求やタイミング回路からの指示等に応じて、外部入出力インタフェースからネットワークに出力される。また、同様に外部入出力インタフェースに入力された外部からの制御命令に応じて、配信する画像の画質や配信フレームレート等を変更することが可能である。
The monitoring imaging devices 101 to 105 convert the light passing through the lens into an electrical signal by the imaging device, digitally convert the electrical signal by the A / D circuit, and store (store) it as image data in the temporary storage memory. The stored image data is output from the external input / output interface to the network in response to an external video request input to the external input / output interface or an instruction from the timing circuit. Similarly, it is possible to change the image quality of the image to be distributed, the distribution frame rate, and the like in accordance with an external control command input to the external input / output interface.
端末装置106は、CPU(Central Processing Unit)等の演算回路や、RAM(Random Access Memory)等の一時記憶メモリ、HDD(Hard Disk Drive)等の記録媒体、データ伝送バス、外部入出力インタフェース、電源回路、液晶ディスプレイ等の画面、キーボードやマウス等のユーザ入力機器を備える、コンピュータ等の装置である。
端末装置106は、ネットワーク100から外部入出力インタフェースに入力された監視撮像装置101~105からの画像データを一時記憶メモリに格納する。格納された画像データを、演算回路を使い、表示に適した形態に変換を加え、画面表示する。記録媒体には、本発明の方式を含むソフトウェア一式や、OS(オペレーションシステム)等が格納されている。また、本端末装置106に対するユーザ操作は、ユーザ入力機器に対して行われる。 Theterminal device 106 includes an arithmetic circuit such as a CPU (Central Processing Unit), a temporary storage memory such as a RAM (Random Access Memory), a recording medium such as an HDD (Hard Disk Drive), a data transmission bus, an external input / output interface, a power supply A device such as a computer provided with a circuit, a screen such as a liquid crystal display, and user input devices such as a keyboard and a mouse.
Theterminal device 106 stores the image data from the monitoring imaging devices 101 to 105 input from the network 100 to the external input / output interface in the temporary storage memory. The stored image data is converted into a form suitable for display using an arithmetic circuit and displayed on the screen. The recording medium stores a set of software including the method of the present invention, an OS (operation system), and the like. In addition, a user operation on the terminal device 106 is performed on a user input device.
端末装置106は、ネットワーク100から外部入出力インタフェースに入力された監視撮像装置101~105からの画像データを一時記憶メモリに格納する。格納された画像データを、演算回路を使い、表示に適した形態に変換を加え、画面表示する。記録媒体には、本発明の方式を含むソフトウェア一式や、OS(オペレーションシステム)等が格納されている。また、本端末装置106に対するユーザ操作は、ユーザ入力機器に対して行われる。 The
The
ユーザ撮像装置107は、遠近制御やピント制御が可能なズームレンズや、CCD(Charge Coupled Device)やCMOS(Complementary Metal Oxide Semiconductor)等の撮像素子、アナログ信号をデジタル変換するA/D回路、RAM等の一時記憶メモリ、データ伝送バス、タイミング回路、外部入出力インタフェース、電源回路等を備えるUSBカメラ等の装置である。
The user imaging device 107 includes a zoom lens capable of perspective control and focus control, an imaging device such as a CCD (Charge Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor), an A / D circuit that converts an analog signal into digital, a RAM, and the like. Devices such as a USB camera equipped with a temporary storage memory, a data transmission bus, a timing circuit, an external input / output interface, a power supply circuit, and the like.
ユーザ撮像装置107は、レンズを通った光を、撮像素子で電気信号に変換し、電気信号をA/D回路でデジタル変換し、一時記憶メモリに画像データとして格納する。格納された画像データは、外部入出力インタフェースに入力された外部からの映像要求やタイミング回路からの指示等に応じて、外部入出力インタフェースからネットワークに出力される。また、同様に外部入出力インタフェースに入力された外部からの制御命令に応じて、配信する画像の画質や配信フレームレート等を変更することが可能である。
ユーザ撮像装置107は、ユーザ(監視者)の顔を正面から撮影可能な位置、例えば、端末装置106の画面近傍に配置されるものとする。 Theuser imaging device 107 converts the light that has passed through the lens into an electrical signal by the imaging device, digitally converts the electrical signal by the A / D circuit, and stores it as image data in the temporary storage memory. The stored image data is output from the external input / output interface to the network in response to an external video request input to the external input / output interface or an instruction from the timing circuit. Similarly, it is possible to change the image quality of the image to be distributed, the distribution frame rate, and the like in accordance with an external control command input to the external input / output interface.
Theuser imaging device 107 is arranged at a position where the user's (monitor) face can be photographed from the front, for example, near the screen of the terminal device 106.
ユーザ撮像装置107は、ユーザ(監視者)の顔を正面から撮影可能な位置、例えば、端末装置106の画面近傍に配置されるものとする。 The
The
次に、本発明の一実施例である監視システムの機能構成について図2を用いて説明する。 図2は本発明の一実施例に係る監視システムの機能構成を示す図である。図1と同一の符号は、同一の装置を示す。
端末装置106は、配信指示部200、監視画像受信部201、画面表示部202、ユーザ画像取得部203、顔検出部204、注視判定部205、判定結果保持部206、制御部207で構成されている。 Next, the functional configuration of the monitoring system according to an embodiment of the present invention will be described with reference to FIG. FIG. 2 is a diagram showing a functional configuration of the monitoring system according to one embodiment of the present invention. The same reference numerals as those in FIG. 1 denote the same devices.
Theterminal device 106 includes a distribution instruction unit 200, a monitoring image reception unit 201, a screen display unit 202, a user image acquisition unit 203, a face detection unit 204, a gaze determination unit 205, a determination result holding unit 206, and a control unit 207. Yes.
端末装置106は、配信指示部200、監視画像受信部201、画面表示部202、ユーザ画像取得部203、顔検出部204、注視判定部205、判定結果保持部206、制御部207で構成されている。 Next, the functional configuration of the monitoring system according to an embodiment of the present invention will be described with reference to FIG. FIG. 2 is a diagram showing a functional configuration of the monitoring system according to one embodiment of the present invention. The same reference numerals as those in FIG. 1 denote the same devices.
The
配信指示部200は、監視撮像装置101~105への監視画像配信を指示する機能部である。配信指示は、配信フレームレートや画質についての指示も含んで構成される。配信指示部200は、配信指示データを保持する。配信指示データの態様については、後述する。
監視画像受信部201は、監視撮像装置からの監視画像取得を行う機能部である。本実施例では、監視撮像装置101~105から入力された画像データの受信を行う。 Thedistribution instruction unit 200 is a functional unit that instructs distribution of monitoring images to the monitoring imaging devices 101 to 105. The distribution instruction includes an instruction on a distribution frame rate and image quality. The distribution instruction unit 200 holds distribution instruction data. The mode of the distribution instruction data will be described later.
The monitoringimage receiving unit 201 is a functional unit that acquires a monitoring image from the monitoring imaging device. In this embodiment, image data input from the monitoring imaging devices 101 to 105 is received.
監視画像受信部201は、監視撮像装置からの監視画像取得を行う機能部である。本実施例では、監視撮像装置101~105から入力された画像データの受信を行う。 The
The monitoring
画面表示部202は、監視画像の画面表示を行う機能部である。本実施例では、画像受信部201で受信した画像データを、表示に適した形態に変換し、表示を行う。
ユーザ画像取得部203は、ユーザ撮像装置からのユーザ(監視者)画像の取得を行う機能部である。本実施例では、ユーザ撮像装置107から画像データの取得を行う。
顔検出部204は、ユーザ画像中から監視者の顔を検出する機能部である。本実施例では、ユーザ画像取得部203にて取得した画像データに対し画像認識技術を用いた顔検出を行い、監視者の存在を判定し、監視者が存在する場合にはその領域座標出力を行う。 Thescreen display unit 202 is a functional unit that displays a monitor image on the screen. In the present embodiment, the image data received by the image receiving unit 201 is converted into a form suitable for display and displayed.
The userimage acquisition unit 203 is a functional unit that acquires a user (monitorer) image from the user imaging device. In this embodiment, image data is acquired from the user imaging device 107.
Theface detection unit 204 is a functional unit that detects the face of the supervisor from the user image. In the present embodiment, face detection using image recognition technology is performed on the image data acquired by the user image acquisition unit 203, the presence of a monitor is determined, and if the monitor exists, the region coordinate output is output. Do.
ユーザ画像取得部203は、ユーザ撮像装置からのユーザ(監視者)画像の取得を行う機能部である。本実施例では、ユーザ撮像装置107から画像データの取得を行う。
顔検出部204は、ユーザ画像中から監視者の顔を検出する機能部である。本実施例では、ユーザ画像取得部203にて取得した画像データに対し画像認識技術を用いた顔検出を行い、監視者の存在を判定し、監視者が存在する場合にはその領域座標出力を行う。 The
The user
The
注視判定部205は、監視者の監視画像注視の有無とその注視対象画像を判定する機能部である。本実施例では、顔検出部204にて監視者が存在すると判定した場合に、その視線方向を検出し、視線の先が画面表示部202により表示している監視画像である場合に、注視されていると判定する。その際、注視対象となっている監視画像も併せて判定する。
判定結果保持部206は、注視判定部205にて判定した結果を所定時間保持する機能部である。本実施例では、注視判定部205にて判定した結果をその判定時刻とともに一時記憶メモリに保持する。保持時間は予め設定として与え、保持時間を越えた結果については破棄をする。保持の態様についてはどのようであってもよい。
制御部207は、端末装置106の各機能部の制御を行う機能部である。 Thegaze determination unit 205 is a functional unit that determines the presence or absence of a monitoring image of the monitoring person and the gaze target image. In the present embodiment, when the face detection unit 204 determines that there is a monitor, the direction of the line of sight is detected, and when the point of the line of sight is a monitoring image displayed by the screen display unit 202, the user is watched. It is determined that At that time, the monitoring image that is the gaze target is also determined.
The determinationresult holding unit 206 is a functional unit that holds the result determined by the gaze determination unit 205 for a predetermined time. In this embodiment, the result determined by the gaze determination unit 205 is stored in the temporary storage memory together with the determination time. The holding time is given as a preset, and the result exceeding the holding time is discarded. Any manner of holding may be used.
Thecontrol unit 207 is a functional unit that controls each functional unit of the terminal device 106.
判定結果保持部206は、注視判定部205にて判定した結果を所定時間保持する機能部である。本実施例では、注視判定部205にて判定した結果をその判定時刻とともに一時記憶メモリに保持する。保持時間は予め設定として与え、保持時間を越えた結果については破棄をする。保持の態様についてはどのようであってもよい。
制御部207は、端末装置106の各機能部の制御を行う機能部である。 The
The determination
The
次に、配信指示部における配信指示データの態様について図3を用いて説明する。
図3は本発明の一実施例に係る監視システムの配信指示部における配信指示データの態様を示す図である。
配信指示データ300は、1種類の配信指示を1レコードとして保管する。本実施例では、レコード301~307の7個のレコード、すなわち7種類の配信指示が保管できるようになっている。本データは予め設定しておくものとする。
レコードは、頻度セル310、配信フレームレートセル311、画質セル312で構成される。
頻度セル310は、注視頻度の範囲を格納する領域である。注視頻度は一定時間、例えば、1分間における注視の回数である。本例では、具体的な数値を示していないが、レコード301に格納された範囲が、最も高頻度であり、レコード307に格納された値が最も低頻度であるとする。 Next, the mode of distribution instruction data in the distribution instruction unit will be described with reference to FIG.
FIG. 3 is a diagram showing an aspect of distribution instruction data in the distribution instruction unit of the monitoring system according to one embodiment of the present invention.
Thedistribution instruction data 300 stores one type of distribution instruction as one record. In this embodiment, seven records 301 to 307, that is, seven types of distribution instructions can be stored. This data is set in advance.
The record includes afrequency cell 310, a distribution frame rate cell 311, and an image quality cell 312.
Thefrequency cell 310 is an area for storing a gaze frequency range. The gaze frequency is the number of gazes in a certain time, for example, 1 minute. In this example, specific numerical values are not shown, but it is assumed that the range stored in the record 301 is the most frequent and the value stored in the record 307 is the least frequent.
図3は本発明の一実施例に係る監視システムの配信指示部における配信指示データの態様を示す図である。
配信指示データ300は、1種類の配信指示を1レコードとして保管する。本実施例では、レコード301~307の7個のレコード、すなわち7種類の配信指示が保管できるようになっている。本データは予め設定しておくものとする。
レコードは、頻度セル310、配信フレームレートセル311、画質セル312で構成される。
頻度セル310は、注視頻度の範囲を格納する領域である。注視頻度は一定時間、例えば、1分間における注視の回数である。本例では、具体的な数値を示していないが、レコード301に格納された範囲が、最も高頻度であり、レコード307に格納された値が最も低頻度であるとする。 Next, the mode of distribution instruction data in the distribution instruction unit will be described with reference to FIG.
FIG. 3 is a diagram showing an aspect of distribution instruction data in the distribution instruction unit of the monitoring system according to one embodiment of the present invention.
The
The record includes a
The
配信フレームレートセル311は、配信フレームレートを格納する領域である。配信フレームレートは、一定時間、例えば1秒間に撮像装置から出力される画像の枚数である。本実施例では、具体的な数値を示していないが、レコード301に格納された値が最も高いフレームレートであり、レコード307に格納された値が最も低いフレームレートであるとする。
画質セル312は、画質を格納する領域である。画質は、例えば、画像の解像度や圧縮率、あるいはそれらの組合せで構成する。本実施例では、具体的な数値を示していないが、レコード301に格納された値が最も高画質であり、レコード307に格納された値が最も低画質であるとする。 The distributionframe rate cell 311 is an area for storing a distribution frame rate. The distribution frame rate is the number of images output from the imaging device in a certain time, for example, 1 second. In this embodiment, specific numerical values are not shown, but it is assumed that the value stored in the record 301 is the highest frame rate and the value stored in the record 307 is the lowest frame rate.
Theimage quality cell 312 is an area for storing image quality. The image quality is constituted by, for example, image resolution, compression rate, or a combination thereof. In this embodiment, specific numerical values are not shown, but it is assumed that the value stored in the record 301 has the highest image quality and the value stored in the record 307 has the lowest image quality.
画質セル312は、画質を格納する領域である。画質は、例えば、画像の解像度や圧縮率、あるいはそれらの組合せで構成する。本実施例では、具体的な数値を示していないが、レコード301に格納された値が最も高画質であり、レコード307に格納された値が最も低画質であるとする。 The distribution
The
次に、端末装置106における処理の流れについて図4を用いて説明する。
図4は本発明の一実施例に係る監視システムの端末装置における処理の流れを説明するためのフローチャートである。
端末装置106における処理は、図4(A)と図4(B)で構成されている。
図4(B)の処理は監視撮像装置101~105毎に行われる処理である。図1のシステム構成において、端末装置106は内部にて、図4(A)の処理が1つと、図4(B)の処理が5つの合計6つの処理が並列実行される形となる。 Next, the flow of processing in theterminal device 106 will be described with reference to FIG.
FIG. 4 is a flowchart for explaining the flow of processing in the terminal device of the monitoring system according to one embodiment of the present invention.
The processing in theterminal device 106 is composed of FIG. 4 (A) and FIG. 4 (B).
The process in FIG. 4B is performed for each of themonitoring imaging devices 101 to 105. In the system configuration of FIG. 1, the terminal device 106 is configured such that one process in FIG. 4A and five processes in FIG.
図4は本発明の一実施例に係る監視システムの端末装置における処理の流れを説明するためのフローチャートである。
端末装置106における処理は、図4(A)と図4(B)で構成されている。
図4(B)の処理は監視撮像装置101~105毎に行われる処理である。図1のシステム構成において、端末装置106は内部にて、図4(A)の処理が1つと、図4(B)の処理が5つの合計6つの処理が並列実行される形となる。 Next, the flow of processing in the
FIG. 4 is a flowchart for explaining the flow of processing in the terminal device of the monitoring system according to one embodiment of the present invention.
The processing in the
The process in FIG. 4B is performed for each of the
まず、図4(A)の処理の流れについて説明する。
ステップ400において、端末装置106の制御部207は、所定時間の待機を行う。待機時間の長さは予め設定する。所定時間経過後に、処理をステップ401に進める。
ステップ401において、端末装置106のユーザ画像取得部203は、ユーザ撮像装置107からのユーザ画像取得を行う。 First, the flow of processing in FIG. 4A will be described.
Instep 400, the control unit 207 of the terminal device 106 stands by for a predetermined time. The length of the waiting time is set in advance. After a predetermined time has elapsed, the process proceeds to step 401.
Instep 401, the user image acquisition unit 203 of the terminal device 106 acquires a user image from the user imaging device 107.
ステップ400において、端末装置106の制御部207は、所定時間の待機を行う。待機時間の長さは予め設定する。所定時間経過後に、処理をステップ401に進める。
ステップ401において、端末装置106のユーザ画像取得部203は、ユーザ撮像装置107からのユーザ画像取得を行う。 First, the flow of processing in FIG. 4A will be described.
In
In
ステップ402において、端末装置106の顔検出部204は、ステップ401にて取得したユーザ画像データに対して、顔検出を行う。顔検出は、例えば、背景画像との差分を求め、その差分領域の面積や形状等にて判定をする方法や、予め学習させ用意しておいた眼や鼻、口等の主要構成要素の配置や額と眼の濃淡差等の顔の画像特徴パターンを有する領域の有無や画像内探索を使って探す方法、等により行う。本発明の一実施例ではいずれの方法であってもよい。
顔検出部204は、ステップ402の判定で、顔を検出した場合(YES)にはステップ403の処理に進み、顔を検出しない場合(NO)にはステップ400の処理に戻る。 Instep 402, the face detection unit 204 of the terminal device 106 performs face detection on the user image data acquired in step 401. Face detection is, for example, a method for obtaining a difference from a background image and determining the area based on the area or shape of the difference area, or arrangement of main components such as eyes, nose, and mouth prepared and learned in advance. This is performed by the presence / absence of a region having an image feature pattern of the face such as the difference between the forehead and the contrast of the eyes, a search method using an in-image search, and the like. Any method may be used in an embodiment of the present invention.
If it is determined instep 402 that the face is detected (YES), the face detection unit 204 proceeds to step 403. If no face is detected (NO), the face detection unit 204 returns to step 400.
顔検出部204は、ステップ402の判定で、顔を検出した場合(YES)にはステップ403の処理に進み、顔を検出しない場合(NO)にはステップ400の処理に戻る。 In
If it is determined in
ステップ403において、端末装置106の注視判定部205は、ステップ402で求めた顔領域に対し、目尻や目頭、黒目の中心、鼻や口等の器官位置検出を行う。
ステップ404において、端末装置106の注視判定部205は、視線方向の検出を行う。視線方向の検出は、ステップ403で求めた器官の位置関係から顔向きを推定し、顔向きと黒目の位置関係から求める方法等により行う。本発明ではいずれの方法であってもよい。 Instep 403, the gaze determination unit 205 of the terminal device 106 detects organ positions such as the corners of the eyes, the head of the eyes, the center of the black eyes, the nose and the mouth of the face area obtained in step 402.
Instep 404, the gaze determination unit 205 of the terminal device 106 detects the gaze direction. The line-of-sight direction is detected by a method of estimating the face direction from the positional relationship of the organs obtained in step 403 and determining from the positional relationship of the face direction and the black eye. Any method may be used in the present invention.
ステップ404において、端末装置106の注視判定部205は、視線方向の検出を行う。視線方向の検出は、ステップ403で求めた器官の位置関係から顔向きを推定し、顔向きと黒目の位置関係から求める方法等により行う。本発明ではいずれの方法であってもよい。 In
In
ステップ405において、端末装置106の注視判定部205は、監視画像注視の有無を判定する。ステップ404で求めた視線が監視画像を向いている場合に当該監視画像に対して注視有りと判定する。注視有りと判定した場合(YES)にはステップ406の処理に進み、注視無しと判定した場合(NO)にはステップ400の処理に戻る。
ステップ406において、端末装置106の判定結果保持部206は、ステップ405で求めた判定結果と判定時刻を保管(記憶)する。 Instep 405, the gaze determination unit 205 of the terminal device 106 determines the presence or absence of the monitoring image gaze. When the line of sight obtained in step 404 is directed to the monitoring image, it is determined that the monitoring image is being watched. If it is determined that there is a gaze (YES), the process proceeds to step 406. If it is determined that there is no gaze (NO), the process returns to step 400.
Instep 406, the determination result holding unit 206 of the terminal device 106 stores (stores) the determination result obtained in step 405 and the determination time.
ステップ406において、端末装置106の判定結果保持部206は、ステップ405で求めた判定結果と判定時刻を保管(記憶)する。 In
In
ステップ407において、端末装置106の判定結果保持部206は、所定時間における注視頻度を算出する。頻度の算出は、過去一定時間における注視有りの判定結果の数で求める。処理後、処理をステップ400の処理に戻る。
In step 407, the determination result holding unit 206 of the terminal device 106 calculates the gaze frequency in a predetermined time. The frequency is calculated from the number of determination results with gaze in a certain past time. After the process, the process returns to the process of step 400.
次に、図4(B)の処理の流れについて説明する。
ステップ410において、端末装置106の配信指示部200は、監視撮像装置に対して、画像配信の要求を送信する。その際、要求する配信フレームレートや画質は、配信指示データ300から、デフォルトとして、例えば、レコード304に格納された値を取り出すものとする。本ステップで指示した配信フレームレートや画質について、現配信条件として保管(記憶)しておく。 Next, the flow of processing in FIG. 4B will be described.
Instep 410, the distribution instruction unit 200 of the terminal device 106 transmits a request for image distribution to the monitoring imaging device. At this time, for example, the requested distribution frame rate and image quality are taken out of the distribution instruction data 300 as default values stored in the record 304, for example. The distribution frame rate and image quality instructed in this step are stored (stored) as current distribution conditions.
ステップ410において、端末装置106の配信指示部200は、監視撮像装置に対して、画像配信の要求を送信する。その際、要求する配信フレームレートや画質は、配信指示データ300から、デフォルトとして、例えば、レコード304に格納された値を取り出すものとする。本ステップで指示した配信フレームレートや画質について、現配信条件として保管(記憶)しておく。 Next, the flow of processing in FIG. 4B will be described.
In
ステップ411において、端末装置106の監視画像受信部201は、画像受信待機を行い、監視撮像装置101~105からの画像着信を検知するとステップ412の処理に進む。
ステップ412において、端末装置106の監視画像受信部201は、監視撮像装置からの監視画像を受信する。 Instep 411, the monitoring image receiving unit 201 of the terminal device 106 waits for image reception, and when an incoming image from the monitoring imaging device 101 to 105 is detected, the process proceeds to step 412.
Instep 412, the monitoring image receiving unit 201 of the terminal device 106 receives a monitoring image from the monitoring imaging device.
ステップ412において、端末装置106の監視画像受信部201は、監視撮像装置からの監視画像を受信する。 In
In
ステップ413において、端末装置106の画面表示部202は、受信した監視画像データに対して、必要に応じて、圧縮画像の展開や画像形式の変換を行い、画面に表示する。その際、ステップ410で保管した現配信条件を取り出し、条件が高画質である場合には大きな面積で、低画質である場合には小さな面積で表示したり、高フレームレートである場合には画面の上部に表示し、低フレームレートである場合には画面の下部に表示する等の配信条件に併せた表示制御も併せて行う。この表示制御は、並列実行されている5つの処理間で表示位置が重なったり等しないように協調動作させる。
In step 413, the screen display unit 202 of the terminal device 106 performs decompression of the compressed image and conversion of the image format on the received monitoring image data as necessary, and displays them on the screen. At that time, the current delivery condition stored in step 410 is taken out and displayed in a large area if the condition is high quality, a small area if the condition is low, or a screen if the frame rate is high. Display control is also performed in accordance with distribution conditions such as displaying at the top of the screen, and displaying at the bottom of the screen when the frame rate is low. This display control is performed in a coordinated manner so that the display positions do not overlap among the five processes being executed in parallel.
ステップ414において、端末装置106の配信指示部200は、ステップ407で算出・保管されている注視頻度を取出し、配信指示データ300を参照する。レコード301~307について、頻度セル310に格納された値と上記で取り出した注視頻度を順に比較し、該当するレコードを探し出す。そして、該当レコードに格納された配信フレームレートと画質を取り出す。取り出した配信フレームレートや画質について、新配信条件として保管する。
In step 414, the distribution instruction unit 200 of the terminal device 106 extracts the gaze frequency calculated and stored in step 407 and refers to the distribution instruction data 300. For the records 301 to 307, the value stored in the frequency cell 310 and the gaze frequency extracted above are compared in order to find the corresponding record. Then, the distribution frame rate and image quality stored in the corresponding record are extracted. The extracted delivery frame rate and image quality are stored as new delivery conditions.
ステップ415において、端末装置106の配信指示部200は、現配信条件と新配信条件を比較する。不一致であった場合(YES)にはステップ416の処理に進み、それ以外の場合(NO)にはステップ411の処理に戻る。
ステップ416において、端末装置106の配信指示部200は、監視撮像装置に対して、画像配信の変更要求を送信する。その際、要求する配信フレームレートや画質は、ステップ414で取り出した値とする。本ステップで指示した配信フレームレートや画質について、現配信条件として保管しておく。処理後、ステップ411の処理に戻る。 Instep 415, the distribution instruction unit 200 of the terminal device 106 compares the current distribution condition with the new distribution condition. If they do not match (YES), the process proceeds to step 416. Otherwise (NO), the process returns to step 411.
Instep 416, the distribution instruction unit 200 of the terminal apparatus 106 transmits an image distribution change request to the monitoring imaging apparatus. At that time, the requested distribution frame rate and image quality are the values extracted in step 414. The distribution frame rate and image quality instructed in this step are stored as current distribution conditions. After the process, the process returns to step 411.
ステップ416において、端末装置106の配信指示部200は、監視撮像装置に対して、画像配信の変更要求を送信する。その際、要求する配信フレームレートや画質は、ステップ414で取り出した値とする。本ステップで指示した配信フレームレートや画質について、現配信条件として保管しておく。処理後、ステップ411の処理に戻る。 In
In
ここまでに示したように、本発明では、監視画像に対する注視頻度をリアルタイムに求め、注視頻度に応じて配信条件を変化させながら配信を行う方式を提供する。注視頻度が高いほど、高フレームレート、高画質で監視画像表示を行う。また、逆に注視頻度が低いほど、低フレームレート、低画質で監視画像表示を行う。一般的に高フレームレートや高画質であるほどネットワーク流量は高く、低フレームレートや低画質であるほどネットワーク流量は低い。従って、本発明により、システム全体のネットワーク流量を最適化することを可能とした。
なお、ここまでの説明においては、説明の簡略化のため、監視撮像装置を5台、端末装置を1台の構成で示したが、これらはネットワークに対し、上記以外の台数の構成であってもよい。 As described so far, the present invention provides a method for obtaining a gaze frequency for a monitoring image in real time and performing distribution while changing a distribution condition according to the gaze frequency. As the gaze frequency is higher, the monitor image is displayed at a higher frame rate and higher image quality. Conversely, as the gaze frequency is lower, monitor image display is performed at a lower frame rate and lower image quality. Generally, the higher the frame rate and the higher image quality, the higher the network flow rate, and the lower the frame rate and the lower image quality, the lower the network flow rate. Therefore, the present invention makes it possible to optimize the network flow rate of the entire system.
In the description so far, for simplification of description, five monitoring imaging devices and one terminal device are shown in the configuration, but these are configurations other than the above for the network. Also good.
なお、ここまでの説明においては、説明の簡略化のため、監視撮像装置を5台、端末装置を1台の構成で示したが、これらはネットワークに対し、上記以外の台数の構成であってもよい。 As described so far, the present invention provides a method for obtaining a gaze frequency for a monitoring image in real time and performing distribution while changing a distribution condition according to the gaze frequency. As the gaze frequency is higher, the monitor image is displayed at a higher frame rate and higher image quality. Conversely, as the gaze frequency is lower, monitor image display is performed at a lower frame rate and lower image quality. Generally, the higher the frame rate and the higher image quality, the higher the network flow rate, and the lower the frame rate and the lower image quality, the lower the network flow rate. Therefore, the present invention makes it possible to optimize the network flow rate of the entire system.
In the description so far, for simplification of description, five monitoring imaging devices and one terminal device are shown in the configuration, but these are configurations other than the above for the network. Also good.
(第2の実施形態) 次に、本発明の第2の実施形態について、図5から図8を用いて説明する。
図5は本発明の他の一実施例に係る監視システムの構成を示す図である。
図5において、図1と同一の符号の装置は、同一の装置を示す。
監視システムは、ネットワーク100、監視撮像装置101~105、端末装置506、ユーザ撮像装置107で構成されている。 Second Embodiment Next, a second embodiment of the present invention will be described with reference to FIGS.
FIG. 5 is a diagram showing the configuration of a monitoring system according to another embodiment of the present invention.
5, devices having the same reference numerals as those in FIG. 1 indicate the same devices.
The monitoring system includes a network 100,monitoring imaging devices 101 to 105, a terminal device 506, and a user imaging device 107.
図5は本発明の他の一実施例に係る監視システムの構成を示す図である。
図5において、図1と同一の符号の装置は、同一の装置を示す。
監視システムは、ネットワーク100、監視撮像装置101~105、端末装置506、ユーザ撮像装置107で構成されている。 Second Embodiment Next, a second embodiment of the present invention will be described with reference to FIGS.
FIG. 5 is a diagram showing the configuration of a monitoring system according to another embodiment of the present invention.
5, devices having the same reference numerals as those in FIG. 1 indicate the same devices.
The monitoring system includes a network 100,
端末装置506は、CPU等の演算回路や、RAM等の一時記憶メモリ、HDD等の記録媒体、データ伝送バス、外部入出力インタフェース、電源回路、液晶ディスプレイ等の画面、キーボードやマウス等のユーザ入力機器を備える、コンピュータ等の装置である。
The terminal device 506 is an arithmetic circuit such as a CPU, a temporary storage memory such as a RAM, a recording medium such as an HDD, a data transmission bus, an external input / output interface, a power circuit, a screen such as a liquid crystal display, and a user input such as a keyboard and a mouse. It is a device such as a computer provided with a device.
端末装置506は、ネットワーク100から外部入出力インタフェースに入力された監視撮像装置101~105からの画像データを一時記憶メモリに格納(記憶)する。格納された画像データを、演算回路を使い、表示に適した形態に変換を加え、画面表示する。記録媒体には、本発明の方式を含むソフトウェア一式や、OS(オペレーションシステム)、ユーザ(監視者)の画像(人物特徴量)一式等が格納されている。また、本端末装置に対するユーザ操作は、ユーザ入力機器に対して行われる。
ユーザ撮像装置107は、ユーザ(監視者)の顔を正面から撮影可能な位置、例えば、端末装置506の画面近傍に配置されるものとする。 Theterminal device 506 stores (stores) the image data from the monitoring imaging devices 101 to 105 input from the network 100 to the external input / output interface in the temporary storage memory. The stored image data is converted into a form suitable for display using an arithmetic circuit and displayed on the screen. The recording medium stores a set of software including the method of the present invention, an OS (operation system), a set of images (person feature amounts) of a user (monitoring person), and the like. In addition, a user operation on the terminal device is performed on a user input device.
Theuser imaging device 107 is arranged at a position where the user (monitor) 's face can be photographed from the front, for example, near the screen of the terminal device 506.
ユーザ撮像装置107は、ユーザ(監視者)の顔を正面から撮影可能な位置、例えば、端末装置506の画面近傍に配置されるものとする。 The
The
次に、本発明の他の一実施例である監視システムの機能構成について図6を用いて説明する。
図6は本発明の他の一実施例に係る監視システムの機能構成を示す図である。
図6において、図1や図2、図5と同一の符号は、同一の装置を示す。
端末装置506は、配信指示部200、監視画像受信部201、画面表示部202、ユーザ画像取得部203、顔検出部204、顔特徴量算出部608、人物識別部609、人物記録部610、制御部611の各機能部で構成されている。 Next, a functional configuration of a monitoring system according to another embodiment of the present invention will be described with reference to FIG.
FIG. 6 is a diagram showing a functional configuration of a monitoring system according to another embodiment of the present invention.
In FIG. 6, the same reference numerals as those in FIGS. 1, 2, and 5 denote the same devices.
Theterminal device 506 includes a distribution instruction unit 200, a monitoring image reception unit 201, a screen display unit 202, a user image acquisition unit 203, a face detection unit 204, a face feature amount calculation unit 608, a person identification unit 609, a person recording unit 610, a control Each functional unit of the unit 611 is configured.
図6は本発明の他の一実施例に係る監視システムの機能構成を示す図である。
図6において、図1や図2、図5と同一の符号は、同一の装置を示す。
端末装置506は、配信指示部200、監視画像受信部201、画面表示部202、ユーザ画像取得部203、顔検出部204、顔特徴量算出部608、人物識別部609、人物記録部610、制御部611の各機能部で構成されている。 Next, a functional configuration of a monitoring system according to another embodiment of the present invention will be described with reference to FIG.
FIG. 6 is a diagram showing a functional configuration of a monitoring system according to another embodiment of the present invention.
In FIG. 6, the same reference numerals as those in FIGS. 1, 2, and 5 denote the same devices.
The
顔特徴量算出部608は、顔検出部204で検出した画像中の顔領域に対して画像認識技術を用いて特徴量算出を行う機能部である。
人物識別部609は、顔検出部204で検出した監視者に対する人物識別を、顔特徴量算出部608で求めた特徴量を人物記録部610に格納されたデータベースに照合することにより判定する機能部である。 The face featureamount calculation unit 608 is a functional unit that calculates a feature amount using an image recognition technique for the face area in the image detected by the face detection unit 204.
Theperson identification unit 609 determines the person identification for the supervisor detected by the face detection unit 204 by comparing the feature amount obtained by the face feature amount calculation unit 608 with a database stored in the person recording unit 610. It is.
人物識別部609は、顔検出部204で検出した監視者に対する人物識別を、顔特徴量算出部608で求めた特徴量を人物記録部610に格納されたデータベースに照合することにより判定する機能部である。 The face feature
The
人物記録部610は、監視者の人物特徴量、本実施例では顔特徴量をその人物ID(姓名や人物固有に与えた識別情報)等と併せて人物データとして記録しておく機能部である。ここで用いる特徴量は画像上に映った人物に対し、個人を識別するための画像特徴量であり、例えば、目鼻口などの器官の位置関係やそれぞれの輪郭情報であったりする。本例においては、利用する特徴量はどのようであってもよい。
また、人物記録部610には、監視撮像装置101~105に対する配信条件を人物毎に記録しておく。ここで配信条件とは、配信フレームレートや画質等で構成される。人物データは、事前に与えられておくものとし、その与え方はどのようであってもよい。人物データの態様については後述する。
制御部611は、端末装置506の各機能部の制御を行う機能部である。 Theperson recording unit 610 is a functional unit that records the person feature amount of the supervisor, in this embodiment, the face feature amount together with the person ID (first and last name, identification information given to the person) and the like as person data. . The feature amount used here is an image feature amount for identifying an individual with respect to a person shown in the image, and may be, for example, the positional relationship of organs such as the eyes and nose and the respective contour information. In this example, any feature quantity may be used.
Theperson recording unit 610 records the delivery conditions for the monitoring imaging devices 101 to 105 for each person. Here, the distribution conditions are composed of a distribution frame rate, image quality, and the like. The person data is given in advance, and the way of giving it may be any. The aspect of the person data will be described later.
Thecontrol unit 611 is a functional unit that controls each functional unit of the terminal device 506.
また、人物記録部610には、監視撮像装置101~105に対する配信条件を人物毎に記録しておく。ここで配信条件とは、配信フレームレートや画質等で構成される。人物データは、事前に与えられておくものとし、その与え方はどのようであってもよい。人物データの態様については後述する。
制御部611は、端末装置506の各機能部の制御を行う機能部である。 The
The
The
次に、人物記録部610における人物データの態様について図7を用いて説明する。
図7は本発明の他の一実施例に係る監視システムの人物記録部における人物データの態様を示す図である。
人物データ700は、人物1名の情報を1個のレコードとして記録する。本実施例は、監視者3名の情報、すなわち3個のレコードが記録されている状態を示し、レコード701~703で構成される。本データは予め設定しておく。 Next, an aspect of person data in theperson recording unit 610 will be described with reference to FIG.
FIG. 7 is a diagram showing an aspect of person data in the person recording unit of the monitoring system according to another embodiment of the present invention.
Theperson data 700 records information of one person as one record. The present embodiment shows information of three monitor persons, that is, a state where three records are recorded, and is composed of records 701 to 703. This data is set in advance.
図7は本発明の他の一実施例に係る監視システムの人物記録部における人物データの態様を示す図である。
人物データ700は、人物1名の情報を1個のレコードとして記録する。本実施例は、監視者3名の情報、すなわち3個のレコードが記録されている状態を示し、レコード701~703で構成される。本データは予め設定しておく。 Next, an aspect of person data in the
FIG. 7 is a diagram showing an aspect of person data in the person recording unit of the monitoring system according to another embodiment of the present invention.
The
レコードは、レコード番号セル710、人物IDセル711、特徴量セル712、配信フレームレートセル713~717、画質セル718~722で構成される。
レコード番号セル710は、レコード番号を格納する領域である。レコード番号は、レコードの管理をするために用いる番号であり、例えば、レコード毎にユニークに振られる連続整数値である。 The record is composed of arecord number cell 710, a person ID cell 711, a feature amount cell 712, distribution frame rate cells 713 to 717, and image quality cells 718 to 722.
Arecord number cell 710 is an area for storing a record number. The record number is a number used for managing records, and is, for example, a continuous integer value uniquely assigned to each record.
レコード番号セル710は、レコード番号を格納する領域である。レコード番号は、レコードの管理をするために用いる番号であり、例えば、レコード毎にユニークに振られる連続整数値である。 The record is composed of a
A
人物IDセル711は、監視者の人物IDを格納する領域である。本実施例では、人物名称としての文字列が格納された例を示すが、人物固有に与えた識別情報であれば、整数値等を用いるようにしてもよい。
特徴量セル712は、監視者の人物特徴量を格納する領域である。本実施例では、説明の容易さのため、1次元値としての小数値が格納された例を示すが、多次元値を用いるようにしてもよい。 Theperson ID cell 711 is an area for storing the person ID of the supervisor. In this embodiment, an example in which a character string as a person name is stored is shown. However, an integer value or the like may be used as long as it is identification information given to a person.
Thefeature quantity cell 712 is an area for storing the person feature quantity of the supervisor. In this embodiment, an example in which a decimal value as a one-dimensional value is stored is shown for ease of explanation, but a multidimensional value may be used.
特徴量セル712は、監視者の人物特徴量を格納する領域である。本実施例では、説明の容易さのため、1次元値としての小数値が格納された例を示すが、多次元値を用いるようにしてもよい。 The
The
配信フレームレートセル713~717は、配信フレームレートを格納する領域である。配信フレームレートは、所定時間、例えば、1秒間に撮像装置から出力される画像の枚数である。配信フレームレートセル713は、監視撮像装置101に対する配信フレームレート、配信フレームレートセル714は、監視撮像装置102に対する配信フレームレート、配信フレームレートセル715は、監視撮像装置103に対する配信フレームレート、配信フレームレートセル716は、監視撮像装置104に対する配信フレームレート、配信フレームレートセル717は、監視撮像装置105に対する配信フレームレートを、それぞれ格納する。本セルに格納する値自体はどんな値でも良く、本実施例では、記号を具体的な数値の代わりに示す。
Delivery frame rate cells 713 to 717 are areas for storing delivery frame rates. The distribution frame rate is the number of images output from the imaging device in a predetermined time, for example, 1 second. The distribution frame rate cell 713 is the distribution frame rate for the monitoring imaging apparatus 101, the distribution frame rate cell 714 is the distribution frame rate for the monitoring imaging apparatus 102, the distribution frame rate cell 715 is the distribution frame rate for the monitoring imaging apparatus 103, and the distribution frame. The rate cell 716 stores the distribution frame rate for the monitoring imaging device 104, and the distribution frame rate cell 717 stores the distribution frame rate for the monitoring imaging device 105, respectively. The value itself stored in the cell may be any value, and in this embodiment, the symbol is shown instead of a specific numerical value.
画質セル718~722は、画質を格納する領域である。画質は、例えば、画像の解像度や圧縮率、あるいはそれらの組合せで構成する。本セルに格納する値自体はどんな値でも良く、本例では、記号を具体的な数値の代わりに示す。
Image quality cells 718 to 722 are areas for storing image quality. The image quality is constituted by, for example, image resolution, compression rate, or a combination thereof. The value itself stored in this cell may be any value, and in this example, a symbol is shown instead of a specific numerical value.
次に、端末装置506における処理の流れについて図8を用いて説明する。
図8は本発明の他の一実施例に係る監視システムの端末装置における処理の流れを説明するためのフローチャートである。
端末装置506における処理は、図8(A)と図(B)で構成されている。図8(B)の処理は監視撮像装置毎に行われる処理であるため、5台の監視撮像装置を示した図5のシステム構成においては、端末装置506の内部にて、図8(A)の処理が1つと、図(B)の処理が5つの合計6つの処理が並列実行される形となる。 Next, the flow of processing in theterminal device 506 will be described with reference to FIG.
FIG. 8 is a flowchart for explaining the flow of processing in the terminal device of the monitoring system according to another embodiment of the present invention.
The processing in theterminal device 506 is composed of FIG. 8A and FIG. Since the processing in FIG. 8B is performed for each monitoring imaging device, in the system configuration in FIG. 5 showing five monitoring imaging devices, inside the terminal device 506, FIG. There is one process, and a total of six processes in FIG. 5B are executed in parallel.
図8は本発明の他の一実施例に係る監視システムの端末装置における処理の流れを説明するためのフローチャートである。
端末装置506における処理は、図8(A)と図(B)で構成されている。図8(B)の処理は監視撮像装置毎に行われる処理であるため、5台の監視撮像装置を示した図5のシステム構成においては、端末装置506の内部にて、図8(A)の処理が1つと、図(B)の処理が5つの合計6つの処理が並列実行される形となる。 Next, the flow of processing in the
FIG. 8 is a flowchart for explaining the flow of processing in the terminal device of the monitoring system according to another embodiment of the present invention.
The processing in the
まず、図8(A)の処理の流れについて説明する。
ステップ800において、端末装置506の制御部611は、所定時間の待機を行う。待機時間の長さは予め設定する。所定時間経過後に、処理をステップ801に進める。
ステップ801において、端末装置506のユーザ画像取得部203は、ユーザ撮像装置107からのユーザ画像取得を行う。 First, the flow of processing in FIG. 8A will be described.
Instep 800, the control unit 611 of the terminal device 506 waits for a predetermined time. The length of the waiting time is set in advance. After a predetermined time elapses, the process proceeds to step 801.
Instep 801, the user image acquisition unit 203 of the terminal device 506 performs user image acquisition from the user imaging device 107.
ステップ800において、端末装置506の制御部611は、所定時間の待機を行う。待機時間の長さは予め設定する。所定時間経過後に、処理をステップ801に進める。
ステップ801において、端末装置506のユーザ画像取得部203は、ユーザ撮像装置107からのユーザ画像取得を行う。 First, the flow of processing in FIG. 8A will be described.
In
In
ステップ802において、端末装置506の顔検出部204は、ステップ801にて取得したユーザ画像データに対して、顔検出を行う。顔検出は、例えば、背景画像との差分を求め、その差分領域の面積や形状等にて判定をする方法や、予め学習させ用意しておいた眼や鼻、口等の主要構成要素の配置や額と眼の濃淡差等の顔の画像特徴パターンを有する領域の有無を画像内探索を使って探す方法、等により行う。本発明の他の一実施例ではいずれの方法であってもよい。
顔検出部204は、ステップ802の判定で、顔を検出した場合(YES)にはステップ803に進み、顔を検出しない場合(NO)にはステップ808の処理に進む。 Instep 802, the face detection unit 204 of the terminal device 506 performs face detection on the user image data acquired in step 801. Face detection is, for example, a method for obtaining a difference from a background image and determining the area based on the area or shape of the difference area, or arrangement of main components such as eyes, nose, and mouth prepared and learned in advance. Further, it is performed by a method of searching for the presence or absence of a region having a facial image feature pattern such as a difference between the forehead and the contrast of eyes using an in-image search. In another embodiment of the present invention, any method may be used.
If it is determined instep 802 that the face is detected (YES), the face detection unit 204 proceeds to step 803. If no face is detected (NO), the face detection unit 204 proceeds to step 808.
顔検出部204は、ステップ802の判定で、顔を検出した場合(YES)にはステップ803に進み、顔を検出しない場合(NO)にはステップ808の処理に進む。 In
If it is determined in
ステップ803において、端末装置506の制御部611は、監視者なしフラグをOFFにし、処理をステップ804に進める。なお、説明の簡略化のため、図8には明示していないが、監視者なしフラグは初期状態ONとする。
ステップ804において、端末装置506の顔特徴量算出部608は、ステップ802で求めた顔領域に対し、特徴量算出を行う。
ここで算出する特徴量には、例えば、頭髪や皮膚の色、顔の輪郭の形状や方向、目や鼻、口といった主要構成要素の大きさ・形状や配置関係等々の画像特徴量が挙げられるが、本発明の他の一実施例においては使用する特徴量の種類や数はいずれであってもよい。 Instep 803, the control unit 611 of the terminal device 506 turns off the no-monitorer flag and advances the process to step 804. For simplification of explanation, although not explicitly shown in FIG. 8, the no-monitorer flag is set to the initial state ON.
Instep 804, the face feature amount calculation unit 608 of the terminal device 506 performs feature amount calculation on the face area obtained in step 802.
The feature quantities calculated here include, for example, image feature quantities such as the color of hair and skin, the shape and direction of the outline of the face, the size and shape of the main components such as eyes, nose, and mouth, and the arrangement relationship. However, in another embodiment of the present invention, any kind or number of feature quantities may be used.
ステップ804において、端末装置506の顔特徴量算出部608は、ステップ802で求めた顔領域に対し、特徴量算出を行う。
ここで算出する特徴量には、例えば、頭髪や皮膚の色、顔の輪郭の形状や方向、目や鼻、口といった主要構成要素の大きさ・形状や配置関係等々の画像特徴量が挙げられるが、本発明の他の一実施例においては使用する特徴量の種類や数はいずれであってもよい。 In
In
The feature quantities calculated here include, for example, image feature quantities such as the color of hair and skin, the shape and direction of the outline of the face, the size and shape of the main components such as eyes, nose, and mouth, and the arrangement relationship. However, in another embodiment of the present invention, any kind or number of feature quantities may be used.
ステップ805において、端末装置506の人物識別部609は、特徴量の照合を行う。具体的には、ステップ804で算出した特徴量を、人物記録部610に予め記録しておいた人物データ700の各人物の特徴量全てを対象に順番に照合(一致度算出)していき、最も一致性の高い特徴量を探し出す。この特徴量を持つ人物を最類似人物とする。
In step 805, the person identifying unit 609 of the terminal device 506 performs feature amount collation. Specifically, the feature amount calculated in step 804 is sequentially collated (calculation of coincidence) for all the feature amounts of each person in the person data 700 recorded in advance in the person recording unit 610, Find the feature with the highest match. The person having this feature amount is set as the most similar person.
ここで、一致度とは、画像(画像特徴量)同士の近さを示す数値であり、その算出方法については、例えば、非特許文献1の「大規模な画像集合のための表現モデル」(廣池敦他、日本写真学会誌2003年66巻1号P93-P101)のような論文を参照することができる。本例における一致度は、値が小さいほど一致性が高いものとする。例えば、ステップ804で算出した特徴量が4.5であって、一致度の算出方法が特徴量値の差分であるとした場合、レコード701~703の3レコードの特徴量セル712の格納値と照合した結果、差分、すなわち一致度が0.4であるレコード702に格納された人物が最類似人物となる。本発明において、照合順番の決定方法や、一致度の算出方法はどのようであってもよい。
Here, the degree of coincidence is a numerical value indicating the closeness between images (image feature amounts), and the calculation method thereof is, for example, “Representation model for large-scale image collection” (see Non-Patent Document 1). Reference can be made to a paper such as Kaoru Tsujiike et al., Journal of the Japan Photography Society, Vol. 66, No. 1, P93-P101). It is assumed that the degree of coincidence in this example is higher as the value is smaller. For example, if the feature amount calculated in step 804 is 4.5, and the coincidence calculation method is a difference between feature amount values, the stored value of the feature amount cell 712 of three records 701 to 703 As a result of the collation, the person stored in the record 702 having a difference, that is, a matching degree of 0.4 becomes the most similar person. In the present invention, any method may be used for determining the collation order and calculating the degree of coincidence.
ステップ806において、端末装置506の人物識別部609は、監視者の判定を行う。判定は、ステップ805で求めた最類似人物についてその一致度とある所定値(閾値)との関係において、一致度が閾値以下であった場合にその人物を監視者として判定する。閾値は予め設定する。例えば、閾値が0.6であるとした場合、ステップ805で求めた最類似人物の一致度は0.4であるので、ユーザ撮像装置107に撮影された人物は、レコード702に格納された人物、すなわち監視者と判定される。
人物識別部609は、ステップ806の処理で、監視者有りの場合(YES)にはステップ807の処理に進み、監視者なしの場合(NO)にはステップ808の処理に進む。 Instep 806, the person identification unit 609 of the terminal device 506 performs a monitoring person's determination. In the determination, in the relationship between the degree of coincidence and the predetermined value (threshold value) for the most similar person obtained in step 805, the person is decided as a supervisor when the degree of coincidence is equal to or less than the threshold value. The threshold is set in advance. For example, if the threshold is 0.6, the degree of coincidence of the most similar person obtained in step 805 is 0.4, and therefore the person photographed by the user imaging device 107 is the person stored in the record 702. That is, it is determined to be a supervisor.
In the process ofstep 806, the person identification unit 609 proceeds to the process of step 807 if there is a supervisor (YES), and proceeds to the process of step 808 if there is no supervisor (NO).
人物識別部609は、ステップ806の処理で、監視者有りの場合(YES)にはステップ807の処理に進み、監視者なしの場合(NO)にはステップ808の処理に進む。 In
In the process of
ステップ807において、端末装置506の制御部611は、監視者IDにステップ806で求めた監視者の人物IDをセットする。例えば、レコード702の人物IDセル711の格納値をセットする。処理後、ステップ800の処理に戻る。なお、説明の簡略化のため、図8には明示していないが、監視者IDは初期状態では未設定とする。
ステップ808において、端末装置506の制御部611は、監視者なしフラグをONにし、監視者IDをクリアした上で、ステップ800の処理に戻る。 Instep 807, the control unit 611 of the terminal device 506 sets the person ID of the monitor obtained in step 806 as the monitor ID. For example, the stored value of the person ID cell 711 of the record 702 is set. After the process, the process returns to step 800. For simplification of explanation, although not explicitly shown in FIG. 8, the supervisor ID is not set in the initial state.
Instep 808, the control unit 611 of the terminal device 506 turns on the no-monitorer flag, clears the supervisor ID, and returns to the process in step 800.
ステップ808において、端末装置506の制御部611は、監視者なしフラグをONにし、監視者IDをクリアした上で、ステップ800の処理に戻る。 In
In
次に、図8(B)の処理の流れについて説明する。
ステップ810において、端末装置506の制御部611は、所定時間の待機を行う。待機時間の長さは予め設定する。所定時間経過後に、ステップ811の処理に進む。
ステップ811において、端末装置506の制御部611は、監視者なしフラグがONであるか否かを判定し、監視者なしフラグがOFFである場合(NO)には、ステップ812の処理に進み、監視者なしフラグがONである場合(YES)にはステップ810の処理に戻る。 Next, the flow of processing in FIG. 8B will be described.
Instep 810, the control unit 611 of the terminal device 506 waits for a predetermined time. The length of the waiting time is set in advance. After a predetermined time elapses, the process proceeds to step 811.
Instep 811, the control unit 611 of the terminal device 506 determines whether or not the no-monitorer flag is ON. If the no-monitorer flag is OFF (NO), the process proceeds to step 812. If the no-monitorer flag is ON (YES), the process returns to step 810.
ステップ810において、端末装置506の制御部611は、所定時間の待機を行う。待機時間の長さは予め設定する。所定時間経過後に、ステップ811の処理に進む。
ステップ811において、端末装置506の制御部611は、監視者なしフラグがONであるか否かを判定し、監視者なしフラグがOFFである場合(NO)には、ステップ812の処理に進み、監視者なしフラグがONである場合(YES)にはステップ810の処理に戻る。 Next, the flow of processing in FIG. 8B will be described.
In
In
ステップ812において、端末装置506の制御部611は、監視者IDの変化を判定し、前回の判定から監視者IDに変化がある場合(YES)にはステップ813の処理に進み、監視者IDに変化がない場合(NO)にはステップ815の処理に進む。
ステップ813において、端末装置506の制御部611は、配信条件の取得を行う。人物記録部610に格納された人物データ700の中から監視者IDと人物IDが一致するレコードを特定する。そして、一致レコードの配信フレームレートセルと画質セルから、配信フレームレート値と画質値を取り出す。 Instep 812, the control unit 611 of the terminal device 506 determines a change in the supervisor ID. If the supervisor ID has changed from the previous determination (YES), the process proceeds to step 813, and the supervisor ID is set. If there is no change (NO), the process proceeds to step 815.
Instep 813, the control unit 611 of the terminal device 506 acquires distribution conditions. From the person data 700 stored in the person recording unit 610, a record in which the supervisor ID matches the person ID is specified. Then, the distribution frame rate value and the image quality value are extracted from the distribution frame rate cell and the image quality cell of the matching record.
ステップ813において、端末装置506の制御部611は、配信条件の取得を行う。人物記録部610に格納された人物データ700の中から監視者IDと人物IDが一致するレコードを特定する。そして、一致レコードの配信フレームレートセルと画質セルから、配信フレームレート値と画質値を取り出す。 In
In
ステップ814において、端末装置506の配信指示部200は、監視撮像装置に対して、画像配信の要求を送信する。その際、要求する配信条件、すなわち、配信フレームレートや画質は、ステップ813にて取り出した値とする。
ステップ815において、端末装置506の監視画像受信部201は、画像受信待機を行う。監視撮像装置からの画像着信を検知すると処理はステップ816に進む。
ステップ816において、端末装置506の監視画像受信部201は、監視撮像装置からの監視画像を受信する。 Instep 814, the distribution instruction unit 200 of the terminal device 506 transmits an image distribution request to the monitoring imaging device. At this time, the requested distribution conditions, that is, the distribution frame rate and image quality are the values extracted in step 813.
Instep 815, the monitoring image receiving unit 201 of the terminal device 506 waits for image reception. If an incoming image from the monitoring imaging apparatus is detected, the process proceeds to step 816.
Instep 816, the monitoring image receiving unit 201 of the terminal device 506 receives a monitoring image from the monitoring imaging device.
ステップ815において、端末装置506の監視画像受信部201は、画像受信待機を行う。監視撮像装置からの画像着信を検知すると処理はステップ816に進む。
ステップ816において、端末装置506の監視画像受信部201は、監視撮像装置からの監視画像を受信する。 In
In
In
ステップ817において、端末装置506の画面表示部202は、受信した監視画像データに対して、必要に応じて、圧縮画像の展開や画像形式の変換を行い、画面に表示する。その際、ステップ813で取得した配信条件を参照し、条件が高画質である場合には大きな面積で、低画質である場合には小さな面積で表示したり、高フレームレートである場合には画面の上部に表示し、低フレームレートである場合には画面の下部に表示する等の配信条件に併せた表示制御も併せて行う。この表示制御は、並列実行されている5つの処理間で表示位置が重なったり等しないように協調動作させるものとする。処理後、ステップ811の処理に戻る。
In step 817, the screen display unit 202 of the terminal device 506 performs decompression of the compressed image and conversion of the image format on the received monitoring image data as necessary, and displays them on the screen. At that time, the distribution condition acquired in step 813 is referred to, and if the condition is high image quality, a large area is displayed, if the image quality is low, a small area is displayed, or if the frame rate is high, the screen is displayed. Display control is also performed in accordance with distribution conditions such as displaying at the top of the screen, and displaying at the bottom of the screen when the frame rate is low. This display control is performed in a coordinated manner so that the display positions do not overlap among the five processes executed in parallel. After the process, the process returns to step 811.
上述で示したように、本発明の一実施例では、監視画像に対する監視者をリアルタイムに求め、監視者に応じて配信条件を変化させながら配信を行う方式を提供する。また、端末装置の前に人物が存在しない場合や人物は存在するが監視者でない場合には、配信を行わない方式を提供する。これらにより、システム全体のネットワーク流量の無駄を低減することを可能とした。
As described above, in one embodiment of the present invention, a system for obtaining a monitor for a monitoring image in real time and performing distribution while changing distribution conditions according to the monitor is provided. In addition, when a person does not exist in front of the terminal device or when a person exists but is not a supervisor, a system that does not perform distribution is provided. As a result, the waste of the network flow rate of the entire system can be reduced.
上述の説明においては、説明の簡略化のため、監視撮像装置を5台、端末装置を1台の構成で示したが、これらはネットワークに対し、上記以外の台数の構成であってもよい。
また、監視映像の配信元を撮像装置とした構成で示したが、録画装置や再配信装置とした構成であってもよい。
また、人物識別に掛かる処理を端末装置上にて実施する構成にて示したが、別個の装置、例えば、サーバ装置にて実施するようにしてもよい。
さらに、本発明の第1の形態と第2の形態を別個の実施例として示したが、第1の形態と第2の形態を組み合わせて実施するようにしてもよい。 In the above description, for simplification of description, the configuration of five monitoring imaging devices and one terminal device is shown, but these configurations may be other than the above for the network.
In addition, although the configuration in which the monitoring video distribution source is an imaging device is shown, the configuration may be a recording device or a re-distribution device.
Moreover, although the process concerning a person identification was shown by the structure implemented on a terminal device, you may make it implement in a separate apparatus, for example, a server apparatus.
Furthermore, although the 1st form and 2nd form of this invention were shown as a separate Example, you may make it implement combining a 1st form and a 2nd form.
また、監視映像の配信元を撮像装置とした構成で示したが、録画装置や再配信装置とした構成であってもよい。
また、人物識別に掛かる処理を端末装置上にて実施する構成にて示したが、別個の装置、例えば、サーバ装置にて実施するようにしてもよい。
さらに、本発明の第1の形態と第2の形態を別個の実施例として示したが、第1の形態と第2の形態を組み合わせて実施するようにしてもよい。 In the above description, for simplification of description, the configuration of five monitoring imaging devices and one terminal device is shown, but these configurations may be other than the above for the network.
In addition, although the configuration in which the monitoring video distribution source is an imaging device is shown, the configuration may be a recording device or a re-distribution device.
Moreover, although the process concerning a person identification was shown by the structure implemented on a terminal device, you may make it implement in a separate apparatus, for example, a server apparatus.
Furthermore, although the 1st form and 2nd form of this invention were shown as a separate Example, you may make it implement combining a 1st form and a 2nd form.
本発明の実施形態である監視システムは、ネットワーク流量を低減することができる。
The monitoring system according to the embodiment of the present invention can reduce the network flow rate.
以上、本発明の一実施形態について詳細に説明したが、本発明は上述した実施形態に限定されるものではなく、本発明の趣旨を逸脱しない範囲で種々変更して実施することができる。
As mentioned above, although one embodiment of the present invention was described in detail, the present invention is not limited to the above-described embodiment, and various modifications can be made without departing from the spirit of the present invention.
監視者の画像に対する注視頻度に基づき配信画像の画質やフレームレートを変更することによって、ネットワーク流量を低減したい用途に適用できる。
-It can be applied to applications that want to reduce network flow rate by changing the image quality and frame rate of the distribution image based on the frequency of gaze at the image of the supervisor.
100:ネットワーク、101~105:監視撮像装置、106:端末装置、107:ユーザ撮像装置、200:配信指示部、201:監視画像受信部、202:画面表示部、203:ユーザ画像取得部、204:顔検出部、205:注視判定部、206:判定結果保持部、207:制御部、300:配信指示データ、301~307:レコード、310:頻度セル、311:配信フレームレートセル、312:画質セル、506:端末装置、608:顔特徴量算出部、609:人物識別部、610:人物記録部、611:制御部、700:人物データ、701~703:レコード、710:レコード番号セル、711:人物IDセル、712:特徴量セル、713~717:配信フレームレートセル、718~722:画質セル。
100: network 101-105: monitoring imaging device 106: terminal device 107: user imaging device 200: distribution instruction unit 201: monitoring image receiving unit 202: screen display unit 203: user image acquisition unit 204 : Face detection unit, 205: gaze determination unit, 206: determination result holding unit, 207: control unit, 300: distribution instruction data, 301 to 307: record, 310: frequency cell, 311: distribution frame rate cell, 312: image quality Cell, 506: terminal device, 608: face feature amount calculation unit, 609: person identification unit, 610: person recording unit, 611: control unit, 700: person data, 701 to 703: record, 710: record number cell, 711 : Person ID cell, 712: feature amount cell, 713 to 717: distribution frame rate cell, and 718 to 722: image quality cell.
Claims (2)
- 撮像装置とユーザ撮像装置と端末装置を備えた監視システムであって、
前記撮像装置は、配信画像の画質やフレームレートを変更する手段を有し、
前記ユーザ撮像装置は、前記端末装置の監視者を撮影し、
前記端末装置は、前記ユーザ撮像装置の撮影画像から監視者の画像に対する注視頻度を取得する手段と、前記注視頻度に基づき配信画像の画質またはフレームレートを決定する手段と、該決定した結果に基づき前記撮像装置への配信指示を出す手段とを有することを特徴とする監視システム。 A monitoring system including an imaging device, a user imaging device, and a terminal device,
The imaging apparatus has means for changing the image quality and frame rate of a distribution image,
The user imaging device photographs a supervisor of the terminal device,
The terminal device includes means for acquiring a gaze frequency for an image of a supervisor from a captured image of the user imaging device, means for determining an image quality or a frame rate of a distribution image based on the gaze frequency, and based on the determined result And a means for issuing a delivery instruction to the imaging apparatus. - 撮像装置とユーザ撮像装置と端末装置を備えた監視システムであって、
前記撮像装置は、配信画像の画質やフレームレートを変更する手段を有し、
前記ユーザ撮像装置は、前記端末装置の監視者を撮影し、
前記端末装置は、前記撮像装置からの配信画像を画面表示する手段と、前記ユーザ撮像装置の撮影画像から監視者の個人識別を行う手段と、前記個人識別結果に基づき配信画像の画質やフレームレートを決定する手段と、該決定した結果に基づき前記撮像装置への配信指示を出す手段とを有することを特徴とする監視システム。 A monitoring system including an imaging device, a user imaging device, and a terminal device,
The imaging apparatus has means for changing the image quality and frame rate of a distribution image,
The user imaging device photographs a supervisor of the terminal device,
The terminal device includes a means for displaying a delivery image from the imaging device, a means for personally identifying a supervisor from a photographed image of the user imaging device, and an image quality and a frame rate of the delivery image based on the personal identification result. And a means for issuing a distribution instruction to the imaging device based on the determined result.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017544429A JP6632632B2 (en) | 2015-10-09 | 2016-09-13 | Monitoring system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015201025 | 2015-10-09 | ||
JP2015-201025 | 2015-10-09 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017061239A1 true WO2017061239A1 (en) | 2017-04-13 |
Family
ID=58487501
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/076955 WO2017061239A1 (en) | 2015-10-09 | 2016-09-13 | Surveillance system |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP6632632B2 (en) |
WO (1) | WO2017061239A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021052301A (en) * | 2019-09-25 | 2021-04-01 | 株式会社日立国際電気 | Monitoring system and monitoring method |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7500167B2 (en) * | 2019-08-23 | 2024-06-17 | キヤノン株式会社 | Display control device, control method thereof, program thereof, and storage medium thereof |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008072447A (en) * | 2006-09-14 | 2008-03-27 | Fujitsu Ltd | Image distribution system, image distribution program, image distribution method |
JP2008167101A (en) * | 2006-12-28 | 2008-07-17 | Hitachi Ltd | Video recorder for monitoring |
JP2008219484A (en) * | 2007-03-05 | 2008-09-18 | Victor Co Of Japan Ltd | Monitoring camera, display control device, and monitoring system |
JP2010233114A (en) * | 2009-03-27 | 2010-10-14 | Sogo Keibi Hosho Co Ltd | Monitoring system and image transfer method of the same |
JP2011199737A (en) * | 2010-03-23 | 2011-10-06 | Hitachi Ltd | Apparatus and method for recording of monitor video, and program |
-
2016
- 2016-09-13 JP JP2017544429A patent/JP6632632B2/en active Active
- 2016-09-13 WO PCT/JP2016/076955 patent/WO2017061239A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008072447A (en) * | 2006-09-14 | 2008-03-27 | Fujitsu Ltd | Image distribution system, image distribution program, image distribution method |
JP2008167101A (en) * | 2006-12-28 | 2008-07-17 | Hitachi Ltd | Video recorder for monitoring |
JP2008219484A (en) * | 2007-03-05 | 2008-09-18 | Victor Co Of Japan Ltd | Monitoring camera, display control device, and monitoring system |
JP2010233114A (en) * | 2009-03-27 | 2010-10-14 | Sogo Keibi Hosho Co Ltd | Monitoring system and image transfer method of the same |
JP2011199737A (en) * | 2010-03-23 | 2011-10-06 | Hitachi Ltd | Apparatus and method for recording of monitor video, and program |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021052301A (en) * | 2019-09-25 | 2021-04-01 | 株式会社日立国際電気 | Monitoring system and monitoring method |
JP7252107B2 (en) | 2019-09-25 | 2023-04-04 | 株式会社日立国際電気 | Monitoring system and monitoring method |
Also Published As
Publication number | Publication date |
---|---|
JPWO2017061239A1 (en) | 2018-07-12 |
JP6632632B2 (en) | 2020-01-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021232587A1 (en) | Bifocal camera temperature measurement method based on image processing, and related device | |
CN110089104B (en) | Event storage device, event search device, and event alarm device | |
JP5567853B2 (en) | Image recognition apparatus and method | |
JP5213105B2 (en) | Video network system and video data management method | |
TWI430186B (en) | Image processing apparatus and image processing method | |
JP4701356B2 (en) | Privacy protection image generation device | |
JP5669082B2 (en) | Verification device | |
JP5999395B1 (en) | Imaging device, recording device, and video output control device | |
JP2015072578A (en) | Person identification apparatus, person identification method, and program | |
KR102249498B1 (en) | The Apparatus And System For Searching | |
JP2011060058A (en) | Imaging apparatus and monitoring system | |
WO2017046838A1 (en) | Specific person detection system and specific person detection method | |
CN113495629A (en) | Notebook computer display screen brightness adjusting system and method | |
JP5088463B2 (en) | Monitoring system | |
JP5865584B2 (en) | Specific person detection system and detection method | |
WO2017061239A1 (en) | Surveillance system | |
US10783365B2 (en) | Image processing device and image processing system | |
CN113409056B (en) | Payment method and device, local identification equipment, face payment system and equipment | |
JP2019029747A (en) | Image monitoring system | |
CN112488647B (en) | Attendance checking system and method, storage medium and electronic equipment | |
JP2015073191A (en) | Image processing system and control method therefor | |
JP7329967B2 (en) | IMAGE PROCESSING APPARATUS, SYSTEM, IMAGE PROCESSING APPARATUS CONTROL METHOD, AND PROGRAM | |
TWI503759B (en) | Cloud-based smart monitoring system | |
JP2012049774A (en) | Video monitoring device | |
WO2022057329A1 (en) | Safety monitoring method, apparatus, and system, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16853393 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017544429 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16853393 Country of ref document: EP Kind code of ref document: A1 |