WO2016170765A1 - Target identification system, target identification method and program storage medium - Google Patents
Target identification system, target identification method and program storage medium Download PDFInfo
- Publication number
- WO2016170765A1 WO2016170765A1 PCT/JP2016/002051 JP2016002051W WO2016170765A1 WO 2016170765 A1 WO2016170765 A1 WO 2016170765A1 JP 2016002051 W JP2016002051 W JP 2016002051W WO 2016170765 A1 WO2016170765 A1 WO 2016170765A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- detection target
- target
- detection
- condition
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
Definitions
- the present invention relates to a target specifying system, a target specifying method, and a program recording medium for specifying a detection target from a plurality of targets.
- a safer society can be realized if it is possible to reliably detect detection targets such as criminals and people who need attention in a hustle and bustle.
- detection targets such as criminals and people who need attention in a hustle and bustle.
- face authentication technology it is possible to authenticate a detection target from an image taken by a surveillance camera or the like. If the face authentication technology is used, the detection target can be extracted from the image taken by the surveillance camera.
- Patent Document 1 discloses a method that allows a monitor to easily perform a person's confirmation work by performing face authentication between a person reflected on a monitoring camera and an accumulated person.
- the detection target can be marked by using a color ball, paint bullet, color spray, or the like that attaches a liquid containing a special dye to the detection target.
- a technique for reliably applying an object to the moving detection target is required.
- Patent Document 2 discloses an apparatus that can cause the spotlight irradiation position to follow the position of an unspecified moving body. According to the apparatus of Patent Document 2, only a subject who moves around can be irradiated with a spotlight as a mark.
- a detection target can be detected by a monitoring camera, and a spotlight can be irradiated as a mark only to the detected detection target.
- a conspicuous mark is attached to the detection target detected in the crowd, a situation is assumed in which a panic occurs when the surroundings notice the mark, making it difficult to specify the detection target.
- the detection target itself has noticed that it has been marked and is a trigger to escape.
- An object of the present invention is to provide a target specifying system in which only a detector can specify desired information projected on a detection target in order to solve the above-described problem.
- An object specifying system includes an imaging unit that images a detection area to generate image data, and that measures the brightness of the detection area to generate photometric data and outputs the image data and the photometric data, and a detection target
- the feature data is extracted from the input image data, the feature data extracted from the image data is compared with the feature data stored in the database, and the detection target is detected.
- the target specifying means for outputting a notification signal including the collation result and the position coordinates regarding the detection target, and control for causing the imaging means to image the detection area are performed.
- An information input / output device having a control unit that performs control to emit light, and a projection unit that emits signal light toward the detection target in accordance with the control of the control unit, and the light that reaches the detector that detects the detection target
- An information recognition device that includes a shutter that controls the amount of incident light, sets an opening / closing condition of the shutter based on the control condition, and controls the opening / closing of the shutter according to the notification signal.
- the detection area is imaged to generate image data, and the brightness of the detection area is measured to generate photometric data, and feature data is extracted from the image data.
- the feature data extracted in step 3 is compared with the feature data stored in the database storing the feature data of the detection target, and the position coordinates related to the detection position of the detection target are extracted, and when the detection target is detected Generates a notification signal including the matching result and position coordinates regarding the detection target, sets control conditions for the signal light to be irradiated to the detection target based on the photometric data and the notification signal, and emits the signal light based on the control conditions Based on the control condition, the shutter opening / closing condition for controlling the incident amount of the light that is emitted to the detection target and that reaches the detector that detects the detection target is controlled. Constant, and controls opening and closing the shutter in response to the notification signal.
- the program recording medium of the present invention captures a detection area to generate image data, and also performs processing for measuring the brightness of the detection area to generate photometric data, extracts feature data from the image data, and outputs image data.
- the feature data extracted above is compared with the feature data stored in the database storing the feature data of the detection target, and the position coordinates related to the detection position of the detection target are extracted, and the detection target is detected.
- a process for emitting light to emit signal light toward the detection target, and a shutter opening / closing condition for controlling the amount of incident light reaching the detector for detecting the detection target Set on the basis of the control condition, and records a target specifying program for executing a process of controlling opening and closing of the shutter to the computer in response to the notification signal.
- FIG. 1 is a block diagram showing a configuration of an information input / output device according to a first embodiment of the present invention. It is a block diagram which shows the structure of the imaging means which concerns on the 1st Embodiment of this invention. It is a block diagram which shows the structure of the projection means which concerns on the 1st Embodiment of this invention. It is a block diagram which shows the structure of an example of the projection means which concerns on the 1st Embodiment of this invention.
- FIG. 1 is a conceptual diagram showing an example of a usage scene of the target identification system 1 according to the present embodiment.
- the interface device 100 captures an image of the detection area 200 and verifies whether or not the detection target 210 is included in the captured detection area 200.
- the interface device 100 detects the detection target 210 within the detection area 200, the interface device 100 projects the marking light 300 onto the detection target 210.
- the marking light 300 emitted by the projection unit 30 is used as a pulse to lower the average illuminance.
- a detector that recognizes the marking light 300 can capture the marking light 300 emitted in a pulse shape using the shutter glasses 600 and visually recognize the marking light 300 with relatively improved contrast.
- the shutter glasses 600 are realized by the same principle as the secure display disclosed in Non-Patent Document 1, for example.
- the shutter glasses 600 may be one using a technique different from that of Non-Patent Document 1.
- Non-Patent Document 1 “Liquid Crystal Privacy-Enhanced Displays for Mobile PCs”, J. Ishii, G. Saito, M. Imai and F. Okumura, SID'09 Digest, pp.243-246, 2009 Under daylight sunlight, the illuminance can be as high as 50000-60000 lux.
- the detection target 210 is likely to escape when the marking light 300 is visually recognized. Therefore, it is necessary to irradiate the detection target 210 with strong signal light that is visible in the daylight without being recognized by the surroundings.
- the marking light 300 that can be visually recognized by the naked eye under sunlight is irradiated with the detection target 210 in the form of a very short pulse to be legalized.
- the average illuminance of the pulsed marking light 300 is set to about 10 to 50 lux.
- the contrast of the signal light with respect to the background light is 1000: 1 or more, and the signal light is hardly visible with the naked eye.
- FIG. 2 is a conceptual diagram for explaining the illuminance of the marking light 300 irradiated on the detection target 210.
- the illuminance of the marking light 300 applied to the detection target 210 is 1000 lux
- the average illuminance of the marking light 300 is 50 lux. If the contrast of the marking light 300 irradiated to the detection target 210 is sufficiently small with respect to the background light, the irradiated signal light cannot be recognized with the naked eye. This is because the naked eye cannot detect light whose contrast is too small with respect to the brightness of the background light.
- FIG. 2 illustrates signal light with constant luminance and pulse interval, but the pulse interval of signal light can be arbitrarily set. For example, when the luminance of the signal light is lowered, the pulse interval can be made close and the brightness visually perceived by the detector can be made constant.
- FIG. 3 is a conceptual diagram for explaining the visibility when the marking light 300 irradiated to the detection target 210 is viewed with the naked eye when sunlight is used as background light.
- the contrast between the illuminance of the background light and the illuminance of the projection light becomes 1000: 1. That is, the marking light 300 cannot be visually recognized with the naked eye.
- the detection light 210 is irradiated with the marking light 300 that cannot be detected with the naked eye.
- FIG. 4 is a conceptual diagram for explaining the visibility when the marking light 300 irradiated on the detection target 210 is viewed through the shutter glasses 600 when sunlight is used as background light.
- the illuminance of the marking light 300 and the shutter opening time are automatically set according to the environment.
- the marking light 300 can be always matched with the detection target 210.
- FIG. 5 is an example in which the marking light 300 is projected only to the detection target 210 from among a plurality of people.
- the marking light 300 projected on the detection target 210 has a sufficiently small contrast with the background light, so that it cannot be visually recognized by surrounding people.
- the detector can visually recognize the marking light 300 by wearing the shutter glasses 600.
- the wearable interface device 100 detects the detection target 210 such as a criminal and irradiates the detection light 210 with the marking light 300.
- a detector such as a police officer can grasp the detection target 210 by observing the marking light 300 irradiated by the interface device 100 even if the detection target 210 is missed by himself / herself.
- FIG. 6 is a block diagram showing the configuration of the target identification system 1 according to this embodiment.
- the target identification system 1 includes an information input / output device 10 and an information recognition device 60.
- the information input / output device 10 corresponds to the interface device 100
- the information recognition device 60 corresponds to the shutter glasses 600.
- the information input / output device 10 includes an imaging unit 20, a control unit 40, and a projection unit 30.
- FIG. 8 is a block diagram showing the configuration of the imaging means 20 according to this embodiment.
- the imaging unit 20 includes an imaging device 21, an image processor 23, an internal memory 24, a photometric sensor 25, and a data output unit 26.
- At least one lens is incorporated in the imaging means 20.
- the lens incorporated in the imaging unit 20 may have a zoom function that can change the focal length.
- the imaging unit 20 may be equipped with an autofocus function for automatically focusing.
- it is preferable that the imaging unit 20 is equipped with a function applied to a general digital camera, such as a function of preventing camera shake.
- the imaging element 21 is an element for imaging an area including the detection area 200 and acquiring an image signal.
- the imaging element 21 is a photoelectric conversion element in which semiconductor components are integrated into an integrated circuit.
- the image sensor 21 can be realized by a solid-state image sensor such as a CCD (Charge-Coupled Device) or a CMOS (Complementary Metal-Oxide-Semiconductor), for example.
- the imaging element 21 is configured by an element that captures light in the visible region, but may be configured by an element that can capture and detect electromagnetic waves such as infrared rays, ultraviolet rays, X-rays, gamma rays, radio waves, and microwaves.
- the image processor 23 performs image processing such as dark current correction, interpolation calculation, color space conversion, gamma correction, aberration correction, noise reduction, and image compression on the image data captured by the image sensor 21.
- image processing such as dark current correction, interpolation calculation, color space conversion, gamma correction, aberration correction, noise reduction, and image compression on the image data captured by the image sensor 21.
- An integrated circuit dedicated to processing Note that when the image data is output without being processed, the image processor 23 can be omitted.
- the image processor 23 may be a processor designed to perform necessary processing.
- the internal memory 24 is a storage element that temporarily stores image data that cannot be processed when image processing is performed by the image processor 23 and processed image data.
- the image data captured by the image sensor 21 may be temporarily stored in the internal memory 24.
- the internal memory 24 may be configured by a general memory.
- the photometric sensor 25 is a sensor for measuring the illuminance of the detection area 200.
- the photometric sensor 25 can be realized by an electronic circuit that detects light using a photoelectric effect, such as a photoresistor or a photodiode. If the image sensor 21 has a photometric function, the photometric sensor 25 may be omitted.
- the photometric sensor 25 outputs the measured illuminance of the detection area 200 to the data output unit 26. Note that the illuminance of the detection region 200 acquired by the photometric sensor 25 may be used for the calculation of the image processor 23.
- photometric sensor 25 those using a method classified into mirror meter type photometry, finder optical path photometry, cut capacitor type photometry, direct capacitor type photometry, sub-mirror type photometry, retractable sensor type photometry, etc. are applied. it can.
- a method using a method classified into narrowing photometry instantaneous narrowing photometry, open photometry, or the like can be applied.
- a sensor using a method classified into spot photometry partial photometry, full surface photometry, center-weighted photometry, multi-segment photometry, or the like can be applied. Note that the photometry methods given here are merely examples, and the scope of the present invention is not limited thereto.
- the data output unit 26 outputs the processed image data processed by the image processor 23 to the control means 40. Further, the data output unit 26 outputs the photometric data (illuminance) of the detection area 200 measured by the photometric sensor 25 to the control means 40.
- FIG. 9 is a block diagram showing the configuration of the projection means 30 according to this embodiment.
- the projection unit 30 includes a light source 31, a light source driving unit 32, and a projection unit 38.
- the light source 31 has a light source that emits light 34 having a specific wavelength.
- the light source 31 is configured to emit light 34 in the visible light region.
- the light source 31 may be configured to emit light 34 other than the visible light region such as the infrared region and the ultraviolet region.
- the light source 31 may be a light source such as a laser light source, an LED (Light Emitting Diode), a lamp, a discharge lamp, or a light bulb. Note that the light emission source used for the light source 31 is not limited to the above.
- the light source 31 may be a reflector that reflects light emitted from another light source.
- the light source driving unit 32 drives the light source 31 based on the irradiation condition set by the control means 40 so that pulsed light with a specified output is emitted. Details of the irradiation conditions will be described later.
- Projection unit 38 projects light 34 emitted from light source 31 as signal light 39.
- the projection means 30 may be a projection means 30-1 equipped with a phase modulation type spatial modulation element as shown in FIG.
- FIG. 10 is a block diagram showing the configuration of the projection means 30-1 equipped with a phase modulation type spatial modulation element.
- the projection unit 30-1 includes a light source 310, a light source driving unit 320, a phase modulation element 350, a phase modulation element control unit 360, and a projection unit 380.
- the light source 310 has a light source that emits laser light 340 having a specific wavelength.
- the light source 310 is configured to emit laser light 340 in the visible light region.
- the light source 310 may be configured to emit laser light 340 other than the visible light region such as the infrared region or the ultraviolet region.
- the light source 310 does not uniformly project light onto an area within the projection range, but projects the light while concentrating it on a part.
- the projection unit 30-1 according to the present embodiment is often used for the purpose of projecting line drawings such as characters and symbols.
- line drawings such as characters and symbols
- the amount of laser light 340 can be reduced, so that overall laser output can be suppressed. That is, the light source 310 can be composed of a small and low power light source and a low output power source for driving the light source.
- the light source driving unit 320 drives the light source 310 based on the irradiation condition set by the control means 40 so that pulsed light with a specified output is emitted.
- the phase modulation element 350 includes a phase modulation type spatial modulation element that receives coherent laser light 340 having the same phase and modulates the phase of the incident laser light 340.
- the phase modulation element 350 emits the modulated light 370 that is modulated toward the projection unit 380.
- phase modulation element 350 In the display area on the phase modulation element 350, a phase distribution for displaying a target image on the projection surface is displayed.
- the modulated light 370 reflected from the display region of the phase modulation element 350 becomes an image in which a kind of diffraction grating forms an aggregate, and the target image is formed by collecting the light diffracted by these diffraction gratings.
- the phase modulation element 350 is realized by a spatial modulation element using, for example, a ferroelectric liquid crystal, a homogeneous liquid crystal, a vertical alignment liquid crystal, or the like.
- the phase modulation element 350 can be realized by, for example, LCOS (Liquid Crystal on Silicon). Further, the phase modulation element 350 may be realized by, for example, a MEMS (Micro Electro Mechanical System).
- the phase modulation element control unit 360 changes the phase so that the parameter that determines the difference between the phase of the laser light 340 irradiated to the display area on the phase modulation element 350 and the phase of the modulation light 370 reflected from the display area changes.
- the modulation element 350 is controlled.
- Parameters that determine the difference between the phase of the laser light 340 irradiated on the display area on the phase modulation element 350 and the phase of the modulated light 370 reflected on the display area are optical characteristics such as refractive index and optical path length, for example. Parameters.
- the phase modulation element control unit 360 changes the refractive index of the display area by controlling the voltage applied to the display area on the phase modulation element 350. As a result, the laser light 340 applied to the display area is appropriately diffracted based on the refractive index of the display area.
- phase distribution of the laser beam 340 irradiated on the phase modulation element 350 is modulated according to the optical characteristics of the display area.
- control of the phase modulation element 350 by the phase modulation element control unit 360 is not limited to the above.
- Projection unit 380 includes an optical system that projects modulated light 370 reflected by phase modulation element 350 as signal light 390.
- FIG. 11 is a conceptual diagram for explaining the optical configuration of the projection means 30-1 according to the present embodiment.
- the laser light emitted from the light source 310 is converted into parallel laser light 340 by the collimator 53 and is incident on the display surface of the phase modulation element 350.
- the incident angle of the laser beam 340 is made non-perpendicular with respect to the display surface on the phase modulation element 350. That is, in the present embodiment, the emission axis of the laser light emitted from the light source 310 is inclined with respect to the display surface on the phase modulation element 350. If the emission axis of the laser beam 340 is set obliquely with respect to the display surface on the phase modulation element 350, the efficiency can be improved because the laser beam 340 can be incident on the display surface of the phase modulation element 350 without using a beam splitter. be able to.
- the modulated light 370 modulated by the phase modulation element 350 is projected as signal light 390 by the projection unit 380.
- the projection unit 380 includes a Fourier transform lens 381, an aperture 382, and a projection lens 383.
- the Fourier transform lens 381 is an optical lens for forming an image formed when the modulated light 370 reflected by the display surface on the phase modulation element 350 is projected at infinity at a nearby focal position.
- the aperture 382 has a function of erasing higher-order light included in the light focused by the Fourier transform lens 381 and specifying an image region.
- the opening of the aperture 382 is set to be smaller than the image area of the projection image at the position of the aperture 382 and to block the peripheral area of the projection image.
- the opening of the aperture 382 is formed to be rectangular or circular.
- the aperture 382 is preferably installed at the focal position of the Fourier transform lens 381, but may be shifted from the focal position as long as the function of erasing high-order light can be exhibited.
- the projection lens 383 is an optical lens that magnifies and projects the light focused by the Fourier transform lens 381.
- the projection lens 383 projects the signal light 390 so that the target image corresponding to the phase distribution input to the phase modulation element 350 is displayed on the projection surface.
- Control means When the detection target 210 is included in the detection area 200 captured by the imaging unit 20, the control unit 40 controls the projection unit 30 to project an appropriate image toward the detection target 210.
- the control unit 40 provides the projection unit 30 with image information corresponding to an operation performed on the user interface image included in the detection area 200 captured by the imaging unit 20 and projects an image related to the image information.
- the projection means 30 is controlled to do so.
- the control means 40 can be realized by the function of a microcomputer including, for example, an arithmetic device and a control device.
- control unit 40 may be configured to transmit the result of detecting the detection target 210 and information related to the operation performed on the detection area 200 to a host system such as a server at a predetermined timing.
- the control means 40 includes an imaging control unit 41, an irradiation condition setting unit 42, a processing unit 43, a storage unit 44, a projection control unit 45, and a communication unit 46.
- the control unit 40 includes a target specifying unit 50 for specifying the detection target 210.
- FIG. 12 shows a configuration in which the control means 40 includes the target specifying means 50, the control means 40 and the target specifying means 50 may be configured differently. The object specifying unit 50 will be described in detail later.
- the imaging control unit 41 controls the imaging unit 20 to image the detection area 200.
- the imaging control unit 41 causes the imaging unit 20 to image the detection area 200 at a predetermined timing. Further, the imaging control unit 41 acquires image data and photometry data output from the imaging unit 20.
- the imaging control unit 41 outputs the image data to the target specifying unit 50, outputs the photometric data to the irradiation condition setting unit 42, and outputs the image data and the photometric data to the processing unit 43.
- the irradiation condition setting unit 42 calculates an irradiation condition including a pulse condition and an output condition of the signal light emitted from the projection unit 30 based on the illuminance of the detection area 200 included in the photometric data.
- the pulse condition includes a condition relating to a pulse width corresponding to the irradiation time of the signal light and a repetition period of emitting the signal light.
- the output conditions include conditions regarding the output intensity of the signal light, the illuminance in the detection region 200 of the signal light emitted with the output intensity, and the like.
- the irradiation condition setting unit 42 outputs the calculated irradiation condition of the signal light to the processing unit 43 and outputs the pulse condition of the signal light to the communication unit 46.
- the irradiation condition setting unit 42 may be configured to transmit the irradiation conditions to the projection control unit 45 or the communication unit 46 without going through the processing unit 43.
- the processing unit 43 acquires a notification signal including a position coordinate where the detection target 210 acquired from the target specifying unit 50 described later is detected and the irradiation condition calculated by the irradiation condition setting unit 42.
- the notification signal includes a collation result indicating whether or not the detection target 210 is detected, and includes the position coordinates of the detection target 210 when the detection target 210 is detected.
- the processing unit 43 clarifies the position coordinates of the projection image on the captured image, and generates a control condition for projecting the signal light onto the detection target 210. Based on the position coordinates of the detection target 210, the processing unit 43 calculates an irradiation direction condition including the irradiation direction of the signal light irradiated toward the detection target 210 as one of the control conditions.
- the target specifying unit 50 detects the detection target 210 by face authentication. Therefore, the processing unit 43 determines the irradiation direction of the signal light so that the signal light can be irradiated to the body part below the face position of the detection target 210.
- the control conditions generated by the processing unit 43 include conditions relating to the pulse condition, output condition, and irradiation direction condition of the signal light.
- the processing unit 43 outputs the control conditions of the projection unit 30 to the projection control unit 45.
- the storage unit 44 stores a target image projected by the projection unit 30.
- the target image includes, for example, a predetermined image for identifying that the target irradiated with the target image is the detection target 210, such as X or ⁇ .
- the storage unit 44 may be configured to store data other than the target image.
- the projection unit 30 includes a phase modulation type spatial modulation element as shown in FIG. 10, the storage unit 44 may store the phase distribution of the target image.
- the projection control unit 45 When the projection control unit 45 acquires the control condition and the irradiation condition, the projection control unit 45 acquires an appropriate target image from the storage unit 44. And the projection control part 45 provides the acquired target image to the projection means 30, controls the projection means 30 based on control conditions and irradiation conditions, and projects the target image.
- the communication unit 46 transmits the pulse condition of the signal light set by the irradiation condition setting unit 42 to the projection unit 30.
- the communication unit 46 receives the notification signal output from the target specifying unit 50.
- FIG. 13 is a block diagram illustrating a configuration of the target specifying unit 50 according to the present embodiment.
- the target specifying unit 50 according to the present embodiment includes a database 51, an image acquisition unit 52, an image analysis unit 53, a verification unit 54, and a verification result output unit 55.
- the database 51 stores facial image feature data including the detection target 210.
- the database 51 may store, for example, information on the shape of facial parts such as the nose, eyes, chin, cheekbones, and the relative positional relationship and size of these parts as feature data. Note that the data stored in the database 51 is not limited to the above. Further, the database 51 may store difference data between the feature data of the average face image created from a large number of face images and the feature data of each face image.
- the detection target 210 is detected using the feature data of the human face image, but the detection target 210 may be detected using the feature data other than the face image.
- the detection target 210 may be detected using the feature data other than the face image.
- a person's hairstyle and physique can be set in the feature data.
- it is possible to analyze a moving image it is possible to set the feature data such as how a person walks and how to walk.
- the image acquisition unit 52 acquires image data from the imaging control unit 41.
- the image acquisition unit 52 outputs the acquired image data to the image analysis unit 53.
- the image analysis unit 53 acquires image data from the image acquisition unit 52 and analyzes the acquired image data.
- the image analysis unit 53 detects a face image corresponding to a person's face from the analyzed image data, and extracts feature data from the detected face image.
- the image analysis unit 53 extracts feature data such as the shape of facial parts such as nose, eyes, chin, cheekbone, and the relative positional relationship and size of these parts from the facial image.
- the image analysis unit 53 associates the position data on the image data from which the face image is extracted with the feature data of the face image extracted from the image data.
- the image analysis unit 53 outputs data including the feature data of the extracted face image and the position coordinates corresponding to the face image to the matching unit 54.
- the collation unit 54 compares the facial image feature data extracted by the image analysis unit 53 with the individual facial image feature data stored in the database 51.
- the matching unit 54 may compare the feature data on the face images as they are, or may statistically digitize the face images and compare them.
- a general algorithm may be used as a specific algorithm for comparing face images.
- the collation unit 54 can collate whether or not the detection target 210 is included in the detection region 200 imaged by the imaging unit 20 by performing template matching using a combination of characteristic facial parts.
- the verification unit 54 When the detection unit 210 is detected, the verification unit 54 outputs a determination result that the detection target 210 has been detected to the verification result output unit 55. At this time, the collation unit 54 outputs the position coordinates of the face image of the detection target 210 together with the determination result.
- the collation result output unit 55 projects a notification signal including information indicating that the detection target 210 has been detected and the position coordinates of the face image of the detection target 210. Output to means 30.
- FIG. 14 is a block diagram showing the configuration of the information recognition apparatus 60 according to this embodiment.
- the information recognition apparatus 60 according to the present embodiment includes a receiving unit 61, an opening / closing timing setting unit 62, an opening / closing control unit 63, a shutter driving unit 64, and a shutter 65.
- the information recognition device 60 includes a shutter 65 that controls the amount of incident light reaching the detector from the detection region 200.
- the information recognition device 60 sets the opening / closing conditions of the shutter 65 based on the control conditions set by the information input / output device 10.
- the information recognition device 60 controls the opening and closing of the shutter 65 in accordance with the notification signal output from the information input / output device 10.
- the receiving means 61 is means for receiving the data output from the information input / output device 10.
- the receiving unit 61 receives data by wireless communication, for example.
- the receiving unit 61 receives the control condition set by the irradiation condition setting unit 42 of the control unit 40.
- the receiving unit 61 outputs the received control condition to the opening / closing timing setting unit 62.
- the receiving means 61 may include a general liquid crystal module image memory.
- the opening / closing timing setting means 62 sets the opening / closing conditions of the shutter 65 based on the control conditions received from the receiving means 61.
- the opening / closing timing setting means 62 outputs the set shutter opening / closing conditions to the opening / closing control means 63.
- the opening / closing control means 63 can be realized by a general liquid crystal module timing controller.
- the opening / closing timing setting means 62 sets the shutter 65 to open in a time zone including the timing when the signal light is emitted.
- the opening / closing timing setting means 62 does not always need to open the shutter 65 at the timing when the signal light is emitted.
- the opening / closing timing setting means 62 calculates a condition for relatively improving the contrast of the pulsed signal light applied to the detection target 210 with respect to the background light viewed through the shutter 65. Then, the opening / closing timing setting means 62 sets a shutter opening / closing condition that makes the signal light visible based on the calculated condition and the pulse condition included in the control condition.
- the pulse interval of the signal light may not be uniform.
- the luminance of the signal light is lowered as compared with the time zone A.
- the visual brightness is made equal to the signal light in time zone A by narrowing the pulse interval.
- the opening / closing timing setting means 62 sets the time for opening the shutter 65 shorter than the time zone A in order to improve the contrast.
- the opening / closing timing setting means 62 sets the opening / closing timing of the shutter 65 based on the brightness of the background light measured by the information input / output device 10 and the average brightness of the signal light irradiated on the detection target 210.
- the background light is darker in the morning, evening, cloudy weather, and rainy weather compared to the daytime in fine weather.
- the opening / closing timing setting means 62 sets the shutter opening / closing conditions so as to obtain an appropriate contrast while setting the average brightness of the irradiated signal light to be small.
- the background light is brighter than in normal daytime.
- the opening / closing timing setting means 62 sets the average brightness of the signal light to be irradiated and sets the shutter opening / closing conditions so as to obtain an appropriate contrast.
- the opening / closing control means 63 controls the shutter driving means 64 that opens and closes the shutter 65 based on the shutter opening / closing conditions set by the opening / closing timing setting means 62.
- the opening / closing control means 63 can be realized by a display controller of a general liquid crystal module.
- the opening / closing timing setting means 62 and the opening / closing control means 63 may be shared.
- the shutter driving unit 64 opens and closes the shutter 65 according to the control of the opening / closing control unit 63.
- the shutter driving unit 64 can be realized by an XY driver of a general liquid crystal module.
- the shutter 65 is opened and closed by the shutter driving means 64.
- the shutter 65 blocks external light when in the closed state and allows external light to pass through when in the open state.
- the shutter 65 does not have to completely block the external light when in the closed state, and may be such that the external light is dimmed when in the open state.
- the shutter 65 may be realized using, for example, a high-speed response liquid crystal, or may be realized using a MEMS (Micro Electro Mechanical System) shutter.
- a MEMS Micro Electro Mechanical System
- the brightness of the light passing through the shutter 65 falls to about 40% of the external light due to the insertion of the polarizing plate. Therefore, when viewing through the shutter 65 using TN liquid crystal, it is darker than when viewing with the naked eye.
- the MEMS shutter since there is no polarizing plate, it becomes brighter than the shutter 65 using TN liquid crystal.
- the aperture ratio is not 100%, so it is darker than when viewed with the naked eye.
- the pupil is opened because the entire field of view becomes dark, and it feels brighter than when TN liquid crystal is used.
- the structure of the shutter 65 is not limited to that described here as long as external light can be blocked, passed, or dimmed.
- FIG. 15 is an example in which the irradiation timing of the signal light and the opening / closing timing of the shutter 65 are synchronized.
- the information recognition device 60 When receiving the pulse condition of the signal light, the information recognition device 60 opens the shutter 65 at the timing when the signal light is irradiated to the detection target 210 based on the pulse condition, and the shutter 65 at the timing when the signal light is not irradiated. Control to close.
- the information recognizing device 60 performs control so that the contrast of the signal light with respect to the background light sensed through the shutter 65 is relatively improved by opening and closing the shutter 65 so as to be visible.
- FIGS. 17 and 18 show an example of shutter glasses 600 according to the present embodiment.
- the shutter glasses 600 have a frame for disposing the shutter 65 in front of at least one eye of a person who detects the detection target 210.
- a transparent body such as transparent glass or plastic may be installed at the position of the glasses lens, or only the shutter 65 may be installed.
- FIGS. 17 and 18 are examples for embodying the structure of the shutter glasses 600 according to the present embodiment, and do not limit the scope of the present invention.
- the shutter glasses 611 in FIG. 17 have a shutter 612 having the function of the shutter 65 mounted on at least one lens.
- the shutter glasses 611 are equipped with a control device 613 having functions of a receiving unit 61, an opening / closing timing setting unit 62, an opening / closing control unit 63, and a shutter driving unit 64.
- the control device 613 controls opening / closing of the shutter 611 under the pulse condition received from the information input / output device 10. Since the detector wearing the shutter glasses 611 can visually recognize the projection light projected by the information input / output device 10, the detection target 210 can be grasped.
- the 18 includes a shutter 622 having a function of the shutter 65 on at least one lens.
- the shutter glasses 621 are equipped with a control device 623 having at least the function of the shutter driving means 64. Furthermore, the shutter glasses 621 are equipped with a control device 624 having any of the functions of the receiving unit 61, the opening / closing timing setting unit 62, and the opening / closing control unit 63.
- the shutter glasses 621 can be reduced in weight by sharing the control system of the shutter 622 into two systems. Normally, electric power is required to drive the shutter glasses 621. However, if a power source is prepared in the control device 624, it is possible to achieve both weight reduction of the shutter glasses 622 and an increase in capacity of the power source.
- FIG. 19 is a flowchart for explaining the operation of the information input / output device 10 of the target identification system 1 according to this embodiment.
- the information input / output device 10 images the detection area 200 (step S11).
- the detection area 200 it is assumed that photometric data of the detection area 200 is also acquired.
- the information input / output device 10 generates image data from the captured image signal (step S12).
- the imaging means 20 of the information input / output device 10 converts an image signal that is analog data into image data that is digital data.
- the imaging unit 20 performs image processing on the image data as necessary.
- the information input / output device 10 verifies the image data (step S13).
- the target specifying unit 50 of the information input / output device 10 extracts a face image from the image data, and verifies whether or not the extracted face image matches the detection target 210.
- step S15 the information input / output device 10 sets the irradiation condition of the signal light projected on the detection target 210 (Step S15).
- the irradiation condition setting unit 42 of the information input / output device 10 sets appropriate pulse conditions and output conditions for the signal light based on the photometric data. Note that step S15 may be executed in parallel with steps S12 to S14 at the next stage after step S11.
- step S14 when the extracted face image does not coincide with the detection target 210, that is, when the detection target 210 is not found (No in step S14), the process returns to step S11.
- the information input / output device 10 transmits the pulse condition included in the irradiation condition to the information recognition apparatus 60 (step S16).
- the information input / output device 10 projects the projected light of the mark onto the detection target 210 (step S17).
- the above is the description of the operation of the information input / output device 10 of the target identification system 1 according to the present embodiment.
- the information recognition device 60 opens and closes the shutter 65 based on the received pulse condition.
- the target specifying system it is possible to irradiate the detection target with a strong signal light that can be seen in daylight without being recognized by the surroundings.
- the detector can recognize the signal light emitted to the detection target using an information recognition device such as shutter glasses.
- an information recognition device such as shutter glasses.
- FIG. 20 shows a usage scene in which the target identification system 1 is installed on the ceiling.
- the target identification system 1 When the target identification system 1 detects the detection target 210 at the position A, the target identification system 1 irradiates the detection target 210 with the marking light 300. The target identification system 1 irradiates the detection target 210 with two marking lights 300 at the position A. Then, even if the target identification system 1 realizes that the detection target 21 is irradiated with the marking light 300 and escapes, the target identification system 1 continuously detects the detection target 210 and continues to irradiate the marking light 300 even at the position B. Yes. That is, the target identification system 1 follows the detection target 210 that moves around in real time, and continues to irradiate the marking light 300 to the detection target 210.
- FIG. 21 shows a use scene when a criminal who is the detection target 210 is positioned behind another person and the detection target 210 cannot be directly irradiated with the marking light 300.
- the target specifying system 1 irradiates the person positioned immediately before the detection target 210 with the marking light 300 “back criminal” to specify the detection target 210. That is, the target identification system 1 can identify the detection target 210 by irradiating the marking light 300 to another target object even if the marking light 300 cannot be directly irradiated to the detection target 210.
- FIG. 22 shows a usage scene when the target identification system 1 cannot directly irradiate the detection target 210 with the marking light 300.
- the target identification system 1 receives a detection notification of the detection target 210 from another system, and irradiates the ground with marking light 300 that notifies the detector that the detection target is approaching according to the received detection notification.
- the detector wearing the shutter glasses 600 knows the approach of the criminal who is the detection target 210 by visually recognizing the marking light 300 irradiated on the ground.
- the target identification system 1 irradiates the marking light 300 on the displayable line 302 that can directly irradiate the detection target 210 with the marking light 300, and who is the detection target 210 among a plurality of persons approaching. May be shown.
- FIG. 23 is a block diagram showing the configuration of the information recognition apparatus 60-2 of the target identification system according to the second embodiment of the present invention.
- the target identification system according to the present embodiment has a configuration in which a photometric unit 66 is added to the information recognition device 60 of the target identification system 1 according to the first embodiment.
- the photometric means 66 is a photometric sensor that measures the brightness of the background light that has passed through the shutter 65.
- the photometric unit 66 may have the same configuration as the photometric sensor 25.
- the opening / closing timing setting means 62 sets conditions for making the signal light visible based on the brightness of the background light through the shutter 65 measured by the photometry means 66 and the averaged brightness of the pulsed signal light. .
- the visibility of the signal light can be set in accordance with the state of the background light such as actual weather conditions and illumination conditions. Therefore, it becomes possible to visually recognize the signal light more reliably.
- FIG. 24 is a block diagram showing the configuration of the target identification system 3 according to the third embodiment of the present invention.
- the target specifying system 3 according to the present embodiment is the first in that the target specifying device 50-3 that specifies the detection target 210 is connected to the information input / output device 10-2 and the information recognition device 60 via the network 70. This is different from the object identification system 1 according to the embodiment.
- the function of the target specifying device 50-3 may be assigned to a general server or the like.
- FIG. 25 is a block diagram showing the configuration of the control means 40-3 according to this embodiment. Unlike the control unit 40 according to the first embodiment, the control unit 40-3 according to the present embodiment does not include the target specifying device 50-3.
- the communication unit 46 of the control unit 40-3 according to the present embodiment is connected to the target specifying device 50-3 via the network 70.
- the other configuration of the control unit 40-3 is the same as that of the control unit 40 according to the first embodiment, and thus description thereof is omitted.
- the target specifying device since the target specifying device can be provided outside the control means, the information input / output device can be reduced in weight and size. In addition, if the target specifying device is placed on the high-performance server side, the detection processing of the detection target can be speeded up and the arithmetic processing of the control means can be reduced. In addition, since the power used for detection target processing by the target specifying device can be reduced, the power source mounted on the information input / output device can be reduced in size.
- FIG. 26 is a block diagram showing the configuration of the target identification system 4 according to the fourth embodiment of the present invention.
- the target identification system 4 according to this embodiment has a configuration in which a lighting device 80 is added to the target identification system 1 according to the first embodiment.
- the target identification system 4 according to this embodiment may have a configuration in which the illumination device 80 is added to the target identification system 3 according to the third embodiment.
- FIG. 27 is a block diagram showing the configuration of the control means 40-4 of the target identification system 4 according to this embodiment.
- the control unit 40-4 has a configuration in which an illumination control unit 47 is added to the control unit 40 of the target identification system 1 according to the first embodiment.
- the lighting device 80 irradiates a person who passes a certain point or a person staying at a certain point with specific light. That is, the illumination device 80 irradiates the target included in the detection region 200 with specific light.
- the specific light is light having a wavelength and a property different from those of the marking light 300. For example, illumination light for increasing the contrast of the marking light 300, dummy light for making the irradiation of the marking light 300 less conscious, and the like correspond to specific light.
- the illumination control unit 47 controls the illumination device 80 to emit specific light.
- the illumination control unit 47 controls the illumination device 80 to irradiate specific light to all persons passing through a certain point.
- the illumination control unit 47 may switch the irradiation timing of specific light according to the irradiation timing of the marking light 300 of the projection control unit 45.
- the illumination control unit 47 controls the illumination device 80 to emit specific light at the irradiation timing of the marking light 300 of the projection control unit 45.
- the illumination control unit 47 performs control to stop irradiation of specific light by the illumination device 80 at the irradiation timing of the marking light 300 of the projection control unit 45.
- the illumination control unit 47 may switch the irradiation of specific light at a timing different from the example given here.
- the illumination control unit 47 sets an area where the illumination device 80 emits specific light. For example, when the lighting device 80 can individually irradiate a plurality of areas, the illumination control unit 47 can perform control to irradiate a specific area among the plurality of areas. Specifically, when the lighting device 80 includes a plurality of LEDs, the lighting control unit 47 controls to drive only the LEDs that irradiate the detection target 210 with specific light and not to drive other LEDs. can do. On the contrary, the illumination control unit 47 can control not to drive only the LED that irradiates the detection target 210 with specific light.
- the illumination control unit 47 may be set to irradiate light only to a specific area by controlling the irradiation direction of the specific light by the lighting device 80.
- a phase modulation type spatial modulation element is used for the illumination device 80, it is possible to irradiate only a specific area without changing the irradiation direction.
- FIG. 28 is a conceptual diagram illustrating an example of a usage scene of the target identification system 4 including the lighting device 80.
- background light with sufficient brightness may not be obtained at night or in a dark room.
- the illumination device 80 sufficiently brightens the background light in the detection area. If the background light is sufficiently brightened, the information input / output device 10 can irradiate the detection light 210 with the marking light 300 using the method described in the first embodiment.
- the specific light emitted from the illumination device 80 may be illumination light with sufficient brightness.
- FIG. 29 is a conceptual diagram illustrating another example of a usage scene of the target identification system 4 including the lighting device 80.
- a detection area 200 is set in which all persons who pass through the checkpoint must pass.
- an illuminating device 80 for irradiating dummy marking light hereinafter referred to as dummy light 303
- the interface device 100 is used instead of the lighting device 80 so that the object that is authenticated as a child is not irradiated with the dummy light 303. That's fine.
- the interface apparatus 100 irradiates the detection target 210 with the marking light 300.
- the marking light 300 is visually recognized by the detector wearing the shutter glasses 600.
- the dummy light 303 is a continuous wave, or the pulse condition of the dummy light 303 is different from that of the marking light 300, the marking light 300 can be more easily distinguished.
- the interface device 100 may be stationary instead of wearable. However, even in the case of FIG. 29, the interface device 100 may be wearable.
- the detection target 210 can be identified.
- the detection target 210 In general, in the case of inspections, multiple police officers are present as detectors. In such a case, if the detection target 210 is specified through a single monitor or the like, it is impossible to instantly identify which person is the criminal of the detection target 210. According to the target identification system 4 according to the present embodiment, all the detectors wearing the shutter glasses 600 can identify the detection target 210 at the same time. In a scene where a criminal is arrested, such a time difference can greatly contribute to the criminal arrest.
- FIGS. 30 to 32 are conceptual diagrams for explaining how to control the opening and closing of the shutter 65 in accordance with the pulse condition of the signal light in the scene as shown in FIG. 30 to 32 are examples, and do not limit the scope of the present invention. Further, the pulse conditions and output conditions of the signal light in FIGS. 30 to 32 can be arbitrarily set.
- the interface device 100 controls the irradiation of the pulsed dummy light 303 to all persons passing through the detection region 200. Then, the interface apparatus 100 controls to irradiate the detection target 210 with the pulsed marking light 300 at a timing when the dummy light 303 is not irradiated. Since the dummy light 303 may be seen by all people, the pulse width can be set longer.
- the marking light 300 sets a short pulse condition that is not visible with the naked eye so that only the detector wearing the shutter glasses 600 can visually recognize.
- the shutter 65 is set to open in accordance with the irradiation timing of the marking light 300. At this time, the shutter glasses 600 control the opening and closing of the shutter 65 so that the contrast of the marking light 300 with respect to the background light can be visually recognized through the shutter 65.
- the interface device 100 irradiates all persons passing the detection region 200 with the continuous wave dummy light 303 and the detection target 210 with the pulsed marking light 300. Control.
- the continuous wave dummy light 303 is visible to all people.
- the marking light 300 sets a short pulse condition that cannot be visually recognized by the naked eye so that only the detector wearing the shutter glasses 600 can visually recognize.
- the peak output of the marking light 300 is set to be stronger than the dummy light 303.
- the shutter 65 is set to open in accordance with the irradiation timing of the marking light 300. At this time, the shutter glasses 600 control the opening and closing of the shutter 65 so that the contrast of the marking light 300 with respect to the background light can be visually recognized through the shutter 65.
- the interface apparatus 100 performs control so that all persons passing through the detection region 200 are irradiated with the pulsed dummy light 303. Then, the interface apparatus 100 controls to irradiate the detection target 210 with the pulse-shaped marking light 300 at the timing when the dummy light 303 is not irradiated.
- the dummy light 303 may be visible to all or may not be visible.
- the marking light 300 sets a short pulse condition that cannot be visually recognized with the naked eye so that only the detector wearing the shutter glasses 600 can visually recognize.
- the dummy light 303 is irradiated to each person passing through the detection region 200 at different irradiation timings.
- the shutter 65 is set to open only at the irradiation timing of the marking light 300. At this time, the shutter glasses 600 control the opening and closing of the shutter 65 so that the contrast of the marking light 300 with respect to the background light can be visually recognized through the shutter 65.
- the target specifying system it is possible to project the projection light that can be detected only by the detector wearing the information recognition device onto the detection target even at night or in a dark room. Moreover, according to this embodiment, by irradiating all persons with dummy light, it is possible to irradiate the detection target with the marking light without being aware of the detection target itself and the surroundings.
- FIG. 33 is a block diagram showing the configuration of the information input / output system 110 that exhibits the information input / output function of the target identification system according to the fifth embodiment of the present invention.
- the information input / output system 110 includes an imaging device 120 having the function of the imaging unit 20, a projection unit 30, a control device 140 having the function of the control unit 40, and an illumination unit 81 having the function of the illumination device 80. . That is, the information input / output system 110 has a configuration in which an illumination unit 81 having the function of the illumination device 80 according to the fourth embodiment is added. The illumination unit 81 has the same function as the illumination device 80, and detailed description thereof is omitted.
- the illumination unit 81 and the projection unit 30 constitute a light irradiation device 800.
- the light irradiation device 800 is provided as a device separate from the imaging device 120 and the control device 140 and operates under the control of the control device 140.
- FIG. 34 shows an example in which the detection target 210 is detected indoors.
- a plurality of light irradiation devices 800 are installed on the ceiling.
- the imaging device 120 and the control device 140 are installed at a position different from the installation position of the light irradiation device 800.
- the light irradiation device 800 may be configured to be controlled by the control device 140 through a network such as an intranet or the Internet.
- FIG. 34 shows a scene in which illumination light is irradiated as specific light from the light irradiation device 800 at the timing when the shutter glasses 600 worn by the detector are closed. At this time, illumination light from the light irradiation device 800 is irradiated to all persons in the room.
- FIG. 34 shows a scene in which the detection target 210 is irradiated with illumination light only from the light irradiation device 800 in the vicinity of the detection target 210 when the shutter glasses 600 worn by the detector are opened.
- the light irradiation device 800 that has not irradiated the illumination light irradiates the detection light 210 with the marking light 300.
- other persons in the room do not recognize the marking light 300.
- a detection target 210 is irradiated with illumination light from a certain light irradiation device 800, and the detection target 210 is irradiated from a plurality of other light irradiation devices 800.
- the marking light 300 is irradiated.
- the illumination light applied to the detection target 210 from the light irradiation device 800 is made as dark as necessary within a range where the detection target 210 can be visually recognized.
- the contrast of the marking light 300 is improved, and it becomes easier for only the detector wearing the shutter glasses 600 to detect the marking light 300.
- the detection target 210 is irradiated with illumination light from a plurality of light irradiation devices 800, it is possible to prevent a blind spot from occurring due to the posture of the detection target 210 or the standing position of the detector.
- marking light 300 is irradiated only to the detection target 210 while irradiating the person passing through a certain point with the band-shaped light from the light irradiation means 800.
- the light irradiation by the light irradiation apparatus 800 in FIGS. 35 to 38 can be controlled at the timing as shown in FIGS. 30 to 32, the marking light 300 may be irradiated at the timing when the detection target 210 is irradiated with the signal light, and the dummy light 303 may be irradiated at other timings.
- FIG. 35 irradiates a band-shaped light toward the ground from a light irradiation device 800 installed on a ceiling or the like.
- the light irradiation device 800 scans the band-shaped light from the position C to the position C ′. Then, the light irradiation device 800 irradiates the detection target 210 with the marking light 300 at the timing of scanning the detection target 210. At this time, the dummy light 303 is projected onto a target other than the detection target 210 that passes between C and C ′.
- Irradiation apparatus 800 may scan so as to reciprocate the band-shaped light between position C and position C ′.
- the control device 140 notifies the shutter glasses 600 of the timing of irradiating the detection light 210 with the marking light 300.
- the shutter glasses 600 open the shutter 65 at the timing when the marking light 300 is irradiated on the detection target 210.
- FIG. 36 is a conceptual diagram of the band-shaped light irradiated from the light irradiation device 800 viewed from an angle different from that in FIG.
- the imaging device 120 images a person who passes band-like light
- the control device 140 detects the face and position of the person on the captured image.
- the light irradiation apparatus 800 stops the light irradiation when the light is irradiated to the position of the face of the person.
- the light irradiation apparatus 800 irradiates light near the shoulder when the person is at position B, near the waist when the person is at position C, and near the knee when the person is at position D. That is, the light irradiation device 800 irradiates light while avoiding a human face.
- FIG. 37 shows an example in which a person other than the detection target 210 among persons passing between C and C ′ is irradiated with band-shaped light from the light irradiation means 800.
- the light irradiation device 800 irradiates a person other than the detection target 210 with a band-shaped light as shown on the left side of FIG. At this time, the dummy light 303 is projected to the person irradiated with the band-like light. On the other hand, the light irradiation apparatus 800 does not irradiate the detection target 210 with band-shaped light. Then, the light irradiation device 800 irradiates the detection light 210 with the marking light 300 as shown on the right side of FIG.
- the detection target 210 can be detected without being known to the surroundings.
- FIG. 38 shows an example in which the band-like light is irradiated from the light irradiation means 800 to all persons passing between C-C ′.
- the light irradiating device 800 irradiates all persons passing between the positions C-C ′ with a band-like light at the timing of closing the shutter glasses 600 as shown on the left side of FIG. At this time, the dummy light 303 is projected to all persons including the detection target 210. Then, the light irradiation device 800 projects the dummy light 304 different from the dummy light 303 onto the detection target 210 as shown on the right side of FIG.
- the light irradiation device 800 does not irradiate the same dummy light to everyone at the timing of closing the shutter glasses 600, and detects a dummy light (second dummy light) different from the dummy light 303 (first dummy light). Only the object 210 may be irradiated.
- the information input / output system 100 forms an image having a specific meaning.
- the detector wearing the shutter glasses 600 can recognize an image having a specific meaning by recognizing an image in which the second dummy light and the marking light 300 are combined. According to the method of FIG. 38, when a specific meaningful image is projected onto the detection target 210, the specific image can be made more difficult to recognize from a third party.
- FIG. 39 is a block diagram showing the configuration of the information recognition apparatus 60-6 of the target identification system according to the sixth embodiment of the present invention.
- the target identification system according to the present embodiment has a configuration in which an alarm unit 67 is added to the information recognition device 60 of the target identification system 1 according to the first embodiment.
- the alarm means 67 notifies the detector that the detection target 210 has been detected when the receiving means 61 of the information recognition apparatus 60-6 receives the notification signal.
- the alarm means 67 notifies the detector that the detection target 210 has been detected by sound, vibration, smell, heat, physical stimulation, or the like.
- the alarm means 67 may notify the detector that the detection target 210 has been detected by light that is visible only to the detector.
- information other than the information given here may be used.
- the detection object can be notified as an alarm to the detection person with information other than visual information, the detection person can easily grasp the detection object more reliably.
- the computer 90 in FIG. 40 is a configuration example for enabling the information input / output device according to each embodiment, and does not limit the scope of the present invention.
- the information input / output device according to each embodiment is preferably a microcomputer having the function of the computer 90 of FIG. 40 in the case of the form of the interface device 100 as shown in FIG.
- the computer 90 includes a processor 91, a main storage device 92, an auxiliary storage device 93, an input / output interface 95, and a communication interface 96.
- the processor 91, the main storage device 92, the auxiliary storage device 93, the input / output interface 95, and the communication interface 96 are connected to each other via a bus 99 so as to exchange data.
- the processor 91, the main storage device 92, the auxiliary storage device 93, and the input / output interface 95 are connected to a network such as the Internet or an intranet via a communication interface 96.
- the computer 90 is connected to a server or computer of a higher system via a network, and receives information related to the projection image from the higher system.
- the processor 91 expands the program stored in the auxiliary storage device 93 or the like in the main storage device 92, and executes the expanded program.
- a configuration using a software program installed in the computer 90 may be adopted.
- the processor 91 executes arithmetic processing and control processing of the information input / output device according to the present embodiment.
- the main storage device 92 has an area where the program is expanded.
- the main storage device 92 may be a volatile memory such as a DRAM (Dynamic Random Access Memory). Further, a non-volatile memory such as an MRAM (Magnetic Resistive Random Access Memory) may be configured and added as the main storage device 92.
- DRAM Dynamic Random Access Memory
- MRAM Magnetic Resistive Random Access Memory
- the auxiliary storage device 93 is means for storing data such as the phase distribution of the projected image.
- the auxiliary storage device 93 is configured by a local disk such as a hard disk or a flash memory.
- the phase distribution of the projected image can be stored in the main storage device 92, and the auxiliary storage device 93 can be omitted.
- the input / output interface 95 is a device that connects the computer 90 and peripheral devices based on the connection standard between the computer 90 and peripheral devices.
- the communication interface 96 is an interface for connecting to a network such as the Internet or an intranet based on standards and specifications. In FIG. 40, the interface is abbreviated as I / F (Interface).
- I / F Interface
- the input / output interface 95 and the communication interface 96 may be shared as an interface connected to an external device.
- the computer 90 may be configured so that input devices such as a keyboard, a mouse, and a touch panel can be connected as necessary. These input devices are used for inputting information and settings. Note that when the touch panel is used as an input device, the display unit of the display device may serve as the input unit of the input device. Data exchange between the processor 91 and the input device may be mediated by the input interface 95.
- the communication interface 96 is connected to a host system such as another computer or server through a network.
- the host system transmits the phase distribution of the basic image used in each embodiment of the present invention to the computer 90 via the communication interface 96.
- the host system transmits information related to the projection image used in each embodiment of the present invention to the computer 90 via the communication interface 96.
- the host system may generate the phase distribution of the projection image used in each embodiment of the present invention by itself or may acquire it from another device.
- the computer 90 may be provided with a display device for displaying various information.
- the computer 90 is preferably provided with a display control device (not shown) for controlling the display of the display device.
- the display device may be connected to the computer 90 via the input interface 95.
- the computer 90 may be provided with a reader / writer as necessary.
- the reader / writer is connected to the bus 99 and reads a data program from the recording medium and writes a processing result of the computer 90 to the recording medium between the processor 91 and a recording medium (program recording medium) (not shown).
- the recording medium can be realized by a semiconductor recording medium such as an SD (Secure Digital) card or a USB (Universal Serial Bus) memory.
- the recording medium 516 may be realized by a magnetic recording medium such as a flexible disk, an optical recording medium such as a CD (Compact Disk) or a DVD (Digital Versatile Disc), or other recording medium.
- FIG. 40 is an example of a hardware configuration for enabling the information input / output device according to each embodiment, and does not limit the scope of the present invention.
- a processing program for causing a computer to execute processing by the information input / output device according to each embodiment is also included in the scope of the present invention.
- a program recording medium recording the processing program according to each embodiment is also included in the scope of the present invention.
- the security guard or the store clerk who marks the detection may mark this detection target.
- the method of the embodiment can be applied. For example, in a store such as a department store or a showroom, when a customer who has a history of damaging a product in the store or having behaved suspiciously has visited the store, the store clerk who is the detector detects the customer as a detection target.
- the method of this embodiment can also be applied to a scene to be marked.
- the administrator of the public facility that is the detector
- the method of this embodiment can also be applied to a scene where a visitor is marked as a detection target.
- the technique of each embodiment of the present invention can be applied even when an animal or an object is a detection target.
- Imaging means 21 Imaging element 23 Image processor 24 Internal memory 25 Photometric sensor 26 Data output part 30 Projection means 31 Light source 32 Light source drive part 35 Phase modulation element 36 Phase modulation element control part 38 Projection Unit 40 control unit 41 imaging control unit 42 irradiation condition setting unit 43 processing unit 44 storage unit 45 projection control unit 46 communication unit 50 target specifying unit 51 database 52 image acquisition unit 53 image analysis unit 54 collation unit 55 collation result output unit 60 information Recognizing device 61 Receiving means 62 Opening / closing timing setting means 63 Opening / closing control means 64 Shutter driving means 65 Shutter 80 Illuminating device 81 Illuminating means 90 Computer 91 Processor 92 Main storage device 93 Auxiliary storage device 95 I / O interface Face 96 Communication interface 99 bus 100 interface device 110 information output system 120 imaging device 140 controller 800 light radiation device
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Projection Apparatus (AREA)
- Collating Specific Patterns (AREA)
- Closed-Circuit Television Systems (AREA)
- Burglar Alarm Systems (AREA)
- Alarm Systems (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
This target identification system is provided with: an information input/output device comprising an imaging means which images a detection region so as to enable only a detecting person to identify desired information projected onto a detection target, a target identification means which compares feature data extracted from image data and feature data stored in a database and generates positional coordinates of the detection position of the detection target, a control means which, on the basis of photometric data and a communication signal, sets a control condition of a signal light irradiated onto the detection target and performs control for emitting the signal light on the basis of the control condition, and a projection means which irradiates the signal light onto the detection target; and an information recognition device which, on the base of the control condition, configures a shutter opening and closing condition for controlling the amount of incident light reaching the detecting person and which performs control for opening and closing the shutter depending on the communication signal.
Description
本発明は、複数の対象の中から検出対象を特定する対象特定システム、対象特定方法およびプログラム記録媒体に関する。
The present invention relates to a target specifying system, a target specifying method, and a program recording medium for specifying a detection target from a plurality of targets.
雑踏の中で犯罪者や要注意人物などの検出対象を確実に検出できれば、より安全な社会を実現できる。近年、顔認証技術などの進歩により、監視カメラなどで撮影した画像から検出対象を認証できるようになっている。顔認証技術を用いれば、監視カメラが撮影した画像中から検出対象を抽出できる。
A safer society can be realized if it is possible to reliably detect detection targets such as criminals and people who need attention in a hustle and bustle. In recent years, with the progress of face authentication technology and the like, it is possible to authenticate a detection target from an image taken by a surveillance camera or the like. If the face authentication technology is used, the detection target can be extracted from the image taken by the surveillance camera.
特許文献1には、監視カメラに映った人物と蓄積した人物との顔認証を行うことによって、監視員に人物の確認作業を容易に行わせることができる方法が開示されている。
Patent Document 1 discloses a method that allows a monitor to easily perform a person's confirmation work by performing face authentication between a person reflected on a monitoring camera and an accumulated person.
また、検出された対象を確実に捕捉するために、検出対象に対して何らかの目印を付けることが求められる。例えば、特殊染料を含む液体を検出対象に付着させるカラーボールやペイント弾、カラースプレーなどを用いることによって、検出対象に目印を付けることができる。しかしながら、特殊染料を検出対象に付着させるには、移動する検出対象に確実に物を当てる技術が必要となる。また、雑踏の中で検出対象のみに特殊染料を付着させることは難しい。
Also, in order to reliably capture the detected object, it is required to put some mark on the detected object. For example, the detection target can be marked by using a color ball, paint bullet, color spray, or the like that attaches a liquid containing a special dye to the detection target. However, in order to attach the special dye to the detection target, a technique for reliably applying an object to the moving detection target is required. In addition, it is difficult to attach the special dye only to the detection target in the crowd.
特許文献2には、不特定の移動体の位置にスポットライトの照射位置を追従させることができる装置について開示されている。特許文献2の装置によれば、動き回る対象者のみに目印となるスポットライトを照射できる。
Patent Document 2 discloses an apparatus that can cause the spotlight irradiation position to follow the position of an unspecified moving body. According to the apparatus of Patent Document 2, only a subject who moves around can be irradiated with a spotlight as a mark.
特許文献1および2の技術を用いれば、監視カメラによって検出対象を検出し、検出された検出対象のみに目印としてスポットライトを照射できる。しかしながら、雑踏の中で検出された検出対象に目立つ目印を付けると、周囲がその目印に気が付いた際にパニックが起こり、検出対象を特定しにくくなる状況が想定される。また、検出対象自身が目印を付けられたことに気づき、逃げるきっかけになる場合も想定される。
If the techniques of Patent Documents 1 and 2 are used, a detection target can be detected by a monitoring camera, and a spotlight can be irradiated as a mark only to the detected detection target. However, if a conspicuous mark is attached to the detection target detected in the crowd, a situation is assumed in which a panic occurs when the surroundings notice the mark, making it difficult to specify the detection target. In addition, it may be assumed that the detection target itself has noticed that it has been marked and is a trigger to escape.
本発明の目的は、上述した課題を解決するために、検出者のみが検出対象に投射された所望の情報を特定できる対象特定システムを提供することにある。
An object of the present invention is to provide a target specifying system in which only a detector can specify desired information projected on a detection target in order to solve the above-described problem.
本発明の対象特定システムは、検出領域を撮像して画像データを生成するとともに、検出領域の明るさを測光して測光データを生成し、画像データおよび測光データを出力する撮像手段と、検出対象の特徴データを格納するデータベースを含み、入力した画像データ上で特徴データを抽出し、画像データ上で抽出された特徴データとデータベースに格納されている特徴データとを照合するとともに、検出対象の検出位置に関する位置座標を抽出し、検出対象が検出された際には、検出対象に関する照合結果と位置座標とを含む通知信号を出力する対象特定手段と、撮像手段に検出領域を撮像させる制御を行うとともに、入力した測光データおよび通知信号に基づいて検出対象に照射する信号光の制御条件を設定し、制御条件に基づいて信号光を出射させる制御を行う制御手段と、制御手段の制御に応じて、検出対象に向けて信号光を照射する投射手段とを有する情報入出力装置と、検出対象を検出する検出者に届く光の入射量を制御するシャッタを有し、制御条件に基づいてシャッタの開閉条件を設定し、通知信号に応じてシャッタを開閉制御する情報認識装置とを備える。
An object specifying system according to the present invention includes an imaging unit that images a detection area to generate image data, and that measures the brightness of the detection area to generate photometric data and outputs the image data and the photometric data, and a detection target The feature data is extracted from the input image data, the feature data extracted from the image data is compared with the feature data stored in the database, and the detection target is detected. When the position coordinates related to the position are extracted and the detection target is detected, the target specifying means for outputting a notification signal including the collation result and the position coordinates regarding the detection target, and control for causing the imaging means to image the detection area are performed. Along with the input photometric data and the notification signal, set the control condition of the signal light to irradiate the detection target, and signal based on the control condition An information input / output device having a control unit that performs control to emit light, and a projection unit that emits signal light toward the detection target in accordance with the control of the control unit, and the light that reaches the detector that detects the detection target An information recognition device that includes a shutter that controls the amount of incident light, sets an opening / closing condition of the shutter based on the control condition, and controls the opening / closing of the shutter according to the notification signal.
本発明の対象特定方法においては、検出領域を撮像して画像データを生成するともに、検出領域の明るさを測光して測光データを生成し、画像データ上で特徴データを抽出し、画像データ上で抽出された特徴データと、検出対象の特徴データを格納するデータベースに格納されている特徴データとを照合するとともに、検出対象の検出位置に関する位置座標を抽出し、検出対象が検出された際には、検出対象に関する照合結果と位置座標とを含む通知信号を生成し、測光データおよび通知信号に基づいて検出対象に照射する信号光の制御条件を設定し、制御条件に基づいて信号光を出射させる制御を行い、検出対象に向けて信号光を出射し、検出対象を検出する検出者に届く光の入射量を制御するシャッタの開閉条件を制御条件に基づいて設定し、通知信号に応じてシャッタを開閉制御する。
In the object specifying method of the present invention, the detection area is imaged to generate image data, and the brightness of the detection area is measured to generate photometric data, and feature data is extracted from the image data. The feature data extracted in step 3 is compared with the feature data stored in the database storing the feature data of the detection target, and the position coordinates related to the detection position of the detection target are extracted, and when the detection target is detected Generates a notification signal including the matching result and position coordinates regarding the detection target, sets control conditions for the signal light to be irradiated to the detection target based on the photometric data and the notification signal, and emits the signal light based on the control conditions Based on the control condition, the shutter opening / closing condition for controlling the incident amount of the light that is emitted to the detection target and that reaches the detector that detects the detection target is controlled. Constant, and controls opening and closing the shutter in response to the notification signal.
本発明のプログラム記録媒体は、検出領域を撮像して画像データを生成するともに、検出領域の明るさを測光して測光データを生成する処理と、画像データ上で特徴データを抽出し、画像データ上で抽出された特徴データと、検出対象の特徴データを格納するデータベースに格納されている特徴データとを照合するとともに、検出対象の検出位置に関する位置座標を抽出し、検出対象が検出された際には、検出対象に関する照合結果と位置座標とを含む通知信号を生成する処理と、測光データおよび通知信号に基づいて検出対象に照射する信号光の制御条件を設定し、制御条件に基づいて信号光を出射させる制御を行い、検出対象に向けて信号光を出射する処理と、検出対象を検出する検出者に届く光の入射量を制御するシャッタの開閉条件を制御条件に基づいて設定し、通知信号に応じてシャッタを開閉制御する処理とをコンピュータに実行させる対象特定プログラムを記録する。
The program recording medium of the present invention captures a detection area to generate image data, and also performs processing for measuring the brightness of the detection area to generate photometric data, extracts feature data from the image data, and outputs image data. When the feature data extracted above is compared with the feature data stored in the database storing the feature data of the detection target, and the position coordinates related to the detection position of the detection target are extracted, and the detection target is detected Includes a process for generating a notification signal including a collation result and a position coordinate regarding the detection target, and a control condition for the signal light to be irradiated to the detection target based on the photometric data and the notification signal, and the signal based on the control condition is set. A process for emitting light to emit signal light toward the detection target, and a shutter opening / closing condition for controlling the amount of incident light reaching the detector for detecting the detection target Set on the basis of the control condition, and records a target specifying program for executing a process of controlling opening and closing of the shutter to the computer in response to the notification signal.
本発明によれば、検出者のみが検出対象に投射された所望の情報を特定できる対象特定システムを提供することが可能になる。
According to the present invention, it is possible to provide a target specifying system in which only a detector can specify desired information projected on a detection target.
以下に、本発明を実施するための形態について図面を用いて説明する。ただし、以下に述べる実施形態には、本発明を実施するために技術的に好ましい限定がされているが、発明の範囲を以下に限定するものではない。なお、以下の実施形態の説明に用いる全図においては、特に理由が無い限り、同様箇所には同一符号を付す。また、以下の実施形態において、同様の構成・動作に関しては繰り返しの説明を省略する場合がある。
Hereinafter, embodiments for carrying out the present invention will be described with reference to the drawings. However, the preferred embodiments described below are technically preferable for carrying out the present invention, but the scope of the invention is not limited to the following. In all the drawings used for the description of the following embodiments, the same reference numerals are given to the same parts unless there is a particular reason. In the following embodiments, repeated description of similar configurations and operations may be omitted.
(第1の実施形態)
まず、本発明の第1の実施形態に係る対象特定システム1の概略について説明する。対象特定システム1の具体的な構成や動作については後述する。 (First embodiment)
First, an outline of thetarget identification system 1 according to the first embodiment of the present invention will be described. The specific configuration and operation of the target identification system 1 will be described later.
まず、本発明の第1の実施形態に係る対象特定システム1の概略について説明する。対象特定システム1の具体的な構成や動作については後述する。 (First embodiment)
First, an outline of the
図1は、本実施形態に係る対象特定システム1の利用シーンの一例を示す概念図である。
FIG. 1 is a conceptual diagram showing an example of a usage scene of the target identification system 1 according to the present embodiment.
インターフェース装置100は、検出領域200の範囲内を撮像し、撮像した検出領域200に検出対象210が含まれるか否かを検証する。そして、インターフェース装置100は、検出領域200の範囲内において検出対象210を検出すると、検出対象210にマーキング光300を投射する。
The interface device 100 captures an image of the detection area 200 and verifies whether or not the detection target 210 is included in the captured detection area 200. When the interface device 100 detects the detection target 210 within the detection area 200, the interface device 100 projects the marking light 300 onto the detection target 210.
本実施形態に係る対象特定システム1においては、裸眼での視認性を下げるために、投射手段30が照射するマーキング光300をパルスにして平均照度を下げる。
In the target identification system 1 according to this embodiment, in order to reduce the visibility with the naked eye, the marking light 300 emitted by the projection unit 30 is used as a pulse to lower the average illuminance.
マーキング光300を認識する検出者は、シャッタメガネ600を使ってパルス状に出射されたマーキング光300を取り込み、相対的にコントラストが向上されたマーキング光300を視認できる。シャッタメガネ600は、例えば、非特許文献1に開示されているセキュアディスプレイと同様の原理によって実現される。なお、シャッタメガネ600は、非特許文献1とは異なる技術を用いたものであってもよい。
(非特許文献1)“Liquid Crystal Privacy-Enhanced Displays for Mobile PCs”,J. Ishii, G. Saito,M. Imai and F. Okumura, SID'09 Digest, pp.243-246, 2009
昼間の太陽光下においては、照度が50000~60000ルクスにもなる。太陽光下において裸眼で視認できるマーキング光300を出射するには、強力な信号光を出射する必要がある。場合によっては違法なレーザ光を信号光として出射する必要性が生じる。しかしながら、法律上は信号光の上限に制限があるため、強力な信号光を連続的に出射し続けることができない場合も想定される。 A detector that recognizes the marking light 300 can capture the marking light 300 emitted in a pulse shape using theshutter glasses 600 and visually recognize the marking light 300 with relatively improved contrast. The shutter glasses 600 are realized by the same principle as the secure display disclosed in Non-Patent Document 1, for example. The shutter glasses 600 may be one using a technique different from that of Non-Patent Document 1.
(Non-Patent Document 1) “Liquid Crystal Privacy-Enhanced Displays for Mobile PCs”, J. Ishii, G. Saito, M. Imai and F. Okumura, SID'09 Digest, pp.243-246, 2009
Under daylight sunlight, the illuminance can be as high as 50000-60000 lux. In order to emit the marking light 300 that can be viewed with the naked eye under sunlight, it is necessary to emit strong signal light. In some cases, it becomes necessary to emit illegal laser light as signal light. However, there is a legal limit on the upper limit of signal light, and it may be assumed that strong signal light cannot be continuously emitted.
(非特許文献1)“Liquid Crystal Privacy-Enhanced Displays for Mobile PCs”,J. Ishii, G. Saito,M. Imai and F. Okumura, SID'09 Digest, pp.243-246, 2009
昼間の太陽光下においては、照度が50000~60000ルクスにもなる。太陽光下において裸眼で視認できるマーキング光300を出射するには、強力な信号光を出射する必要がある。場合によっては違法なレーザ光を信号光として出射する必要性が生じる。しかしながら、法律上は信号光の上限に制限があるため、強力な信号光を連続的に出射し続けることができない場合も想定される。 A detector that recognizes the marking light 300 can capture the marking light 300 emitted in a pulse shape using the
(Non-Patent Document 1) “Liquid Crystal Privacy-Enhanced Displays for Mobile PCs”, J. Ishii, G. Saito, M. Imai and F. Okumura, SID'09 Digest, pp.243-246, 2009
Under daylight sunlight, the illuminance can be as high as 50000-60000 lux. In order to emit the marking light 300 that can be viewed with the naked eye under sunlight, it is necessary to emit strong signal light. In some cases, it becomes necessary to emit illegal laser light as signal light. However, there is a legal limit on the upper limit of signal light, and it may be assumed that strong signal light cannot be continuously emitted.
また、周囲の人間がマーキング光300を視認できてしまうと、場合によってはパニックを起こす可能性がある。検出対象210は、マーキング光300を視認すると、逃亡する可能性が高い。そのため、周囲に悟られずに、昼間の太陽光下で見える強い信号光を検出対象210に照射する必要が生じる。
In addition, if surrounding people can visually recognize the marking light 300, there is a possibility of causing a panic in some cases. The detection target 210 is likely to escape when the marking light 300 is visually recognized. Therefore, it is necessary to irradiate the detection target 210 with strong signal light that is visible in the daylight without being recognized by the surroundings.
このような状況を克服するため、本実施形態においては、以下のようなアプローチをとる。
In order to overcome such a situation, the following approach is taken in this embodiment.
第1に、太陽光下において裸眼で視認できるマーキング光300を、非常に短いパルス状にして検出対象210に照射し、適法化する。第2に、パルス状のマーキング光300の平均照度を10~50ルクス程度にする。この結果、背景光に対する信号光のコントラストは1000:1以上になり、裸眼ではほとんど信号光を視認できなくなってしまう。
First, the marking light 300 that can be visually recognized by the naked eye under sunlight is irradiated with the detection target 210 in the form of a very short pulse to be legalized. Second, the average illuminance of the pulsed marking light 300 is set to about 10 to 50 lux. As a result, the contrast of the signal light with respect to the background light is 1000: 1 or more, and the signal light is hardly visible with the naked eye.
図2は、検出対象210に照射するマーキング光300の照度について説明するための概念図である。例えば、検出対象210に照射されたマーキング光300の照度が1000ルクス、マーキング光300の平均的な照度が50ルクスであったとする。背景光に対して、検出対象210に照射されたマーキング光300のコントラストが十分に小さいと、照射された信号光を裸眼では認識できない。なぜならば、裸眼では、背景光の明るさに対してコントラストがあまりにも小さい光は検出できないためである。
FIG. 2 is a conceptual diagram for explaining the illuminance of the marking light 300 irradiated on the detection target 210. For example, it is assumed that the illuminance of the marking light 300 applied to the detection target 210 is 1000 lux, and the average illuminance of the marking light 300 is 50 lux. If the contrast of the marking light 300 irradiated to the detection target 210 is sufficiently small with respect to the background light, the irradiated signal light cannot be recognized with the naked eye. This is because the naked eye cannot detect light whose contrast is too small with respect to the brightness of the background light.
なお、図2では、輝度やパルス間隔が一定の信号光について例示しているが、信号光のパルス間隔は任意に設定できる。例えば、信号光の輝度を落とす場合は、パルス間隔を密にし、検出者によって視覚される明るさを一定にすることができる。
Note that FIG. 2 illustrates signal light with constant luminance and pulse interval, but the pulse interval of signal light can be arbitrarily set. For example, when the luminance of the signal light is lowered, the pulse interval can be made close and the brightness visually perceived by the detector can be made constant.
図3は、太陽光を背景光とした際に、裸眼で、検出対象210に照射されたマーキング光300を見た場合の視認性について説明するための概念図である。
FIG. 3 is a conceptual diagram for explaining the visibility when the marking light 300 irradiated to the detection target 210 is viewed with the naked eye when sunlight is used as background light.
例えば、昼間の太陽光が50000ルクスであったとする。このとき、図3のように、平均照度が50ルクスとなるマーキング光300を検出対象210に照射すると、背景光の照度と投射光の照度とのコントラストが1000:1となる。すなわち、裸眼ではマーキング光300を視認できない。本実施形態においては、背景光とマーキング光300とのコントラストを利用して、裸眼では検出できないマーキング光300を検出対象210に照射する。
For example, suppose that daytime sunlight is 50000 lux. At this time, as shown in FIG. 3, when the marking light 300 having an average illuminance of 50 lux is irradiated onto the detection target 210, the contrast between the illuminance of the background light and the illuminance of the projection light becomes 1000: 1. That is, the marking light 300 cannot be visually recognized with the naked eye. In the present embodiment, using the contrast between the background light and the marking light 300, the detection light 210 is irradiated with the marking light 300 that cannot be detected with the naked eye.
図4は、太陽光を背景光とした際に、シャッタメガネ600を通して、検出対象210に照射されたマーキング光300を見た場合の視認性について説明するための概念図である。
FIG. 4 is a conceptual diagram for explaining the visibility when the marking light 300 irradiated on the detection target 210 is viewed through the shutter glasses 600 when sunlight is used as background light.
シャッタメガネ600を装着した検出者にとっては、シャッタメガネ600のシャッタが1周期で1/100の時間しか開いていないため、背景光の明るさは500ルクス程度に落ちることになる。一方、マーキング光300は、シャッタが開いているときだけシャッタを通過するため、平均照度は10~50ルクス程度のままである。その結果、背景光とマーキング光300のコントラストが10:1となるため、シャッタメガネ600を装着した検出者はマーキング光300を十分視認できるようになる。通常、環境によって上述した照度のレベルは変化する。そのため、本実施形態においては、環境に合わせて、マーキング光300の照度およびシャッタ開口時間が自動的に設定されるように構成する。
For the detector wearing the shutter glasses 600, since the shutter of the shutter glasses 600 is opened only for 1/100 time in one cycle, the brightness of the background light falls to about 500 lux. On the other hand, since the marking light 300 passes through the shutter only when the shutter is open, the average illuminance remains about 10 to 50 lux. As a result, the contrast between the background light and the marking light 300 is 10: 1, so that the detector wearing the shutter glasses 600 can fully see the marking light 300. Usually, the illuminance level described above changes depending on the environment. Therefore, in the present embodiment, the illuminance of the marking light 300 and the shutter opening time are automatically set according to the environment.
例えば、ヘッドマウントディスプレイのような装置を用いても、検出対象210にバーチャルな目印を付けることは可能であるが、検出対象210にマーキング光300を一致させることは困難である。それに対し、本実施形態に係る対象特定システム1によれば、検出対象210にマーキング光300を常に一致させることができる。
For example, even if an apparatus such as a head-mounted display is used, it is possible to attach a virtual mark to the detection target 210, but it is difficult to make the marking light 300 coincide with the detection target 210. On the other hand, according to the target identification system 1 according to the present embodiment, the marking light 300 can be always matched with the detection target 210.
図5は、複数人の中から検出対象210のみにマーキング光300を投射する例である。
FIG. 5 is an example in which the marking light 300 is projected only to the detection target 210 from among a plurality of people.
検出対象210に投射されたマーキング光300は、背景光とのコントラストが十分に小さいため、周囲の人には視認できない。一方、検出者は、シャッタメガネ600をかけることによって、マーキング光300を視認できる。
The marking light 300 projected on the detection target 210 has a sufficiently small contrast with the background light, so that it cannot be visually recognized by surrounding people. On the other hand, the detector can visually recognize the marking light 300 by wearing the shutter glasses 600.
すなわち、図5の例では、ウエアラブル型のインターフェース装置100が犯罪者などの検出対象210を検出し、マーキング光300を検出対象210に照射する。警察官などの検出者は、自身で検出対象210を見逃してしまっても、インターフェース装置100が照射したマーキング光300を見ることによって、検出対象210を把握できる。
That is, in the example of FIG. 5, the wearable interface device 100 detects the detection target 210 such as a criminal and irradiates the detection light 210 with the marking light 300. A detector such as a police officer can grasp the detection target 210 by observing the marking light 300 irradiated by the interface device 100 even if the detection target 210 is missed by himself / herself.
以上が、本実施形態に係る対象特定システム1の概略についての説明である。
The above is the description of the outline of the target identification system 1 according to the present embodiment.
<構成>
次に、本発明の第1の実施形態に係る対象特定システム1の構成について、図面を参照しながら説明する。 <Configuration>
Next, the configuration of thetarget identification system 1 according to the first embodiment of the present invention will be described with reference to the drawings.
次に、本発明の第1の実施形態に係る対象特定システム1の構成について、図面を参照しながら説明する。 <Configuration>
Next, the configuration of the
図6は、本実施形態に係る対象特定システム1の構成を示すブロック図である。図6のように、本実施形態に係る対象特定システム1は、情報入出力装置10および情報認識装置60を備える。なお、図1において、情報入出力装置10がインターフェース装置100に相当し、情報認識装置60がシャッタメガネ600に相当する。
FIG. 6 is a block diagram showing the configuration of the target identification system 1 according to this embodiment. As shown in FIG. 6, the target identification system 1 according to this embodiment includes an information input / output device 10 and an information recognition device 60. In FIG. 1, the information input / output device 10 corresponds to the interface device 100, and the information recognition device 60 corresponds to the shutter glasses 600.
〔情報入出力装置〕
図7のように、本実施形態に係る情報入出力装置10は、撮像手段20と、制御手段40と、投射手段30とを備える。 [Information input / output device]
As illustrated in FIG. 7, the information input /output device 10 according to the present embodiment includes an imaging unit 20, a control unit 40, and a projection unit 30.
図7のように、本実施形態に係る情報入出力装置10は、撮像手段20と、制御手段40と、投射手段30とを備える。 [Information input / output device]
As illustrated in FIG. 7, the information input /
〔撮像手段〕
図8は、本実施形態に係る撮像手段20の構成を示すブロック図である。撮像手段20は、撮像素子21、画像処理プロセッサ23、内部メモリ24、測光センサ25およびデータ出力部26を有する。 [Imaging means]
FIG. 8 is a block diagram showing the configuration of the imaging means 20 according to this embodiment. Theimaging unit 20 includes an imaging device 21, an image processor 23, an internal memory 24, a photometric sensor 25, and a data output unit 26.
図8は、本実施形態に係る撮像手段20の構成を示すブロック図である。撮像手段20は、撮像素子21、画像処理プロセッサ23、内部メモリ24、測光センサ25およびデータ出力部26を有する。 [Imaging means]
FIG. 8 is a block diagram showing the configuration of the imaging means 20 according to this embodiment. The
なお、図示していないが、撮像手段20には、少なくとも一つのレンズが組み込まれている。撮像手段20に組み込まれたレンズは、焦点距離を変えることができるズーム機能を有してもよい。また、撮像手段20には、焦点を自動的に合わせるオートフォーカス機能が搭載されていてもよい。また、撮像手段20には、手振れを防止する機能などの一般的なデジタルカメラに適用されている機能が搭載されていることが好ましい。
Although not shown, at least one lens is incorporated in the imaging means 20. The lens incorporated in the imaging unit 20 may have a zoom function that can change the focal length. The imaging unit 20 may be equipped with an autofocus function for automatically focusing. Moreover, it is preferable that the imaging unit 20 is equipped with a function applied to a general digital camera, such as a function of preventing camera shake.
撮像素子21は、検出領域200を含む領域を撮像し、画像信号を取得するための素子である。撮像素子21は、半導体部品が集積回路化された光電変換素子である。撮像素子21は、例えば、CCD(Charge-Coupled Device)やCMOS(Complementary Metal-Oxide-Semiconductor)などの固体撮像素子によって実現できる。通常、撮像素子21は、可視領域の光を撮像する素子によって構成するが、赤外線や紫外線、X線、ガンマ線、電波、マイクロ波などの電磁波を撮像・検波できる素子によって構成してもよい。
The imaging element 21 is an element for imaging an area including the detection area 200 and acquiring an image signal. The imaging element 21 is a photoelectric conversion element in which semiconductor components are integrated into an integrated circuit. The image sensor 21 can be realized by a solid-state image sensor such as a CCD (Charge-Coupled Device) or a CMOS (Complementary Metal-Oxide-Semiconductor), for example. Normally, the imaging element 21 is configured by an element that captures light in the visible region, but may be configured by an element that can capture and detect electromagnetic waves such as infrared rays, ultraviolet rays, X-rays, gamma rays, radio waves, and microwaves.
画像処理プロセッサ23は、撮像素子21によって撮像された画像データに対して、暗電流補正や補間演算、色空間変換、ガンマ補正、収差の補正、ノイズリダクション、画像圧縮などの画像処理を実行する画像処理専用の集積回路である。なお、画像データを加工せずに出力する場合は、画像処理プロセッサ23を省略できる。画像処理プロセッサ23は、必要な処理ができるように設計されたプロセッサを用いればよい。
The image processor 23 performs image processing such as dark current correction, interpolation calculation, color space conversion, gamma correction, aberration correction, noise reduction, and image compression on the image data captured by the image sensor 21. An integrated circuit dedicated to processing. Note that when the image data is output without being processed, the image processor 23 can be omitted. The image processor 23 may be a processor designed to perform necessary processing.
内部メモリ24は、画像処理プロセッサ23によって画像処理を行う際に処理しきれない画像データや、処理済みの画像データを一時的に格納する記憶素子である。なお、撮像素子21によって撮像された画像データを内部メモリ24に一時的に記憶するように構成してもよい。内部メモリ24は、一般的なメモリによって構成すればよい。
The internal memory 24 is a storage element that temporarily stores image data that cannot be processed when image processing is performed by the image processor 23 and processed image data. The image data captured by the image sensor 21 may be temporarily stored in the internal memory 24. The internal memory 24 may be configured by a general memory.
測光センサ25は、検出領域200の照度を測定するためのセンサである。測光センサ25は、例えばフォトレジスタやフォトダイオードなどのように光電効果を用いて光を検出する電子回路によって実現できる。なお、撮像素子21が測光する機能を有している場合は、測光センサ25を省略してもよい。
The photometric sensor 25 is a sensor for measuring the illuminance of the detection area 200. The photometric sensor 25 can be realized by an electronic circuit that detects light using a photoelectric effect, such as a photoresistor or a photodiode. If the image sensor 21 has a photometric function, the photometric sensor 25 may be omitted.
測光センサ25は、測定した検出領域200の照度をデータ出力部26に出力する。なお、測光センサ25によって取得された検出領域200の照度は、画像処理プロセッサ23の演算に用いてもよい。
The photometric sensor 25 outputs the measured illuminance of the detection area 200 to the data output unit 26. Note that the illuminance of the detection region 200 acquired by the photometric sensor 25 may be used for the calculation of the image processor 23.
測光センサ25としては、構造別分類では、ミラーメータ式測光やファインダー光路測光、カットコンデンサ式測光、ダイレクトコンデンサ式測光、サブミラー式測光、リトラクタブルセンサ式測光などに分類される方式を用いたものを適用できる。また、測光センサ25としては、方式別分類では、絞込み測光や瞬間絞込み測光、開放測光などに分類される方式を用いたものを適用できる。また、測光センサ25としては、範囲別分類では、スポット測光や部分測光、全面測光、中央部重点測光、多分割測光などに分類される方式を用いたものを適用できる。なお、ここであげた測光方式は一例であって、本発明の範囲をこれらに限定するわけではない。
As the photometric sensor 25, those using a method classified into mirror meter type photometry, finder optical path photometry, cut capacitor type photometry, direct capacitor type photometry, sub-mirror type photometry, retractable sensor type photometry, etc. are applied. it can. In addition, as the photometric sensor 25, a method using a method classified into narrowing photometry, instantaneous narrowing photometry, open photometry, or the like can be applied. In addition, as the photometric sensor 25, a sensor using a method classified into spot photometry, partial photometry, full surface photometry, center-weighted photometry, multi-segment photometry, or the like can be applied. Note that the photometry methods given here are merely examples, and the scope of the present invention is not limited thereto.
データ出力部26は、画像処理プロセッサ23によって処理された処理済みの画像データを制御手段40に出力する。また、データ出力部26は、測光センサ25が測定した検出領域200の測光データ(照度)を制御手段40に出力する。
The data output unit 26 outputs the processed image data processed by the image processor 23 to the control means 40. Further, the data output unit 26 outputs the photometric data (illuminance) of the detection area 200 measured by the photometric sensor 25 to the control means 40.
〔投射手段〕
図9は、本実施形態に係る投射手段30の構成を示すブロック図である。投射手段30は、光源31、光源駆動部32および投射部38を有する。 [Projection means]
FIG. 9 is a block diagram showing the configuration of the projection means 30 according to this embodiment. Theprojection unit 30 includes a light source 31, a light source driving unit 32, and a projection unit 38.
図9は、本実施形態に係る投射手段30の構成を示すブロック図である。投射手段30は、光源31、光源駆動部32および投射部38を有する。 [Projection means]
FIG. 9 is a block diagram showing the configuration of the projection means 30 according to this embodiment. The
光源31は、特定波長の光34を発する光源を有する。通常、光源31は、可視光領域の光34を発するように構成する。なお、光源31は、赤外領域や紫外領域などの可視光領域以外の光34を発するように構成してもよい。
The light source 31 has a light source that emits light 34 having a specific wavelength. In general, the light source 31 is configured to emit light 34 in the visible light region. The light source 31 may be configured to emit light 34 other than the visible light region such as the infrared region and the ultraviolet region.
光源31は、例えばレーザ光源やLED(Light Emitting Diode)、ランプ、放電灯、電球などの発光源を用いることができる。なお、光源31に用いる発光源はここで挙げた限りではない。また、光源31は、別の発光源から発せられた光を反射する反射体であってもよい。
The light source 31 may be a light source such as a laser light source, an LED (Light Emitting Diode), a lamp, a discharge lamp, or a light bulb. Note that the light emission source used for the light source 31 is not limited to the above. The light source 31 may be a reflector that reflects light emitted from another light source.
光源駆動部32は、制御手段40によって設定された照射条件に基づいて、指定された出力のパルス光が出射されるように光源31を駆動する。なお、照射条件の詳細については後述する。
The light source driving unit 32 drives the light source 31 based on the irradiation condition set by the control means 40 so that pulsed light with a specified output is emitted. Details of the irradiation conditions will be described later.
投射部38は、光源31から出射された光34を信号光39として投射する。
Projection unit 38 projects light 34 emitted from light source 31 as signal light 39.
投射手段30は、例えば図10のように、位相変調型の空間変調素子を搭載する投射手段30-1とすることができる。
The projection means 30 may be a projection means 30-1 equipped with a phase modulation type spatial modulation element as shown in FIG.
図10は、位相変調型の空間変調素子を搭載した投射手段30-1の構成を示すブロック図である。投射手段30-1は、光源310、光源駆動部320、位相変調素子350、位相変調素子制御部360および投射部380を有する。
FIG. 10 is a block diagram showing the configuration of the projection means 30-1 equipped with a phase modulation type spatial modulation element. The projection unit 30-1 includes a light source 310, a light source driving unit 320, a phase modulation element 350, a phase modulation element control unit 360, and a projection unit 380.
光源310は、特定波長のレーザ光340を発する光源を有する。通常、光源310は、可視光領域のレーザ光340を発するように構成する。なお、光源310は、赤外領域や紫外領域などの可視光領域以外のレーザ光340を発するように構成してもよい。
The light source 310 has a light source that emits laser light 340 having a specific wavelength. Usually, the light source 310 is configured to emit laser light 340 in the visible light region. Note that the light source 310 may be configured to emit laser light 340 other than the visible light region such as the infrared region or the ultraviolet region.
光源310は、投射範囲内の領域に光を均一に投射するのではなく、一部分に光を集中させて投射する。本実施形態に係る投射手段30-1は、文字や記号などの線画を投射する用途に用いられることが多い。文字や記号などの線画を投射する際には、レーザ光340の照射量を減らすことができるため、全体的なレーザ出力を抑えることができる。すなわち、光源310は、小型かつ低電力の光源と、その光源を駆動する低出力の電源とで構成できる。
The light source 310 does not uniformly project light onto an area within the projection range, but projects the light while concentrating it on a part. The projection unit 30-1 according to the present embodiment is often used for the purpose of projecting line drawings such as characters and symbols. When projecting line drawings such as characters and symbols, the amount of laser light 340 can be reduced, so that overall laser output can be suppressed. That is, the light source 310 can be composed of a small and low power light source and a low output power source for driving the light source.
光源駆動部320は、制御手段40によって設定された照射条件に基づいて、指定された出力のパルス光が出射されるように光源310を駆動する。
The light source driving unit 320 drives the light source 310 based on the irradiation condition set by the control means 40 so that pulsed light with a specified output is emitted.
位相変調素子350は、位相がそろったコヒーレントなレーザ光340の入射を受け、入射されたレーザ光340の位相を変調する位相変調型の空間変調素子を含む。位相変調素子350は、変調された変調光370を投射部380に向けて出射する。
The phase modulation element 350 includes a phase modulation type spatial modulation element that receives coherent laser light 340 having the same phase and modulates the phase of the incident laser light 340. The phase modulation element 350 emits the modulated light 370 that is modulated toward the projection unit 380.
位相変調素子350上の表示領域には、被投射面上で目的画像を表示するための位相分布が表示される。位相変調素子350の表示領域で反射された変調光370は、一種の回折格子が集合体を形成したような画像になり、これらの回折格子で回折された光が集まることによって目的画像が形成される。
In the display area on the phase modulation element 350, a phase distribution for displaying a target image on the projection surface is displayed. The modulated light 370 reflected from the display region of the phase modulation element 350 becomes an image in which a kind of diffraction grating forms an aggregate, and the target image is formed by collecting the light diffracted by these diffraction gratings. The
位相変調素子350は、例えば、強誘電性液晶やホモジーニアス液晶、垂直配向液晶などを用いた空間変調素子によって実現される。位相変調素子350は、例えば、LCOS(Liquid Crystal on Silicon)によって実現できる。また、位相変調素子350は、例えば、MEMS(Micro Electro Mechanical System)によって実現してもよい。
The phase modulation element 350 is realized by a spatial modulation element using, for example, a ferroelectric liquid crystal, a homogeneous liquid crystal, a vertical alignment liquid crystal, or the like. The phase modulation element 350 can be realized by, for example, LCOS (Liquid Crystal on Silicon). Further, the phase modulation element 350 may be realized by, for example, a MEMS (Micro Electro Mechanical System).
位相変調素子制御部360は、位相変調素子350上の表示領域に照射されるレーザ光340の位相と、表示領域で反射される変調光370の位相との差分を決定づけるパラメータが変化するように位相変調素子350を制御する。位相変調素子350上の表示領域に照射されるレーザ光340の位相と、表示領域で反射される変調光370の位相との差分を決定づけるパラメータは、例えば、屈折率や光路長などの光学的特性に関するパラメータである。例えば、位相変調素子制御部360は、位相変調素子350上の表示領域に印可する電圧を制御することによって、表示領域の屈折率を変化させる。その結果、表示領域に照射されたレーザ光340は、表示領域の屈折率に基づいて適宜回折される。
The phase modulation element control unit 360 changes the phase so that the parameter that determines the difference between the phase of the laser light 340 irradiated to the display area on the phase modulation element 350 and the phase of the modulation light 370 reflected from the display area changes. The modulation element 350 is controlled. Parameters that determine the difference between the phase of the laser light 340 irradiated on the display area on the phase modulation element 350 and the phase of the modulated light 370 reflected on the display area are optical characteristics such as refractive index and optical path length, for example. Parameters. For example, the phase modulation element control unit 360 changes the refractive index of the display area by controlling the voltage applied to the display area on the phase modulation element 350. As a result, the laser light 340 applied to the display area is appropriately diffracted based on the refractive index of the display area.
すなわち、位相変調素子350に照射されたレーザ光340の位相分布は、表示領域の光学的特性に応じて変調される。なお、位相変調素子制御部360による位相変調素子350の制御はここで挙げた限りではない。
That is, the phase distribution of the laser beam 340 irradiated on the phase modulation element 350 is modulated according to the optical characteristics of the display area. Note that the control of the phase modulation element 350 by the phase modulation element control unit 360 is not limited to the above.
投射部380は、位相変調素子350によって反射された変調光370を信号光390として投射する光学系を含む。
Projection unit 380 includes an optical system that projects modulated light 370 reflected by phase modulation element 350 as signal light 390.
図11は、本実施形態に係る投射手段30-1の光学的な構成について説明するための概念図である。図11のように、本実施形態においては、光源310が出射したレーザ光をコリメータ53によって平行なレーザ光340とし、位相変調素子350の表示面に入射する。
FIG. 11 is a conceptual diagram for explaining the optical configuration of the projection means 30-1 according to the present embodiment. As shown in FIG. 11, in this embodiment, the laser light emitted from the light source 310 is converted into parallel laser light 340 by the collimator 53 and is incident on the display surface of the phase modulation element 350.
本実施形態においては、図11のように、位相変調素子350上の表示面に対して、レーザ光340の入射角を非垂直にする。すなわち、本実施形態においては、光源310から出射されるレーザ光の出射軸を位相変調素子350上の表示面に対して斜めにする。位相変調素子350上の表示面に対してレーザ光340の出射軸を斜めに設定すれば、ビームスプリッタを用いなくても位相変調素子350の表示面にレーザ光340を入射できるため効率を向上させることができる。
In this embodiment, as shown in FIG. 11, the incident angle of the laser beam 340 is made non-perpendicular with respect to the display surface on the phase modulation element 350. That is, in the present embodiment, the emission axis of the laser light emitted from the light source 310 is inclined with respect to the display surface on the phase modulation element 350. If the emission axis of the laser beam 340 is set obliquely with respect to the display surface on the phase modulation element 350, the efficiency can be improved because the laser beam 340 can be incident on the display surface of the phase modulation element 350 without using a beam splitter. be able to.
位相変調素子350で変調された変調光370は、投射部380によって信号光390として投射される。図11のように、投射部380は、フーリエ変換レンズ381、アパーチャ382、投射レンズ383を有する。
The modulated light 370 modulated by the phase modulation element 350 is projected as signal light 390 by the projection unit 380. As illustrated in FIG. 11, the projection unit 380 includes a Fourier transform lens 381, an aperture 382, and a projection lens 383.
フーリエ変換レンズ381は、位相変調素子350上の表示面で反射された変調光370を無限遠に投射した際に形成される像を、近傍の焦点位置に形成させるための光学レンズである。
The Fourier transform lens 381 is an optical lens for forming an image formed when the modulated light 370 reflected by the display surface on the phase modulation element 350 is projected at infinity at a nearby focal position.
アパーチャ382は、フーリエ変換レンズ381によって集束された光に含まれる高次光を消去し、画像領域を特定する機能を有する。アパーチャ382の開口部は、アパーチャ382の位置における投射画像の画像領域よりも小さく開口され、投射画像の周辺領域を遮るように設置される。例えば、アパーチャ382の開口部は、矩形状や円形上になるように形成される。アパーチャ382は、フーリエ変換レンズ381の焦点位置に設置されることが好ましいが、高次光を消去する機能を発揮できれば、焦点位置からずれていても構わない。
The aperture 382 has a function of erasing higher-order light included in the light focused by the Fourier transform lens 381 and specifying an image region. The opening of the aperture 382 is set to be smaller than the image area of the projection image at the position of the aperture 382 and to block the peripheral area of the projection image. For example, the opening of the aperture 382 is formed to be rectangular or circular. The aperture 382 is preferably installed at the focal position of the Fourier transform lens 381, but may be shifted from the focal position as long as the function of erasing high-order light can be exhibited.
投射レンズ383は、フーリエ変換レンズ381によって集束された光を拡大して投射する光学レンズである。投射レンズ383は、位相変調素子350に入力された位相分布に対応する目的画像が被投射面上に表示されるように信号光390を投射する。
The projection lens 383 is an optical lens that magnifies and projects the light focused by the Fourier transform lens 381. The projection lens 383 projects the signal light 390 so that the target image corresponding to the phase distribution input to the phase modulation element 350 is displayed on the projection surface.
〔制御手段〕
制御手段40は、撮像手段20が撮像した検出領域200に検出対象210が含まれている場合、検出対象210に向けて適切な画像を投射するように投射手段30を制御する。また、制御手段40は、撮像手段20が撮像した検出領域200に含まれるユーザインターフェース画像に対して行われた操作に応じた画像情報を投射手段30に提供するとともに、その画像情報に関する画像を投射するように投射手段30を制御する。制御手段40は、例えば演算装置や制御装置などを含むマイクロコンピュータの機能によって実現できる。 [Control means]
When thedetection target 210 is included in the detection area 200 captured by the imaging unit 20, the control unit 40 controls the projection unit 30 to project an appropriate image toward the detection target 210. The control unit 40 provides the projection unit 30 with image information corresponding to an operation performed on the user interface image included in the detection area 200 captured by the imaging unit 20 and projects an image related to the image information. The projection means 30 is controlled to do so. The control means 40 can be realized by the function of a microcomputer including, for example, an arithmetic device and a control device.
制御手段40は、撮像手段20が撮像した検出領域200に検出対象210が含まれている場合、検出対象210に向けて適切な画像を投射するように投射手段30を制御する。また、制御手段40は、撮像手段20が撮像した検出領域200に含まれるユーザインターフェース画像に対して行われた操作に応じた画像情報を投射手段30に提供するとともに、その画像情報に関する画像を投射するように投射手段30を制御する。制御手段40は、例えば演算装置や制御装置などを含むマイクロコンピュータの機能によって実現できる。 [Control means]
When the
なお、制御手段40は、検出対象210を検出した結果や、検出領域200になされた操作に関する情報をサーバなどの上位システムに所定のタイミングで送信するように構成してもよい。
Note that the control unit 40 may be configured to transmit the result of detecting the detection target 210 and information related to the operation performed on the detection area 200 to a host system such as a server at a predetermined timing.
図12のように、制御手段40は、撮像制御部41、照射条件設定部42、処理部43、記憶部44、投射制御部45および通信部46を有する。また、制御手段40は、検出対象210を特定するための対象特定手段50を含む。なお、図12には制御手段40が対象特定手段50を含む構成を示したが、制御手段40と対象特定手段50とを別の構成としてもよい。対象特定手段50については、後ほど詳細に説明する。
As shown in FIG. 12, the control means 40 includes an imaging control unit 41, an irradiation condition setting unit 42, a processing unit 43, a storage unit 44, a projection control unit 45, and a communication unit 46. In addition, the control unit 40 includes a target specifying unit 50 for specifying the detection target 210. Although FIG. 12 shows a configuration in which the control means 40 includes the target specifying means 50, the control means 40 and the target specifying means 50 may be configured differently. The object specifying unit 50 will be described in detail later.
撮像制御部41は、撮像手段20に検出領域200を撮像させる制御を行う。撮像制御部41は、所定のタイミングで検出領域200を撮像手段20に撮像させる。また、撮像制御部41は、撮像手段20から出力された画像データおよび測光データを取得する。撮像制御部41は、画像データを対象特定手段50に出力し、測光データを照射条件設定部42に出力し、画像データおよび測光データを処理部43に出力する。
The imaging control unit 41 controls the imaging unit 20 to image the detection area 200. The imaging control unit 41 causes the imaging unit 20 to image the detection area 200 at a predetermined timing. Further, the imaging control unit 41 acquires image data and photometry data output from the imaging unit 20. The imaging control unit 41 outputs the image data to the target specifying unit 50, outputs the photometric data to the irradiation condition setting unit 42, and outputs the image data and the photometric data to the processing unit 43.
照射条件設定部42は、測光データに含まれる検出領域200の照度に基づいて、投射手段30が出射する信号光のパルス条件や出力条件を含む照射条件を算出する。なお、パルス条件には、信号光の照射時間に相当するパルス幅、信号光を出射する繰返し周期に関する条件を含む。また、出力条件には、信号光の出力強度、その出力強度で出射された信号光の検出領域200における照度などに関する条件を含む。照射条件設定部42は、算出した信号光の照射条件を処理部43に出力するともに、信号光のパルス条件を通信部46に出力する。なお、照射条件設定部42は、処理部43を経由せずに、投射制御部45または通信部46に照射条件を送信するように構成してもよい。
The irradiation condition setting unit 42 calculates an irradiation condition including a pulse condition and an output condition of the signal light emitted from the projection unit 30 based on the illuminance of the detection area 200 included in the photometric data. Note that the pulse condition includes a condition relating to a pulse width corresponding to the irradiation time of the signal light and a repetition period of emitting the signal light. The output conditions include conditions regarding the output intensity of the signal light, the illuminance in the detection region 200 of the signal light emitted with the output intensity, and the like. The irradiation condition setting unit 42 outputs the calculated irradiation condition of the signal light to the processing unit 43 and outputs the pulse condition of the signal light to the communication unit 46. The irradiation condition setting unit 42 may be configured to transmit the irradiation conditions to the projection control unit 45 or the communication unit 46 without going through the processing unit 43.
処理部43は、後述する対象特定手段50から取得した検出対象210が検出された位置座標を含む通知信号と、照射条件設定部42によって算出された照射条件とを取得する。なお、通知信号には、検出対象210が検出されたか否かを示す照合結果を含み、検出対象210が検出された場合には検出対象210の位置座標を含む。
The processing unit 43 acquires a notification signal including a position coordinate where the detection target 210 acquired from the target specifying unit 50 described later is detected and the irradiation condition calculated by the irradiation condition setting unit 42. The notification signal includes a collation result indicating whether or not the detection target 210 is detected, and includes the position coordinates of the detection target 210 when the detection target 210 is detected.
処理部43は、撮像画像上における投射画像の位置座標を明らかにし、信号光を検出対象210に投射するための制御条件を生成する。処理部43は、検出対象210の位置座標を基に、検出対象210に向けて照射する信号光の照射方向を含む照射方向条件を制御条件の一つとして算出する。通常、対象特定手段50は、顔認証によって検出対象210を検出する。そのため、処理部43は、検出対象210の顔位置よりも下の胴体部分に信号光を照射できるように信号光の照射方向を決定する。処理部43によって生成される制御条件には、信号光のパルス条件、出力条件および照射方向条件に関する条件を含む。
The processing unit 43 clarifies the position coordinates of the projection image on the captured image, and generates a control condition for projecting the signal light onto the detection target 210. Based on the position coordinates of the detection target 210, the processing unit 43 calculates an irradiation direction condition including the irradiation direction of the signal light irradiated toward the detection target 210 as one of the control conditions. Usually, the target specifying unit 50 detects the detection target 210 by face authentication. Therefore, the processing unit 43 determines the irradiation direction of the signal light so that the signal light can be irradiated to the body part below the face position of the detection target 210. The control conditions generated by the processing unit 43 include conditions relating to the pulse condition, output condition, and irradiation direction condition of the signal light.
処理部43は、投射手段30の制御条件を投射制御部45に出力する。
The processing unit 43 outputs the control conditions of the projection unit 30 to the projection control unit 45.
記憶部44は、投射手段30が投射する目的画像を記憶する。目的画像は、例えば×や○など、目的画像が照射された対象が検出対象210であることを識別するための所定の画像を含む。なお、記憶部44は、目的画像以外のデータを記憶するように構成してもよい。また、投射手段30が図10のような位相変調型の空間変調素子を含む場合、記憶部44は、目的画像の位相分布を記憶しておけばよい。
The storage unit 44 stores a target image projected by the projection unit 30. The target image includes, for example, a predetermined image for identifying that the target irradiated with the target image is the detection target 210, such as X or ◯. The storage unit 44 may be configured to store data other than the target image. When the projection unit 30 includes a phase modulation type spatial modulation element as shown in FIG. 10, the storage unit 44 may store the phase distribution of the target image.
投射制御部45は、制御条件および照射条件を取得すると、適切な目的画像を記憶部44から取得する。そして、投射制御部45は、取得した目的画像を投射手段30に提供し、制御条件および照射条件に基づいて投射手段30を制御してその目的画像を投射させる。
When the projection control unit 45 acquires the control condition and the irradiation condition, the projection control unit 45 acquires an appropriate target image from the storage unit 44. And the projection control part 45 provides the acquired target image to the projection means 30, controls the projection means 30 based on control conditions and irradiation conditions, and projects the target image.
通信部46は、照射条件設定部42によって設定された信号光のパルス条件を投射手段30に送信する。また、対象特定手段50が制御手段40の内部に含まれない構成の場合、通信部46は、対象特定手段50が出力した通知信号を受信する。
The communication unit 46 transmits the pulse condition of the signal light set by the irradiation condition setting unit 42 to the projection unit 30. When the target specifying unit 50 is not included in the control unit 40, the communication unit 46 receives the notification signal output from the target specifying unit 50.
〔対象特定手段〕
図13は、本実施形態に係る対象特定手段50の構成を示すブロック図である。本実施形態に係る対象特定手段50は、データベース51、画像取得部52、画像解析部53、照合部54、照合結果出力部55を有する。 [Target identification method]
FIG. 13 is a block diagram illustrating a configuration of thetarget specifying unit 50 according to the present embodiment. The target specifying unit 50 according to the present embodiment includes a database 51, an image acquisition unit 52, an image analysis unit 53, a verification unit 54, and a verification result output unit 55.
図13は、本実施形態に係る対象特定手段50の構成を示すブロック図である。本実施形態に係る対象特定手段50は、データベース51、画像取得部52、画像解析部53、照合部54、照合結果出力部55を有する。 [Target identification method]
FIG. 13 is a block diagram illustrating a configuration of the
データベース51は、検出対象210を含む顔画像の特徴データを格納しておく。データベース51は、例えば、鼻や目、あご、頬骨などといった顔のパーツの形状、それらのパーツの相対的な位置関係や大きさなどに関する情報を特徴データとして格納しておけばよい。なお、データベース51に格納しておくデータはここで挙げた限りではない。また、データベース51は、多数の顔画像から作成された平均的な顔画像の特徴データと、個々の顔画像の特徴データとの差分データを格納しておいてもよい。
The database 51 stores facial image feature data including the detection target 210. The database 51 may store, for example, information on the shape of facial parts such as the nose, eyes, chin, cheekbones, and the relative positional relationship and size of these parts as feature data. Note that the data stored in the database 51 is not limited to the above. Further, the database 51 may store difference data between the feature data of the average face image created from a large number of face images and the feature data of each face image.
本実施形態においては、人物の顔画像の特徴データを用いて、検出対象210を検出するが、顔画像以外の特徴データを用いて検出対象210を検出するようにしてもよい。例えば、人物の髪型や体格などを特徴データに設定することができる。また、動画解析することが可能ならば、人物の歩き方や癖などを特徴データに設定することもできる。
In the present embodiment, the detection target 210 is detected using the feature data of the human face image, but the detection target 210 may be detected using the feature data other than the face image. For example, a person's hairstyle and physique can be set in the feature data. Further, if it is possible to analyze a moving image, it is possible to set the feature data such as how a person walks and how to walk.
画像取得部52は、撮像制御部41から画像データを取得する。画像取得部52は、取得した画像データを画像解析部53に出力する。
The image acquisition unit 52 acquires image data from the imaging control unit 41. The image acquisition unit 52 outputs the acquired image data to the image analysis unit 53.
画像解析部53は、画像取得部52から画像データを取得し、取得した画像データを解析する。画像解析部53は、解析した画像データから人物の顔に対応する顔画像を検出し、検出した顔画像から特徴データを抽出する。画像解析部53は、例えば、鼻や目、あご、頬骨などといった顔のパーツの形状、それらのパーツの相対的な位置関係や大きさなどの特徴データを顔画像から抽出する。このとき、画像解析部53は、画像データから抽出された顔画像の特徴データに、その顔画像が抽出された画像データ上の位置座標を関連付ける。
The image analysis unit 53 acquires image data from the image acquisition unit 52 and analyzes the acquired image data. The image analysis unit 53 detects a face image corresponding to a person's face from the analyzed image data, and extracts feature data from the detected face image. The image analysis unit 53 extracts feature data such as the shape of facial parts such as nose, eyes, chin, cheekbone, and the relative positional relationship and size of these parts from the facial image. At this time, the image analysis unit 53 associates the position data on the image data from which the face image is extracted with the feature data of the face image extracted from the image data.
画像解析部53は、抽出した顔画像の特徴データと、その顔画像に対応する位置座標とを含むデータを照合部54に出力する。
The image analysis unit 53 outputs data including the feature data of the extracted face image and the position coordinates corresponding to the face image to the matching unit 54.
照合部54は、画像解析部53によって抽出された顔画像の特徴データと、データベース51に格納された個々の顔画像の特徴データとを比較する。照合部54は、顔画像上の特徴データをそのまま比較したり、顔画像を統計的に数値化して比較したりすればよい。なお、顔画像を比較する具体的なアルゴリズムとしては、一般的な手法を用いればよい。例えば、照合部54は、特徴的な顔のパーツの組み合わせを用いてテンプレートマッチングを行うことによって、撮像手段20によって撮像された検出領域200に検出対象210が含まれるか否かを照合できる。
The collation unit 54 compares the facial image feature data extracted by the image analysis unit 53 with the individual facial image feature data stored in the database 51. The matching unit 54 may compare the feature data on the face images as they are, or may statistically digitize the face images and compare them. A general algorithm may be used as a specific algorithm for comparing face images. For example, the collation unit 54 can collate whether or not the detection target 210 is included in the detection region 200 imaged by the imaging unit 20 by performing template matching using a combination of characteristic facial parts.
照合部54は、検出対象210を検出した場合、検出対象210が検出されたという判定結果を照合結果出力部55に出力する。このとき、照合部54は、検出対象210の顔画像の位置座標を判定結果と併せて出力する。
When the detection unit 210 is detected, the verification unit 54 outputs a determination result that the detection target 210 has been detected to the verification result output unit 55. At this time, the collation unit 54 outputs the position coordinates of the face image of the detection target 210 together with the determination result.
照合結果出力部55は、検出対象210が検出されたという判定結果を受信すると、検出対象210が検出されたことを示す情報と、検出対象210の顔画像の位置座標とを含む通知信号を投射手段30に出力する。
When receiving the determination result that the detection target 210 has been detected, the collation result output unit 55 projects a notification signal including information indicating that the detection target 210 has been detected and the position coordinates of the face image of the detection target 210. Output to means 30.
〔情報認識装置〕
図14は、本実施形態に係る情報認識装置60の構成を示すブロック図である。本実施形態に係る情報認識装置60は、受信手段61、開閉タイミング設定手段62、開閉制御手段63、シャッタ駆動手段64、シャッタ65を有する。 [Information recognition device]
FIG. 14 is a block diagram showing the configuration of theinformation recognition apparatus 60 according to this embodiment. The information recognition apparatus 60 according to the present embodiment includes a receiving unit 61, an opening / closing timing setting unit 62, an opening / closing control unit 63, a shutter driving unit 64, and a shutter 65.
図14は、本実施形態に係る情報認識装置60の構成を示すブロック図である。本実施形態に係る情報認識装置60は、受信手段61、開閉タイミング設定手段62、開閉制御手段63、シャッタ駆動手段64、シャッタ65を有する。 [Information recognition device]
FIG. 14 is a block diagram showing the configuration of the
情報認識装置60は、検出者に対して検出領域200から届く光の入射量を制御するシャッタ65を有する。情報認識装置60は、情報入出力装置10によって設定された制御条件に基づいてシャッタ65の開閉条件を設定する。また、情報認識装置60は、情報入出力装置10から出力された通知信号に応じてシャッタ65を開閉制御する。
The information recognition device 60 includes a shutter 65 that controls the amount of incident light reaching the detector from the detection region 200. The information recognition device 60 sets the opening / closing conditions of the shutter 65 based on the control conditions set by the information input / output device 10. The information recognition device 60 controls the opening and closing of the shutter 65 in accordance with the notification signal output from the information input / output device 10.
受信手段61は、情報入出力装置10から出力されたデータを受信する手段である。受信手段61は、例えば無線通信によってデータを受信する。受信手段61は、制御手段40の照射条件設定部42によって設定された制御条件を受信する。受信手段61は、受信した制御条件を開閉タイミング設定手段62に出力する。なお、受信手段61は、一般的な液晶モジュールの画像メモリを含んでいてもよい。
The receiving means 61 is means for receiving the data output from the information input / output device 10. The receiving unit 61 receives data by wireless communication, for example. The receiving unit 61 receives the control condition set by the irradiation condition setting unit 42 of the control unit 40. The receiving unit 61 outputs the received control condition to the opening / closing timing setting unit 62. The receiving means 61 may include a general liquid crystal module image memory.
開閉タイミング設定手段62は、受信手段61から受信した制御条件に基づいて、シャッタ65の開閉条件を設定する。開閉タイミング設定手段62は、設定したシャッタ開閉条件を開閉制御手段63に出力する。例えば、開閉制御手段63は、一般的な液晶モジュールのタイミングコントローラによって実現できる。
The opening / closing timing setting means 62 sets the opening / closing conditions of the shutter 65 based on the control conditions received from the receiving means 61. The opening / closing timing setting means 62 outputs the set shutter opening / closing conditions to the opening / closing control means 63. For example, the opening / closing control means 63 can be realized by a general liquid crystal module timing controller.
図15のように、開閉タイミング設定手段62は、信号光が出射されているタイミングを含む時間帯に、シャッタ65を開くように設定する。なお、開閉タイミング設定手段62は、信号光が出射されているタイミングにおいて、常にシャッタ65を開く必要はない。開閉タイミング設定手段62は、シャッタ65を通して見た背景光に対して、検出対象210に照射されたパルス状の信号光のコントラストを相対的に向上させる条件を計算する。そして、開閉タイミング設定手段62は、計算した条件と、制御条件に含まれるパルス条件に基づいて、信号光を視認可能とするシャッタ開閉条件を設定する。
As shown in FIG. 15, the opening / closing timing setting means 62 sets the shutter 65 to open in a time zone including the timing when the signal light is emitted. The opening / closing timing setting means 62 does not always need to open the shutter 65 at the timing when the signal light is emitted. The opening / closing timing setting means 62 calculates a condition for relatively improving the contrast of the pulsed signal light applied to the detection target 210 with respect to the background light viewed through the shutter 65. Then, the opening / closing timing setting means 62 sets a shutter opening / closing condition that makes the signal light visible based on the calculated condition and the pulse condition included in the control condition.
図16のように、信号光のパルス間隔は均一ではなくてもよい。図16の例において、時間帯Bでは、時間帯Aと比べて信号光の輝度を落としている。そして、時間帯Bでは、パルス間隔を密にすることによって、視覚できる明るさを時間帯Aの信号光と同等にしている。このとき、開閉タイミング設定手段62は、時間帯Bにおいては、コントラストを向上させるために、シャッタ65を開く時間を時間帯Aと比べて短く設定する。
As shown in FIG. 16, the pulse interval of the signal light may not be uniform. In the example of FIG. 16, in the time zone B, the luminance of the signal light is lowered as compared with the time zone A. In time zone B, the visual brightness is made equal to the signal light in time zone A by narrowing the pulse interval. At this time, in the time zone B, the opening / closing timing setting means 62 sets the time for opening the shutter 65 shorter than the time zone A in order to improve the contrast.
開閉タイミング設定手段62は、情報入出力装置10によって測定された背景光の明るさと、検出対象210に照射される信号光の平均的な明るさとを基に、シャッタ65の開閉タイミングを設定する。
The opening / closing timing setting means 62 sets the opening / closing timing of the shutter 65 based on the brightness of the background light measured by the information input / output device 10 and the average brightness of the signal light irradiated on the detection target 210.
例えば、朝や夕方、曇天、雨天では、晴天の昼間と比べて背景光が暗くなる。その場合、開閉タイミング設定手段62は、照射する信号光の平均的な明るさが小さくなるように設定するとともに、適切なコントラストが得られるようにシャッタ開閉条件を設定する。また、例えば、雪山や真夏の海岸では、通常の昼間と比べて背景光が明るくなる。その場合、開閉タイミング設定手段62は、照射する信号光の平均的な明るさが大きくなるように設定するとともに、適切なコントラストが得られるようにシャッタ開閉条件を設定する。
For example, the background light is darker in the morning, evening, cloudy weather, and rainy weather compared to the daytime in fine weather. In that case, the opening / closing timing setting means 62 sets the shutter opening / closing conditions so as to obtain an appropriate contrast while setting the average brightness of the irradiated signal light to be small. Further, for example, on a snowy mountain or a midsummer coast, the background light is brighter than in normal daytime. In that case, the opening / closing timing setting means 62 sets the average brightness of the signal light to be irradiated and sets the shutter opening / closing conditions so as to obtain an appropriate contrast.
開閉制御手段63は、開閉タイミング設定手段62によって設定されたシャッタ開閉条件に基づいて、シャッタ65を開閉するシャッタ駆動手段64を制御する。例えば、開閉制御手段63は、一般的な液晶モジュールの表示コントローラによって実現できる。なお、タイミングコントロールと表示コントロールを同一の手段で制御する場合、開閉タイミング設定手段62と開閉制御手段63を共通化してもよい。
The opening / closing control means 63 controls the shutter driving means 64 that opens and closes the shutter 65 based on the shutter opening / closing conditions set by the opening / closing timing setting means 62. For example, the opening / closing control means 63 can be realized by a display controller of a general liquid crystal module. When the timing control and the display control are controlled by the same means, the opening / closing timing setting means 62 and the opening / closing control means 63 may be shared.
シャッタ駆動手段64は、開閉制御手段63の制御に応じて、シャッタ65を開閉させる。例えば、シャッタ駆動手段64は、一般的な液晶モジュールのXYドライバによって実現できる。
The shutter driving unit 64 opens and closes the shutter 65 according to the control of the opening / closing control unit 63. For example, the shutter driving unit 64 can be realized by an XY driver of a general liquid crystal module.
シャッタ65は、シャッタ駆動手段64によって開閉される。例えば、シャッタ65は、閉状態のときは外光を遮断し、開状態のときは外光を通過させる。なお、シャッタ65は、閉状態のときに完全に外光を遮断しなくてもよく、開状態のときに外光を減光させる程度でもよい。
The shutter 65 is opened and closed by the shutter driving means 64. For example, the shutter 65 blocks external light when in the closed state and allows external light to pass through when in the open state. Note that the shutter 65 does not have to completely block the external light when in the closed state, and may be such that the external light is dimmed when in the open state.
シャッタ65は、例えば高速応答液晶を用いて実現してもよいし、MEMS(Micro Electro Mechanical System)シャッタを用いて実現してもよい。通常のTN(Twisted Nematic)液晶を用いる場合、偏光板を入れる関係で、シャッタ65を通した光の明るさは外光よりも40%程度に落ちる。そのため、TN液晶を用いたシャッタ65を通して見る場合は、裸眼で見る場合よりも暗くなる。一方、MEMSシャッタを用いると、偏光板がないので、TN液晶を用いたシャッタ65よりも明るくなる。ただし、シャッタ65にMEMSシャッタを用いても、開口率が100%ということはないので、裸眼で見るよりかは暗くなる。しかしながら、MEMSシャッタの場合は、視野全体が暗くなるために瞳孔が開き、TN液晶を用いた場合よりかは明るく感じる。なお、シャッタ65の構造は、外光の遮断や通過、減光を切り替えることができれば、ここで挙げた限りではない。
The shutter 65 may be realized using, for example, a high-speed response liquid crystal, or may be realized using a MEMS (Micro Electro Mechanical System) shutter. When using a normal TN (TwistedwNematic) liquid crystal, the brightness of the light passing through the shutter 65 falls to about 40% of the external light due to the insertion of the polarizing plate. Therefore, when viewing through the shutter 65 using TN liquid crystal, it is darker than when viewing with the naked eye. On the other hand, when the MEMS shutter is used, since there is no polarizing plate, it becomes brighter than the shutter 65 using TN liquid crystal. However, even if a MEMS shutter is used as the shutter 65, the aperture ratio is not 100%, so it is darker than when viewed with the naked eye. However, in the case of the MEMS shutter, the pupil is opened because the entire field of view becomes dark, and it feels brighter than when TN liquid crystal is used. The structure of the shutter 65 is not limited to that described here as long as external light can be blocked, passed, or dimmed.
図15は、信号光の照射タイミングと、シャッタ65の開閉タイミングとを同期させる一例である。
FIG. 15 is an example in which the irradiation timing of the signal light and the opening / closing timing of the shutter 65 are synchronized.
情報認識装置60は、信号光のパルス条件を受信すると、そのパルス条件に基づいて、検出対象210に信号光が照射されるタイミングにおいてシャッタ65を開き、信号光が照射されていないタイミングにおいてシャッタ65を閉じるように制御する。情報認識装置60は、シャッタ65を開閉することによって、シャッタ65を通して感じられる背景光に対する信号光のコントラストを相対的に向上させ、視認可能になるように制御する。
When receiving the pulse condition of the signal light, the information recognition device 60 opens the shutter 65 at the timing when the signal light is irradiated to the detection target 210 based on the pulse condition, and the shutter 65 at the timing when the signal light is not irradiated. Control to close. The information recognizing device 60 performs control so that the contrast of the signal light with respect to the background light sensed through the shutter 65 is relatively improved by opening and closing the shutter 65 so as to be visible.
図17および図18は、本実施形態に係るシャッタメガネ600の一例を示す。シャッタメガネ600は、検出対象210を検出する検出者の少なくとも一方の目の前方にシャッタ65を配置するためのフレームを有する。シャッタメガネ600は、メガネのレンズの位置に透明なガラスやプラスチックなどの透光体を設置してもよいし、シャッタ65のみを設置してもよい。なお、図17および図18は、本実施形態に係るシャッタメガネ600の構造を具体化するための一例であって、本発明の範囲を限定するものではない。
17 and 18 show an example of shutter glasses 600 according to the present embodiment. The shutter glasses 600 have a frame for disposing the shutter 65 in front of at least one eye of a person who detects the detection target 210. In the shutter glasses 600, a transparent body such as transparent glass or plastic may be installed at the position of the glasses lens, or only the shutter 65 may be installed. FIGS. 17 and 18 are examples for embodying the structure of the shutter glasses 600 according to the present embodiment, and do not limit the scope of the present invention.
図17のシャッタメガネ611は、少なくとも片方のレンズにシャッタ65の機能を有するシャッタ612を搭載する。また、シャッタメガネ611は、受信手段61、開閉タイミング設定手段62、開閉制御手段63、シャッタ駆動手段64の機能を有する制御装置613を搭載する。制御装置613は、情報入出力装置10から受信したパルス条件でシャッタ611を開閉制御する。シャッタメガネ611を装着した検出者は、情報入出力装置10が投射した投射光を視認できるため、検出対象210を把握できるようになる。
The shutter glasses 611 in FIG. 17 have a shutter 612 having the function of the shutter 65 mounted on at least one lens. The shutter glasses 611 are equipped with a control device 613 having functions of a receiving unit 61, an opening / closing timing setting unit 62, an opening / closing control unit 63, and a shutter driving unit 64. The control device 613 controls opening / closing of the shutter 611 under the pulse condition received from the information input / output device 10. Since the detector wearing the shutter glasses 611 can visually recognize the projection light projected by the information input / output device 10, the detection target 210 can be grasped.
図18のシャッタメガネ621は、少なくとも片方のレンズにシャッタ65の機能を有するシャッタ622を搭載する。また、シャッタメガネ621は、少なくともシャッタ駆動手段64の機能を有する制御装置623を搭載する。さらに、シャッタメガネ621は、受信手段61、開閉タイミング設定手段62、開閉制御手段63のうちいずれかの機能を有する制御装置624を搭載する。図18の例は、シャッタ622の制御系統を2系統に分担することによって、シャッタメガネ621を軽量化できる。通常、シャッタメガネ621を駆動させるためには電力が必要となるが、制御装置624に電源を仕込めば、シャッタメガネ622の軽量化と、電源の大容量化とを両立することも可能となる。
18 includes a shutter 622 having a function of the shutter 65 on at least one lens. The shutter glasses 621 are equipped with a control device 623 having at least the function of the shutter driving means 64. Furthermore, the shutter glasses 621 are equipped with a control device 624 having any of the functions of the receiving unit 61, the opening / closing timing setting unit 62, and the opening / closing control unit 63. In the example of FIG. 18, the shutter glasses 621 can be reduced in weight by sharing the control system of the shutter 622 into two systems. Normally, electric power is required to drive the shutter glasses 621. However, if a power source is prepared in the control device 624, it is possible to achieve both weight reduction of the shutter glasses 622 and an increase in capacity of the power source.
以上が、本実施形態に係る対象特定システム1の構成についての説明である。
The above is the description of the configuration of the target identification system 1 according to the present embodiment.
<動作>
次に、本実施形態に係る対象特定システム1の動作について説明する。 <Operation>
Next, the operation of thetarget identification system 1 according to this embodiment will be described.
次に、本実施形態に係る対象特定システム1の動作について説明する。 <Operation>
Next, the operation of the
図19は、本実施形態に係る対象特定システム1の情報入出力装置10の動作を説明するためのフローチャートである。
FIG. 19 is a flowchart for explaining the operation of the information input / output device 10 of the target identification system 1 according to this embodiment.
図19において、まず、情報入出力装置10は、検出領域200を撮像する(ステップS11)。なお、図19の例では、検出領域200の測光データを併せて取得するものとする。
In FIG. 19, first, the information input / output device 10 images the detection area 200 (step S11). In the example of FIG. 19, it is assumed that photometric data of the detection area 200 is also acquired.
次に、情報入出力装置10は、撮像した画像信号から、画像データを生成する(ステップS12)。このとき、情報入出力装置10の撮像手段20は、アナログデータである画像信号をデジタルデータである画像データに変換する。また、撮像手段20は、必要に応じて、画像データに画像処理を加える。
Next, the information input / output device 10 generates image data from the captured image signal (step S12). At this time, the imaging means 20 of the information input / output device 10 converts an image signal that is analog data into image data that is digital data. In addition, the imaging unit 20 performs image processing on the image data as necessary.
次に、情報入出力装置10は、画像データを検証する(ステップS13)。このとき、情報入出力装置10の対象特定手段50は、画像データ中から顔画像を抽出し、抽出した顔画像が検出対象210と一致するか否かを検証する。
Next, the information input / output device 10 verifies the image data (step S13). At this time, the target specifying unit 50 of the information input / output device 10 extracts a face image from the image data, and verifies whether or not the extracted face image matches the detection target 210.
抽出された顔画像が検出対象210と一致する場合、すなわち検出対象210が発見された場合(ステップS14でYes)、情報入出力装置10は、検出対象210に投射する信号光の照射条件を設定する(ステップS15)。このとき、情報入出力装置10の照射条件設定部42は、測光データに基づいて、信号光の適切なパルス条件および出力条件を設定する。なお、ステップS15は、ステップS11の次の段階で、ステップS12~S14に並行させて実行してもよい。
When the extracted face image matches the detection target 210, that is, when the detection target 210 is found (Yes in step S14), the information input / output device 10 sets the irradiation condition of the signal light projected on the detection target 210 (Step S15). At this time, the irradiation condition setting unit 42 of the information input / output device 10 sets appropriate pulse conditions and output conditions for the signal light based on the photometric data. Note that step S15 may be executed in parallel with steps S12 to S14 at the next stage after step S11.
一方、抽出された顔画像が検出対象210と一致しない場合、すなわち検出対象210が発見されていない場合(ステップS14でNo)、ステップS11に戻る。
On the other hand, when the extracted face image does not coincide with the detection target 210, that is, when the detection target 210 is not found (No in step S14), the process returns to step S11.
ステップS15において検出対象210に投射する信号光の照射条件が設定されると、情報入出力装置10は、照射条件に含まれるパルス条件を情報認識装置60に送信する(ステップS16)。
When the irradiation condition of the signal light projected on the detection target 210 is set in step S15, the information input / output device 10 transmits the pulse condition included in the irradiation condition to the information recognition apparatus 60 (step S16).
そして、情報入出力装置10は、目印の投射光を検出対象210に投射する(ステップS17)。
Then, the information input / output device 10 projects the projected light of the mark onto the detection target 210 (step S17).
以上が、本実施形態に係る対象特定システム1の情報入出力装置10の動作についての説明である。情報認識装置60は、受信したパルス条件に基づいてシャッタ65の開閉動作を行う。
The above is the description of the operation of the information input / output device 10 of the target identification system 1 according to the present embodiment. The information recognition device 60 opens and closes the shutter 65 based on the received pulse condition.
以上のように、本実施形態に係る対象特定システムによれば、周囲に悟られずに、昼間の太陽光下で見えるような強い信号光を検出対象に照射できる。また、本実施形態に係る対象特定システムによれば、検出者は、シャッタメガネなどの情報認識装置によって、検出対象に照射された信号光を認識できる。その結果、本実施形態に係る対象特定システムによれば、検出者のみが検出対象に投射された所望の情報を認識できる。
As described above, according to the target specifying system according to the present embodiment, it is possible to irradiate the detection target with a strong signal light that can be seen in daylight without being recognized by the surroundings. In addition, according to the target identification system according to the present embodiment, the detector can recognize the signal light emitted to the detection target using an information recognition device such as shutter glasses. As a result, according to the target identification system according to the present embodiment, only the detector can recognize desired information projected on the detection target.
(利用シーン)
ここで、図面を参照しながら本実施形態に係る対象特定システム1の利用シーンについて説明する。 (Use scene)
Here, a usage scene of thetarget identification system 1 according to the present embodiment will be described with reference to the drawings.
ここで、図面を参照しながら本実施形態に係る対象特定システム1の利用シーンについて説明する。 (Use scene)
Here, a usage scene of the
図20は、対象特定システム1を天井に設置する利用シーンである。
FIG. 20 shows a usage scene in which the target identification system 1 is installed on the ceiling.
対象特定システム1は、位置Aにおいて検出対象210を検出すると、その検出対象210にマーキング光300を照射する。対象特定システム1は、位置Aでは2つのマーキング光300を検出対象210に照射している。そして、対象特定システム1は、検出対象21がマーキング光300を照射されていることに気づいて逃げ出しても、検出対象210を継続的に検出し、位置Bにおいてもマーキング光300を照射し続けている。すなわち、対象特定システム1は、リアルタイムで動き回る検出対象210に追随し、その検出対象210に対してマーキング光300を照射し続ける。
When the target identification system 1 detects the detection target 210 at the position A, the target identification system 1 irradiates the detection target 210 with the marking light 300. The target identification system 1 irradiates the detection target 210 with two marking lights 300 at the position A. Then, even if the target identification system 1 realizes that the detection target 21 is irradiated with the marking light 300 and escapes, the target identification system 1 continuously detects the detection target 210 and continues to irradiate the marking light 300 even at the position B. Yes. That is, the target identification system 1 follows the detection target 210 that moves around in real time, and continues to irradiate the marking light 300 to the detection target 210.
図21は、検出対象210である犯罪者などが別の人物の背後に位置しており、検出対象210に直接マーキング光300を照射できない場合の利用シーンである。
FIG. 21 shows a use scene when a criminal who is the detection target 210 is positioned behind another person and the detection target 210 cannot be directly irradiated with the marking light 300.
このような場合、対象特定システム1は、検出対象210の直前に位置する人物に対して、「後ろ犯人」というマーキング光300を照射し、検出対象210を特定させる。すなわち、対象特定システム1は、検出対象210に直接マーキング光300を照射できなくても、別の対象物にマーキング光300を照射することによって検出対象210を特定させることができる。
In such a case, the target specifying system 1 irradiates the person positioned immediately before the detection target 210 with the marking light 300 “back criminal” to specify the detection target 210. That is, the target identification system 1 can identify the detection target 210 by irradiating the marking light 300 to another target object even if the marking light 300 cannot be directly irradiated to the detection target 210.
図22は、対象特定システム1が検出対象210にマーキング光300を直接照射できない場合の利用シーンである。
FIG. 22 shows a usage scene when the target identification system 1 cannot directly irradiate the detection target 210 with the marking light 300.
対象特定システム1は、別のシステムから検出対象210の検出通知を受信し、受信した検出通知に応じて検出対象が接近していることを検出者に知らせるマーキング光300を地面に照射する。シャッタメガネ600を装着した検出者は、地面に照射されたマーキング光300を視認することによって、検出対象210である犯人の接近を知る。なお、対象特定システム1は、検出対象210に直接マーキング光300を照射できる表示可能ライン302上にマーキング光300を照射することによって、接近してくる複数の人物のうち誰が検出対象210であるのかを示すようにしてもよい。
The target identification system 1 receives a detection notification of the detection target 210 from another system, and irradiates the ground with marking light 300 that notifies the detector that the detection target is approaching according to the received detection notification. The detector wearing the shutter glasses 600 knows the approach of the criminal who is the detection target 210 by visually recognizing the marking light 300 irradiated on the ground. Note that the target identification system 1 irradiates the marking light 300 on the displayable line 302 that can directly irradiate the detection target 210 with the marking light 300, and who is the detection target 210 among a plurality of persons approaching. May be shown.
(第2の実施形態)
図23は、本発明の第2の実施形態に係る対象特定システムの情報認識装置60-2の構成を示すブロック図である。本実施形態に係る対象特定システムは、第1の実施形態に係る対象特定システム1の情報認識装置60に測光手段66を追加した構成である。 (Second Embodiment)
FIG. 23 is a block diagram showing the configuration of the information recognition apparatus 60-2 of the target identification system according to the second embodiment of the present invention. The target identification system according to the present embodiment has a configuration in which aphotometric unit 66 is added to the information recognition device 60 of the target identification system 1 according to the first embodiment.
図23は、本発明の第2の実施形態に係る対象特定システムの情報認識装置60-2の構成を示すブロック図である。本実施形態に係る対象特定システムは、第1の実施形態に係る対象特定システム1の情報認識装置60に測光手段66を追加した構成である。 (Second Embodiment)
FIG. 23 is a block diagram showing the configuration of the information recognition apparatus 60-2 of the target identification system according to the second embodiment of the present invention. The target identification system according to the present embodiment has a configuration in which a
測光手段66は、シャッタ65を通した背景光の明るさを測定する測光センサである。測光手段66は、測光センサ25と同様の構成とすればよい。
The photometric means 66 is a photometric sensor that measures the brightness of the background light that has passed through the shutter 65. The photometric unit 66 may have the same configuration as the photometric sensor 25.
開閉タイミング設定手段62は、測光手段66が測定したシャッタ65を通した背景光の明るさと、パルス状の信号光の平均化した明るさとを基に、信号光が視認可能になる条件を設定する。
The opening / closing timing setting means 62 sets conditions for making the signal light visible based on the brightness of the background light through the shutter 65 measured by the photometry means 66 and the averaged brightness of the pulsed signal light. .
本実施形態によれば、シャッタ65を通した背景光の明るさを実測するため、実際の天候条件や照明条件などの背景光の状態に対応させて信号光の視認性を設定できる。そのため、より確実に信号光を視認できるようになる。
According to the present embodiment, since the brightness of the background light that has passed through the shutter 65 is actually measured, the visibility of the signal light can be set in accordance with the state of the background light such as actual weather conditions and illumination conditions. Therefore, it becomes possible to visually recognize the signal light more reliably.
(第3の実施形態)
図24は、本発明の第3の実施形態に係る対象特定システム3の構成を示すブロック図である。本実施形態に係る対象特定システム3は、検出対象210を特定する対象特定装置50-3がネットワーク70を介して情報入出力装置10-2および情報認識装置60に接続される点で、第1の実施形態の実施形態に係る対象特定システム1とは異なる。対象特定装置50-3の機能は、一般的なサーバなどに割り当てればよい。 (Third embodiment)
FIG. 24 is a block diagram showing the configuration of thetarget identification system 3 according to the third embodiment of the present invention. The target specifying system 3 according to the present embodiment is the first in that the target specifying device 50-3 that specifies the detection target 210 is connected to the information input / output device 10-2 and the information recognition device 60 via the network 70. This is different from the object identification system 1 according to the embodiment. The function of the target specifying device 50-3 may be assigned to a general server or the like.
図24は、本発明の第3の実施形態に係る対象特定システム3の構成を示すブロック図である。本実施形態に係る対象特定システム3は、検出対象210を特定する対象特定装置50-3がネットワーク70を介して情報入出力装置10-2および情報認識装置60に接続される点で、第1の実施形態の実施形態に係る対象特定システム1とは異なる。対象特定装置50-3の機能は、一般的なサーバなどに割り当てればよい。 (Third embodiment)
FIG. 24 is a block diagram showing the configuration of the
図25は、本実施形態に係る制御手段40-3の構成を示すブロック図である。本実施形態に係る制御手段40-3は、第1の実施形態に係る制御手段40とは異なり、対象特定装置50-3を含まない。本実施形態に係る制御手段40-3の通信部46は、ネットワーク70を介して対象特定装置50-3に接続される。なお、制御手段40-3のその他の構成は、第1の実施形態に係る制御手段40と同様であるために説明は省略する。
FIG. 25 is a block diagram showing the configuration of the control means 40-3 according to this embodiment. Unlike the control unit 40 according to the first embodiment, the control unit 40-3 according to the present embodiment does not include the target specifying device 50-3. The communication unit 46 of the control unit 40-3 according to the present embodiment is connected to the target specifying device 50-3 via the network 70. The other configuration of the control unit 40-3 is the same as that of the control unit 40 according to the first embodiment, and thus description thereof is omitted.
本実施形態に係る対象特定システムにおいては、制御手段の外部に対象特定装置を設けることができるため、情報入出力装置を軽量化・小型化できる。また、高性能のサーバ側に対象特定装置を置けば、検出対象の検出処理を高速化できるとともに、制御手段の演算処理を軽減できる。また、対象特定装置による検出対象処理に用いる電力を削減できるため、情報入出力装置に搭載する電源を小型化できる。
In the target specifying system according to the present embodiment, since the target specifying device can be provided outside the control means, the information input / output device can be reduced in weight and size. In addition, if the target specifying device is placed on the high-performance server side, the detection processing of the detection target can be speeded up and the arithmetic processing of the control means can be reduced. In addition, since the power used for detection target processing by the target specifying device can be reduced, the power source mounted on the information input / output device can be reduced in size.
(第4の実施形態)
図26は、本発明の第4の実施形態に係る対象特定システム4の構成を示すブロック図である。本実施形態に係る対象特定システム4は、第1の実施形態に係る対象特定システム1に照明装置80を追加した構成である。なお、本実施形態に係る対象特定システム4は、第3の実施形態に係る対象特定システム3に照明装置80を追加した構成としてもよい。 (Fourth embodiment)
FIG. 26 is a block diagram showing the configuration of thetarget identification system 4 according to the fourth embodiment of the present invention. The target identification system 4 according to this embodiment has a configuration in which a lighting device 80 is added to the target identification system 1 according to the first embodiment. The target identification system 4 according to this embodiment may have a configuration in which the illumination device 80 is added to the target identification system 3 according to the third embodiment.
図26は、本発明の第4の実施形態に係る対象特定システム4の構成を示すブロック図である。本実施形態に係る対象特定システム4は、第1の実施形態に係る対象特定システム1に照明装置80を追加した構成である。なお、本実施形態に係る対象特定システム4は、第3の実施形態に係る対象特定システム3に照明装置80を追加した構成としてもよい。 (Fourth embodiment)
FIG. 26 is a block diagram showing the configuration of the
図27は、本実施形態に係る対象特定システム4の制御手段40-4の構成を示すブロック図である。制御手段40-4は、第1の実施形態に係る対象特定システム1の制御手段40に照明制御部47を追加した構成である。
FIG. 27 is a block diagram showing the configuration of the control means 40-4 of the target identification system 4 according to this embodiment. The control unit 40-4 has a configuration in which an illumination control unit 47 is added to the control unit 40 of the target identification system 1 according to the first embodiment.
照明装置80は、ある地点を通過する人や、ある地点に滞在する人に対して特定の光を照射する。すなわち、照明装置80は、検出領域200に含まれる対象に対して特定の光を照射する。特定の光とは、マーキング光300とは波長や性質が異なる光である。例えば、マーキング光300のコントラストをあげるための照明光や、マーキング光300の照射を意識されにくくするためのダミー光などが特定の光に相当する。
The lighting device 80 irradiates a person who passes a certain point or a person staying at a certain point with specific light. That is, the illumination device 80 irradiates the target included in the detection region 200 with specific light. The specific light is light having a wavelength and a property different from those of the marking light 300. For example, illumination light for increasing the contrast of the marking light 300, dummy light for making the irradiation of the marking light 300 less conscious, and the like correspond to specific light.
照明制御部47は、照明装置80に特定の光を照射させる制御をする。例えば、照明制御部47は、ある地点を通過する全ての人物に特定の光を照射するように照明装置80を制御する。
The illumination control unit 47 controls the illumination device 80 to emit specific light. For example, the illumination control unit 47 controls the illumination device 80 to irradiate specific light to all persons passing through a certain point.
照明制御部47は、投射制御部45のマーキング光300の照射タイミングに合わせて、特定の光の照射タイミングを切り替えてもよい。例えば、照明制御部47は、投射制御部45のマーキング光300の照射タイミングで、照明装置80に特定の光を照射させる制御をする。例えば、照明制御部47は、投射制御部45のマーキング光300の照射タイミングで、照明装置80による特定の光の照射を止める制御をする。ただし、照明制御部47は、ここで挙げた例とは異なるタイミングで特定の光の照射を切り替えてもよい。
The illumination control unit 47 may switch the irradiation timing of specific light according to the irradiation timing of the marking light 300 of the projection control unit 45. For example, the illumination control unit 47 controls the illumination device 80 to emit specific light at the irradiation timing of the marking light 300 of the projection control unit 45. For example, the illumination control unit 47 performs control to stop irradiation of specific light by the illumination device 80 at the irradiation timing of the marking light 300 of the projection control unit 45. However, the illumination control unit 47 may switch the irradiation of specific light at a timing different from the example given here.
また、照明制御部47は、照明装置80が特定の光を照射させるエリアを設定する。例えば、照明装置80が複数のエリアを個別に照射することができる場合、照明制御部47は、複数のエリアのうち特定エリアに特定の光を照射させる制御を行うことができる。具体的には、照明装置80が複数のLEDで構成されている場合、照明制御部47は、検出対象210に特定の光を照射するLEDのみを駆動させ、その他のLEDは駆動させないように制御することができる。反対に、照明制御部47は、検出対象210に特定の光を照射するLEDのみを駆動させないように制御することができる。また、照明制御部47が照明装置80による特定の光の照射方向を制御することによって、特定のエリアのみに光を照射するように設定してもよい。なお、照明装置80に位相変調型の空間変調素子を用いれば、照射方向を変更せずに、特定のエリアのみに光を照射することができる。
Further, the illumination control unit 47 sets an area where the illumination device 80 emits specific light. For example, when the lighting device 80 can individually irradiate a plurality of areas, the illumination control unit 47 can perform control to irradiate a specific area among the plurality of areas. Specifically, when the lighting device 80 includes a plurality of LEDs, the lighting control unit 47 controls to drive only the LEDs that irradiate the detection target 210 with specific light and not to drive other LEDs. can do. On the contrary, the illumination control unit 47 can control not to drive only the LED that irradiates the detection target 210 with specific light. Further, the illumination control unit 47 may be set to irradiate light only to a specific area by controlling the irradiation direction of the specific light by the lighting device 80. In addition, if a phase modulation type spatial modulation element is used for the illumination device 80, it is possible to irradiate only a specific area without changing the irradiation direction.
図28は、照明装置80を含む対象特定システム4の利用シーンの一例を示す概念図である。
FIG. 28 is a conceptual diagram illustrating an example of a usage scene of the target identification system 4 including the lighting device 80.
例えば、夜間や暗い室内では、十分な明るさの背景光が得られない場合がある。そのような場合、照明装置80によって、検出領域の背景光を十分に明るくする。背景光を十分に明るくすれば、情報入出力装置10は、第1の実施形態において説明した手法を用いて、検出対象210に対してマーキング光300を照射できる。図28の例の場合、照明装置80から照射される特定の光は、十分な明るさの照明光であればよい。
For example, background light with sufficient brightness may not be obtained at night or in a dark room. In such a case, the illumination device 80 sufficiently brightens the background light in the detection area. If the background light is sufficiently brightened, the information input / output device 10 can irradiate the detection light 210 with the marking light 300 using the method described in the first embodiment. In the case of the example in FIG. 28, the specific light emitted from the illumination device 80 may be illumination light with sufficient brightness.
図29は、照明装置80を含む対象特定システム4の利用シーンの別の一例を示す概念図である。
FIG. 29 is a conceptual diagram illustrating another example of a usage scene of the target identification system 4 including the lighting device 80.
例えば、何らかの検問をするために、検問所を通過する全ての人を特定の場所に誘導する。その特定の場所には、検問所を通過する全ての人が通過しなくてはならない検出領域200を設定する。そして、検出領域200を通過する全ての人にダミーのマーキング光(以下、ダミー光303)を照射する照明装置80を設ける。なお、ダミー光303を子供などに当てないような配慮をする場合、照明装置80の替わりにインターフェース装置100を用いて、子供であると認証された対象にはダミー光303を照射しないようにすればよい。
For example, in order to conduct some kind of check, all people who pass the checkpoint are guided to a specific place. In that particular place, a detection area 200 is set in which all persons who pass through the checkpoint must pass. Then, an illuminating device 80 for irradiating dummy marking light (hereinafter referred to as dummy light 303) to all persons passing through the detection region 200 is provided. In the case where consideration is given so that the dummy light 303 is not applied to a child or the like, the interface device 100 is used instead of the lighting device 80 so that the object that is authenticated as a child is not irradiated with the dummy light 303. That's fine.
検出対象210が検出された場合、インターフェース装置100は、検出対象210にマーキング光300を照射する。例えば、ダミー光303とマーキング光300とを異なる波長の光とすれば、シャッタメガネ600を装着した検出者には、マーキング光300が視認される。このとき、ダミー光303を連続波としたり、ダダミー光303のパルス条件をマーキング光300とは異なる条件にしたりすれば、マーキング光300をより峻別しやすくなる。
When the detection target 210 is detected, the interface apparatus 100 irradiates the detection target 210 with the marking light 300. For example, if the dummy light 303 and the marking light 300 are light of different wavelengths, the marking light 300 is visually recognized by the detector wearing the shutter glasses 600. At this time, if the dummy light 303 is a continuous wave, or the pulse condition of the dummy light 303 is different from that of the marking light 300, the marking light 300 can be more easily distinguished.
図29のような場合、インターフェース装置100は、ウエアラブルではなく、据置型にしてもよい。ただし、図29のような場合であっても、インターフェース装置100をウエアラブルなものとしてもよい。
In the case as shown in FIG. 29, the interface device 100 may be stationary instead of wearable. However, even in the case of FIG. 29, the interface device 100 may be wearable.
図29の例のように、全ての人にダミー光303が当たっていれば、周囲の人や検出対象210自身がマーキング光300に気づきにくくなる。シャッタメガネ600を装着した検出者は、検出対象210に当たるマーキング光300を視認できるので、検出対象210を特定できる。
As in the example of FIG. 29, if the dummy light 303 is applied to all people, the surrounding people and the detection target 210 themselves are less likely to notice the marking light 300. Since the detector wearing the shutter glasses 600 can visually recognize the marking light 300 that hits the detection target 210, the detection target 210 can be identified.
一般に、検問などの場合、複数の警察官が検出者として臨場している。このような場合、単一のモニタなどを通じて検出対象210を特定していたのでは、どの人が検出対象210の犯罪者なのかを瞬間に識別できない。本実施形態に係る対象特定システム4によれば、シャッタメガネ600を装着した全ての検出者が同時に検出対象210を識別できる。犯罪者を逮捕するような場面において、このような時間差は、犯罪者の逮捕に大きな寄与をする可能性がある。
In general, in the case of inspections, multiple police officers are present as detectors. In such a case, if the detection target 210 is specified through a single monitor or the like, it is impossible to instantly identify which person is the criminal of the detection target 210. According to the target identification system 4 according to the present embodiment, all the detectors wearing the shutter glasses 600 can identify the detection target 210 at the same time. In a scene where a criminal is arrested, such a time difference can greatly contribute to the criminal arrest.
図30~図32は、図29のような場面において、信号光のパルス条件に対応させて、シャッタ65の開閉をどのように制御するのかについて説明するための概念図である。なお、図30~図32は一例であって、本発明の範囲を限定するものではない。また、図30~図32中の信号光のパルス条件や出力条件等は、任意に設定できる。
30 to 32 are conceptual diagrams for explaining how to control the opening and closing of the shutter 65 in accordance with the pulse condition of the signal light in the scene as shown in FIG. 30 to 32 are examples, and do not limit the scope of the present invention. Further, the pulse conditions and output conditions of the signal light in FIGS. 30 to 32 can be arbitrarily set.
図30の例では、インターフェース装置100は、検出領域200を通過する全ての人に対して、パルス状のダミー光303を照射するように制御する。そして、インターフェース装置100は、ダミー光303を照射しないタイミングでパルス状のマーキング光300を検出対象210に照射するように制御する。ダミー光303は、全ての人に見られてもよいため、パルス幅は長めに設定できる。それに対し、マーキング光300は、シャッタメガネ600を装着した検出者のみが視認できるように、裸眼では視認できない程度の短めのパルス条件を設定する。そして、シャッタ65は、マーキング光300の照射タイミングに併せて開くように設定する。このとき、シャッタメガネ600は、背景光に対するマーキング光300のコントラストがシャッタ65を通して視認できるように、シャッタ65の開閉を制御する。
In the example of FIG. 30, the interface device 100 controls the irradiation of the pulsed dummy light 303 to all persons passing through the detection region 200. Then, the interface apparatus 100 controls to irradiate the detection target 210 with the pulsed marking light 300 at a timing when the dummy light 303 is not irradiated. Since the dummy light 303 may be seen by all people, the pulse width can be set longer. On the other hand, the marking light 300 sets a short pulse condition that is not visible with the naked eye so that only the detector wearing the shutter glasses 600 can visually recognize. Then, the shutter 65 is set to open in accordance with the irradiation timing of the marking light 300. At this time, the shutter glasses 600 control the opening and closing of the shutter 65 so that the contrast of the marking light 300 with respect to the background light can be visually recognized through the shutter 65.
図31の例では、インターフェース装置100は、検出領域200を通過する全ての人に対して、連続波のダミー光303を照射し、検出対象210にはパルス状のマーキング光300を照射するように制御する。連続波のダミー光303は、全ての人に視認できる。それに対し、マーキング光300は、シャッタメガネ600を装着した検出者のみが視認できるように、裸眼では視認できない短めのパルス条件を設定する。図31の例では、マーキング光300のピーク出力をダミー光303よりも強めに設定する。そして、シャッタ65は、マーキング光300の照射タイミングに併せて開くように設定する。このとき、シャッタメガネ600は、背景光に対するマーキング光300のコントラストがシャッタ65を通して視認できるように、シャッタ65の開閉を制御する。
In the example of FIG. 31, the interface device 100 irradiates all persons passing the detection region 200 with the continuous wave dummy light 303 and the detection target 210 with the pulsed marking light 300. Control. The continuous wave dummy light 303 is visible to all people. On the other hand, the marking light 300 sets a short pulse condition that cannot be visually recognized by the naked eye so that only the detector wearing the shutter glasses 600 can visually recognize. In the example of FIG. 31, the peak output of the marking light 300 is set to be stronger than the dummy light 303. Then, the shutter 65 is set to open in accordance with the irradiation timing of the marking light 300. At this time, the shutter glasses 600 control the opening and closing of the shutter 65 so that the contrast of the marking light 300 with respect to the background light can be visually recognized through the shutter 65.
図32の例では、図30の例と同様に、インターフェース装置100は、検出領域200を通過する全ての人に対して、パルス状のダミー光303を照射するように制御する。そして、インターフェース装置100は、ダミー光303を照射しないタイミングに併せて、パルス状のマーキング光300を検出対象210に照射するように制御する。ダミー光303は、全ての人に視認できてもよいし、視認できなくてもよい。それに対し、マーキング光300は、シャッタメガネ600を装着した検出者のみが視認できるように、裸眼で視認できない短めのパルス条件を設定する。図32の例では、検出領域200を通過するそれぞれの人に対して、異なる照射タイミングでダミー光303を照射する。そして、シャッタ65は、マーキング光300の照射タイミングのみに開くように設定する。このとき、シャッタメガネ600は、背景光に対するマーキング光300のコントラストがシャッタ65を通して視認できるように、シャッタ65の開閉を制御する。
32, as in the example of FIG. 30, the interface apparatus 100 performs control so that all persons passing through the detection region 200 are irradiated with the pulsed dummy light 303. Then, the interface apparatus 100 controls to irradiate the detection target 210 with the pulse-shaped marking light 300 at the timing when the dummy light 303 is not irradiated. The dummy light 303 may be visible to all or may not be visible. On the other hand, the marking light 300 sets a short pulse condition that cannot be visually recognized with the naked eye so that only the detector wearing the shutter glasses 600 can visually recognize. In the example of FIG. 32, the dummy light 303 is irradiated to each person passing through the detection region 200 at different irradiation timings. The shutter 65 is set to open only at the irradiation timing of the marking light 300. At this time, the shutter glasses 600 control the opening and closing of the shutter 65 so that the contrast of the marking light 300 with respect to the background light can be visually recognized through the shutter 65.
以上のように、本実施形態に係る対象特定システムによれば、夜間や暗い室内であっても、情報認識装置を装着した検出者のみが検出できる投射光を検出対象に投射できる。また、本実施形態によれば、全ての人にダミー光を照射することによって、検出対象自身や周囲には悟られることなく、検出対象にマーキング光を照射できる。
As described above, according to the target specifying system according to the present embodiment, it is possible to project the projection light that can be detected only by the detector wearing the information recognition device onto the detection target even at night or in a dark room. Moreover, according to this embodiment, by irradiating all persons with dummy light, it is possible to irradiate the detection target with the marking light without being aware of the detection target itself and the surroundings.
(第5の実施形態)
図33は、本発明の第5の実施形態に係る対象特定システムの情報入出力機能を発揮する情報入出力システム110の構成を示すブロック図である。 (Fifth embodiment)
FIG. 33 is a block diagram showing the configuration of the information input /output system 110 that exhibits the information input / output function of the target identification system according to the fifth embodiment of the present invention.
図33は、本発明の第5の実施形態に係る対象特定システムの情報入出力機能を発揮する情報入出力システム110の構成を示すブロック図である。 (Fifth embodiment)
FIG. 33 is a block diagram showing the configuration of the information input /
本実施形態に係る情報入出力システム110は、撮像手段20の機能をもつ撮像装置120、投射手段30、制御手段40の機能を有する制御装置140、照明装置80の機能を有する照明手段81を備える。すなわち、情報入出力システム110は、第4の実施形態に係る照明装置80の機能を有する照明手段81を追加した構成である。なお、照明手段81は照明装置80と同じ機能を有しており、詳細な説明は省略する。
The information input / output system 110 according to the present embodiment includes an imaging device 120 having the function of the imaging unit 20, a projection unit 30, a control device 140 having the function of the control unit 40, and an illumination unit 81 having the function of the illumination device 80. . That is, the information input / output system 110 has a configuration in which an illumination unit 81 having the function of the illumination device 80 according to the fourth embodiment is added. The illumination unit 81 has the same function as the illumination device 80, and detailed description thereof is omitted.
また、照明手段81と投射手段30は、光照射装置800を構成する。光照射装置800は、撮像装置120や制御装置140とは別の装置として設けられ、制御装置140の制御を受けて動作する。
Further, the illumination unit 81 and the projection unit 30 constitute a light irradiation device 800. The light irradiation device 800 is provided as a device separate from the imaging device 120 and the control device 140 and operates under the control of the control device 140.
以下において、本実施形態に係る情報入出力システム110を含む対象特定システムの利用シーンについて説明する。
Hereinafter, usage scenes of the target identification system including the information input / output system 110 according to the present embodiment will be described.
図34は、室内において検出対象210を検出する例である。天井には、複数の光照射装置800が設置されている。撮像装置120や制御装置140は、光照射装置800の設置位置とは異なる位置に設置されている。なお、光照射装置800は、イントラネットやインターネットなどのネットワークを通じて制御装置140の制御を受けるように構成してもよい。
FIG. 34 shows an example in which the detection target 210 is detected indoors. A plurality of light irradiation devices 800 are installed on the ceiling. The imaging device 120 and the control device 140 are installed at a position different from the installation position of the light irradiation device 800. The light irradiation device 800 may be configured to be controlled by the control device 140 through a network such as an intranet or the Internet.
図34の左側は、検出者が装着するシャッタメガネ600が閉じるタイミングで、光照射装置800から特定の光として照明光を照射する場面を示す。このとき、室内にいる全ての人物に光照射装置800からの照明光が照射されている。
The left side of FIG. 34 shows a scene in which illumination light is irradiated as specific light from the light irradiation device 800 at the timing when the shutter glasses 600 worn by the detector are closed. At this time, illumination light from the light irradiation device 800 is irradiated to all persons in the room.
図34の右側は、検出者が装着するシャッタメガネ600が開くタイミングで、検出対象210近傍の光照射装置800のみから検出対象210に照明光を照射する場面を示す。このとき、照明光を照射していない光照射装置800は、マーキング光300を検出対象210に照射する。このとき、室内にいる他の人物は、マーキング光300を認識することはない。
The right side of FIG. 34 shows a scene in which the detection target 210 is irradiated with illumination light only from the light irradiation device 800 in the vicinity of the detection target 210 when the shutter glasses 600 worn by the detector are opened. At this time, the light irradiation device 800 that has not irradiated the illumination light irradiates the detection light 210 with the marking light 300. At this time, other persons in the room do not recognize the marking light 300.
図34の右側の場面では、シャッタメガネ600を開くタイミングに併せて、ある光照射装置800からは検出対象210のみに照明光を照射し、他の複数の光照射装置800からは検出対象210にマーキング光300を照射する。このとき、光照射装置800からは検出対象210に照射する照明光は、検出対象210を視認できる範囲で必要最低限まで暗くする。その結果、マーキング光300のコントラストが向上し、シャッタメガネ600を装着した検出者のみマーキング光300を検出しやすくなる。また、複数の光照射装置800から検出対象210に照明光を照射すれば、検出対象210の姿勢や検出者の立ち位置などの関係で死角が発生することを防ぐことができる。
In the scene on the right side of FIG. 34, in conjunction with the timing of opening the shutter glasses 600, only a detection target 210 is irradiated with illumination light from a certain light irradiation device 800, and the detection target 210 is irradiated from a plurality of other light irradiation devices 800. The marking light 300 is irradiated. At this time, the illumination light applied to the detection target 210 from the light irradiation device 800 is made as dark as necessary within a range where the detection target 210 can be visually recognized. As a result, the contrast of the marking light 300 is improved, and it becomes easier for only the detector wearing the shutter glasses 600 to detect the marking light 300. Further, if the detection target 210 is irradiated with illumination light from a plurality of light irradiation devices 800, it is possible to prevent a blind spot from occurring due to the posture of the detection target 210 or the standing position of the detector.
図35~図38は、ある地点を通過する人物全員に光照射手段800から帯状の光を照射しつつ、検出対象210のみにマーキング光300を照射する例である。図35~図38における光照射装置800による光の照射は、図30~図32のようなタイミングで制御することができる。なお、図30~図32の例において、検出対象210に信号光を照射するタイミングでマーキング光300を照射し、その他のタイミングではダミー光303を照射するなどして制御すればよい。
35 to 38 are examples in which marking light 300 is irradiated only to the detection target 210 while irradiating the person passing through a certain point with the band-shaped light from the light irradiation means 800. The light irradiation by the light irradiation apparatus 800 in FIGS. 35 to 38 can be controlled at the timing as shown in FIGS. 30 to 32, the marking light 300 may be irradiated at the timing when the detection target 210 is irradiated with the signal light, and the dummy light 303 may be irradiated at other timings.
図35は、天井などに設置された光照射装置800から地面に向けて帯状の光を照射する。光照射装置800は、位置Cから位置C’に向けて帯状の光を走査する。そして、光照射装置800は、検出対象210を走査するタイミングで、検出対象210にマーキング光300を照射する。このとき、C-C’間を通過する検出対象210以外の対象には、ダミー光303が投射される。なお、照射装置800は、位置Cと位置C’との間で帯状の光を往復させるように走査してもよい。
FIG. 35 irradiates a band-shaped light toward the ground from a light irradiation device 800 installed on a ceiling or the like. The light irradiation device 800 scans the band-shaped light from the position C to the position C ′. Then, the light irradiation device 800 irradiates the detection target 210 with the marking light 300 at the timing of scanning the detection target 210. At this time, the dummy light 303 is projected onto a target other than the detection target 210 that passes between C and C ′. Irradiation apparatus 800 may scan so as to reciprocate the band-shaped light between position C and position C ′.
そして、制御装置140は、マーキング光300を検出対象210に照射するタイミングをシャッタメガネ600に通知する。シャッタメガネ600は、検出対象210にマーキング光300が照射されるタイミングでシャッタ65を開く。
Then, the control device 140 notifies the shutter glasses 600 of the timing of irradiating the detection light 210 with the marking light 300. The shutter glasses 600 open the shutter 65 at the timing when the marking light 300 is irradiated on the detection target 210.
図36は、光照射装置800から照射される帯状の光を図35とは別の角度からみた概念図である。なお、図36の例では、撮像装置120が帯状の光を通過する人を撮像し、制御装置140が撮像された画像上で人物の顔や位置を検知する。
FIG. 36 is a conceptual diagram of the band-shaped light irradiated from the light irradiation device 800 viewed from an angle different from that in FIG. In the example of FIG. 36, the imaging device 120 images a person who passes band-like light, and the control device 140 detects the face and position of the person on the captured image.
人物が位置Aにいる場合、光照射装置800が光を照射してしまうと人物の顔にダミー光303が投射されてしまう。そのため、光照射装置800は、光が人物の顔の位置に照射される場合、光の照射をやめる。そして、光照射装置800は、人物が位置Bにいる場合は肩近辺、人物が位置Cにいる場合は腰近辺、人物が位置Dにいる場合は膝近辺に光を照射する。すなわち、光照射装置800は、人物の顔を避けて光を照射する。
When the person is at position A, if the light irradiation device 800 emits light, the dummy light 303 is projected on the face of the person. Therefore, the light irradiation apparatus 800 stops the light irradiation when the light is irradiated to the position of the face of the person. The light irradiation apparatus 800 irradiates light near the shoulder when the person is at position B, near the waist when the person is at position C, and near the knee when the person is at position D. That is, the light irradiation device 800 irradiates light while avoiding a human face.
図37は、C-C’間を通過する人物のうち、検出対象210以外の人物に光照射手段800から帯状の光を照射する例である。
FIG. 37 shows an example in which a person other than the detection target 210 among persons passing between C and C ′ is irradiated with band-shaped light from the light irradiation means 800.
光照射装置800は、シャッタメガネ600を閉じるタイミングにおいて、図37の左側のように、検出対象210以外の人物に帯状の光が照射する。このとき、帯状の光を照射された人物にはダミー光303が投射される。一方、光照射装置800は、検出対象210には帯状の光を照射しない。そして、光照射装置800は、シャッタメガネ600を開くタイミングにおいて、図37の右側のように、検出対象210にマーキング光300を照射する。
The light irradiation device 800 irradiates a person other than the detection target 210 with a band-shaped light as shown on the left side of FIG. At this time, the dummy light 303 is projected to the person irradiated with the band-like light. On the other hand, the light irradiation apparatus 800 does not irradiate the detection target 210 with band-shaped light. Then, the light irradiation device 800 irradiates the detection light 210 with the marking light 300 as shown on the right side of FIG.
シャッタメガネ600を装着した検出者は、シャッタメガネ600を開くタイミングでマーキング光300を視認できるため、検出対象210を周囲に知られずに検出することができる。
Since the detector wearing the shutter glasses 600 can visually recognize the marking light 300 at the timing when the shutter glasses 600 are opened, the detection target 210 can be detected without being known to the surroundings.
図38は、C-C’間を通過する全ての人物に光照射手段800から帯状の光を照射する例である。
FIG. 38 shows an example in which the band-like light is irradiated from the light irradiation means 800 to all persons passing between C-C ′.
光照射装置800は、シャッタメガネ600を閉じるタイミングにおいて、図38の左側のように、位置C-C’間を通過する全ての人物に帯状の光を照射する。このとき、検出対象210を含めた全ての人物にダミー光303が投射される。そして、光照射装置800は、シャッタメガネ600を開くタイミングにおいて、図38の右側のように、ダミー光303とは異なるダミー光304を検出対象210に投射する。
The light irradiating device 800 irradiates all persons passing between the positions C-C ′ with a band-like light at the timing of closing the shutter glasses 600 as shown on the left side of FIG. At this time, the dummy light 303 is projected to all persons including the detection target 210. Then, the light irradiation device 800 projects the dummy light 304 different from the dummy light 303 onto the detection target 210 as shown on the right side of FIG.
また、光照射装置800は、シャッタメガネ600を閉じるタイミングにおいて、全員に同じダミー光を照射せず、ダミー光303(第1のダミー光)とは異なるダミー光(第2のダミー光)を検出対象210のみに照射してもよい。このとき、情報入出力システム100は、マーキング光300と第2のダミー光とを組み合わせると、特定の意味のある画像が形成されるようにする。このとき、シャッタメガネ600を装着した検出者は、第2のダミー光とマーキング光300が合成された画像を認識することによって、特定の意味のある画像を認識することができる。図38の方法によれば、特定の意味のある画像を検出対象210に投射する場合に、第三者からその特定の画像をより認識しにくくすることができる。
In addition, the light irradiation device 800 does not irradiate the same dummy light to everyone at the timing of closing the shutter glasses 600, and detects a dummy light (second dummy light) different from the dummy light 303 (first dummy light). Only the object 210 may be irradiated. At this time, when the marking light 300 and the second dummy light are combined, the information input / output system 100 forms an image having a specific meaning. At this time, the detector wearing the shutter glasses 600 can recognize an image having a specific meaning by recognizing an image in which the second dummy light and the marking light 300 are combined. According to the method of FIG. 38, when a specific meaningful image is projected onto the detection target 210, the specific image can be made more difficult to recognize from a third party.
(第6の実施形態)
図39は、本発明の第6の実施形態に係る対象特定システムの情報認識装置60-6の構成を示すブロック図である。本実施形態に係る対象特定システムは、第1の実施形態に係る対象特定システム1の情報認識装置60に警報手段67を追加した構成である。 (Sixth embodiment)
FIG. 39 is a block diagram showing the configuration of the information recognition apparatus 60-6 of the target identification system according to the sixth embodiment of the present invention. The target identification system according to the present embodiment has a configuration in which analarm unit 67 is added to the information recognition device 60 of the target identification system 1 according to the first embodiment.
図39は、本発明の第6の実施形態に係る対象特定システムの情報認識装置60-6の構成を示すブロック図である。本実施形態に係る対象特定システムは、第1の実施形態に係る対象特定システム1の情報認識装置60に警報手段67を追加した構成である。 (Sixth embodiment)
FIG. 39 is a block diagram showing the configuration of the information recognition apparatus 60-6 of the target identification system according to the sixth embodiment of the present invention. The target identification system according to the present embodiment has a configuration in which an
警報手段67は、情報認識装置60-6の受信手段61が通知信号を受信した際に、検出対象210が検出されたことを検出者に通知する。例えば、警報手段67は、音や振動、におい、熱、物理的な刺激などによって、検出対象210が検出されたことを検出者に通知する。なお、警報手段67は、検出者のみが視認できる光によって、検出対象210が検出されたことを検出者に通知してもよい。ただし、検出対象210が検出されたことを検出者に通知する際に、ここであげた情報以外のものを用いてもよい。
The alarm means 67 notifies the detector that the detection target 210 has been detected when the receiving means 61 of the information recognition apparatus 60-6 receives the notification signal. For example, the alarm means 67 notifies the detector that the detection target 210 has been detected by sound, vibration, smell, heat, physical stimulation, or the like. Note that the alarm means 67 may notify the detector that the detection target 210 has been detected by light that is visible only to the detector. However, when notifying the detection person that the detection object 210 has been detected, information other than the information given here may be used.
本実施形態に係る対象特定システムによれば、検出対象が検出されたことを視覚情報以外の情報で検出者に警報として通知できるため、検出者は、検出対象をより確実に把握しやすくなる。
According to the target identification system according to the present embodiment, since the detection object can be notified as an alarm to the detection person with information other than visual information, the detection person can easily grasp the detection object more reliably.
(ハードウェア構成)
次に、本発明の各実施形態に係る対象特定システムを可能とするためのハードウェア構成について、図40のコンピュータ90を一例として挙げて説明する。なお、図40のコンピュータ90は、各実施形態に係る情報入出力装置を可能とするための一構成例であって、本発明の範囲を限定するものではない。また、各実施形態に係る情報入出力装置は、図2のようなインターフェース装置100の形態とする場合、図40のコンピュータ90の機能を有するマイクロコンピュータとすることが好ましい。 (Hardware configuration)
Next, a hardware configuration for enabling the target identification system according to each embodiment of the present invention will be described using thecomputer 90 of FIG. 40 as an example. Note that the computer 90 in FIG. 40 is a configuration example for enabling the information input / output device according to each embodiment, and does not limit the scope of the present invention. Further, the information input / output device according to each embodiment is preferably a microcomputer having the function of the computer 90 of FIG. 40 in the case of the form of the interface device 100 as shown in FIG.
次に、本発明の各実施形態に係る対象特定システムを可能とするためのハードウェア構成について、図40のコンピュータ90を一例として挙げて説明する。なお、図40のコンピュータ90は、各実施形態に係る情報入出力装置を可能とするための一構成例であって、本発明の範囲を限定するものではない。また、各実施形態に係る情報入出力装置は、図2のようなインターフェース装置100の形態とする場合、図40のコンピュータ90の機能を有するマイクロコンピュータとすることが好ましい。 (Hardware configuration)
Next, a hardware configuration for enabling the target identification system according to each embodiment of the present invention will be described using the
図40のように、コンピュータ90は、プロセッサ91、主記憶装置92、補助記憶装置93、入出力インターフェース95、通信インターフェース96を備える。プロセッサ91、主記憶装置92、補助記憶装置93、入出力インターフェース95および通信インターフェース96は、バス99を介して互いにデータ授受可能に接続される。また、プロセッサ91、主記憶装置92、補助記憶装置93および入出力インターフェース95は、通信インターフェース96を介して、インターネットやイントラネットなどのネットワークと接続する。コンピュータ90は、ネットワークを介して上位システムのサーバやコンピュータに接続され、投射画像に関する情報を上位システムから受信する。
40, the computer 90 includes a processor 91, a main storage device 92, an auxiliary storage device 93, an input / output interface 95, and a communication interface 96. The processor 91, the main storage device 92, the auxiliary storage device 93, the input / output interface 95, and the communication interface 96 are connected to each other via a bus 99 so as to exchange data. The processor 91, the main storage device 92, the auxiliary storage device 93, and the input / output interface 95 are connected to a network such as the Internet or an intranet via a communication interface 96. The computer 90 is connected to a server or computer of a higher system via a network, and receives information related to the projection image from the higher system.
プロセッサ91は、補助記憶装置93等に格納されたプログラムを主記憶装置92に展開し、展開されたプログラムを実行する。本実施形態においては、コンピュータ90にインストールされたソフトウェアプログラムを用いる構成とすればよい。プロセッサ91は、本実施形態に係る情報入出力装置の演算処理や制御処理を実行する。
The processor 91 expands the program stored in the auxiliary storage device 93 or the like in the main storage device 92, and executes the expanded program. In the present embodiment, a configuration using a software program installed in the computer 90 may be adopted. The processor 91 executes arithmetic processing and control processing of the information input / output device according to the present embodiment.
主記憶装置92は、プログラムが展開される領域を有する。主記憶装置92は、例えばDRAM(Dynamic Random Access Memory)などの揮発性メモリとすればよい。また、MRAM(Magnetoresistive Random Access Memory)などの不揮発性メモリを主記憶装置92として構成・追加してもよい。
The main storage device 92 has an area where the program is expanded. The main storage device 92 may be a volatile memory such as a DRAM (Dynamic Random Access Memory). Further, a non-volatile memory such as an MRAM (Magnetic Resistive Random Access Memory) may be configured and added as the main storage device 92.
補助記憶装置93は、投射画像の位相分布などのデータを記憶する手段である。補助記憶装置93は、ハードディスクやフラッシュメモリなどのローカルディスクによって構成される。なお、投射画像の位相分布を主記憶装置92に記憶させる構成とし、補助記憶装置93を省略することも可能である。
The auxiliary storage device 93 is means for storing data such as the phase distribution of the projected image. The auxiliary storage device 93 is configured by a local disk such as a hard disk or a flash memory. The phase distribution of the projected image can be stored in the main storage device 92, and the auxiliary storage device 93 can be omitted.
入出力インターフェース95は、コンピュータ90と周辺機器との接続規格に基づいて、コンピュータ90と周辺機器とを接続する装置である。通信インターフェース96は、規格や仕様に基づいて、インターネットやイントラネットなどのネットワークに接続するためのインターフェースである。なお、図40においては、インターフェースをI/F(Interface)と略して表記している。入出力インターフェース95および通信インターフェース96は、外部機器と接続するインターフェースとして共通化してもよい。
The input / output interface 95 is a device that connects the computer 90 and peripheral devices based on the connection standard between the computer 90 and peripheral devices. The communication interface 96 is an interface for connecting to a network such as the Internet or an intranet based on standards and specifications. In FIG. 40, the interface is abbreviated as I / F (Interface). The input / output interface 95 and the communication interface 96 may be shared as an interface connected to an external device.
コンピュータ90には、必要に応じて、キーボードやマウス、タッチパネルなどといった入力機器を接続できるように構成してもよい。それらの入力機器は、情報や設定の入力に使用される。なお、タッチパネルを入力機器として用いる場合は、表示機器の表示部が入力機器の入力部を兼ねる構成とすればよい。プロセッサ91と入力機器との間のデータ授受は、入力インターフェース95に仲介させればよい。
The computer 90 may be configured so that input devices such as a keyboard, a mouse, and a touch panel can be connected as necessary. These input devices are used for inputting information and settings. Note that when the touch panel is used as an input device, the display unit of the display device may serve as the input unit of the input device. Data exchange between the processor 91 and the input device may be mediated by the input interface 95.
通信インターフェース96は、ネットワークを通じて、別のコンピュータやサーバなどの上位システムに接続される。上位システムは、通信インターフェース96を介して、本発明の各実施形態で用いる基本画像の位相分布をコンピュータ90に送信する。また、上位システムは、通信インターフェース96を介して、本発明の各実施形態で用いる投射画像に関する情報をコンピュータ90に送信する。上位システムは、本発明の各実施形態で用いる投射画像の位相分布を自装置で生成してもよいし、別の装置から取得してもよい。
The communication interface 96 is connected to a host system such as another computer or server through a network. The host system transmits the phase distribution of the basic image used in each embodiment of the present invention to the computer 90 via the communication interface 96. The host system transmits information related to the projection image used in each embodiment of the present invention to the computer 90 via the communication interface 96. The host system may generate the phase distribution of the projection image used in each embodiment of the present invention by itself or may acquire it from another device.
また、コンピュータ90には、各種の情報を表示するための表示機器を備え付けてもよい。表示機器を備え付ける場合、コンピュータ90には、表示機器の表示を制御するための表示制御装置(図示しない)が備えられていることが好ましい。表示機器は、入力インターフェース95を介してコンピュータ90に接続すればよい。
Further, the computer 90 may be provided with a display device for displaying various information. When the display device is provided, the computer 90 is preferably provided with a display control device (not shown) for controlling the display of the display device. The display device may be connected to the computer 90 via the input interface 95.
また、コンピュータ90には、必要に応じて、リーダライタを備え付けてもよい。リーダライタは、バス99に接続され、プロセッサ91と図示しない記録媒体(プログラム記録媒体)との間で、記録媒体からのデータ・プログラムの読み出し、コンピュータ90の処理結果の記録媒体への書き込みなどを仲介する。記録媒体は、例えばSD(Secure Digital)カードやUSB(Universal Serial Bus)メモリなどの半導体記録媒体などで実現できる。また、記録媒体516は、フレキシブルディスクなどの磁気記録媒体、CD(Compact Disk)やDVD(Digital Versatile Disc)などの光学記録媒体やその他の記録媒体によって実現してもよい。
Further, the computer 90 may be provided with a reader / writer as necessary. The reader / writer is connected to the bus 99 and reads a data program from the recording medium and writes a processing result of the computer 90 to the recording medium between the processor 91 and a recording medium (program recording medium) (not shown). Mediate. The recording medium can be realized by a semiconductor recording medium such as an SD (Secure Digital) card or a USB (Universal Serial Bus) memory. The recording medium 516 may be realized by a magnetic recording medium such as a flexible disk, an optical recording medium such as a CD (Compact Disk) or a DVD (Digital Versatile Disc), or other recording medium.
以上が、本発明の各実施形態に係る対象特定システムを可能とするためのハードウェア構成の一例である。なお、図40のハードウェア構成は、各実施形態に係る情報入出力装置を可能とするためのハードウェア構成の一例であって、本発明の範囲を限定するものではない。また、各実施形態に係る情報入出力装置による処理をコンピュータに実行させる処理プログラムも本発明の範囲に含まれる。さらに、各実施形態に係る処理プログラムを記録したプログラム記録媒体も本発明の範囲に含まれる。
The above is an example of a hardware configuration for enabling the target identification system according to each embodiment of the present invention. Note that the hardware configuration in FIG. 40 is an example of a hardware configuration for enabling the information input / output device according to each embodiment, and does not limit the scope of the present invention. A processing program for causing a computer to execute processing by the information input / output device according to each embodiment is also included in the scope of the present invention. Furthermore, a program recording medium recording the processing program according to each embodiment is also included in the scope of the present invention.
以上の本発明の各実施形態においては、検出者である警察官や警備員が、検出対象である犯人を検挙する場面を想定して説明してきた。各実施形態の手法は、そのような場面以外にも適用できる。なお、下記の場面は一例であって、本発明の範囲を限定するものではない。
In each of the embodiments of the present invention described above, the description has been made on the assumption that a police officer or a guard who is a detective criminalizes a criminal who is a detection target. The method of each embodiment can be applied to other cases. Note that the following scene is an example, and does not limit the scope of the present invention.
例えば、パチンコ店やカジノ、ゲームセンターなどの遊技施設に、過去に不正をした疑いのある検出対象が入場した際に、検出者である警備員や店員がその検出対象をマークする場面にも本実施形態の手法を適用できる。例えば、デパートやショールームなどの店舗において、過去に店舗内の商品を破損したり、怪しい行動をしたりした履歴がある客が来店した際に、検出者である店員が、その客を検出対象としてマークする場面にも本実施形態の手法を適用できる。例えば、公園や広場などの公共施設において、過去に施設内で迷惑行為をしたり、怪しい行動をしたりした履歴のある来場者が訪れた際に、検出者である公共施設の管理者がその来場者を検出対象としてマークする場面にも本実施形態の手法を適用できる。また、本発明の各実施形態の手法は、動物や物を検出対象とする場合においても適用できる。
For example, when a detection target suspected of fraud enters a gaming facility such as a pachinko parlor, casino, or game center, the security guard or the store clerk who marks the detection may mark this detection target. The method of the embodiment can be applied. For example, in a store such as a department store or a showroom, when a customer who has a history of damaging a product in the store or having behaved suspiciously has visited the store, the store clerk who is the detector detects the customer as a detection target. The method of this embodiment can also be applied to a scene to be marked. For example, in a public facility such as a park or open space, when a visitor with a history of nuisance or suspicious behavior has visited in the past, the administrator of the public facility that is the detector The method of this embodiment can also be applied to a scene where a visitor is marked as a detection target. In addition, the technique of each embodiment of the present invention can be applied even when an animal or an object is a detection target.
この出願は、2015年4月20日に出願された日本出願特願2015-085658を基礎とする優先権を主張し、その開示の全てをここに取り込む。
This application claims priority based on Japanese Patent Application No. 2015-085658 filed on April 20, 2015, the entire disclosure of which is incorporated herein.
1 対象特定システム
10 情報入出力装置
20 撮像手段
21 撮像素子
23 画像処理プロセッサ
24 内部メモリ
25 測光センサ
26 データ出力部
30 投射手段
31 光源
32 光源駆動部
35 位相変調素子
36 位相変調素子制御部
38 投射部
40 制御手段
41 撮像制御部
42 照射条件設定部
43 処理部
44 記憶部
45 投射制御部
46 通信部
50 対象特定手段
51 データベース
52 画像取得部
53 画像解析部
54 照合部
55 照合結果出力部
60 情報認識装置
61 受信手段
62 開閉タイミング設定手段
63 開閉制御手段
64 シャッタ駆動手段
65 シャッタ
80 照明装置
81 照明手段
90 コンピュータ
91 プロセッサ
92 主記憶装置
93 補助記憶装置
95 入出力インターフェース
96 通信インターフェース
99 バス
100 インターフェース装置
110 情報入出力システム
120 撮像装置
140 制御装置
800 光照射装置 DESCRIPTION OFSYMBOLS 1 Target identification system 10 Information input / output device 20 Imaging means 21 Imaging element 23 Image processor 24 Internal memory 25 Photometric sensor 26 Data output part 30 Projection means 31 Light source 32 Light source drive part 35 Phase modulation element 36 Phase modulation element control part 38 Projection Unit 40 control unit 41 imaging control unit 42 irradiation condition setting unit 43 processing unit 44 storage unit 45 projection control unit 46 communication unit 50 target specifying unit 51 database 52 image acquisition unit 53 image analysis unit 54 collation unit 55 collation result output unit 60 information Recognizing device 61 Receiving means 62 Opening / closing timing setting means 63 Opening / closing control means 64 Shutter driving means 65 Shutter 80 Illuminating device 81 Illuminating means 90 Computer 91 Processor 92 Main storage device 93 Auxiliary storage device 95 I / O interface Face 96 Communication interface 99 bus 100 interface device 110 information output system 120 imaging device 140 controller 800 light radiation device
10 情報入出力装置
20 撮像手段
21 撮像素子
23 画像処理プロセッサ
24 内部メモリ
25 測光センサ
26 データ出力部
30 投射手段
31 光源
32 光源駆動部
35 位相変調素子
36 位相変調素子制御部
38 投射部
40 制御手段
41 撮像制御部
42 照射条件設定部
43 処理部
44 記憶部
45 投射制御部
46 通信部
50 対象特定手段
51 データベース
52 画像取得部
53 画像解析部
54 照合部
55 照合結果出力部
60 情報認識装置
61 受信手段
62 開閉タイミング設定手段
63 開閉制御手段
64 シャッタ駆動手段
65 シャッタ
80 照明装置
81 照明手段
90 コンピュータ
91 プロセッサ
92 主記憶装置
93 補助記憶装置
95 入出力インターフェース
96 通信インターフェース
99 バス
100 インターフェース装置
110 情報入出力システム
120 撮像装置
140 制御装置
800 光照射装置 DESCRIPTION OF
Claims (18)
- 検出領域を撮像して画像データを生成するとともに、前記検出領域の明るさを測光して測光データを生成し、前記画像データおよび前記測光データを出力する撮像手段と、
前記検出対象の特徴データを格納するデータベースを含み、入力した前記画像データ上で特徴データを抽出し、前記画像データ上で抽出された特徴データと前記データベースに格納されている特徴データとを照合するとともに、前記検出対象の検出位置に関する位置座標を抽出し、前記検出対象が検出された際には、前記検出対象に関する照合結果と位置座標とを含む通知信号を出力する対象特定手段と、
前記撮像手段に前記検出領域を撮像させる制御を行うとともに、入力した前記測光データおよび前記通知信号に基づいて前記検出対象に照射する信号光の制御条件を設定し、前記制御条件に基づいて前記信号光を出射させる制御を行う制御手段と、
前記制御手段の制御に応じて、前記検出対象に向けて前記信号光を照射する投射手段とを有する情報入出力装置と、
前記検出対象を検出する検出者に届く光の入射量を制御するシャッタを有し、前記制御条件に基づいて前記シャッタの開閉条件を設定し、前記通知信号に応じて前記シャッタを開閉制御する情報認識装置とを備える対象特定システム。 An imaging unit that captures an image of the detection area to generate image data, and that measures the brightness of the detection area to generate photometric data, and outputs the image data and the photometric data;
Including a database for storing the feature data to be detected, extracting the feature data from the input image data, and comparing the feature data extracted from the image data with the feature data stored in the database In addition, when the position coordinate related to the detection position of the detection target is extracted, and the detection target is detected, target identification means for outputting a notification signal including a matching result and the position coordinate regarding the detection target;
The control unit controls the imaging unit to image the detection area, sets control conditions for signal light to be irradiated on the detection target based on the input photometric data and the notification signal, and sets the signal based on the control conditions. Control means for performing control to emit light;
An information input / output device having projection means for irradiating the signal light toward the detection target according to the control of the control means;
Information that includes a shutter that controls an incident amount of light reaching a detector that detects the detection target, sets the opening / closing condition of the shutter based on the control condition, and controls the opening / closing of the shutter according to the notification signal A target identification system comprising a recognition device. - 前記対象特定手段は、
前記画像データから抽出された特徴データに対応する前記検出対象の前記画像データ上における位置座標を、前記抽出された特徴データに関連付けて出力する画像解析部と、
前記画像解析部によって出力された前記特徴データと、前記データベースに格納された特徴データとを照合し、前記検出対象が検出された場合には、前記検出対象が検出されたことを示す情報と、前記検出対象に対応する位置座標とを出力する照合部と、
前記検出対象が検出されたことを示す情報と、前記検出対象に対応する位置座標とを含む前記通知信号を出力する照合結果出力部とを有する請求項1に記載の対象特定システム。 The object specifying means includes
An image analysis unit that outputs position coordinates on the image data of the detection target corresponding to the feature data extracted from the image data in association with the extracted feature data;
The feature data output by the image analysis unit and the feature data stored in the database are collated, and when the detection target is detected, information indicating that the detection target is detected; A collation unit that outputs position coordinates corresponding to the detection target;
The target identification system according to claim 1, further comprising: a collation result output unit that outputs the notification signal including information indicating that the detection target is detected and position coordinates corresponding to the detection target. - 前記制御手段は、
前記測光データに含まれる前記検出領域の照度に基づいて、前記投射手段が出射する前記信号光のパルス幅および繰返し周期に関する条件を含むパルス条件と、前記信号光の出力に関する条件を含む出力条件とを含む照射条件を設定する照射条件設定部と、
前記通知信号および前記照射条件を入力とし、前記通知信号に含まれる前記検出対象の位置座標を基に、前記検出対象に照射する前記信号光の照射方向条件を算出し、前記照射方向条件および前記照射条件を併せた条件を前記制御条件として設定する処理部とを有する請求項2に記載の対象特定システム。 The control means includes
Based on the illuminance of the detection area included in the photometric data, a pulse condition including a condition related to a pulse width and a repetition period of the signal light emitted by the projection unit, and an output condition including a condition related to the output of the signal light, An irradiation condition setting unit for setting irradiation conditions including:
With the notification signal and the irradiation condition as inputs, based on the position coordinates of the detection target included in the notification signal, the irradiation direction condition of the signal light to be irradiated to the detection target is calculated, and the irradiation direction condition and the The target specifying system according to claim 2, further comprising: a processing unit that sets a condition including an irradiation condition as the control condition. - 前記情報認識装置は、
前記制御条件を受信する受信手段と、
前記制御条件に基づいて、前記シャッタ光を通した背景光に対して、前記検出対象に照射されたパルス状の前記信号光のコントラストを相対的に向上させる条件を計算し、計算した条件と前記パルス条件とに基づいて前記シャッタの開閉条件を設定する開閉タイミング設定手段と、
前記シャッタの開閉制御を行う開閉制御手段と、
前記シャッタの開閉制御に応じて前記シャッタを駆動させるシャッタ駆動部とを有する請求項3に記載の対象特定システム。 The information recognition device includes:
Receiving means for receiving the control condition;
Based on the control condition, a condition for relatively improving the contrast of the pulsed signal light irradiated to the detection target with respect to the background light passing through the shutter light is calculated, and the calculated condition and the An opening / closing timing setting means for setting an opening / closing condition of the shutter based on a pulse condition;
An opening / closing control means for performing opening / closing control of the shutter;
The object specifying system according to claim 3, further comprising: a shutter driving unit that drives the shutter according to the opening / closing control of the shutter. - 前記開閉タイミング設定手段は、
前記検出対象に前記信号光が照射されるタイミングに合わせて前記シャッタを開くように設定する請求項4に記載の対象特定システム。 The opening / closing timing setting means includes
The target specifying system according to claim 4, wherein the shutter is opened in accordance with a timing at which the detection target is irradiated with the signal light. - 前記情報認識装置は、
前記検出者の少なくとも一方の目の前方に前記シャッタを配置するためのフレームを有する請求項4または5に記載の対象特定システム。 The information recognition device includes:
The target specifying system according to claim 4, further comprising a frame for disposing the shutter in front of at least one eye of the detector. - 前記情報認識装置は、
前記シャッタを通して前記検出領域の明るさを測光する測光手段を有する請求項1乃至6のいずれか一項に記載の対象特定システム。 The information recognition device includes:
The object specifying system according to any one of claims 1 to 6, further comprising photometric means for measuring the brightness of the detection area through the shutter. - 前記対象特定手段は、
前記情報入出力装置および前記情報認識装置とネットワークを通じて接続されたサーバ上に構成される請求項1乃至7のいずれか一項に記載の対象特定システム。 The object specifying means includes
The object specifying system according to any one of claims 1 to 7, configured on a server connected to the information input / output device and the information recognition device through a network. - 前記検出領域を照らす照明手段を備える請求項4に記載の対象特定システム。 The object specifying system according to claim 4, further comprising an illumination unit that illuminates the detection area.
- 前記照明手段は、
前記制御手段の制御に応じて前記検出領域に含まれる対象に特定の光を照射する請求項9に記載の対象特定システム。 The illumination means includes
The target specifying system according to claim 9, wherein specific light is irradiated to a target included in the detection region in accordance with control of the control unit. - 前記制御手段は、
前記検出領域に含まれる対象が特定の地点を通過する際には、前記検出領域に含まれる対象に対して前記特定の光を走査して照射するように前記照明手段を制御し、
前記検出対象が前記特定の地点を通過する際には、前記検出対象に前記信号光を照射するように前記投射手段を制御する請求項10に記載の対象特定システム。 The control means includes
When the target included in the detection region passes through a specific point, the illumination unit is controlled to scan and irradiate the specific light to the target included in the detection region,
The target specifying system according to claim 10, wherein when the detection target passes through the specific point, the projection unit is controlled to irradiate the detection target with the signal light. - 前記制御手段は、
前記照明手段が前記特定の光を照射していないタイミングにおいて、前記投射手段から前記信号光が出射されるように前記照射条件を設定する請求項11に記載の対象特定システム。 The control means includes
The target identification system according to claim 11, wherein the irradiation condition is set so that the signal light is emitted from the projection unit at a timing when the illumination unit does not irradiate the specific light. - 前記制御手段は、
前記照明手段が出射する前記特定の光の出力よりも、前記投射手段が出射する前記信号光の出力の方が高くなるように前記照射条件を設定する請求項11に記載の対象特定システム。 The control means includes
The target specifying system according to claim 11, wherein the irradiation condition is set such that an output of the signal light emitted from the projection unit is higher than an output of the specific light emitted from the illumination unit. - 前記制御手段は、
前記検出領域に含まれる対象ごとに異なるタイミングで前記照明手段から前記特定の光が照射され、前記照明手段が前記特定の光を照射していないタイミングで前記投射手段から前記信号光が出射されるように前記照射条件を設定する請求項11に記載の対象特定システム。 The control means includes
The specific light is emitted from the illuminating unit at a different timing for each target included in the detection region, and the signal light is emitted from the projection unit at a timing at which the illuminating unit does not irradiate the specific light. The object specifying system according to claim 11, wherein the irradiation condition is set as follows. - 前記情報認識装置は、
前記通知信号を受信した際に、前記検出対象が検出されたことを通知する警報手段を有する請求項1乃至14のいずれか一項に記載の対象特定システム。 The information recognition device includes:
The target specifying system according to claim 1, further comprising an alarm unit that notifies that the detection target is detected when the notification signal is received. - 前記投射手段は、
特定波長のパルス状の前記信号光を出射する光源と、
前記制御手段の制御に応じて、前記光源を駆動させる光源駆動部と、
前記光源から出射された前記信号光が入射する表示面を有し、前記検出対象に投射する所定の画像の位相分布を前記表示面に表示する位相変調素子と、
前記位相変調素子の表示面に前記所定の画像の位相分布を表示させる位相変調素子制御部と、
前記位相変調素子の表示面から出射された前記信号光を検出対象に投射する投射部とを有する請求項1乃至15のいずれか一項に記載の対象特定システム。 The projection means includes
A light source that emits the pulsed signal light of a specific wavelength;
A light source driving unit that drives the light source in accordance with the control of the control means;
A phase modulation element having a display surface on which the signal light emitted from the light source is incident, and displaying a phase distribution of a predetermined image projected on the detection target on the display surface;
A phase modulation element controller that displays a phase distribution of the predetermined image on a display surface of the phase modulation element;
The target specifying system according to claim 1, further comprising: a projection unit that projects the signal light emitted from the display surface of the phase modulation element onto a detection target. - 検出領域を撮像して画像データを生成するともに、前記検出領域の明るさを測光して測光データを生成し、
前記画像データ上で特徴データを抽出し、前記画像データ上で抽出された特徴データと、前記検出対象の特徴データを格納するデータベースに格納されている特徴データとを照合するとともに、前記検出対象の検出位置に関する位置座標を抽出し、前記検出対象が検出された際には、前記検出対象に関する照合結果と前記位置座標とを含む通知信号を生成し、
前記測光データおよび前記通知信号に基づいて前記検出対象に照射する信号光の制御条件を設定し、前記制御条件に基づいて前記信号光を出射させる制御を行い、前記検出対象に向けて前記信号光を出射し、
前記検出対象を検出する検出者に届く光の入射量を制御するシャッタの開閉条件を前記制御条件に基づいて設定し、前記通知信号に応じて前記シャッタを開閉制御する対象特定方法。 The detection area is imaged to generate image data, and the brightness of the detection area is measured to generate photometric data,
Feature data is extracted on the image data, and the feature data extracted on the image data is collated with feature data stored in a database storing the feature data of the detection target, and the detection target When the position coordinates related to the detection position are extracted and the detection target is detected, a notification signal including a collation result related to the detection target and the position coordinates is generated,
Based on the photometric data and the notification signal, a control condition for signal light to be emitted to the detection target is set, control is performed to emit the signal light based on the control condition, and the signal light is directed toward the detection target. And
An object specifying method for setting an opening / closing condition of a shutter for controlling an incident amount of light reaching a detector for detecting the detection object based on the control condition, and controlling the opening / closing of the shutter according to the notification signal. - 検出領域を撮像して画像データを生成するともに、前記検出領域の明るさを測光して測光データを生成する処理と、
前記画像データ上で特徴データを抽出し、前記画像データ上で抽出された特徴データと、前記検出対象の特徴データを格納するデータベースに格納されている特徴データとを照合するとともに、前記検出対象の検出位置に関する位置座標を抽出し、前記検出対象が検出された際には、前記検出対象に関する前記照合結果と前記位置座標とを含む通知信号を生成する処理と、
前記測光データおよび前記通知信号に基づいて前記検出対象に照射する信号光の制御条件を設定し、前記制御条件に基づいて前記信号光を出射させる制御を行い、前記検出対象に向けて前記信号光を出射する処理と、
前記検出対象を検出する検出者に届く光の入射量を制御するシャッタの開閉条件を前記制御条件に基づいて設定し、前記通知信号に応じて前記シャッタを開閉制御する処理とをコンピュータに実行させる対象特定プログラムを記録するプログラム記録媒体。 A process for capturing the detection area and generating image data, and measuring the brightness of the detection area to generate photometric data;
Feature data is extracted on the image data, and the feature data extracted on the image data is collated with feature data stored in a database storing the feature data of the detection target, and the detection target A process of extracting a position coordinate related to a detection position and generating a notification signal including the collation result related to the detection target and the position coordinate when the detection target is detected;
Based on the photometric data and the notification signal, a control condition for signal light to be emitted to the detection target is set, control is performed to emit the signal light based on the control condition, and the signal light is directed toward the detection target. The process of emitting
Based on the control condition, an opening / closing condition for a shutter that controls the amount of incident light reaching the detector that detects the detection target is set, and the computer executes a process for controlling the opening / closing of the shutter according to the notification signal. A program recording medium for recording a target specific program.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017513961A JP6642568B2 (en) | 2015-04-20 | 2016-04-15 | Target identification system, target identification method and program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015-085658 | 2015-04-20 | ||
JP2015085658 | 2015-04-20 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016170765A1 true WO2016170765A1 (en) | 2016-10-27 |
Family
ID=57143044
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/002051 WO2016170765A1 (en) | 2015-04-20 | 2016-04-15 | Target identification system, target identification method and program storage medium |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP6642568B2 (en) |
WO (1) | WO2016170765A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019116526A1 (en) * | 2017-12-15 | 2019-06-20 | 日本電気株式会社 | Projection device, interface device, and projection method |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002013928A (en) * | 2000-06-30 | 2002-01-18 | Topcon Corp | Device for pulse light observation |
JP2012095292A (en) * | 2010-10-28 | 2012-05-17 | Kofukin Seimitsu Kogyo (Shenzhen) Yugenkoshi | System for identifying and tracking suspects and method therefor |
JP2015504616A (en) * | 2011-09-26 | 2015-02-12 | マイクロソフト コーポレーション | Video display correction based on sensor input of transmission myopia display |
-
2016
- 2016-04-15 JP JP2017513961A patent/JP6642568B2/en active Active
- 2016-04-15 WO PCT/JP2016/002051 patent/WO2016170765A1/en active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002013928A (en) * | 2000-06-30 | 2002-01-18 | Topcon Corp | Device for pulse light observation |
JP2012095292A (en) * | 2010-10-28 | 2012-05-17 | Kofukin Seimitsu Kogyo (Shenzhen) Yugenkoshi | System for identifying and tracking suspects and method therefor |
JP2015504616A (en) * | 2011-09-26 | 2015-02-12 | マイクロソフト コーポレーション | Video display correction based on sensor input of transmission myopia display |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019116526A1 (en) * | 2017-12-15 | 2019-06-20 | 日本電気株式会社 | Projection device, interface device, and projection method |
JPWO2019116526A1 (en) * | 2017-12-15 | 2020-12-10 | 日本電気株式会社 | Projection device, interface device and projection method |
US11392014B2 (en) | 2017-12-15 | 2022-07-19 | Nec Corporation | Projection device, interface device, and projection method |
Also Published As
Publication number | Publication date |
---|---|
JP6642568B2 (en) | 2020-02-05 |
JPWO2016170765A1 (en) | 2018-04-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11132536B2 (en) | Living body detection device, living body detection method, and recording medium | |
KR101286454B1 (en) | Fake face identification apparatus and method using characteristic of eye image | |
US8290208B2 (en) | Enhanced safety during laser projection | |
US9196056B2 (en) | Electro-optical system and method for analyzing images of a scene to identify the presence of a target color | |
CN102783123B (en) | Portable electric appts | |
JP2007516525A (en) | Method and system for facial image acquisition and identification | |
JP2008527806A (en) | Night monitoring system and method | |
JP2007527270A (en) | Biological face recognition system | |
JP7241063B2 (en) | Laser Speckle Analysis for Biometrics | |
WO2012083860A1 (en) | Face recognition device and method | |
Janveja et al. | Insight: monitoring the state of the driver in low-light using smartphones | |
JP2006258651A (en) | Method and device for detecting unspecified imaging apparatus | |
JP6642568B2 (en) | Target identification system, target identification method and program | |
JP2012107944A (en) | Target identification device and target identification method | |
JP2010134363A (en) | Illumination control device and method | |
CN105612566B (en) | Smog identifies equipment and the method for detecting at least one smoke detection feature | |
JPWO2019008936A1 (en) | Image processing device, computer program, and image processing system | |
KR100557037B1 (en) | Indicated device of iris location for iris recognition system | |
JPWO2020121626A1 (en) | Image processing equipment, computer programs, and image processing systems | |
CN113011222A (en) | Living body detection system and method and electronic equipment | |
KR102231272B1 (en) | Security passing control gate | |
RU2791821C1 (en) | Biometric identification system and method for biometric identification | |
KR102716619B1 (en) | Laser Speckle Analysis for Biometric Authentication | |
JP6512806B2 (en) | Imaging device | |
KR101509092B1 (en) | LED Light Projector for Number Plate Recognition Camera having Advertisement Function |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16782789 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017513961 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16782789 Country of ref document: EP Kind code of ref document: A1 |