WO2017206896A1 - 一种视频监控方法和设备 - Google Patents

一种视频监控方法和设备 Download PDF

Info

Publication number
WO2017206896A1
WO2017206896A1 PCT/CN2017/086543 CN2017086543W WO2017206896A1 WO 2017206896 A1 WO2017206896 A1 WO 2017206896A1 CN 2017086543 W CN2017086543 W CN 2017086543W WO 2017206896 A1 WO2017206896 A1 WO 2017206896A1
Authority
WO
WIPO (PCT)
Prior art keywords
infrared
camera
target
deflection
infrared sensor
Prior art date
Application number
PCT/CN2017/086543
Other languages
English (en)
French (fr)
Inventor
耿立华
任俊媛
李晓宇
严寒
曾起
Original Assignee
京东方科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司 filed Critical 京东方科技集团股份有限公司
Priority to US15/742,321 priority Critical patent/US10776650B2/en
Publication of WO2017206896A1 publication Critical patent/WO2017206896A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/782Systems for determining direction or deviation from predetermined direction
    • G01S3/785Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
    • G01S3/786Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically
    • G01S3/7864T.V. type tracking systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Definitions

  • the present invention relates to the field of video surveillance, and in particular, to a method and device for video surveillance.
  • the present invention provides a video monitoring method, including:
  • the infrared sensor Detecting whether a target appears in a sensing area of the infrared sensor by an infrared sensor, and generating a deflection command for indicating a deflection angle when the target is detected, wherein the infrared sensor is configured to be disposed at a first position of the monitored area ;
  • the shooting direction of the camera for video surveillance is deflected toward the target by the deflection command according to the deflection command, wherein the camera is configured to be disposed at a second position of the monitored area that is different from the first position.
  • the infrared sensor comprises a plurality of infrared sensors
  • the detecting the target by the infrared sensor comprises: detecting by each of the plurality of infrared sensors Whether the target appears in the sensing area of the infrared sensor, and the deflection command is generated according to a combination of the detection results of the plurality of infrared sensors.
  • the deflection instruction includes a first deflection instruction and a second deflection instruction
  • the step of generating the deflection instruction according to the combination of the detection results of the at least two of the plurality of infrared sensors comprises: when the plurality of When the infrared sensor sequentially detects the target, generating the first deflection instruction according to the first control strategy;
  • the second deflection command is generated according to a second control strategy different from the first control strategy when at least two of the plurality of infrared sensors simultaneously detect the target.
  • the plurality of infrared sensors are divided into N groups, each group includes at least two infrared sensors, and N is a positive integer;
  • the step of generating the first deflection command according to the first control policy includes: for each group of infrared The infrared sensor that last detected the target in the sensor generates a first deflection command, the deflection angle indicated by the first deflection command is an infrared sensor of the current shooting direction of the camera and the camera facing the last detected target The angle between the shooting directions of the time; and the step of deflecting the camera according to the deflection command comprises: deflecting the shooting direction of the camera toward the target by the first deflection each time a first deflection command is generated The angle of deflection indicated by the command.
  • the step of generating the second deflection instruction according to the second control policy comprises: selecting one of the at least two infrared sensors that simultaneously detect the target, and generating a second deflection instruction for the selected one of the infrared sensors
  • the deflection angle indicated by the second deflection command is an angle between a current shooting direction of the camera and a shooting direction when the camera faces the selected one of the infrared sensors.
  • the step of generating the second deflection instruction according to the second control policy comprises: generating a second deflection instruction for the at least two infrared sensors that simultaneously detect the target, the deflection angle indicated by the second deflection instruction The intermediate value of each angle between the current photographing direction of the camera and the photographing direction when the camera faces each of the at least two infrared sensors is the deflection angle.
  • the infrared sensor emits an infrared light beam; when the infrared light beam is blocked, the target is detected to be present in the sensing area of the infrared sensor.
  • the infrared light beam is formed between the first end point and the second end point; the first end point is disposed in a mounting position area of the camera, and the second end point is set to be opposite to the camera The mounting position is fixed.
  • the infrared light beam is formed between the first end point and the second end point; the positions of the first end point and the second end point are set such that the mounting position of the camera is not in the infrared beam The optical path and the extension of the optical path.
  • generating the deflection instruction according to an angle a between the current shooting direction of the camera and the infrared light beam; and the step of deflecting the camera according to the deflection command comprises: The current shooting direction deflects a+x toward the infrared beam, where x is the delay error.
  • the photographing direction and the reference angle b generate the deflection command
  • the step of deflecting the camera according to the deflection command comprises: deflecting a current photographing direction of the camera to a direction at an angle b+y with respect to the initial photographing direction, Where y is the delay error.
  • the reference angle b is an angle between a line connecting the point of the infrared light beam between the first end point and the second end point to the camera and the initial shooting direction.
  • the infrared beams of at least two of the plurality of infrared sensors are sequentially blocked, determining the progress of the target according to a time when the respective infrared beams of the at least two infrared sensors are blocked Speed, thereby determining the delay error based on the travel speed.
  • the video image captured by the camera is subjected to OSD processing.
  • the lighting device corresponding to the infrared sensor is turned on.
  • the present invention provides a video monitoring device, including:
  • a detecting device comprising at least one infrared sensor for detecting whether a target appears in a sensing area of the infrared sensor and generating a detection signal indicating whether a target is present, the infrared sensor being configured to be disposed in the first area of the monitored area position;
  • a processing device connected to the detecting device, configured to receive the detection signal, and generate a deflection command for indicating a deflection angle according to the detection signal;
  • control device coupled to the processing device for receiving the deflection command, and deflecting a shooting direction of a camera for video surveillance to the target according to the deflection command;
  • An image pickup device coupled to the control device includes the camera for capturing a video image through the camera, the camera being configured to be disposed at a second position of the monitored area that is different from the first position.
  • the detecting device comprises a plurality of infrared sensors, each detecting a target and generating a respective detection signal
  • the processing device is configured to: according to the detection signals received from the plurality of infrared sensors Combine to generate the deflection command.
  • the deflection command includes a first deflection instruction and a second deflection instruction
  • the processing device is configured to:
  • the first deflection instruction is generated according to the first control strategy such that at least two are sequentially received for the detection signal
  • One of the infrared sensors generates one or more first deflection commands
  • the second deflection instruction is generated according to a second control strategy different from the first control strategy, such that At least two infrared sensors that receive the detection signal generate a second deflection command.
  • each of the at least one infrared sensor comprises an infrared beam emitter and an infrared beam receiver to form an infrared beam between the infrared beam emitter and the infrared beam receiver, when the infrared When the light beam is blocked, a corresponding one of the at least one infrared sensor generates a detection signal indicating the occurrence of the target;
  • One of the infrared beam emitter and the infrared beam receiver is disposed at a first end point, and the other of the infrared beam emitter and the infrared beam receiver is disposed at a second end point;
  • the first end point is located in a mounting position area of the camera, the second end point is fixed with respect to a mounting position of the camera, or the positions of the first end point and the second end point are set such that The installation position of the video surveillance camera is not on the extension line of the infrared beam and the infrared beam.
  • the video monitoring device further includes a lighting device connected to the processing device, and the processing device is configured to generate an illumination on signal when receiving a detection signal of the target occurrence from the infrared sensor, and The illumination device transmits the illumination on signal such that the illumination device is turned on.
  • the detecting device comprises a plurality of infrared sensors, each of which detects the target and generates a respective detection signal;
  • the lighting device includes a plurality of illuminators disposed in one-to-one correspondence with the plurality of infrared sensors, and
  • the processing device When the processing device receives the detection signal of the target occurrence from the infrared sensor, generating an illumination on signal for the infrared sensor that generates the detection signal, so that the illuminator corresponding to the infrared sensor that generates the detection signal is turned on .
  • the method and device for video surveillance provided by the present invention detects that a target is generated in a sensing area of the infrared sensor, generates a deflection command, and deflects a current shooting direction of the video surveillance camera toward the target area according to the deflection command.
  • the invention automatically deflects the shooting direction of the camera based on the detection result of the infrared sensor, saves the calculation amount in the target tracking, and reduces the cost of the video monitoring device.
  • FIG. 1 is a schematic flowchart of a video monitoring method according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram of a relationship between a target and a camera in a video monitoring method according to an embodiment of the present invention
  • FIG. 3 is a schematic diagram of a positional relationship between a target and an infrared sensor in a video monitoring method according to an embodiment of the present invention
  • FIG. 4 is a schematic diagram of a relationship between an infrared sensor and a camera in a video monitoring method according to an embodiment of the present invention
  • FIG. 5 is a schematic diagram of relationship between multiple infrared beams and a camera position in a video monitoring method according to an embodiment of the present invention
  • FIG. 6 is a schematic diagram showing a setting position of an end point of a plurality of infrared light beams in a video monitoring method according to an embodiment of the present invention
  • FIG. 7 is a schematic diagram showing a setting position of an end point of a plurality of infrared light beams in a video monitoring method according to an embodiment of the present invention.
  • FIG. 8 is a schematic diagram showing a setting position of an end point of a plurality of infrared light beams in a video monitoring method according to an embodiment of the present invention.
  • FIG. 9 is a schematic flowchart of another video monitoring method according to an embodiment of the present disclosure.
  • FIG. 10 is a schematic diagram of blocking of multiple infrared beams in a video monitoring method according to an embodiment of the present invention.
  • FIG. 11 is a schematic diagram showing a setting position of a lighting device according to an embodiment of the present invention.
  • FIG. 12 is a schematic block diagram of a video monitoring device according to an embodiment of the present invention.
  • a method of video surveillance includes:
  • S101 detecting, by an infrared sensor, whether a target appears in a sensing area of the infrared sensor, and when detecting a target, generating a deflection instruction for indicating a deflection angle, wherein the infrared sensor is configured to be disposed in the monitored area.
  • the camera's shooting direction needs to be deflected to avoid problems that cannot be captured in a specific area when the camera's shooting direction is fixed, so as to prevent important image information from being missed.
  • Infrared sensors are typically placed at important locations in the room, such as at entrances, windows, and dead ends; when a target (eg, a suspect or an object of interest) enters the sensing area of the infrared sensor, the infrared sensor detects the presence of the target.
  • a target eg, a suspect or an object of interest
  • the orientation of the target relative to the current shooting direction of the camera is determined to generate a corresponding deflection command such that the camera is deflected toward the target.
  • the deflection command is used to indicate a deflection angle such that the camera is deflected toward the target at the deflection angle.
  • the deflection angle may be an angle between a current shooting direction of the camera and a shooting direction when the camera faces the infrared sensor.
  • the photographing direction when the camera faces the infrared sensor may be a photographing direction when the camera can capture the target sensed by the infrared sensor.
  • One embodiment of this embodiment is to generate a deflection command by determining an orientation of the target relative to a current shooting direction of the camera.
  • the infrared sensor is used to determine the target relative to the camera
  • the position of the current shooting direction For example, the position of the target may be determined by an infrared positioning sensor, or the position of the target may be determined by a thermal infrared sensor, or the current target relative to the camera may be determined when the infrared beam is blocked. The position of the shooting direction.
  • determining a position of the target relative to a current shooting direction of the camera that is, determining orientation information of the target with respect to the camera, thereby generating a deflection command.
  • a deflection command As shown in FIG. 2, an angle between the connection between the target and the camera and the current shooting direction of the camera (which can be decomposed into a horizontal component and a vertical component) is calculated, thereby generating an indication for the target and the camera.
  • a deflection command of the deflection angle of the line corresponding to the angle of the current shooting direction of the camera.
  • the deflection command is generated by determining an orientation of the sensing region of the infrared sensor relative to a current shooting direction of the camera.
  • a point in the sensing area of the infrared sensor for example, a point closest to the camera
  • the angle between the two (which can be decomposed into a horizontal component and a vertical component) generates a deflection command for indicating a deflection angle corresponding to the orientation of the sensing region with respect to the current shooting direction of the camera.
  • the orientation of the infrared sensing area relative to the current shooting direction of the camera may be determined by the center position of the infrared sensing area.
  • the deflection command is generated by determining an orientation of the position of the infrared sensor relative to a current shooting direction of the camera. As shown in FIG. 4, when the infrared sensor detects the target, according to the angle between the connection between the infrared sensor and the camera and the current shooting direction of the camera (which can be decomposed into a horizontal component and a vertical component), it is generated for indication and A deflection command of a deflection angle corresponding to an angle between a line connecting the infrared sensor and the camera and a current photographing direction of the camera.
  • the infrared sensor By placing the infrared sensor in a different position from the camera, it is possible to detect and track the target over a wider range, improving the performance of video surveillance of the target.
  • the current shooting direction refers to a shooting direction of the read camera when a target moment occurs in the sensing area where the infrared sensor is detected.
  • the deflection command is generated according to the current shooting direction and the orientation information of the target with respect to the current shooting direction of the camera.
  • the deflection command may indicate a deflection amount (for example, a deflection angle) of a shooting direction of the camera toward the target, and may also indicate a deflection amount of the imaging direction of the camera toward the target in a vertical direction and a horizontal direction.
  • the upper deflection component for example, the horizontal component and the vertical component of the deflection angle.
  • the deflection command includes a longitudinal PWM (Pulse Width Modulation) signal and a lateral PWM signal.
  • the stepping motor receives the deflection command, the stepping motor is in a prescribed time according to the PWM signal. Rotating at a predetermined rotational speed to drive the camera to deflect from the current photographing direction to the target by the amount of deflection for photographing.
  • the deflection angle indicated by the deflection command may correspond to the precise angle of the target with respect to the current shooting direction of the camera, it may correspond to the connection between the infrared sensor and the camera that detects the target and the current shooting direction.
  • the angle may also correspond to an angle between a point in the sensing area of the infrared sensor and a line of the camera relative to the current shooting direction, and thus, when the camera is deflected toward the target, the camera may be caused
  • the deflected photographing direction is aligned with the target position point, and may be an area in which the photographing direction after the deflection of the camera is aligned with the vicinity of the target position point.
  • an infrared sensor may have a plurality of portions (eg, a transmitter and a receiver) disposed at a plurality of locations, the first location including the plurality of locations.
  • the camera can be considered to be disposed at a second position different from the first position.
  • the pair based on infrared sensing is realized.
  • the detection condition of the target automatically adjusts the shooting direction of the camera, and the automatic adjustment of the video monitoring direction is realized, thereby preventing the omission of important monitoring image information.
  • the infrared sensor is induced by an infrared light beam, and the infrared light beam forms an angle with the current shooting direction; when the infrared light beam is blocked, the target is detected to be present. In the sensing area of the infrared sensor.
  • the infrared beam may be an infrared beam generated by an infrared sensor of a through-beam type, or may be an infrared beam generated by a reflective infrared sensor.
  • multiple infrared sensors may be used to generate multiple infrared beams.
  • the target is detected to enter the sensing area of the corresponding infrared sensor.
  • the case of detection by the two infrared beams A, B will be described below by way of an example in FIG.
  • the intersection of the optical path of the infrared beam and the current shooting direction of the camera is not at the camera.
  • the required angle of deflection of the camera is not equal to a C , and the geometry needs to be passed.
  • the principle calculates the deflection angle or presets the required deflection angle according to the optical path of the infrared light beam C.
  • the infrared light beam is formed between the first end point and the second end point; the first end point is disposed in a mounting position area of the camera, and the second end point is set to be opposite to the camera The mounting position is fixed.
  • the installation location area of the camera refers to a peripheral area of the mounting point of the camera (for example, an area centered on a mounting point of the camera).
  • the mounting location area is sufficiently small relative to the space captured by the surveillance video, so in the calculation of the deflection angle, it can be assumed that the first endpoint coincides with the mounting point of the camera, and the resulting error is negligible.
  • the camera may be installed on the top or bottom surface of the photographed room, or may be installed on the side of the photographed room (ie, the wall surface).
  • the camera When the camera is installed at the center of the top surface of the shooting room, the sum of the deflection angles of the camera relative to each side of the shooting room is the smallest; the camera can also be installed in an important area (such as a door) of the shooting room.
  • the infrared light beam A is formed between the end point A2 and the end point A1; the end point A2 is located in the mounting position area of the camera, and the end point A1 is opposite to The mounting position of the camera is fixed.
  • the infrared light beam B is formed between the end point B2 and the end point B1; the end point B2 is located in the mounting position area of the camera, and the end point B1 is fixed with respect to the mounting position of the camera.
  • the camera and the end point of the infrared light beam may be separately disposed.
  • the camera may be installed at the center of the top surface of the room, and the end points A1 and B1 are respectively disposed on the wall surface, and the infrared beams A and B may be Produced by a reflective infrared sensor, or by a through-beam infrared sensor.
  • the deflection command when the infrared light beam is blocked, the deflection command is generated according to an angle a between the current shooting direction and the infrared light beam, wherein a deflection angle indicated by the deflection command corresponds to The angle a; deflecting the camera according to the deflection command comprises: biasing the camera from the current shooting direction to the infrared beam Turning a+x, where x is the delay error.
  • the counterclockwise angles of the infrared beams A and B and the initial shooting direction of the camera are a A , a B , and the first blocked infrared beam is Infrared beam A.
  • the camera needs to deflect a A to the infrared beam A, and then if the infrared beam B is blocked, the camera needs to deflect toward the infrared beam B (a B - a A )
  • (a B - a A ) is a positive value
  • the camera is deflected counterclockwise
  • (a B - a A ) is a negative value
  • the camera is deflected clockwise.
  • the initial shooting direction may refer to a shooting direction when the camera is not deflected, for example, a shooting direction of the camera when it is powered on.
  • the deflection command may indicate a horizontal component and a vertical component of the deflection angle, respectively.
  • the initial shooting direction of the camera is 0 degrees in the horizontal and vertical directions
  • the counterclockwise angles of the infrared beams A, B and the initial shooting direction of the camera in the horizontal direction are ⁇ A , ⁇ B .
  • the counterclockwise angles of the infrared beams A and B and the initial shooting direction of the camera in the vertical direction are ⁇ A and ⁇ B
  • the infrared beam blocked for the first time is the infrared beam A.
  • the horizontal and vertical angles at which the camera needs to be deflected are ⁇ A and ⁇ A , respectively, and then, if the infrared beam B is blocked, the angle at which the camera needs to be deflected in the horizontal and vertical directions They are ( ⁇ B - ⁇ A ) and ( ⁇ B - ⁇ A ), respectively.
  • the camera can be deflected by a stepper motor.
  • the angle between each infrared beam and the initial shooting direction of the camera in the horizontal direction and the vertical direction needs to be input to the phase controller in advance.
  • the current shooting direction is deflected to the infrared beam by a+x, wherein the x is a delay error.
  • the delay error refers to an increased amount of deflection considering a delay of the infrared beam from a blocking moment to a time when the camera is deflected; for example, when the target moves, the delay error may be according to a target motion Information, so that A deflection amount corresponding to the amount of movement of the target in a period of time from the time when the infrared light beam is blocked to the time when the camera is deflected is indicated.
  • the x may be a preset compensation value or may be determined according to the rotation speed of the camera. Of course, x in this embodiment may take 0, that is, a case where the delay error is not considered.
  • the camera and the end point of the infrared light beam may be separately disposed, and the end point of the infrared light beam may also be integrated in the image pickup device, that is, the emitter or receiver of the infrared light beam. It can be provided in an image pickup apparatus including the camera.
  • FIG. 7 Suppose there are 4 infrared beams A, B, C, D (not shown in Figure 7), the endpoints are A1, A2, B1, B2, C1, C2, D1, D2; infrared beams A, B, C, D respectively Exiting from the endpoints A1, B1, C1, D1 into the shooting space, and at an angle to the initial shooting direction of the camera; endpoints A2, B2, C2, D2 (not shown in Figure 7) are disposed at the camera The installation location is such that the camera is in the optical path (or extension of the optical path) of the infrared beams A, B, C, D.
  • the deflection angle of the camera does not need to be additionally calculated. For example, when the infrared beam A is blocked, the camera deflects the angle a A toward the direction of the infrared beam A; thereafter, when the infrared beam B is blocked, the angle of deflection of the camera to the direction of the infrared beam B (a B -a A ) .
  • the angle between the infrared light beam and the initial shooting direction can be determined by the preset infrared light beam exit direction to determine the deflection angle, without The angle between the infrared beam and the current shooting direction is calculated during the deflection of the camera, thereby saving calculation amount and improving precision.
  • the delay error x may also be added to the deflection angle, which will not be described here.
  • the infrared beam is formed between the first end point and the second end point; the positions of the first end point and the second end point may be preset such that The mounting position of the camera is not in the optical path of the infrared beam and the extension line of the optical path.
  • an illustration is made with two infrared beams.
  • the infrared beams A, B are respectively between the endpoints A1 and A2, the endpoints B1 and B2; the positions of the endpoints A1 and A2, B1 and B2 are respectively set, so that the installation position of the camera is not in the optical path and the infrared beam
  • the extension of the optical path ie, the infrared beam does not pass through the camera.
  • the end points A1 and A2, B1, and B2 can be disposed on two vertical wall surfaces, and when the infrared light beam A is blocked, the photographing direction of the camera is deflected toward the infrared light beam A.
  • a reference angle b with respect to an initial photographing direction of the camera may be set according to positions of the first end point and the second end point.
  • the reference angle b may be a deflection angle b A required for the camera shooting direction when the infrared light beam A is blocked, and a deflection required for the camera shooting direction when the infrared light beam B is blocked.
  • Angle b B .
  • the deflection angle b (including b A and b B ) can be preset in a variety of ways. For example, referring to FIG.
  • the initial shooting direction of the camera may be set to 0°
  • the point A0 for example, the midpoint of the line segment
  • the angle of the initial shooting direction of the camera is the reference angle b A
  • the deflection angle required for the camera shooting direction is b A
  • the angle between the point B0 (for example, the midpoint of the line segment) on the line segment between the end points B1 and B2 and the line of the camera with respect to the initial photographing direction of the camera is the reference angle. b B .
  • the infrared light beam A When the infrared light beam A is blocked, generating the deflection command according to the current shooting direction and the reference angle b A ; deflecting the camera from the current shooting direction to the reference angle b A +y, wherein The y is a delay error; then when the infrared beam B is blocked, the camera is deflected (b B -b A )+y from the current shooting direction, so that the shooting direction is deflected to the direction of the infrared beam B.
  • the delay error y is similar to the above delay error x and will not be described here.
  • the infrared sensor includes a plurality of infrared sensors
  • detecting the target by the infrared sensor includes: detecting, by each of the plurality of infrared sensors, whether a target appears in a sensing area of the infrared sensor, and The deflection command is generated according to a combination of detection results of the plurality of infrared sensors.
  • each of the plurality of infrared sensors detects whether the target appears in a sensing area of the infrared sensor (for example, S801 of FIG. 9).
  • the infrared sensor generates a detection signal for indicating the detection result.
  • the infrared sensor that detects the target generates a detection signal of a high level
  • the infrared sensor that does not detect the target generates a detection signal of a low level.
  • the detection signal of the infrared sensor is not limited to this.
  • the deflection command includes a first deflection command and a second deflection command
  • the step of generating a deflection command according to a combination of the detection results of the plurality of infrared sensors includes: in the plurality of infrared sensors
  • the first deflection instruction is generated according to the first control strategy (for example, S803 of FIG. 9); when at least two of the plurality of infrared sensors simultaneously detect the target, according to the The second control strategy generates the second deflection command (eg, S804 of FIG. 9).
  • the control policy refers to a processing rule generated in advance by a processing rule or a case where the infrared light beam is blocked.
  • first yaw instruction and the second yaw instruction are only different in naming manner in order to correspond to the first control strategy and the second control strategy, and they are the same in terms of instruction format, encoding mode, and the like. And both are used to indicate the angle of deflection for an object (eg, an infrared sensor).
  • the one of the plurality of infrared sensors can be determined in the following manner Whether two targets are detected at the same time: monitoring a plurality of detection signals generated by the plurality of infrared sensors in real time, and determining at least at least two of the plurality of detection signals when at least two of the plurality of detection signals are simultaneously at a high level Both detected the target at the same time.
  • the step of generating the first deflection instruction according to the first control policy may include: sequentially generating a deflection instruction according to a time sequence in which the plurality of infrared sensors sequentially detect the target, so that the shooting direction of the camera is sequentially The directional deflection of the target infrared sensor is detected.
  • each infrared sensor upon detecting a target, generates a deflection command for the infrared sensor for indicating a deflection angle from a current shooting direction to a shooting direction in which the camera faces the infrared sensor.
  • the photographing direction when the camera faces the infrared sensor includes a photographing direction in which the infrared sensor is aligned, and a photographing direction in which one point in the sensing region of the infrared sensor is aligned.
  • a plurality of infrared sensors may be divided into N groups (N is a positive integer), and each group includes at least two.
  • An infrared sensor, and generating the first deflection command according to the first control strategy may include: generating a first deflection command for the last detected infrared sensor of each group of infrared sensors, the first deflection instruction A deflection angle for indicating a photographing direction from the current photographing direction to when the camera faces the infrared sensor that last detected the target.
  • the step of deflecting the camera according to the deflection command may include: when generating a first deflection command for a group of infrared sensors, deflecting a shooting direction of the camera toward the target The deflection angle indicated by a deflection command.
  • the plurality of infrared sensors are divided into two groups (i.e., N is taken as 2), and each of the infrared sensors is described by taking an infrared beam detecting target as an example.
  • the infrared beams corresponding to the plurality of infrared sensors are divided into two groups, the first group includes infrared beams C1, C2, and C3; the second group includes infrared beams D1, D2, and D3; and the target traveling direction is self-infrared.
  • C3 is the last blocked infrared beam
  • a deflection command is generated according to the angle of the infrared beam C3 with respect to the current shooting direction, and the deflection angle corresponds to the angle of the infrared beam C3 with respect to the current shooting direction.
  • the infrared beam can be set to generate a deflection command with the last detected target in a time period, so that only one deflection command is generated in the same time period, and the infrared beams E1, E2, E3, E4 of the six infrared sensors are used.
  • the beams E1 and E2 are sequentially blocked.
  • the deflection command is generated according to the infrared beam E2; in the second cycle, the beam E3 is generated. E4 and E5 are sequentially blocked.
  • a deflection command is generated according to the infrared beam E5; in the third period, the beam E6 is blocked, and when the third period is over, according to the infrared beam E6 A deflection command is generated.
  • the shooting direction of the camera is deflected toward the target by the deflection angle indicated by the deflection command.
  • the method may further include determining a travel speed of the target based on a time at which each of the infrared sensors of each of the sets of infrared sensors detects the target, thereby determining a delay error.
  • the delay error may correspond to an amount of motion of the target from a time when the infrared beam is blocked to a time when the camera is deflected to the infrared beam, for example, a product of the time period and the angular velocity of the target.
  • the step of generating the second deflection instruction according to the second control policy may include: selecting one of the at least two infrared sensors that simultaneously detect the target, and generating a first for the selected one of the infrared sensors And a second deflection command for indicating a deflection angle from a current shooting direction to a shooting direction when the camera faces the selected one of the infrared sensors.
  • the step of generating the second deflection command according to the second control strategy may include generating a second deflection command for the at least two infrared sensors, the second deflection instruction for indicating shooting from the current shooting direction to each of the at least two infrared sensors facing the camera
  • the intermediate value of each deflection angle of the direction is taken as the deflection angle.
  • a second deflection command is generated based on the intermediate value of the angle or reference angle of the plurality of simultaneously blocked infrared beams with respect to the current shooting direction.
  • an infrared sensor may be arbitrarily selected from at least two infrared sensors that simultaneously detect a target, or an infrared sensor near an important area (eg, a door) of the monitored room may be selected, so that the most important surveillance camera is taken. Image information.
  • the infrared sensor is detected by the infrared light beam
  • the infrared light beam is periodically interrupted for detection, and only one infrared light beam is detected in the same period, so that only one second deflection command is generated in the same period; As shown in FIG.
  • a case where a deflection command is generated based on an intermediate value of an angle or a reference angle of a plurality of simultaneously blocked infrared beams with respect to the current photographing direction will be described with reference to FIG.
  • the infrared beams C1, C2, and C3 may be compared with respect to The intermediate value of the angle of the current camera shooting direction or the reference angle generates a deflection command that can be instructed to deflect the camera at an intermediate value for each of the deflection angles of the infrared beams C1, C2, and C3.
  • the deflection command may also instruct the camera to deflect the infrared beam (the infrared beam C2 in the figure) closest to the intermediate direction with respect to the deflection angle of the current shooting direction.
  • the method further includes: when the infrared sensing When the target is detected, the video image captured by the camera is subjected to OSD processing. That is, the video data input by the camera is subjected to OSD (on-screen display) processing by the video recording device; for example, information such as infrared sensor number information, infrared sensor orientation information, the target appearance time, and the like, and the processed video data is encoded and encoded. storage.
  • OSD on-screen display
  • the method further includes: when the infrared sensor detects the target, turning on the illumination device corresponding to the infrared sensor.
  • the illumination device may be integrally disposed in the vicinity of the camera.
  • the camera includes a camera lens and a camera body, and the illumination device (such as a plurality of LED lights) may be integrated with the camera lens in the camera body;
  • the lighting device may also be a separate lighting device (such as an LED lamp, a fluorescent lamp) in a room monitored by the camera, and the control module of the lighting device turns on the illumination after receiving the trigger signal sent by the video monitoring device.
  • the illumination device disposed corresponding to the infrared sensor may be a lighting device disposed near the infrared sensor, or may be a lighting device capable of illuminating the sensing region of the infrared sensor (eg, with the camera)
  • a video monitoring device including:
  • a detecting device 1201 comprising at least one infrared sensor, configured to detect whether a target appears in a sensing area of the infrared sensor, and generate a detection signal indicating whether a target appears;
  • a processing device 1202 coupled to the at least one infrared sensor for receiving the detection signal and generating a deflection command for indicating a deflection angle according to the detection signal;
  • control device 1203 connected to the processing device, configured to receive the deflection command, and deflect a shooting direction of the camera toward the target according to the deflection command;
  • An imaging device 1204 connected to the control device for passing the camera 12041 Take a video image.
  • the detecting device 1201 may include a plurality of infrared sensors (such as an infrared sensor 12011, an infrared sensor 12012 ... an infrared sensor 1201n), each of which can detect a target and generate a respective detection signal.
  • infrared sensors such as an infrared sensor 12011, an infrared sensor 12012 ... an infrared sensor 1201n
  • each of the infrared sensors can include an infrared beam emitter and an infrared beam receiver to form an infrared beam between the infrared beam emitter and the infrared beam receiver, when the infrared beam is blocked At the time of the break, the corresponding infrared sensor generates a detection signal indicating the occurrence of the target.
  • One of the infrared beam emitter and the infrared beam receiver is disposed at a first end point, and the other of the infrared beam emitter and the infrared beam receiver is disposed at a second end point;
  • the first end point is located in a mounting position area of the camera 12041, the second end point is fixed relative to the mounting position of the camera 12041, or the positions of the first end point and the second end point are set to
  • the mounting position of the camera is such that it is not on the extension of the infrared beam and the infrared beam.
  • the video monitoring device further includes a lighting device connected to the processing device 1202, and the processing device 1202 is configured to generate an illumination on signal when the infrared sensor detects the target, and The lighting device transmits the lighting on signal such that the lighting device is turned on.
  • the illumination device includes a plurality of illuminators disposed in one-to-one correspondence with the plurality of infrared sensors 12011-1201n.
  • the processing device 1202 receives the detection signal, an illumination on signal is generated for an infrared sensor that generates the detection signal such that an illuminator corresponding to the infrared sensor that generates the detection signal is turned on.
  • An example of a luminaire can be seen in Figure 11 and its description.
  • the processing device may be configured to generate the deflection instruction according to an angle a between the current shooting direction and the infrared sensor when the infrared sensor detects a target; the control device may be configured to enable the camera Orienting the current shooting direction to the The infrared beam direction deflects a+x, where x is the delay error.
  • the processing device is further configured to set a reference angle b of the shooting direction according to the positions of the first end point and the second end point; and generate the reference according to the current shooting direction and the reference angle b a deflection command; the control means for causing the camera to deflect the current shooting direction by the reference angle b+y, wherein the y is a delay error.
  • the processing device may be configured to generate the deflection command based on a combination of detection signals received from a plurality of infrared sensors. Specifically, the processing device is configured to: when the detection signal indicating the occurrence of the target is sequentially received from at least two of the plurality of infrared sensors, generate the deflection instruction according to the first control strategy; The yaw command is generated according to a second control strategy when at least two of the plurality of infrared sensors simultaneously receive a detection signal indicating the occurrence of the target.
  • the generating, by the processing device, the deflection instruction according to the first control strategy may include: sequentially generating a deflection instruction according to a time sequence in which the plurality of infrared sensors sequentially detect the target, so that the shooting direction of the camera is sequentially detected
  • the direction of the infrared sensor of the target is deflected; or the plurality of infrared sensors are divided into N groups (N is a positive integer), each group includes at least two infrared sensors, and a deflection is generated for the infrared sensor of the last detected target in each group of infrared sensors instruction.
  • the processing device may further determine a travel speed of the target according to a time when each of the infrared sensors of each group of the infrared sensors detects a target, thereby determining a delay error.
  • the generating, by the processing device, the deflection instruction according to the second control policy may include: selecting one of the at least two infrared sensors that simultaneously detect the target, and generating a deflection instruction for the selected one of the infrared sensors,
  • the deflection command is used to indicate a deflection angle from a current shooting direction to a shooting direction when the camera faces the selected one of the infrared sensors; or a deflection command is generated for the at least two infrared sensors, the deflection command is used for Instructing each of the photographing directions from the current photographing direction to each of the at least two infrared sensors that are simultaneously blocked by the camera
  • the intermediate value of the deflection angle is taken as the deflection angle.
  • the processing device can also periodically monitor whether the plurality of infrared sensors detect the target. For example, in the case where the infrared sensor is detected by the infrared beam, the infrared beam is periodically interrupted for detection so that only one deflection command is generated in the same cycle.
  • processing device is further configured to perform OSD processing on the video image captured by the imaging device when the infrared sensor detects the target.
  • the video monitoring device is a device for implementing the video monitoring method provided by the embodiment of the present invention. Therefore, those skilled in the art can understand the embodiment of the present invention based on the video monitoring method according to the embodiment of the present invention.
  • the specific implementation manner of the video monitoring device and various changes thereof are not described in detail herein for how the video monitoring device implements the video monitoring method in the present invention.
  • the apparatus used in the video monitoring method of the present invention by those skilled in the art is within the scope of the present application.
  • modules in the devices of the embodiments can be adaptively changed and placed in one or more devices different from the embodiment.
  • the modules or units or components of the embodiments may be combined into one module or unit or component, and further they may be divided into a plurality of sub-modules or sub-units or sub-components.
  • any combination of the features disclosed in the specification, including the accompanying claims, the abstract and the drawings, and any methods so disclosed, or All processes or units of the device are combined.
  • Each feature disclosed in this specification (including the accompanying claims, the abstract and the drawings) may be replaced by alternative features that provide the same, equivalent or similar purpose.
  • the various component embodiments of the present invention may be implemented in hardware, or in a software module running on one or more processors, or in a combination thereof.
  • a microprocessor or digital signal processor may be used in practice to implement some or all of the functionality of some or all of the gateways, proxy servers, devices in accordance with embodiments of the present invention.
  • the invention may also be embodied as a device or device program for performing some or all of the methods described herein (eg, Computer programs and computer program products).
  • Such a program implementing the invention may be stored on a computer readable medium or may be in the form of one or more signals. Such signals may be downloaded from an Internet website, provided on a carrier signal, or provided in any other form.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

本发明提供了一种视频监控方法和设备。所述视频监控方法包括:通过红外传感器检测目标是否出现在所述红外传感器的感应区域,当检测到目标时,生成用于指示偏转角度的偏转指令;根据所述偏转指令将摄像头的拍摄方向朝所述目标区域偏转所述偏转角度。本发明基于红外传感器的检测结果来自动偏转摄像头的拍摄方向,节省了目标跟踪时的计算量、降低了视频监控设备的成本。

Description

一种视频监控方法和设备 技术领域
本发明涉及视频监控领域,尤其涉及一种视频监控的方法和设备。
背景技术
当今视频监控设备已经在各行业普遍使用,但大多数视频监控设备中,摄像头的拍摄方向是固定不变的,即只能实现对特定方位的监控。这种视频监控设备容易出现监控死角,可能会造成重要图像信息的遗漏,为了防止监控死角的出现,要安装多个摄像头,同时对多个方位进行监控,造成成本升高。在这种视频监控设备中,即使监控方向可变,也是在人工控制下才能改变,不能根据监控场所内人员或物体的活动实现自动监控。
发明内容
本发明的一个目的在于提供一种视频监控的方法和设备。
第一方面,本发明提供了一种视频监控方法,包括:
通过红外传感器检测目标是否出现在所述红外传感器的感应区域,当检测到目标时,生成用于指示偏转角度的偏转指令,其中,所述红外传感器被配置为设置在被监控区域的第一位置;
根据所述偏转指令将用于视频监控的摄像头的拍摄方向向所述目标偏转所述偏转角度,其中,所述摄像头被配置为设置在被监控区域的不同于第一位置的第二位置。
可选地,所述红外传感器包括多个红外传感器,并且通过红外传感器检测目标的步骤包括:通过所述多个红外传感器中的每一个检测 目标是否出现在该红外传感器的感应区域,并且根据所述多个红外传感器的检测结果的组合,生成所述偏转指令。
可选地,所述偏转指令包括第一偏转指令和第二偏转指令,根据所述多个红外传感器中的至少两个红外传感器的检测结果的组合生成偏转指令的步骤包括:当所述多个红外传感器依次检测到所述目标时,根据第一控制策略生成所述第一偏转指令;
当所述多个红外传感器中的至少两个红外传感器同时检测到所述目标时,根据不同于第一控制策略的第二控制策略生成所述第二偏转指令。
可选地,所述多个红外传感器被分为N组,每组包括至少两个红外传感器,N为正整数;根据第一控制策略生成所述第一偏转指令的步骤包括:针对每组红外传感器中最后检测到目标的红外传感器生成一个第一偏转指令,所述第一偏转指令所指示的偏转角度是所述摄像头的当前拍摄方向与所述摄像头面对所述最后检测到目标的红外传感器时的拍摄方向之间的夹角;并且根据所述偏转指令偏转摄像头的步骤包括:每当生成一个第一偏转指令时,将所述摄像头的拍摄方向朝向所述目标偏转所述一个第一偏转指令所指示的偏转角度。
可选地,根据第二控制策略生成所述第二偏转指令的步骤包括:选择同时检测到目标的所述至少两个红外传感器中的一个,并且针对所选择的一个红外传感器生成第二偏转指令,所述第二偏转指令所指示的偏转角度是所述摄像头的当前拍摄方向与所述摄像头面对所选择的一个红外传感器时的拍摄方向之间的夹角。
可选地,根据第二控制策略生成所述第二偏转指令的步骤包括:针对同时检测到目标的所述至少两个红外传感器生成第二偏转指令,所述第二偏转指令所指示的偏转角度是所述摄像头的当前拍摄方向分别与所述摄像头面对所述至少两个红外传感器中的每一个时的拍摄方向之间的各个夹角的中间值作为所述偏转角度。
可选地,所述红外传感器发出红外光束;所述红外光束发生阻断时,检测到目标出现在所述红外传感器的感应区域中。
可选地,所述红外光束形成在第一端点与第二端点之间;所述第一端点设置在所述摄像头的安装位置区域,所述第二端点设置为相对于所述摄像头的安装位置固定。
可选地,所述红外光束形成在第一端点与第二端点之间;设置所述第一端点和所述第二端点的位置,使所述摄像头的安装位置不处于所述红外光束的光路和所述光路的延长线上。
可选地,所述红外光束发生阻断时,根据所述摄像头的当前拍摄方向与所述红外光束的夹角a生成所述偏转指令;根据所述偏转指令偏转摄像头的步骤包括:将所述当前拍摄方向向所述红外光束偏转a+x,其中,x为延迟误差。
可选地,根据所述第一端点和所述第二端点的位置设定相对于所述摄像头的初始拍摄方向的参考角度b;所述红外光束发生阻断时,根据所述摄像头的当前拍摄方向和所述参考角度b生成所述偏转指令;根据所述偏转指令偏转摄像头的步骤包括:将所述摄像头的当前拍摄方向偏转至相对于所述初始拍摄方向呈角度b+y的方向,其中,所述y为延迟误差。
可选地,所述参考角度b为所述红外光束在第一端点与第二端点之间的一点与所述摄像头的连线与所述初始拍摄方向之间的夹角。
可选地,当所述多个红外传感器中的至少两个红外传感器的红外光束依次被阻断时,根据所述至少两个红外传感器的各个红外光束被阻断的时间确定所述目标的行进速度,从而根据所述行进速度确定所述延迟误差。
可选地,所述红外传感器检测到目标时,对所述摄像头拍摄的视频图像进行OSD处理。
可选地,当所述红外传感器检测到该红外传感器的感应区域出现 目标时,开启与该红外传感器对应设置的照明设备。
第二方面,本发明提供一种视频监控设备,包括:
检测装置,其包括至少一个红外传感器,用于检测目标是否出现在所述红外传感器的感应区域,并生成指示目标是否出现的检测信号,所述红外传感器被配置为设置在被监控区域的第一位置;
与所述检测装置相连接的处理装置,用于接收所述检测信号,以及根据所述检测信号生成用于指示偏转角度的偏转指令;
与所述处理装置连接的控制装置,用于接收所述偏转指令,并根据所述偏转指令将用于视频监控的摄像头的拍摄方向向所述目标偏转所述偏转角度;
与所述控制装置连接的摄像装置,包括所述摄像头,用于通过所述摄像头拍摄视频图像,所述摄像头被配置为设置在所述被监控区域的不同于第一位置的第二位置。
可选地,所述检测装置包括多个红外传感器,每个红外传感器分别检测目标并生成各自的检测信号,并且,所述处理装置配置为:根据从所述多个红外传感器接收的检测信号的组合,生成所述偏转指令。
可选地,所述偏转指令包括第一偏转指令和第二偏转指令,所述处理装置配置为:
当从所述多个红外传感器中的至少两个红外传感器依次接收到指示目标出现的检测信号时,根据第一控制策略生成所述第一偏转指令,以使得针对依次接收到检测信号的至少两个红外传感器中的一部分生成一个或多个第一偏转指令;
当从所述多个红外传感器中的至少两个红外传感器同时接收到指示目标出现的检测信号时,根据不同于第一控制策略的第二控制策略生成所述第二偏转指令,以使得针对同时接收到检测信号的至少两个红外传感器生成一个第二偏转指令。
可选地,所述至少一个红外传感器中的每一个包括红外光束发射器和红外光束接收器,从而在所述红外光束发射器与所述红外光束接收器之间形成红外光束,当所述红外光束发生阻断时,所述至少一个红外传感器中的对应一个生成指示目标出现的检测信号;
所述红外光束发射器和所述红外光束接收器中的一个设置在第一端点处,所述红外光束发射器和所述红外光束接收器中的另一个设置在第二端点处;
所述第一端点位于所述摄像头的安装位置区域,所述第二端点相对于所述摄像头的安装位置固定,或,所述第一端点和所述第二端点的位置设置为使得所述视频监控摄像头的安装位置不处于所述红外光束和所述红外光束的延长线上。
可选地,所述视频监控设备还包括照明设备,其与所述处理装置相连,并且所述处理装置用于在从所述红外传感器接收到目标出现的检测信号时生成照明开启信号,并向所述照明设备发送所述照明开启信号,使得所述照明设备开启。
可选地,所述检测装置包括多个红外传感器,每个红外传感器分别检测所述目标并生成各自的检测信号;
所述照明设备包括与所述多个红外传感器一一对应设置的多个照明器,并且
当所述处理装置从所述红外传感器接收到目标出现的检测信号时,针对生成所述检测信号的红外传感器生成照明开启信号,以使得与生成所述检测信号的红外传感器相对应的照明器开启。
本发明提供的视频监控的方法和设备,检测到红外传感器的感应区域出现目标,生成偏转指令;根据所述偏转指令将视频监控摄像头的当前拍摄方向朝所述目标区域偏转。这样,区别于传统的图像跟踪方法,本发明基于红外传感器的检测结果来自动偏转摄像头的拍摄方向,节省了目标跟踪时的计算量、降低了视频监控设备的成本。
附图说明
通过参考附图会更加清楚的理解本发明的特征和优点,附图是示意性的而不应理解为对本发明进行任何限制,在附图中:
图1为本发明实施例提供的一种视频监控方法流程示意图;
图2为本发明实施例提供的一种视频监控方法中目标与摄像头位置关系示意图;
图3为本发明实施例提供的一种视频监控方法中目标与红外传感器的感应区域位置关系示意图;
图4为本发明实施例提供的一种视频监控方法中红外传感器与摄像头位置关系示意图;
图5为本发明实施例提供的一种视频监控方法中多个红外光束与摄像头位置关系示意图;
图6为示出本发明实施例提供的一种视频监控方法中多个红外光束的端点的设置位置的示意图;
图7为示出本发明实施例提供的一种视频监控方法中多个红外光束的端点的设置位置的示意图;
图8为示出本发明实施例提供的一种视频监控方法中多个红外光束的端点的设置位置的示意图;
图9为本发明实施例提供的另一种视频监控方法流程示意图;
图10为本发明实施例提供的一种视频监控方法中多个红外光束发生阻断的示意图;
图11为示出本发明实施例提供的照明设备的设置位置的示意图;
图12为本发明实施例提供的一种视频监控设备的原理框图。
具体实施方式
为了能够更清楚地理解本发明的上述目的、特征和优点,下面结 合附图和具体实施方式对本发明进行进一步的详细描述。需要说明的是,在不冲突的情况下,本申请的实施例及实施例中的特征可以相互组合。
在下面的描述中阐述了很多具体细节以便于充分理解本发明,但是,本发明还可以采用其他不同于在此描述的其他方式来实施,因此,本发明的保护范围并不受下面公开的具体实施例的限制。
在一个方面,提供了一种视频监控的方法,参见图1,该方法包括:
S101、通过红外传感器检测目标是否出现在所述红外传感器的感应区域,当检测到目标时,生成用于指示偏转角度的偏转指令,其中,所述红外传感器被配置为设置在被监控区域的第一位置;
视频监控中,需要摄像头的拍摄方向进行偏转,才能避免所述摄像头拍摄方向固定时产生的特定区域拍摄不到的问题,以防止重要的图像信息遗漏。
红外传感器通常设置在室内的重要位置,例如进门处、窗户处、死角附近;当目标(例如,嫌疑人或感兴趣的物体)进入红外传感器的感应区域时,该红外传感器检测到目标出现。
当红外传感器的感应区域内出现目标时,确定所述目标相对于所述摄像头的当前拍摄方向的方位,以生成对应的偏转指令,使得所述摄像头朝向目标偏转。所述偏转指令用于指示偏转角度,以使得摄像头朝向目标以该偏转角度偏转。所述偏转角度可以是摄像头的当前拍摄方向与所述摄像头面对该红外传感器时的拍摄方向之间的夹角。例如,所述摄像头面对该红外传感器时的拍摄方向可以是摄像头能够拍摄到该红外传感器所感应到的目标时的拍摄方向。本实施例的一种实施方式是通过确定所述目标相对于所述摄像头的当前拍摄方向的方位来生成偏转指令。
具体的,可通过红外传感器来确定所述目标相对于所述摄像头的 当前拍摄方向的位置。例如可以是通过红外定位传感器确定所述目标的位置,也可以是通过热红外感应器确定所述目标的位置,还可以是当红外光束发生阻断时确定所述目标相对于所述摄像头的当前拍摄方向的位置。
具体的,确定所述目标相对于所述摄像头的当前拍摄方向的位置即确定了所述目标相对于所述摄像头的方位信息,从而生成偏转指令。如图2所示,计算所述目标与摄像头的连线与摄像头的当前拍摄方向之间的夹角(可以分解为水平分量和竖直分量),从而生成用于指示与所述目标与摄像头的连线相对于所述摄像头的当前拍摄方向的夹角相对应的偏转角度的偏转指令。
在一些实施例中,通过确定所述红外传感器的感应区域相对于所述摄像头的当前拍摄方向的方位来生成偏转指令。如图3所示,当红外传感器的感应区域内检测到目标时,根据所述红外传感器的感应区域中的某一点(例如,最靠近摄像头的一点)和摄像头的连线与摄像头当前拍摄方向之间的夹角(可以分解为水平分量和竖直分量),生成用于指示与所述感应区域相对于所述摄像头的当前拍摄方向的方位相对应的偏转角度的偏转指令。优选的,可以通过所述红外感应区域的中心位置来确定所述红外感应区域相对于摄像头的当前拍摄方向的方位。
在一些实施例中,通过确定所述红外传感器的位置相对于所述摄像头的当前拍摄方向的方位来生成偏转指令。如图4所示,当红外传感器检测到目标时,根据红外传感器和摄像头的连线与摄像头的当前拍摄方向之间的夹角(可以分解为水平分量和竖直分量),生成用于指示与所述红外传感器和摄像头的连线与所述摄像头的当前拍摄方向之间的夹角相对应的偏转角度的偏转指令。
S102、根据所述偏转指令将摄像头的拍摄方向向所述目标区域偏转所述偏转角度,其中,所述摄像头被配置为设置在被监控区域的不 同于第一位置的第二位置。
通过将红外传感器同摄像头设置在不同的位置,能够在更大范围内检测和跟踪目标,改善了对目标进行视频监控的性能。
具体的,所述当前拍摄方向,是指在检测到红外传感器的感应区域内出现目标时刻,读取到的摄像头的拍摄方向。根据所述当前拍摄方向以及所述目标相对于所述摄像头的当前拍摄方向的方位信息,生成所述偏转指令。
具体的,所述偏转指令,可以指示所述摄像头的拍摄方向朝向所述目标的偏转量(例如,偏转角度),也可以指示所述摄像头的拍摄方向朝向所述目标的偏转量在纵向和横向上的偏转分量(例如,偏转角度的水平分量和竖直分量)。在一些实施例中,所述偏转指令包括纵向PWM(脉冲宽度调制)信号和横向PWM信号,当步进电机接收到所述偏转指令时,根据PWM信号,所述步进电机在规定的时间里按规定的转速旋转,从而驱动所述摄像头从当前拍摄方向向所述目标以所述偏转量偏转,进行拍摄。
具体的,由于偏转指令所指示的偏转角度可对应于所述目标相对于所述摄像头的当前拍摄方向的精确角度,可对应于检测到目标的红外传感器和摄像头的连线与当前拍摄方向之间的夹角,还可对应于该红外传感器的感应区域中的一点和摄像头的连线相对于当前拍摄方向的夹角,因此,当所述摄像头向所述目标偏转时,可以是使得所述摄像头偏转后的拍摄方向对准所述目标位置点,也可以是使得所述摄像头偏转后的拍摄方向对准所述目标位置点附近的区域。
需要说明的是,一个红外传感器可具有设置在多个位置的多个部分(例如,发射器和接收器),第一位置包括所述多个位置。只要红外传感器的多个部分之一与摄像头设置在不同位置,就可以认为摄像头设置在不同于第一位置的第二位置。
通过实施例中的上述步骤S101和S102,实现了基于红外感应的对 目标的检测情况来自动调整摄像头的拍摄方向,实现了视频监控方向的自动调整,从而防止重要监控图像信息遗漏。
实际场景中,红外感应区域中可能会出现干扰、摄像头拍摄方向相对于目标区域的偏转量会存在误差。可选的,在一些实施例中,所述红外传感器通过红外光束进行感应,所述红外光束与所述当前拍摄方向之间形成夹角;所述红外光束发生阻断时,检测到目标出现在所述红外传感器的感应区域中。
具体的,所述红外光束,可以是对射式的红外传感器产生红外光束,也可以是反射式的红外传感器产生的红外光束,为了提高监控效果,可以使用多个红外传感器产生多个红外光束,这样一旦红外光束发生阻断,则检测到目标进入对应的红外传感器的感应区域。下面通过图5中的示例来说明通过两个红外光束A、B进行检测的情形。
例如,如图5所示,当红外光束A发生阻断时,则检测到目标出现在红外光束A的光路上,此时所述摄像头可从当前拍摄方向朝红外光束A方向,偏转角度aA,来实现对目标的监控拍摄;同样地,当红外光束B发生阻断,则检测到目标出现在红外光束B的光路上,此时所述摄像头可从当前拍摄方向朝红外光束B方向,偏转角度aB
通过红外光束来检测目标,可以较为精确的确定出所述目标进入到红外传感器的感应区域时相对于摄像头拍摄方向的夹角,从而更容易生成用于指示偏转角度的偏转指令。
在一些实施例中,红外光束的光路与摄像头当前拍摄方向的交叉点不在摄像头处,当拍摄方向与红外光束的夹角为aC时,摄像头所需的偏转角度不等于aC,需要通过几何原理计算出所述偏转角度或根据所述红外光束C的光路预先设置所需的偏转角度。
为了更为准确的设置红外光束的位置以提高监控效率,在一些实 施例中,所述红外光束形成在第一端点与第二端点之间;所述第一端点设置在所述摄像头的安装位置区域,所述第二端点设置为相对于所述摄像头的安装位置固定。
具体的,所述摄像头的安装位置区域是指,所述摄像头的安装点的周边区域(例如,以所述摄像头的安装点为中心的区域)。该安装位置区域相对于监控视频所拍摄的空间足够小,所以在偏转角度的计算时,可假设第一端点与所述摄像头的安装点重合,由此引起的误差可忽略。
具体的,所述摄像头可以安装在所拍摄房间的顶面或底面上,也可以安装在所拍摄房间的侧面(即墙面)上。当所述摄像头安装在所述拍摄房间的顶面的中心时,该摄像头相对于所拍摄房间的各个侧面的偏转角度之和最小;摄像头也可以安装在与所拍摄房间重要区域(比如门)相对的墙面上。
例如,以两个红外光束为例,如图6所示,所述红外光束A形成在端点A2与端点A1之间;所述端点A2位于所述摄像头的安装位置区域,所述端点A1相对于所述摄像头的安装位置固定。同样的,所述红外光束B形成在端点B2与端点B1之间;所述端点B2位于所述摄像头的安装位置区域,所述端点B1相对于所述摄像头的安装位置固定。
在一些实施例中,所述摄像头与所述红外光束的端点可以分开设置,例如所述摄像头可以安装在房间顶面中心,端点A1、B1分别设置在墙面上,红外光束A、B可以是由反射式红外传感器产生,也可由对射式的红外传感器产生。
在一些实施例中,所述红外光束发生阻断时,根据所述当前拍摄方向与所述红外光束的夹角a生成所述偏转指令,其中,所述偏转指令所指示的偏转角度对应于所述夹角a;根据所述偏转指令偏转摄像头的步骤包括:将摄像头从所述当前拍摄方向向所述红外光束方向偏 转a+x,其中,所述x为延迟误差。
例如,如图6所示,假设所述摄像头的初始拍摄方向为0度,红外光束A、B与摄像头初始拍摄方向的逆时针夹角为aA、aB,首次被阻断的红外光束为红外光束A。当不考虑延迟误差时,所述摄像头需向所述红外光束A偏转aA,而后,如果红外光束B被阻断,则所述摄像头需向所述红外光束B偏转(aB-aA),当(aB-aA)为正值时,所述摄像头逆时针偏转,当(aB-aA)为负值时,所述摄像头顺时针偏转。所述初始拍摄方向可以指摄像头未进行任何偏转时的拍摄方向,例如,摄像头在上电时的拍摄方向。
所述偏转指令可分别指示偏转角度的水平分量和竖直分量。例如,如图6所示,假设所述摄像头的初始拍摄方向在水平和垂直方向为0度,红外光束A、B与摄像头初始拍摄方向在水平方向上的逆时针夹角为αA、αB,红外光束A、B与摄像头初始拍摄方向在竖直方向上的逆时针夹角为γA、γB,首次被阻断的红外光束为红外光束A。当不考虑延迟误差时,所述摄像头需偏转的水平和垂直夹角分别为αA和γA,而后,如果红外光束B被阻断,则所述摄像头在水平和垂直方向需偏转的夹角分别为(αBA)和(γBA)。
具体的,驱动所述摄像头偏转的方式有多种。例如,可通过步进电机驱动摄像头偏转。具体地,可通过以下公式计算出用于使步进电机驱动摄像头旋转的PWM信号中的脉冲数目:脉冲数目=(α或γ/360)*CYC,CYC为步进电动机旋转360度需要的脉冲数目。每个红外光束与摄像头初始拍摄方向在水平方向和垂直方向上的夹角需提前输入到相位控制器中。
具体的,将所述当前拍摄方向向所述红外光束偏转a+x,其中,所述x为延迟误差。所述延迟误差是指,考虑所述红外光束被阻断时刻至所述摄像头偏转完毕时刻的延迟,而增加的偏转量;例如,当所述目标运动时,所述延迟误差可根据目标的运动信息而确定,从而可 指示与目标在所述红外光束被阻断时刻至所述摄像头偏转完毕时刻的时间段中的运动量相对应的偏转量。所述x可以是预先设置的补偿值,也可以根据所述摄像头的转动速度确定。当然,本实施例中的x可以取0,即为不考虑延迟误差的情形。
在一些实施例中,除了上述将所述摄像头与所述红外光束的端点可以分开设置,所述红外光束的端点也可以集成在所述摄像装置中,即所述红外光束的发射器或接收器可设置在包括所述摄像头的摄像装置中。
以图7举例说明。假设有4个红外光束A、B、C、D(图7未示出),端点分别为A1、A2,B1、B2,C1、C2,D1、D2;红外光束A、B、C、D分别从端点A1、B1、C1、D1出射进入拍摄空间中,并且与所述摄像头的初始拍摄方向夹角固定;端点A2、B2、C2、D2(图7中未标出)设置在所述摄像头的安装位置区域中,以使得所述摄像头处于所述红外光束A、B、C、D的光路(或光路的延长线)中。
具体的,因为红外光束A、B、C、D与所述摄像头的初始拍摄方向夹角aA、aB、aC、aD固定,则摄像头的偏转角度无需额外计算。例如,当红外光束A发生阻断时,摄像头向红外光束A的方向偏转角度aA;此后,当红外光束B发生阻断时,摄像头向红外光束B的方向偏转角度(aB-aA)。在此种将红外光束的一对端点之一集成在所述摄像头上的方案中,可以通过预先设置的红外光束出射方向确定所述红外光束与初始拍摄方向的夹角从而确定偏转角度,无需在所述摄像头偏转过程中计算所述红外光束与当前拍摄方向的夹角,从而节省了计算量和提高了精度。
当然,此种将红外光束的端点集成在所述摄像头上的方案中,也可以在偏转角度中加入延迟误差x,在此不赘述。
为了更灵活的设置红外光束,在一些实施例中,所述红外光束形成在第一端点与第二端点之间;可预设所述第一端点和所述第二端点的位置,使所述摄像头的安装位置不处于所述红外光束的光路和所述光路的延长线上。
例如,参见图8,以两个红外光束举例说明。红外光束A、B在分别在端点A1和A2、端点B1和B2之间;分别设置端点A1和A2、B1和B2的位置,使得所述摄像头的安装位置不处于所述红外光束的光路和所述光路的延长线上(即红外光束不经过摄像头)。例如,可以将端点A1和A2、B1和B2设置在两个垂直的墙面上,当红外光束A被阻断时,所述摄像头的拍摄方向向所述红外光束A的方向偏转。
可根据所述第一端点和所述第二端点的位置设定相对于所述摄像头的初始拍摄方向的参考角度b。在一些实施例中,参照图8,所述参考角度b可以是红外光束A被阻断时摄像头拍摄方向所需的偏转角度bA,以及红外光束B被阻断时摄像头拍摄方向所需的偏转角度bB。所述偏转角度b(包括bA和bB)可按多种方式预先设置。例如,参照图8,可以设置所述摄像头初始拍摄方向为0°,对于红外光束A,端点A1和A2之间的线段上的点A0(例如,线段中点)与所述摄像头的连线相对于所述摄像头初始拍摄方向的夹角为所述参考角度bA,当红外光束A被阻断时,摄像头拍摄方向所需的偏转角度为bA。类似地,对于红外光束B,端点B1和B2之间的线段上的点B0(例如,线段中点)与所述摄像头的连线相对于所述摄像头初始拍摄方向的夹角为所述参考角度bB
当所述红外光束A发生阻断时,根据所述当前拍摄方向和所述参考角度bA生成所述偏转指令;将摄像头从所述当前拍摄方向偏转至所述参考角度bA+y,其中,所述y为延迟误差;随后当所述红外光束B发生阻断时,将摄像头从所述当前拍摄方向偏转(bB-bA)+y,使得拍摄方向偏转至红外光束B所在方向。延迟误差y与上述延迟误差x类 似,在此不再赘述。
在一些实施例中,所述红外传感器包括多个红外传感器,并且通过红外传感器检测目标的步骤包括:通过所述多个红外传感器中的每一个检测目标是否出现在该红外传感器的感应区域,并且根据所述多个红外传感器的检测结果的组合,生成所述偏转指令。
例如,所述多个红外传感器中的每一个检测所述目标是否出现在该红外传感器的感应区域(例如,图9的S801)。红外传感器产生用于指示检测结果的检测信号。在一个示例中,检测到目标的红外传感器产生高电平的检测信号,未检测到目标的红外传感器产生低电平的检测信号。但是,红外传感器的检测信号不限于此。
在一些实施例中,所述偏转指令包括第一偏转指令和第二偏转指令,根根据所述多个红外传感器的检测结果的组合生成偏转指令的步骤包括:当所述多个红外传感器中的至少两个依次检测到目标时,根据第一控制策略生成所述第一偏转指令(例如,图9的S803);当所述多个红外传感器中的至少两个同时检测到目标时,根据第二控制策略生成所述第二偏转指令(例如,图9的S804)。所述控制策略是指预先存储的处理规则或所述红外光束被阻断的情况生成的处理规则。
需要说明的是,第一偏转指令和第二偏转指令仅仅是为了对应于第一控制策略和第二控制策略而在命名方式上进行区别,它们在指令格式、编码方式等方面都是相同的,并且均用于表示针对某一对象(例如,某一红外传感器)的偏转角度。
在一个示例中,可通过以下方式确定所述多个红外传感器中的至少两个是否依次检测到目标:实时监控所述多个红外传感器产生的多个检测信号,当多个检测信号中的至少两个依次变为高电平时,确定所述多个红外传感器中的至少两个依次检测到目标。
在一个示例中,可通过以下方式确定所述多个红外传感器中的至 少两个是否同时检测到目标:实时监控所述多个红外传感器产生的多个检测信号,当多个检测信号中的至少两个同时为高电平时,确定所述多个红外传感器中的至少两个同时检测到目标。
具体的,根据所述第一控制策略生成所述第一偏转指令的步骤可以包括:根据多个红外传感器依次检测到目标的时间顺序依次生成偏转指令,以使得所述摄像头的拍摄方向顺次向检测到目标的红外传感器的方向偏转。例如,每个红外传感器一旦检测到目标,就生成针对该红外传感器的偏转指令,其用于指示从当前拍摄方向到摄像头面对该红外传感器时的拍摄方向的偏转角度。
这里,摄像头面对红外传感器时的拍摄方向包括对准所述红外传感器的拍摄方向,以及对准所述红外传感器的感应区域中的一点的拍摄方向。
在一些实施例中,考虑到多个红外传感器依次检测到目标的情形通常是目标发生了位移,针对此,可将多个红外传感器分为N组(N为正整数),每组至少包括两个红外传感器,并且根据所述第一控制策略生成所述第一偏转指令的步骤可以包括:针对每组红外传感器中最后检测到目标的红外传感器生成一个第一偏转指令,所述第一偏转指令用于指示从当前拍摄方向到摄像头面对所述最后检测到目标的红外传感器时的拍摄方向的偏转角度。这种情况下,根据所述偏转指令偏转摄像头的步骤可包括:每当生成针对一组红外传感器的一个第一偏转指令时,就将所述摄像头的拍摄方向朝向所述目标偏转所述一个第一偏转指令所指示的偏转角度。
例如,以所述多个红外传感器被分为2组(即N取2)、并且每个红外传感器通过红外光束检测目标为例进行说明。如图10所示,对应于多个红外传感器的红外光束分为两组,第一组包括红外光束C1、C2、C3;第二组包括红外光束D1、D2、D3;目标行进方向是自红外光束C1到红外光束C3的方向,当所述红外光束C1、C2、C3依次发生 阻断时,C3为最后被阻断的红外光束,根据红外光束C3相对于当前拍摄方向的夹角生成偏转指令,偏转角度对应于红外光束C3相对于当前拍摄方向的夹角。进一步的,在一些实施例中,可以有不止一组红外传感器依次检测到目标。在一个示例中,可以设定以一个时间周期内最后检测到目标的红外光束生成偏转指令,使得同一时间周期内仅生成一个偏转指令,以六个红外传感器的红外光束E1、E2、E3、E4、E5、E6为例,例如,在第一个周期内,光束E1和E2依次被阻断,当第一个周期结束时,根据红外光束E2生成偏转指令;在第二个周期内,光束E3、E4和E5依次被阻断,当第二个周期结束时,根据红外光束E5生成偏转指令;在第三个周期内,光束E6被阻断,当第三个周期结束时,根据红外光束E6生成偏转指令。每当生成偏转指令时,就将摄像头的拍摄方向朝向目标偏转该偏转指令所指示的偏转角度。
在一些实施例中,所述方法还可包括:根据所述每组红外传感器中各红外传感器检测到目标的时间确定所述目标的行进速度,从而确定延迟误差。如图10,根据红外光束C1、C2、C3发生阻断的时间确定所述目标相对于所述摄像头行进的角速度,并考虑所述摄像头偏转所需时间,确定延迟误差,从而生成所述偏转指令,该延迟误差可对应于从某一个红外光束被阻断时刻到摄像头偏转至该红外光束时刻的时间段内目标的运动量,例如,该时间段与所述目标的角速度的乘积。
具体的,根据所述第二控制策略生成所述第二偏转指令的步骤可包括:选择所述同时检测到目标的至少两个红外传感器中的一个,并且针对所选择的一个红外传感器生成一个第二偏转指令,所述第二偏转指令用于指示从当前拍摄方向到摄像头面对所选择的一个红外传感器时的拍摄方向的偏转角度。
在一些实施例中,根据第二控制策略生成所述第二偏转指令的步 骤可包括:针对所述至少两个红外传感器生成第二偏转指令,所述第二偏转指令用于指示从当前拍摄方向分别到摄像头面对所述至少两个红外传感器中的每一个时的拍摄方向的各个偏转角度的中间值作为所述偏转角度。例如,在红外传感器通过红外光束检测目标的情况下,根据多个同时被阻断的红外光束相对于当前拍摄方向的夹角或参考角度的中间值,生成一个第二偏转指令。
在具体实施中,可在同时检测到目标的至少两个红外传感器中任意选择一个红外传感器,也可以是选择被监控房间重要区域(例如门)附近的红外传感器,以使得监控摄像头拍摄下最重要的图像信息。
当然,也可以对多个红外传感器是否检测到目标进行周期性监控。例如,在红外传感器通过红外光束进行检测的情况下,对红外光束是否被阻断进行周期性顺次检测,同一周期内仅检测一个红外光束,以使得同一周期内仅生成一个第二偏转指令;如图10,可以是T1周期内检测红外光束C1、T2周期内检测红外光束C2...T6周期内检测红外光束D3,循环检测C1、C2...D3,这样,一个时刻仅根据一个被阻断的红外光束生成所述偏转指令。
参照图10说明根据多个同时被阻断的红外光束相对于当前拍摄方向的夹角或参考角度的中间值生成一个偏转指令的情况。当确定所述红外光束C1、C2、C3同时被阻断时,说明红外光束C1、C2、C3方向同时出现多个目标,为了监控到最多的目标,可以根据红外光束C1、C2、C3相对于当前摄像头拍摄方向的夹角或参考角度的中间值来生成偏转指令,所生成的该偏转指令可以指示所述摄像头以针对红外光束C1、C2和C3的各个偏转角度的中间值进行偏转。当然,该偏转指令也可以指示所述摄像头以最接近所述中间值方向的红外光束(如图中的红外光束C2)相对于当前拍摄方向的偏转角度进行偏转。
进一步的,在一些实施例中,所述方法还包括:当所述红外传感 器检测到目标时,对所述摄像头拍摄的视频图像进行OSD处理。即通过录像装置对摄像头输入的视频数据进行OSD(on-screen display)处理;例如添加红外传感器编号信息、红外传感器方位信息、所述目标出现时间等信息,并对处理后的视频数据进行编码并存储。
进一步的,在一些实施例中,所述方法还包括:当所述红外传感器检测到目标时,开启与该红外传感器对应设置的照明设备。所述照明设备可以集成设置在摄像头附近,例如,如图11所示,所述摄像头包括摄像头镜头和摄像头本体,所述照明设备(如多个LED灯)可与摄像头镜头一同集成在摄像头本体;所述照明设备也可以是所述摄像头所监控的房间中的单独的照明设备(如LED灯、日光灯),所述照明设备的控制模块在接收视频监控设备发送的触发信号后开启照明。在一些实施例中,所述与红外传感器对应设置的照明设备可以是设置在该红外传感器附近的照明设备,也可以是能够照射到该红外传感器的感应区域的照明设备(例如,与所述摄像头一体设置且出光方向与所述摄像头的拍摄方向一致的照明设备)。
在另一个方面,还提供了一种视频监控设备,包括:
检测装置1201,其包括至少一个红外传感器,用于检测目标是否出现在所述红外传感器的感应区域,并生成指示目标是否出现的检测信号;
与所述至少一个红外传感器相连接的处理装置1202,用于接收所述检测信号,以及根据所述检测信号生成用于指示偏转角度的偏转指令;
与所述处理装置连接的控制装置1203,用于接收所述偏转指令,并根据所述偏转指令将摄像头的拍摄方向向所述目标偏转所述偏转角度;
与所述控制装置连接的摄像装置1204,用于通过所述摄像头 12041拍摄视频图像。
进一步可选的,所述检测装置1201可包括多个红外传感器(如红外传感器12011、红外传感器12012...红外传感器1201n),每个红外传感器可分别检测目标并生成各自的检测信号。
在一些实施例中,每个红外传感器可包括红外光束发射器和红外光束接收器,从而在所述红外光束发射器与所述红外光束接收器之间形成红外光束,当所述红外光束发生阻断时,相应的红外传感器生成指示目标出现的检测信号。
所述红外光束发射器和所述红外光束接收器中的一个设置在第一端点处,所述红外光束发射器和所述红外光束接收器中的另一个设置在第二端点处;
所述第一端点位于所述摄像头12041的安装位置区域,所述第二端点相对于所述摄像头12041的安装位置固定,或,所述第一端点和所述第二端点的位置设置为使得所述摄像头的安装位置不处于所述红外光束和所述红外光束的延长线上。
进一步可选的,所述视频监控设备还包括照明设备,其与所述处理装置1202相连,并且所述处理装置1202用于在所述红外传感器检测到目标时生成照明开启信号,并向所述照明设备发送所述照明开启信号,使得所述照明设备开启。
在一些实施例中,所述照明设备包括与所述多个红外传感器12011-1201n一一对应设置的多个照明器。当处理装置1202接收到所述检测信号时,针对生成所述检测信号的红外传感器生成照明开启信号,以使得与生成所述检测信号的红外传感器相对应的照明器开启。照明器的示例可参照图11及其描述。
进一步的,所述处理装置可配置为在所述红外传感器检测到目标时,根据所述当前拍摄方向与所述红外传感器的夹角a生成所述偏转指令;所述控制装置可配置为使得摄像头将所述当前拍摄方向向所述 红外光束方向偏转a+x,其中,所述x为延迟误差。
进一步的,所述处理装置还用于根据所述第一端点和所述第二端点的位置设定拍摄方向的参考角度b;以及根据所述当前拍摄方向、所述参考角度b生成所述偏转指令;所述控制装置用于使得摄像头将所述当前拍摄方向偏转所述参考角度b+y,其中,所述y为延迟误差。
进一步的,所述处理装置可配置为:根据从多个红外传感器接收的检测信号的组合,生成所述偏转指令。具体地,所述处理装置配置为:当从所述多个红外传感器中的至少两个红外传感器依次接收到指示目标出现的检测信号时,根据第一控制策略生成所述偏转指令;以及,在从所述多个红外传感器中的至少两个所述红外传感器同时接收到指示目标出现的检测信号时,根据第二控制策略生成所述偏转指令。
进一步的,所述处理装置根据第一控制策略生成所述偏转指令可包括:根据多个红外传感器依次检测到目标的时间顺序依次生成偏转指令,以使得所述摄像头的拍摄方向顺次向检测到目标的红外传感器的方向偏转;或将多个红外传感器分为N组(N为正整数),每组至少包括两个红外传感器,针对每组红外传感器中最后检测到目标的红外传感器生成一个偏转指令。所述处理装置还可根据所述每组红外传感器中各红外传感器检测到目标的时间确定所述目标的行进速度,从而确定延迟误差。
进一步的,所述处理装置根据第二控制策略生成所述偏转指令可包括:选择所述同时检测到目标的至少两个红外传感器中的一个,并且针对所选择的一个红外传感器生成一个偏转指令,,所述偏转指令用于指示从当前拍摄方向到摄像头面对所选择的一个红外传感器时的拍摄方向的偏转角度;或,针对所述至少两个红外传感器生成偏转指令,所述偏转指令用于指示从当前拍摄方向分别到摄像头面对所述同时被阻断的至少两个红外传感器中的每一个时的拍摄方向的各个 偏转角度的中间值作为所述偏转角度。
当然,所述处理装置也可以对多个红外传感器是否检测到目标进行周期性监控。例如,在红外传感器通过红外光束进行检测的情况下,对红外光束是否被阻断进行周期性顺次检测,以使得同一周期内仅生成一个偏转指令。
进一步的,所述处理装置,还用于在所述红外传感器检测到目标时,对所述摄像装置拍摄的视频图像进行OSD处理。
由于根据本发明实施例的视频监控设备为实施本发明实施例提供的视频监控方法所采用的设备,故而基于根据本发明实施例的视频监控方法,本领域所属技术人员能够了解本发明实施例的视频监控设备的具体实施方式以及其各种变化形式,所以在此对于视频监控设备如何实现本发明中的视频监控方法不再详细介绍。只要本领域所属技术人员实施本发明中视频监控方法所采用的设备,都属于本申请所欲保护的范围。
在此提供的算法和显示不与任何特定计算机、虚拟设备或者其它设备固有相关。各种通用设备也可以与基于在此的示教一起使用。根据上面的描述,构造这类设备所要求的结构是显而易见的。此外,本发明也不针对任何特定编程语言。应当明白,可以利用各种编程语言实现在此描述的本发明的内容,并且上面对特定语言所做的描述是为了披露本发明的最佳实施方式。
在此处所提供的说明书中,说明了大量具体细节。然而,能够理解,本发明的实施例可以在没有这些具体细节的情况下实践。在一些实例中,并未详细示出公知的方法、结构和技术,以便不模糊对本说明书的理解。
类似地,应当理解,为了精简本公开并帮助理解各个发明方面中的一个或多个,在上面对本发明的示例性实施例的描述中,本发明的各个特征有时被一起分组到单个实施例、图、或者对其的描述中。然 而,并不应将该公开的方法解释成反映如下意图:即所要求保护的本发明要求比在每个权利要求中所明确记载的特征更多的特征。更确切地说,如下面的权利要求书所反映的那样,发明方面在于少于前面公开的单个实施例的所有特征。因此,遵循具体实施方式的权利要求书由此明确地并入该具体实施方式,其中每个权利要求本身都作为本发明的单独实施例。
本领域那些技术人员可以理解,可以对实施例中的设备中的模块进行自适应性地改变并且把它们设置在与该实施例不同的一个或多个设备中。可以把实施例中的模块或单元或组件组合成一个模块或单元或组件,以及此外可以把它们分成多个子模块或子单元或子组件。除了这样的特征和/或过程或者单元中的至少一些是相互排斥之外,可以采用任何组合对本说明书(包括伴随的权利要求、摘要和附图)中公开的所有特征以及如此公开的任何方法或者设备的所有过程或单元进行组合。除非另外明确陈述,本说明书(包括伴随的权利要求、摘要和附图)中公开的每个特征可以由提供相同、等同或相似目的的替代特征来代替。
此外,本领域的技术人员能够理解,尽管在此的一些实施例包括其它实施例中所包括的某些特征而不是其它特征,但是不同实施例的特征的组合意味着处于本发明的范围之内并且形成不同的实施例。例如,在下面的权利要求书中,所要求保护的实施例的任意之一都可以以任意的组合方式来使用。
本发明的各个部件实施例可以以硬件实现,或者以在一个或者多个处理器上运行的软件模块实现,或者以它们的组合实现。本领域的技术人员应当理解,可以在实践中使用微处理器或者数字信号处理器(DSP)来实现根据本发明实施例的网关、代理服务器、设备中的一些或者全部部件的一些或者全部功能。本发明还可以实现为用于执行这里所描述的方法的一部分或者全部的设备或者装置程序(例如,计 算机程序和计算机程序产品)。这样的实现本发明的程序可以存储在计算机可读介质上,或者可以具有一个或者多个信号的形式。这样的信号可以从因特网网站上下载得到,或者在载体信号上提供,或者以任何其他形式提供。
应该注意的是上述实施例对本发明进行说明而不是对本发明进行限制,并且本领域技术人员在不脱离所附权利要求的范围的情况下可设计出替换实施例。在权利要求中,不应将位于括号之间的任何参考符号构造成对权利要求的限制。单词“包含”不排除存在未列在权利要求中的元件或步骤。位于元件之前的单词“一”或“一个”不排除存在多个这样的元件。本发明可以借助于包括有若干不同元件的硬件以及借助于适当编程的计算机来实现。在列举了若干装置的单元权利要求中,这些装置中的若干个可以是通过同一个硬件项来具体体现。单词第一、第二、以及第三等的使用不表示任何顺序。可将这些单词解释为名称。

Claims (20)

  1. 一种视频监控设备,包括:
    检测装置,其包括至少一个红外传感器,用于检测目标是否出现在所述红外传感器的感应区域,并生成指示目标是否出现的检测信号,所述红外传感器被配置为设置在被监控区域的第一位置;
    与所述检测装置相连接的处理装置,用于接收所述检测信号,以及根据所述检测信号生成用于指示偏转角度的偏转指令;
    与所述处理装置连接的控制装置,用于接收所述偏转指令,并根据所述偏转指令将用于视频监控的摄像头的拍摄方向向所述目标偏转所述偏转角度;
    与所述控制装置连接的摄像装置,包括所述摄像头,用于通过所述摄像头拍摄视频图像,所述摄像头被配置为设置在所述被监控区域的不同于第一位置的第二位置。
  2. 根据权利要求1所述的视频监控设备,其中,所述检测装置包括多个红外传感器,每个红外传感器分别检测目标并生成各自的检测信号,并且,所述处理装置配置为:根据从所述多个红外传感器接收的检测信号的组合,生成所述偏转指令。
  3. 根据权利要求2所述的视频监控设备,其中,所述偏转指令包括第一偏转指令和第二偏转指令,所述处理装置配置为:
    当从所述多个红外传感器中的至少两个红外传感器依次接收到指示目标出现的检测信号时,根据第一控制策略生成所述第一偏转指令,以使得针对依次接收到检测信号的至少两个红外传感器中的一部分生成一个或多个第一偏转指令;
    当从所述多个红外传感器中的至少两个红外传感器同时接收到 指示目标出现的检测信号时,根据不同于第一控制策略的第二控制策略生成所述第二偏转指令,以使得针对同时接收到检测信号的至少两个红外传感器生成一个第二偏转指令。
  4. 根据权利要求1所述的视频监控设备,其中,
    所述至少一个红外传感器中的每一个包括红外光束发射器和红外光束接收器,从而在所述红外光束发射器与所述红外光束接收器之间形成红外光束,当所述红外光束发生阻断时,所述至少一个红外传感器中的对应一个生成指示目标出现的检测信号;
    所述红外光束发射器和所述红外光束接收器中的一个设置在第一端点处,所述红外光束发射器和所述红外光束接收器中的另一个设置在第二端点处;
    所述第一端点位于所述摄像头的安装位置区域,所述第二端点相对于所述摄像头的安装位置固定,或,所述第一端点和所述第二端点的位置设置为使得所述视频监控摄像头的安装位置不处于所述红外光束和所述红外光束的延长线上。
  5. 根据权利要求1-4任一项所述的视频监控设备,还包括照明设备,其与所述处理装置相连,并且所述处理装置用于在从所述红外传感器接收到目标出现的检测信号时生成照明开启信号,并向所述照明设备发送所述照明开启信号,使得所述照明设备开启。
  6. 根据权利要求5所述的视频监控设备,其中,所述检测装置包括多个红外传感器,每个红外传感器分别检测所述目标并生成各自的检测信号;
    所述照明设备包括与所述多个红外传感器一一对应设置的多个照明器,并且
    当所述处理装置从所述红外传感器接收到目标出现的检测信号时,针对生成所述检测信号的红外传感器生成照明开启信号,以使得与生成所述检测信号的红外传感器相对应的照明器开启。
  7. 一种视频监控方法,包括:
    通过红外传感器检测目标是否出现在所述红外传感器的感应区域,当检测到目标时,生成用于指示偏转角度的偏转指令,其中,所述红外传感器被配置为设置在被监控区域的第一位置;
    根据所述偏转指令将用于视频监控的摄像头的拍摄方向向所述目标偏转所述偏转角度,其中,所述摄像头被配置为设置在被监控区域的不同于第一位置的第二位置。
  8. 根据权利要求7所述的方法,其中,所述红外传感器包括多个红外传感器,并且
    通过红外传感器检测目标的步骤包括:
    通过所述多个红外传感器中的每一个检测目标是否出现在该红外传感器的感应区域,并且根据所述多个红外传感器的检测结果的组合,生成所述偏转指令。
  9. 根据权利要求8所述的方法,其中,所述偏转指令包括第一偏转指令和第二偏转指令,根据所述多个红外传感器的检测结果的组合生成偏转指令的步骤包括:
    当所述多个红外传感器中的至少两个红外传感器依次检测到所述目标时,根据第一控制策略生成所述第一偏转指令;
    当所述多个红外传感器中的至少两个红外传感器同时检测到所述目标时,根据不同于第一控制策略的第二控制策略生成所述第二偏转指令。
  10. 根据权利要求9所述的方法,其中,所述多个红外传感器被分为N组,每组包括至少两个红外传感器,N为正整数;
    根据第一控制策略生成所述偏转指令的步骤包括:
    针对每组红外传感器中最后检测到目标的红外传感器生成一个第一偏转指令,所述第一偏转指令所指示的偏转角度是所述摄像头的当前拍摄方向与所述摄像头面对所述最后检测到目标的红外传感器时的拍摄方向之间的夹角;并且
    根据所述偏转指令偏转摄像头的步骤包括:
    每当生成一个第一偏转指令时,将所述摄像头的拍摄方向朝向所述目标偏转所述一个第一偏转指令所指示的偏转角度。
  11. 根据权利要求9所述的方法,其中,根据第二控制策略生成所述第二偏转指令的步骤包括:
    选择同时检测到目标的所述至少两个红外传感器中的一个,并且针对所选择的一个红外传感器生成第二偏转指令,所述第二偏转指令所指示的偏转角度是所述摄像头的当前拍摄方向与所述摄像头面对所选择的一个红外传感器时的拍摄方向之间的夹角。
  12. 根据权利要求9所述的方法,其中,根据第二控制策略生成所述第二偏转指令的步骤包括:
    针对同时检测到目标的所述至少两个红外传感器生成第二偏转指令,所述第二偏转指令所指示的偏转角度是所述摄像头的当前拍摄方向分别与所述摄像头面对所述至少两个红外传感器中的每一个时的拍摄方向之间的各个夹角的中间值。
  13. 根据权利要求7所述的方法,其中,
    所述红外传感器发出红外光束;
    所述红外光束发生阻断时,检测到目标出现在所述红外传感器的感应区域中。
  14. 根据权利要求13所述的方法,其中,
    所述红外光束形成在第一端点与第二端点之间;
    所述第一端点设置在所述摄像头的安装位置区域,所述第二端点设置为相对于所述摄像头的安装位置固定。
  15. 根据权利要求13所述的方法,其中,
    所述红外光束形成在第一端点与第二端点之间;
    设置所述第一端点和所述第二端点的位置,使所述摄像头的安装位置不处于所述红外光束的光路和所述光路的延长线上。
  16. 根据权利要求14所述的方法,其中,
    所述红外光束发生阻断时,根据所述摄像头的当前拍摄方向与所述红外光束的夹角a生成所述偏转指令;
    根据所述偏转指令偏转摄像头的步骤包括:将所述当前拍摄方向向所述红外光束偏转a+x,其中,x为延迟误差。
  17. 根据权利要求15所述的方法,其中,
    根据所述第一端点和所述第二端点的位置设定相对于所述摄像头的初始拍摄方向的参考角度b;
    所述红外光束发生阻断时,根据所述摄像头的当前拍摄方向和参考角度b生成所述偏转指令;
    根据所述偏转指令偏转摄像头的步骤包括:将所述摄像头的拍摄方向偏转至相对于所述初始拍摄方向呈角度b+y的方向,其中,y为延 迟误差。
  18. 根据权利要求17所述的方法,其中,所述参考角度b为所述红外光束在第一端点与第二端点之间的一点与所述摄像头的连线与所述初始拍摄方向之间的夹角。
  19. 根据权利要求16或17所述的方法,其中,当所述多个红外传感器中的至少两个红外传感器的红外光束依次被阻断时,根据所述至少两个红外传感器的各个红外光束被阻断的时间确定所述目标的行进速度,从而根据所述行进速度确定所述延迟误差。
  20. 根据权利要求7所述的方法,其中,
    当所述红外传感器检测到该红外传感器的感应区域出现目标时,开启与该红外传感器对应设置的照明设备。
PCT/CN2017/086543 2016-06-03 2017-05-31 一种视频监控方法和设备 WO2017206896A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/742,321 US10776650B2 (en) 2016-06-03 2017-05-31 Method and apparatus for video surveillance

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610391410.2 2016-06-03
CN201610391410.2A CN105828053A (zh) 2016-06-03 2016-06-03 一种视频监控方法和设备

Publications (1)

Publication Number Publication Date
WO2017206896A1 true WO2017206896A1 (zh) 2017-12-07

Family

ID=56532777

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/086543 WO2017206896A1 (zh) 2016-06-03 2017-05-31 一种视频监控方法和设备

Country Status (3)

Country Link
US (1) US10776650B2 (zh)
CN (1) CN105828053A (zh)
WO (1) WO2017206896A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10885755B2 (en) 2018-09-14 2021-01-05 International Business Machines Corporation Heat-based pattern recognition and event determination for adaptive surveillance control in a surveillance system

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105828053A (zh) 2016-06-03 2016-08-03 京东方科技集团股份有限公司 一种视频监控方法和设备
CN107770480B (zh) * 2016-08-15 2020-10-16 保定市天河电子技术有限公司 一种安防监控系统和方法
CN106358022A (zh) * 2016-11-01 2017-01-25 合肥华贝信息科技有限公司 一种基于视频的安防监控方法
CN110298880A (zh) * 2018-03-23 2019-10-01 苏州启铭臻楠电子科技有限公司 一种基于智能单目视觉的舞台摇头灯运动协调性分析方法
CN110767246B (zh) * 2018-07-26 2022-08-02 深圳市优必选科技有限公司 一种噪声处理的方法、装置及机器人
US11521478B2 (en) * 2018-09-27 2022-12-06 Mitsubishi Electric Corporation Left-behind detection device and left-behind detection method
CN111614943A (zh) * 2020-06-02 2020-09-01 苏州金迪智能科技有限公司 一种安防单摄像头监测工位智能控制装置
CN113569825B (zh) * 2021-09-26 2021-12-10 北京国电通网络技术有限公司 视频监控方法、装置、电子设备和计算机可读介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101335879A (zh) * 2008-07-10 2008-12-31 华南理工大学 多点触发定点追踪的监控方法及监控系统
CN102131047A (zh) * 2011-04-18 2011-07-20 广州市晶华光学电子有限公司 一种360度自动跟踪式狩猎相机及其工作方法
US20120013745A1 (en) * 2009-11-13 2012-01-19 Korea Institute Of Science And Technology Infrared sensor and sensing method using the same
CN103051879A (zh) * 2012-12-12 2013-04-17 青岛联盟电子仪器有限公司 一种办公场所监控系统
CN202995309U (zh) * 2012-11-15 2013-06-12 曾敬 智能云台控制器
CN204190875U (zh) * 2014-11-04 2015-03-04 河海大学 一种分区域红外导引智能旋转光学摄像系统
CN105828053A (zh) * 2016-06-03 2016-08-03 京东方科技集团股份有限公司 一种视频监控方法和设备

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101335879A (zh) * 2008-07-10 2008-12-31 华南理工大学 多点触发定点追踪的监控方法及监控系统
US20120013745A1 (en) * 2009-11-13 2012-01-19 Korea Institute Of Science And Technology Infrared sensor and sensing method using the same
CN102131047A (zh) * 2011-04-18 2011-07-20 广州市晶华光学电子有限公司 一种360度自动跟踪式狩猎相机及其工作方法
CN202995309U (zh) * 2012-11-15 2013-06-12 曾敬 智能云台控制器
CN103051879A (zh) * 2012-12-12 2013-04-17 青岛联盟电子仪器有限公司 一种办公场所监控系统
CN204190875U (zh) * 2014-11-04 2015-03-04 河海大学 一种分区域红外导引智能旋转光学摄像系统
CN105828053A (zh) * 2016-06-03 2016-08-03 京东方科技集团股份有限公司 一种视频监控方法和设备

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10885755B2 (en) 2018-09-14 2021-01-05 International Business Machines Corporation Heat-based pattern recognition and event determination for adaptive surveillance control in a surveillance system

Also Published As

Publication number Publication date
CN105828053A (zh) 2016-08-03
US20180197033A1 (en) 2018-07-12
US10776650B2 (en) 2020-09-15

Similar Documents

Publication Publication Date Title
WO2017206896A1 (zh) 一种视频监控方法和设备
US8027515B2 (en) System and method for real-time calculating location
EP2049308A1 (en) System and method for calculating location using a combination of odometry and landmarks
US8292522B2 (en) Surveillance camera position calibration device
TW201539383A (zh) 利用動作感應的侵入偵測技術
US20040179729A1 (en) Measurement system
RU2017134759A (ru) Система технического зрения транспортного средства
JP6540330B2 (ja) 追跡システム、追跡方法および追跡プログラム
KR20150107506A (ko) 목표 피사체 정밀 추적 및 촬영용 팬틸트 일체형 감시카메라
TWI475311B (zh) 轉向攝影機及自動轉向雲台
BR102015005652A2 (pt) aparelho para controlar operações de formação de imagem de uma câmera óptica, e, sistema para reconhecer um número de veículo de uma placa de licença de um veículo
CN205657800U (zh) 一种视频监控设备
JP2008009053A5 (zh)
US20170365144A1 (en) System and methods of field of view alignment
JP2011120186A (ja) ビデオカメラ撮像用装置
KR101856166B1 (ko) 영상 인식 기반의 동물 감지 장치
KR100673087B1 (ko) 로봇용 사물인식시스템
KR101804309B1 (ko) 위치 추적 감시 카메라
JP3846553B2 (ja) 画像処理装置
US20200177808A1 (en) Monitoring system
JP5332767B2 (ja) 原点検出方法、原点検出装置及び監視カメラ装置
JP2000266985A (ja) 監視カメラの自動焦点調整装置
JP7064109B2 (ja) インテリジェント路側ユニットおよびその制御方法
KR101015040B1 (ko) 이동체에 탑재된 카메라에서의 영상 잡음 제거 장치 및 방법
WO2022217809A1 (zh) 摄像机、拍摄方法、系统及装置

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17805841

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17805841

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 27/06/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17805841

Country of ref document: EP

Kind code of ref document: A1