CN214846174U - Lighting device - Google Patents

Lighting device Download PDF

Info

Publication number
CN214846174U
CN214846174U CN202023285582.2U CN202023285582U CN214846174U CN 214846174 U CN214846174 U CN 214846174U CN 202023285582 U CN202023285582 U CN 202023285582U CN 214846174 U CN214846174 U CN 214846174U
Authority
CN
China
Prior art keywords
illumination
lighting
lighting device
driving device
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202023285582.2U
Other languages
Chinese (zh)
Inventor
孙国涛
林彬
郑天航
张正华
李志军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Opple Lighting Co Ltd
Suzhou Op Lighting Co Ltd
Original Assignee
Opple Lighting Co Ltd
Suzhou Op Lighting Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Opple Lighting Co Ltd, Suzhou Op Lighting Co Ltd filed Critical Opple Lighting Co Ltd
Priority to CN202023285582.2U priority Critical patent/CN214846174U/en
Application granted granted Critical
Publication of CN214846174U publication Critical patent/CN214846174U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Studio Devices (AREA)

Abstract

The utility model discloses a lighting device. The apparatus comprises: an illumination device; a driving device connected to the lighting device; a guide rail to which the drive is slidably mounted; and an image acquisition device slidably mounted to the guide rail. The utility model has the advantages of, ordinary light is fixed illumination, can only illuminate a fixed illumination area usually, can not change the scope of illumination, and the utility model discloses a horizontal drive device and vertical drive device control lighting device respectively and rotate at horizontal direction and vertical direction to make lighting device can satisfy the demand of different scenes to light.

Description

Lighting device
Technical Field
The present invention relates to lighting technology, and more particularly, to a lighting device that can automatically track performers or other objects in the air and illuminate them with light, in places such as banquet halls, or stages.
Background
In places such as banquet halls, halls or stages, the lighting position is usually required to be changed, and the common fixed lighting lamp cannot meet the requirement of changing the lighting range along with the lighting position, so that a lighting device is required to be designed to meet the requirement of changing the lighting direction in the scene.
The existing image monitoring equipment needs to acquire a monitoring image in real time under the condition of good light, and if the brightness of a monitoring environment is insufficient, the definition of the monitored image is poor.
Disclosure of Invention
An embodiment of the utility model provides a lighting device has effectively solved ordinary fixed light and can not change the not enough problem of illumination direction, monitoring light.
According to an aspect of the present invention, an embodiment of the present invention provides a lighting device; an illumination device; the driving device is connected to the lighting device and used for changing the irradiation direction of the lighting device; a guide rail to which the drive is slidably mounted; and an image acquisition device slidably mounted to the guide rail.
Further, the illumination device is 20cm to 30cm away from the image acquisition device.
Further, the lighting device is a follow spot lamp.
Further, the driving device includes: the horizontal driving device is arranged on the guide rail; and the vertical driving device is arranged on the horizontal driving device and is connected to the lighting device.
Further, the lighting device is rotatably connected to the vertical driving device.
Further, the driving device is a manipulator.
Furthermore, the image acquisition device comprises at least one lens, and an included angle between a central axis of the lens and a horizontal plane is 30-60 degrees.
Further, the illumination area of the illumination device is located within the field of view of the lens.
Furthermore, the image acquisition device is electrically connected with an external server, and the driving device is electrically connected with the server. Further, the driving device is connected to the server through a wireless communication module.
The utility model has the advantages of, ordinary light is fixed illumination, can only illuminate a fixed illumination area usually, can not change the scope of illumination, and the utility model discloses a horizontal drive device and vertical drive device control lighting device respectively and rotate at horizontal direction and vertical direction to make lighting device can satisfy the demand of different scenes to light. The image acquisition device and the illumination device are arranged on the same guide rail, so that the ambient brightness of real-time image acquisition is improved, and the definition of monitoring images is improved.
Drawings
The technical solution and other advantages of the present invention will become apparent from the following detailed description of the embodiments of the present invention with reference to the accompanying drawings.
Fig. 1 is a schematic structural diagram of an illumination device according to an embodiment of the present invention.
Fig. 2 is a schematic structural view of an installation angle of the lighting device according to an embodiment of the present invention.
Detailed Description
The technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention. It is to be understood that the embodiments described are only some embodiments of the invention, and not all embodiments. Based on the embodiments in the present invention, all other embodiments obtained by those skilled in the art without creative efforts belong to the protection scope of the present invention.
In the description of the present invention, it is to be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; may be mechanically connected, may be electrically connected or may be in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship. The specific meaning of the above terms in the present invention can be understood according to specific situations by those skilled in the art. In this embodiment, the analog display screen touch unit is connected to the head tracking unit, and is configured to acquire a moving path of a sensing cursor in the display device.
As shown in fig. 1, which is a schematic structural diagram of an illumination device provided by an embodiment of the present invention, the illumination device includes: the device comprises a lighting device 1, a driving device, an image acquisition device 5, a guide rail 13, a server 6, a through hole 4 and a shaft pin 4 a.
A driving device is connected to the lighting device 1 for changing the lighting direction of the lighting device 1. The image capturing device 5 is used for capturing at least one real-time image guide 13 of a specific area 10. The lighting device 1 and the image capturing device 5 are slidably mounted to the guide rail 13, respectively.
In an embodiment, the illumination device 1 is located at a distance of 20cm to 30cm from the image capturing device 5.
The driving device includes: a horizontal driving device 3 and a vertical driving device 2. The horizontal driving device 3 is arranged on the guide rail 13, and the vertical driving device 2 is arranged on the horizontal driving device 3. The vertical driving device 2 is provided with a through hole 4. The lighting device 1 is arranged in the through hole 4 through a shaft pin 4 a.
The driving device is a mechanical arm. The image acquisition device 5 is a camera. The server 6 and the driving device perform data transmission through an RF433 wireless transmission module or a ZIGBEE wireless transmission module.
The illumination direction of the illumination device 1 is located within the specific area 10.
The server 6 includes an input end and an output end, wherein the input end is electrically connected to the image capturing device 5, and the output end is electrically connected to the driving device.
In the actual use process, the lighting direction of the lighting device 1 can be changed by manually controlling the driving device, the lighting direction of the lighting device 1 can be automatically controlled by the server 6, and the server can be arranged in the lighting equipment or an external device can communicate information with the lighting equipment.
The server 6 includes: the image processing device comprises an image detection unit, an area detection unit, a proportion calculation unit, a control unit, a calculation unit and a judgment unit.
The image detection unit is used for acquiring pixel coordinate information of the target to be tracked in the real-time image. Specifically, the real-time image captured by the image capturing device 5 is a two-dimensional plane image. In one embodiment, the real-time image has a fixed resolution of 640 × 480, so that the pixel coordinate information of the target to be tracked in the real-time image is determined by the resolution. In order to better identify the position of the target to be tracked, the boundary of the target to be tracked in the environment is acquired.
The image detection unit includes: the image processing device comprises a coordinate acquisition unit, an image partition unit, a boundary acquisition unit and a pixel coordinate acquisition unit.
The coordinate acquisition unit is used for acquiring the centroid pixel coordinate of the target to be tracked. Specifically, in one embodiment, the centroid is a centroid of a color feature of the target to be tracked. For example, the facial features of the target to be tracked can be identified, and the coordinates of the centroid pixel of the target to be tracked can be determined.
The image partition unit divides the real-time image into a plurality of regions with the centroid pixel coordinate as an origin. Specifically, the centroid pixel coordinate is located in a two-dimensional plane of the real-time image, and in an embodiment, a cartesian coordinate system is established in the two-dimensional plane with the centroid pixel coordinate as an origin to divide the real-time image into four regions. And each region is respectively scanned from the origin to the outside in a scattering manner, and the boundary of the target to be tracked is searched simultaneously. The boundary search of the target to be tracked meets the following conditions:
Figure DEST_PATH_GDA0003299210530000041
where G denotes the gradient value of the pixel P, T denotes the gradient adaptive threshold, E0The method refers to the result of searching the boundary of the target to be tracked, wherein 0 represents that the target is not in the target area, 1 represents that the target is in the target area, and R represents0The target area to be tracked is shown.
The boundary acquisition unit is used for respectively scanning each region to acquire boundary pixel coordinates of each region. Specifically, when each of the regions is scanned, the centroid pixel coordinate is used as the origin.
And the pixel coordinate acquisition unit generates pixel coordinate information of the target to be tracked according to the boundary pixel coordinate of each region. Specifically, the boundary pixel coordinates of each of the regions include a plurality of pixel coordinates, and the pixel coordinate information generated by the boundary pixel coordinates of each of the regions is a finished face contour when scanning is performed scattered outward from an origin in a normal case without an error.
And when the pixel coordinate information of the target to be tracked changes, the calculating unit is used for calculating the difference value of the change of the pixel coordinate information. Specifically, when the pixel coordinate information is an initial value, the difference does not need to be calculated.
The judgment unit is used for judging whether the absolute value of the difference value is larger than a preset threshold value or not, and if so, recalculating the actual position information by the changed pixel coordinate information. Specifically, in practical tests, it is found that even if the target to be tracked in the specific area 10 is still, the pixel coordinate information given by the image acquisition device 5 in the real-time image fluctuates within a certain range, so that the tracking device shakes. The pixel coordinate information therefore varies within a certain range without changing the orientation of the tracking device. And if the absolute value of the difference is larger than a preset threshold, recalculating the actual position information by using the changed pixel coordinate information.
The area detection unit is used for calculating the actual size of the specific area 10 according to the position of the image acquisition device 5. Specifically, by calculating L ═ L1+ L2,
Figure DEST_PATH_GDA0003299210530000051
Figure DEST_PATH_GDA0003299210530000052
Figure DEST_PATH_GDA0003299210530000053
according to the formula
L2=H(tan(@1+@2))-tan(@1))
Figure DEST_PATH_GDA0003299210530000054
From this, the (x0, y0) coordinate is calculated as (b, L), i.e.
Figure DEST_PATH_GDA0003299210530000055
The proportion calculation unit is used for calculating the proportion relation between the pixel size of the real-time image and the actual size of the specific area 10. Specifically, the size of the specific area 10 is (b, L), so that the actual distance represented by each pixel can be calculated.
And the control unit is used for calculating the actual position of the target to be tracked according to the proportion information and the pixel coordinate information, calculating according to the actual position of the target to be tracked, and sending a control command according to the actual angle of the tracking device and the actual angle of the tracking device when the target to be tracked is tracked.
In particular, the method comprises the following steps of,thus, the P point pixel coordinate is converted into the actual position, and the result is (xL2/640, yb/480), namely
Figure DEST_PATH_GDA0003299210530000056
Therefore, the actual position of the point P is only related to the values of H, @1, @2 and @3, and the three values are fixed values, so that the final user only needs to adjust the installation height and the angle according to the requirement, and the difficulty of field debugging is greatly reduced. The actual angle is recognizable by the driving means so that the driving means adjusts the angle of the tracking means. And sending the control instruction to a driving device, wherein the driving device changes the direction of the tracking device according to the control instruction.
As shown in fig. 2, with O as the origin, the Z axis in the height H direction, the X axis in the width L direction, and the Y axis in the same plane with the specific region 40 and perpendicular to the width L direction, the image capturing device 10 is assumed as point C, where the height 10 of the image capturing device is the perpendicular distance between the image capturing device and the plane of the specific region H (i.e., the distance between the image capturing device 10 and point O), L1 is the distance between point O and point a, L2 is the distance between point a and point B, and B is the distance between point B and the coordinates (X0, Y0). @1 is the angle between AC and OC. The included angle between @2AC and BC, and the @3 is the included angle between BC and C and the line connecting the coordinates (X0, Y0).
A driving device is connected to the lighting device 1 for changing the lighting direction of the lighting device 1. In an embodiment, the driving device is a robot, which can control the position and illumination angle of the lighting device 1 in three dimensions. The lighting device 1 and the image capturing device 5 are slidably mounted to the guide rail 13, respectively.
The utility model has the advantages of, compare in prior art, ordinary light is fixed illumination, can only illuminate a fixed illumination area usually, can not change the scope of illumination, and the utility model discloses a horizontal drive device and vertical drive device control lighting device respectively and rotate at horizontal direction and vertical direction to make lighting device can satisfy the demand of different scenes to light.
The utility model discloses can also combine together with the server, the server treats through acquireing and tracks the target and is in pixel coordinate information among the real-time image, according to image acquisition device's position, calculation specific area's actual dimension, and according to proportional information reaches pixel coordinate information calculates treat the actual position of tracking the target. The target to be tracked is positioned by calculating data in a two-dimensional scene, so that complex calculation in a three-dimensional scene is avoided, and the cost performance of the system is improved.
The principle and the implementation of the present invention are explained by applying specific examples, and the above description of the embodiments is only used to help understand the technical solution and the core idea of the present invention; those of ordinary skill in the art will understand that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications or substitutions do not depart from the spirit and scope of the present invention in its various embodiments.

Claims (10)

1. An illumination device, comprising:
an illumination device;
the driving device is connected to the lighting device and used for changing the irradiation direction of the lighting device;
a guide rail to which the drive is slidably mounted; and
an image acquisition device slidably mounted to the guide rail.
2. The illumination apparatus according to claim 1, wherein the illumination device is located 20cm to 30cm from the image capture device.
3. A lighting device as recited in claim 1, wherein said lighting device is a follow spot.
4. A lighting device as recited in claim 1, wherein said driving means comprises:
the horizontal driving device is arranged on the guide rail; and
and the vertical driving device is arranged on the horizontal driving device and is connected to the lighting device.
5. A lighting device as recited in claim 4, wherein the lighting device is rotatably connected to the vertical drive.
6. The lighting apparatus according to claim 1, wherein the driving device is a robot.
7. The illumination device as recited in claim 1, wherein the image capturing device comprises at least one lens, and an included angle between a central axis of the lens and a horizontal plane is 30-60 degrees.
8. The illumination apparatus of claim 7 wherein the illumination region of the illumination device is located within the field of view of the lens.
9. The illumination apparatus as claimed in claim 1, wherein the image capturing device is electrically connected to an external server, and the driving device is electrically connected to the server.
10. The lighting apparatus according to claim 9,
the driving device is connected to the server through a wireless communication module.
CN202023285582.2U 2020-12-30 2020-12-30 Lighting device Active CN214846174U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202023285582.2U CN214846174U (en) 2020-12-30 2020-12-30 Lighting device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202023285582.2U CN214846174U (en) 2020-12-30 2020-12-30 Lighting device

Publications (1)

Publication Number Publication Date
CN214846174U true CN214846174U (en) 2021-11-23

Family

ID=78868896

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202023285582.2U Active CN214846174U (en) 2020-12-30 2020-12-30 Lighting device

Country Status (1)

Country Link
CN (1) CN214846174U (en)

Similar Documents

Publication Publication Date Title
CN111897332B (en) Semantic intelligent substation robot humanoid inspection operation method and system
CN110142785A (en) A kind of crusing robot visual servo method based on target detection
CN110246175A (en) Intelligent Mobile Robot image detecting system and method for the panorama camera in conjunction with holder camera
CN105675610A (en) Online detection system for object surface texture characteristics and working principle
JPH02143309A (en) Operation method and apparatus
CN108032011B (en) Initial point guiding device and method are stitched based on laser structure flush weld
CN108596173B (en) Single-camera full-view line number real-time recognition device and detection method thereof
JP2015212629A (en) Detection device and manipulator operation control including detection device
CN113031462A (en) Port machine inspection route planning system and method for unmanned aerial vehicle
CN114434456A (en) Machine room inspection robot and inspection method thereof
Li et al. Deep-trained illumination-robust precision positioning for real-time manipulation of embedded objects
CN111993420A (en) Fixed binocular vision 3D guide piece feeding system
CN109541626B (en) Target plane normal vector detection device and detection method
CN109670391B (en) Intelligent lighting device based on machine vision and dynamic identification data processing method
CN214846174U (en) Lighting device
TW201838400A (en) Moving target position tracking system having a main control unit for electrically connecting the orientation adjustment mechanism, the first image tracking module, and the second image tracking module to control the tracking of the target position
CN112702513B (en) Double-optical-pan-tilt cooperative control method, device, equipment and storage medium
CN116385868A (en) Building monitoring system based on image recognition and patrol robot thereof
CN112601007B (en) Image acquisition method and device for characteristic region
CN115790366A (en) Visual positioning system and method for large array surface splicing mechanism
CN112634356A (en) Tracking method and system and electronic equipment
EP4071578A1 (en) Light source control method for vision machine, and vision machine
CN114909993A (en) High-precision laser projection visual three-dimensional measurement system
JP7219915B2 (en) Tracking lighting device and lighting tracking system
Jian et al. Bolt positioning method based on active binocular vision

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant