EP2705656A1 - Camera system for recording images, and associated method - Google Patents
Camera system for recording images, and associated methodInfo
- Publication number
- EP2705656A1 EP2705656A1 EP12727098.1A EP12727098A EP2705656A1 EP 2705656 A1 EP2705656 A1 EP 2705656A1 EP 12727098 A EP12727098 A EP 12727098A EP 2705656 A1 EP2705656 A1 EP 2705656A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- camera system
- images
- individual cameras
- cameras
- sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6812—Motion detection based on additional sensors, e.g. acceleration sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/51—Housings
Definitions
- the invention is directed to a camera system for taking pictures consisting of at least one single camera.
- the invention is directed to a method for taking pictures
- a camera system comprising at least one individual camera and at least one control unit and a sensor, in particular an acceleration sensor.
- Panoramic photographs allow us to capture images that are close to the scope of the human visual field. This gives a better overall impression of a place than the photos of normal cameras. Panoramic cameras allow the capture of such panoramas using one or more individual cameras. The pictures of several single cameras can later be combined to a complete overall picture.
- Full spherical panoramas can be created by taking still images and then (automatically) mounting them on the computer.
- the images can be photographed either sequentially by many cameras at the same time or with a camera.
- camera tossing describes the throwing up of normal cameras which take a photo in flight by means of a self-timer, ie with a preset delay.There is also a series of design studies for panorama cameras as well as single cameras that are thrown or shot in the air should.
- Triops is the concept of a ball with three fisheye lenses, and "CTRUS” football is intended to integrate cameras into the surface of a football.
- the "I-Ball” draft consists of two fisheye lenses in a ball to be thrown or shot.
- the invention is therefore based on the object of providing a solution by means of a number of cameras which are connected to an overall system, which makes it possible to record a good and sharp image for each individual camera, the images then being combined to form an omnidirectional panoramic image can.
- the object is achieved in that the individual cameras are arranged in different directions so that they record a complete overall image, the overall image includes the individual images of the individual cameras, and a central control unit is arranged with the by means of at least one sensor, the movement profile of the camera system can be detected and the
- Triggering times of the individual cameras can be determined according to a predetermined objective function, wherein the camera system moves autonomously over the entire time course.
- Camera system enables the autonomous release of the individual cameras according to a target function in a panoramic camera, eg when it is thrown into the air.
- the sensor is an acceleration sensor. This makes it possible, for example, to measure the acceleration occurring when a panoramic camera is thrown into the air and to determine the triggering times of the individual cameras by means of a target function.
- the senor is a sensor for measuring the relative velocity to the surrounding air.
- the triggering can be done by means of a target function, which depends directly on the currently measured speed of the camera system.
- the target function is set such that, if the distance to the motion profile drops below a minimum distance d, the individual cameras are triggered, which the invention continues to provide.
- the trigger point is the highest point of a flight profile.
- the speed of the camera system at one time is 0 m / s. The closer the camera system triggers at this point, the slower the camera system moves and the lower the motion blur on the captured image.
- the highest point also provides an interesting perspective, a good overview of the scene and reduced by the lower relative distance differences between e.g. the ground and the launcher the occurring parallax errors.
- the minimum distance d is at most 20 cm, preferably 5 cm, in particular 1 cm. If the trigger point is the highest point of a flight profile, it is advantageous if the camera system triggers as close as possible to the point of the momentary standstill.
- the individual cameras are arranged so that they taken together cover a solid angle of 4 Pi sr.
- the camera system is omnidirectional and it is irrelevant in what orientation it is at the time of recording.
- the handling of the camera system is easier than with only partial coverage of the solid angle, because the orientation does not have to be considered.
- the full spherical panorama allows the scene to be viewed in any direction.
- the camera system comprises a supporting structure and recesses in which the individual cameras are mounted, wherein the recesses are designed so that touching the camera lenses with the fingers is unlikely or impossible, wherein the
- the recessed single cameras prevent soiling or damage to their lenses. Padding can prevent both the damage to the individual cameras and damage to the camera system as a whole.
- the upholstery can be an integral part of the supporting structure. For example, is the use of a very soft material for the load-bearing
- the padding can be used to ensure that touching the camera lenses with your fingers is difficult or impossible.
- a small opening angle of the individual cameras is an advantage, so that the recesses in which the individual cameras are mounted sunk, narrower. This requires a larger number of single cameras to cover the same solid angle than when using single cameras with a larger opening angle.
- the camera system is characterized in that at least 80%, preferably more than 90%, in particular 100% of the surface of the
- Camera system form the light inlet openings of the individual cameras. If you want to combine the pictures of several single cameras into one overall picture ("stitching"), the different projection centers of the single cameras cause parallax errors, which can only be completely avoided if the projection centers of all single cameras are at the same point a covered solid angle of 4 Pi sr only if the entire surface of the camera system is used for the capture of light rays, so it has to do with a "glass ball". As soon as this principle is deviated, there are no light rays which pass through the surface of the desired common projection center within the camera system. This causes parallax errors. If the largest possible part of the surface of the camera system forms the light inlet openings of the individual cameras, the parallax errors can also be kept as small as possible.
- Acceleration sensor or other orientation sensor such as a
- Magnetic field sensor is determined, and these each preferably in 3-axis execution arebeitbeiten before the camera system is in its flight phase.
- the change in orientation between the time of determination of the gravitational vector and the time of recording can be determined. If this orientation change is known, the gravitational vector can be easily calculated in relation to the camera system at the time of acquisition. With a sufficiently accurate and high-resolution acceleration sensor and with a sufficiently vertical throw, it may also be possible to determine the gravitational vector at the time of recording with sufficient accuracy for the consideration of the overall image from the acceleration resulting from the air friction and detected by the accelerometer.
- the central control unit suppresses the triggering of the individual cameras when the camera system exceeds a certain rotation rate r, wherein the rotation rate r from the desired maximum blur and the used
- Exposure time is calculable. In poorly lit scenes or less intense single cameras, it may be useful to throw the camera system into the air several times (for example, to throw it) and only trigger it if the system does not turn sharply.
- the rate of rotation which must be below to produce at most a certain motion blur, can be calculated using the exposure time used. You can set the tolerated blur and moves the camera system several times in the air until you stay at a trial below the calculated rate of rotation.
- a (ball-shaped) camera system can easily throw into the air several times in succession, which increases the chance of a sharp image compared to the single throw.
- Exposure times at the same aperture and to transfer these images to the control unit.
- the control unit can then conclude with the data transmitted to them on the available amount of light in the different directions and from there the exposure values for
- the exposure values are calculated (exposure time and / or aperture, depending on the individual cameras used), these are transmitted to the individual cameras.
- the measurement of the exposure and the triggering of the single cameras for the actual photo can take place either during the same flight or on successive flights. Is the measurement of the
- Exposure values and triggering for the actual photo on different flights it may be necessary to measure the rotation of the camera between these events and to adjust the exposure values accordingly, so that the correct exposure takes place in the correct direction when triggered.
- the invention therefore also provides a method, which is characterized in that the triggering time for the individual cameras by means of the integration of acceleration in the time before the occurrence of the free fall is determined with air resistance and falls below a minimum distance to the trigger point of the flight profile triggering the Single cameras takes place or when detecting the free fall with air resistance, the triggering of the individual cameras takes place or when changing the direction of air resistance in the transition from climb to descent profile triggering the individual cameras or in the case of relative velocity to the surrounding air at least below 2 m / s, preferably below 1 m / s, in particular below 0.5 m / s, the triggering of the individual cameras takes place, wherein either an image comprising at least one single image, by means of the individual cameras is recorded or a temporal series of images, each comprising at least one individual image, is recorded by means of individual cameras and the control unit evaluates these images depending on the content of the images and the selection of
- Accelerometer only an acceleration due to air resistance. It is therefore appropriate to use the acceleration measured before the start of free fall to record the flight profile. By integrating this acceleration, the initial speed of the flight and the climb time to a trigger point can be calculated. The release can then be done after waiting for this climb time.
- the acceleration vector depends on the current speed and direction of the flight.
- the time profile of the acceleration vector it is possible to deduce the current position in the flight profile.
- the triggering can be realized at the highest point of a flight.
- the camera system When the camera system is triggered, either a single image (consisting of the individual camera images) or a series of images, e.g. at a regular interval.
- An embodiment of the invention provides that the selection of the image from the temporal series of images by means of the calculation of the current position of the camera system from the Images or by the sharpness of the images or by the size of the compressed images.
- the motion profile of the camera system By analyzing the image data of a series of images, it is possible to calculate the motion profile of the camera system. This can be used to select an image from the image series. For example, For example, the image at which the distance of the camera system to the highest point of a flight was smallest could be selected.
- the individual cameras be synchronized with each other so that they all trigger at the same time, which the invention continues to provide.
- the synchronization ensures that the individual images fit both locally and temporally.
- single cameras with integrated image stabilization can be used in the camera system. These may e.g. work with piezo sensors sliding image sensors. For the purpose of cost saving and / or the lower
- the sensors connected to the control unit in particular rotation rate sensors, for the determination of control signals for the
- Image stabilization systems of single cameras to use need not be present in the individual cameras and the individual cameras can remain off for a long time.
- the sharpness of the images can be analyzed to directly select an image with as little motion blur as possible. Considering the size of compressed images can lead to a similar result, as sharper images contain more information and therefore occupy more space in the data memory at the same compression rate.
- An embodiment of the invention provides that when exceeding a rotation rate r, which can be calculated from the exposure time used and the maximum desired motion blur, the triggering of the individual cameras is suppressed or the images of several successive flights are cached and the selection by the control unit only one of these images takes place, whereby the selection of the image takes place on the basis of its blur calculated from the image content, or the image is selected on the basis of the measured rotation rate r or the image is selected on the basis of its blur calculated from the measured rotation rate r and the exposure time used.
- a user can simply throw the system into the air multiple times and is likely to get a sharp one Image.
- this rate of rotation can either be selected manually or calculated. It can be calculated from a fixed or user-selected maximum motion blur and the exposure time used. For example, for this calculation, one can look at how many pixels a point light source would expose during exposure.
- control unit decides on the end of a flight series. This decision may not be due to a pause in time (e.g., several seconds)
- the blur caused by the rotation can be determined from the image content by image processing and the sharpest image can be selected.
- the measured rate of rotation r can be used and the image with the lowest rate of rotation r can be selected.
- the blur can be calculated from the measured rate of rotation r and the exposure time used, and the sharpest image can be selected.
- Another possibility is to temporarily store the images of several consecutive flights and to select the sharpest image when the rotation rate is exceeded.
- the selection of the sharpest image can be made either by the content of the images or by the rotation rate measurement. If this is done by means of the rotation rate measurement, then an upper maximum permissible rotation rate m can be determined using a fixed upper maximum
- Motion blur and the exposure time used are calculated. If no image of the series is below a specified maximum motion blur or below the upper maximum permitted rotation rate m, it is also possible that none of the images is selected. This gives the user the opportunity to directly try again to take a picture. It is also possible to trigger images in the series only if the measured rate of rotation under the upper maximum permissible rate of rotation m.
- control unit In the active method via the control unit is a control with or without
- Reaction Wheels use three orthogonal wheels that are accelerated from a rest position opposite to the ball rotation about the respective axis.
- Reaction Wheels use three orthogonal wheels that are accelerated from a rest position opposite to the ball rotation about the respective axis.
- compressed air from a reservoir e.g. 4 nozzles in the form of a cross at one point outside attached to the ball and attached 2 more nozzles perpendicular to the 4 nozzles on the camera system surface.
- moving weights which increase the moment of inertia of the ball when activated can be used.
- Another method would be to use a liquid, granulate or solid body, either in a container, in tubes or in a gimbal. These elements would lead to a damping of rotation due to the friction effects.
- FIG. 2 shows a perspective view of a camera system according to the invention
- FIG. 3 shows a sketch of the integration of the acceleration before the beginning of the free fall with air resistance.
- the camera system consists of a spherical supporting structure 4, e.g. a ball, with 36 cell phone camera modules 1 and the necessary electronics inside.
- the camera modules 1 are arranged in the surface of the spherical supporting structure 4 so as to cover the complete solid angle of 4 Pi sr. That means that
- Camera modules 1 cover the entire solid angle with their viewing volume.
- Camera system is thrown vertically by the user in the air and the ejection receives the camera system acceleration 7, which can be detected by means of an arranged in the camera system acceleration sensor 2. After integration of the acceleration 7 and determination of the speed of the time of reaching the vertex is determined. When the vertex is reached, the cell phone camera modules 1 simultaneously trigger an image acquisition.
- the further structure of the camera can be described as follows.
- the camera system has 36 cell phone camera modules 1, which buffer their image data after recording in a first-in first-out RAM-IC (FIFO-RAM-IC).
- the cell phone camera modules 1 and the FIFO RAM ICs are mounted on small boards under the surface of the ball on a supporting structure 4. Inside the supporting structure 4 is a mainboard with a central microcontroller and other components that together form the control unit 3.
- the cell phone camera modules 1 are connected via a bus to the central microcontroller. This sends the image data via a connected after the flight USB cable to a PC further.
- the flight of the camera system can be divided into four phases: 1. Rest, 2. Drop, 3. Flight, 4. Collect.
- phase 1 the sensor 2 measures only the acceleration of gravity, while in phase 2, the acceleration of gravity plus the discharge acceleration 7 is measured by the sensor 2.
- Fig. 3 the beginning of the discharge phase 8 and the end of the discharge phase 9 is shown.
- phase 3 the flight phase, the sensor 2 measures no or only a very small acceleration, since its proof mass drops (and rises) just as fast as the camera system.
- phase 4 the delay in the fall to the acceleration of gravity adds up.
- the microcontroller constantly stores the last n
- the flight phase is reached as soon as the measured acceleration falls below the 0.3 g threshold for 100 ms.
- the end of the discharge phase 9 is first detected as soon as the acceleration value rises above 1.3 g. Then, it continues to go backwards through the FIFO until the acceleration 7 has fallen below 1.2 g again.
- the discharge speed can now be determined by integration of the acceleration 7 between these two times in the FIFO, wherein the gravitational acceleration is subtracted.
- the integrated surface 10 is shown in FIG. 3. From the speed, the climb time to the vertex is calculated directly, taking into account the air resistance.
- the cell phone camera modules 1 are triggered after the elapse of the rise time.
- Cell phone camera module 1 is taken into account and subtracted as correction factors from the climb time. In addition, the 100 ms are subtracted, after which the free fall is detected as described above.
- Mobile phone camera used as a mobile phone camera module 1 with fixed focus. With this type of lens, the entire scene is sharply focused from a certain distance and no time to Focusing needed. Although most mobile phone cameras have relatively low
- the camera system narrow. This makes it less likely to accidentally touch the lenses during the throw.
- the camera system uses the direct JPEG compression of the image data in hardware. This allows many pictures in the FIFO
- the spherical supporting structure 4 In order for the camera system to be thrown, the spherical supporting structure 4 must be kept small. That is why it is necessary to have as few as possible
- the virtual cameras are placed with their projection centers in the center of a unit sphere, so cover with their viewing volume of a part of the spherical surface.
- the coverage of the solid angle can be evaluated by the camera modules for a particular combination of camera orientations by the cover evenly on the
- the cost function uses the number of test points that are not covered by a virtual camera.
- the supporting structure 4 was manufactured by means of selective laser sintering with the material PA 2200.
- a clip which is fastened with hooks on both sides, pushes the board towards the outside of the supporting structure 4.
- the stop in this direction consists of several small protrusions, which come to rest in free places on the board.
- Each cell phone camera module 1 is mounted behind a recess in the surface of the camera system.
- This depression is adapted to the shape of the viewing volume of the mobile phone camera module 1. It therefore has the shape of a truncated pyramid.
- this recess is located on one side of the output of the LED channel and on the other side, the recessed during laser sintering, the number of cell phone camera modules 1.
- foam is glued to the outside of the supporting structure 4, which forms a padding 5.
- It is a closed-cell cross-linked polyethylene foam with a density of 33 kg / m 3 , which is available under the brand name "Piastazote® LD33" on the market.
- FIG. 2 shows the external view of the camera system with the upholstery 5, the supporting structure 4, the depressions 6 and the mobile phone camera modules 1.
- Each cell phone camera module 1 sits on a small board. All camera boards are connected to the mainboard by a single long ribbon cable. Via this cable, both the data to the mainboard by means of a parallel bus, and the control commands are transmitted via a serial bus to the camera boards.
- the motherboard supplies the camera boards with the required voltages via a power cable.
- Bluetooth module the power supply, the battery protection circuit, a microSD socket, a ATD converters as well as acceleration and yaw rate sensors.
- the camera board in addition to the VS6724 camera module, there is an AL460 FIFO IC for data buffering and an ATtiny24 microcontroller.
- the camera module is mounted in the center of the 19.2 mm x 25.5 mm x 1.6 mm board in a socket. This is located exactly in the middle of the symmetrical board to facilitate the alignment in the design of the supporting structure 4.
- the FIFO IC has been placed on the back so that the total size of the board does not significantly exceed the dimensions of the FIFO IC.
- a microcontroller handles communication with the mainboard and controls the FIFO and camera.
Abstract
Description
Claims
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102011100738 | 2011-05-05 | ||
DE102011109990A DE102011109990A1 (en) | 2011-05-05 | 2011-08-08 | Camera system for taking pictures and associated method |
PCT/DE2012/000464 WO2012149926A1 (en) | 2011-05-05 | 2012-04-30 | Camera system for recording images, and associated method |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2705656A1 true EP2705656A1 (en) | 2014-03-12 |
Family
ID=47019647
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP12727098.1A Ceased EP2705656A1 (en) | 2011-05-05 | 2012-04-30 | Camera system for recording images, and associated method |
Country Status (7)
Country | Link |
---|---|
US (1) | US9531951B2 (en) |
EP (1) | EP2705656A1 (en) |
JP (1) | JP2014519232A (en) |
KR (1) | KR20140022056A (en) |
CN (1) | CN103636190A (en) |
DE (2) | DE102011109990A1 (en) |
WO (1) | WO2012149926A1 (en) |
Families Citing this family (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10757308B2 (en) | 2009-03-02 | 2020-08-25 | Flir Systems, Inc. | Techniques for device attachment with dual band imaging sensor |
US8237787B2 (en) | 2009-05-02 | 2012-08-07 | Steven J. Hollinger | Ball with camera and trajectory control for reconnaissance or recreation |
US9144714B2 (en) | 2009-05-02 | 2015-09-29 | Steven J. Hollinger | Ball with camera for reconnaissance or recreation and network for operating the same |
US9237317B2 (en) | 2009-05-02 | 2016-01-12 | Steven J. Hollinger | Throwable camera and network for operating the same |
JP5866913B2 (en) * | 2011-09-19 | 2016-02-24 | 株式会社リコー | Imaging device |
US9426430B2 (en) * | 2012-03-22 | 2016-08-23 | Bounce Imaging, Inc. | Remote surveillance sensor apparatus |
US9479697B2 (en) | 2012-10-23 | 2016-10-25 | Bounce Imaging, Inc. | Systems, methods and media for generating a panoramic view |
US9497380B1 (en) | 2013-02-15 | 2016-11-15 | Red.Com, Inc. | Dense field imaging |
US9563105B1 (en) * | 2013-04-10 | 2017-02-07 | Ic Real Tech Inc. | Screw coupler enabling direct secure fastening between communicating electronic components |
WO2015051344A1 (en) * | 2013-10-03 | 2015-04-09 | Flir Systems, Inc. | Durable compact multisensor observation devices |
US9507129B2 (en) * | 2013-10-31 | 2016-11-29 | Duke University | Apparatus comprising a compact catadioptric telescope |
US11297264B2 (en) | 2014-01-05 | 2022-04-05 | Teledyne Fur, Llc | Device attachment with dual band imaging sensor |
WO2016029283A1 (en) * | 2014-08-27 | 2016-03-03 | Muniz Samuel | System for creating images, videos and sounds in an omnidimensional virtual environment from real scenes using a set of cameras and depth sensors, and playback of images, videos and sounds in three-dimensional virtual environments using a head-mounted display and a movement sensor |
US10187555B2 (en) * | 2014-10-17 | 2019-01-22 | Amaryllis Innovation Gmbh | Camera system for capturing images and methods thereof |
CN105635635A (en) | 2014-11-19 | 2016-06-01 | 杜比实验室特许公司 | Adjustment for space consistency in video conference system |
CN104869312B (en) * | 2015-05-22 | 2017-09-29 | 北京橙鑫数据科技有限公司 | Intelligent tracking shooting device |
CN104869313A (en) * | 2015-05-27 | 2015-08-26 | 华南理工大学 | Panoramic image photographing method and panoramic image detection system |
KR102249946B1 (en) | 2015-09-04 | 2021-05-11 | 삼성전자주식회사 | Apparatus and method for controlling a image capture and a image output |
GB2544058A (en) * | 2015-11-03 | 2017-05-10 | Rolls Royce Plc | Inspection apparatus and methods of inspecting gas turbine engines |
CN105554400B (en) * | 2016-02-26 | 2019-11-12 | 南开大学 | A method of realizing that automatic jump is taken pictures by Intelligent bracelet |
US10057487B1 (en) * | 2016-03-25 | 2018-08-21 | Scott Zhihao Chen | Panoramic imaging systems based on normal-lens cameras |
CN105721788B (en) * | 2016-04-07 | 2019-06-07 | 福州瑞芯微电子股份有限公司 | A kind of multi-cam electronic equipment and its image pickup method |
CN106021803B (en) * | 2016-06-06 | 2019-04-16 | 中国科学院长春光学精密机械与物理研究所 | A kind of method and system of the optimal arrangement of determining image capture device |
CN106162000B (en) * | 2016-07-08 | 2019-03-15 | 上海芯仑光电科技有限公司 | Pixel Acquisition Circuit, imaging sensor and image capturing system |
CN106773511A (en) * | 2016-12-27 | 2017-05-31 | 未来现实(武汉)科技有限公司 | A kind of many mesh panorama cameras of combined type |
US10603525B2 (en) | 2017-03-20 | 2020-03-31 | Uniqative LLC | Impact tools |
DE102017108053A1 (en) | 2017-04-13 | 2018-10-18 | Bernd-Jan Krasowski | Device for inspection and / or maintenance in walk-in or walk-in sewer systems with a digital camera |
JP7199071B2 (en) | 2017-07-06 | 2023-01-05 | 国立研究開発法人宇宙航空研究開発機構 | Mobile imaging device |
JP7016644B2 (en) * | 2017-08-25 | 2022-02-07 | キヤノン株式会社 | Imaging device |
CN112840634B (en) * | 2018-10-18 | 2023-04-18 | 三星电子株式会社 | Electronic device and method for obtaining image |
CN110639180A (en) * | 2019-09-27 | 2020-01-03 | 佛山市木记信息技术有限公司 | Multifunctional golf ball |
CN113077413A (en) * | 2020-01-06 | 2021-07-06 | 苏州宝时得电动工具有限公司 | Self-moving equipment and control method thereof |
KR102253237B1 (en) * | 2020-06-08 | 2021-05-17 | 한국해양과학기술원 | Photograping system for sea sediment and controlling method of photograping system for sea sediment |
KR102499401B1 (en) * | 2020-11-11 | 2023-02-10 | 국방과학연구소 | Device for analyzing explosives media propagation characteristic image, drone therefor and method thereof |
US11199404B1 (en) * | 2020-12-09 | 2021-12-14 | Baker Hughes Holdings Llc | Camera triggering and multi-camera photogrammetry |
CN113642555B (en) * | 2021-07-29 | 2022-08-05 | 深圳市芯成像科技有限公司 | Image processing method, computer readable medium and system |
Family Cites Families (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3046792A (en) | 1958-12-05 | 1962-07-31 | James J Morgan | Accelerometer with velocity and distance integrating means |
US3505465A (en) | 1967-04-21 | 1970-04-07 | Us Army | Panoramic television viewing system |
US5099694A (en) * | 1987-05-19 | 1992-03-31 | Canon Kabushiki Kaisha | Vibration detecting apparatus |
US5023725A (en) | 1989-10-23 | 1991-06-11 | Mccutchen David | Method and apparatus for dodecahedral imaging system |
US5864360A (en) | 1993-08-26 | 1999-01-26 | Canon Kabushiki Kaisha | Multi-eye image pick-up apparatus with immediate image pick-up |
US6192196B1 (en) | 1994-10-11 | 2001-02-20 | Keller James Mcneel | Panoramic camera |
US6466254B1 (en) * | 1997-05-08 | 2002-10-15 | Be Here Corporation | Method and apparatus for electronically distributing motion panoramic images |
JP2001042420A (en) * | 1999-07-29 | 2001-02-16 | Fuji Photo Film Co Ltd | Digital camera |
JP2001320736A (en) | 2000-05-12 | 2001-11-16 | Minolta Co Ltd | All-around stereoscopic camera |
US6947059B2 (en) | 2001-08-10 | 2005-09-20 | Micoy Corporation | Stereoscopic panoramic image capture device |
US7463280B2 (en) | 2003-06-03 | 2008-12-09 | Steuart Iii Leonard P | Digital 3D/360 degree camera system |
GB2407725B (en) * | 2003-11-03 | 2009-08-26 | Elizabeth Anna Sykes | A Play ball |
JP2007043225A (en) | 2005-07-29 | 2007-02-15 | Univ Of Electro-Communications | Picked-up processing apparatus and picked-up processing method |
US20070171042A1 (en) * | 2005-12-22 | 2007-07-26 | Petru Metes | Tactical surveillance and threat detection system |
WO2008068456A2 (en) | 2006-12-06 | 2008-06-12 | Sony United Kingdom Limited | A method and an apparatus for generating image content |
GB0701300D0 (en) * | 2007-01-24 | 2007-03-07 | Dreampact Ltd | An inspection device which may contain a payload device |
AU2008283053A1 (en) | 2007-04-20 | 2009-02-05 | Lucid Dimensions, Inc. | Curved sensor array apparatus and methods |
JP2010136058A (en) * | 2008-12-04 | 2010-06-17 | Nikon Corp | Electronic camera and image processing program |
JP2010252027A (en) * | 2009-04-15 | 2010-11-04 | Mitsubishi Electric Corp | Video camera with turntable |
US8237787B2 (en) * | 2009-05-02 | 2012-08-07 | Steven J. Hollinger | Ball with camera and trajectory control for reconnaissance or recreation |
US9144714B2 (en) | 2009-05-02 | 2015-09-29 | Steven J. Hollinger | Ball with camera for reconnaissance or recreation and network for operating the same |
US8750645B2 (en) * | 2009-12-10 | 2014-06-10 | Microsoft Corporation | Generating a composite image from video frames |
JP2011135246A (en) | 2009-12-24 | 2011-07-07 | Sony Corp | Image processing apparatus, image capturing apparatus, image processing method, and program |
US9065982B2 (en) * | 2010-01-07 | 2015-06-23 | Northrop Grumman Systems Corporation | Reconfigurable surveillance apparatus and associated method |
US20120152790A1 (en) | 2011-03-28 | 2012-06-21 | Physical Apps, Llc | Physical interaction device for personal electronics and method for use |
WO2014066405A1 (en) | 2012-10-23 | 2014-05-01 | Bounce Imaging, Inc. | Remote surveillance sensor apparatus |
-
2011
- 2011-08-08 DE DE102011109990A patent/DE102011109990A1/en not_active Ceased
- 2011-08-08 DE DE202011111046.3U patent/DE202011111046U1/en not_active Expired - Lifetime
-
2012
- 2012-04-30 JP JP2014508682A patent/JP2014519232A/en active Pending
- 2012-04-30 EP EP12727098.1A patent/EP2705656A1/en not_active Ceased
- 2012-04-30 KR KR1020137029215A patent/KR20140022056A/en not_active Application Discontinuation
- 2012-04-30 WO PCT/DE2012/000464 patent/WO2012149926A1/en active Application Filing
- 2012-04-30 CN CN201280021753.XA patent/CN103636190A/en active Pending
- 2012-04-30 US US14/113,924 patent/US9531951B2/en active Active
Non-Patent Citations (1)
Title |
---|
See references of WO2012149926A1 * |
Also Published As
Publication number | Publication date |
---|---|
DE102011109990A1 (en) | 2012-11-08 |
JP2014519232A (en) | 2014-08-07 |
US20140111608A1 (en) | 2014-04-24 |
KR20140022056A (en) | 2014-02-21 |
WO2012149926A1 (en) | 2012-11-08 |
DE202011111046U1 (en) | 2018-10-02 |
US9531951B2 (en) | 2016-12-27 |
CN103636190A (en) | 2014-03-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2705656A1 (en) | Camera system for recording images, and associated method | |
US20140049601A1 (en) | Camera system for capturing images and methods thereof | |
US11188101B2 (en) | Method for controlling aircraft, device, and aircraft | |
WO2016059470A1 (en) | Camera system for capturing images and methods thereof | |
KR101783545B1 (en) | Camera Gimbal Syatem of Unmanned Flight Vehicle for VR 360 degree Omnidirectional Photographing | |
EP3371653A1 (en) | Camera mounting for stereoscopic panoramic recordings | |
CN112650267A (en) | Flight control method and device for aircraft and aircraft | |
CN109313455B (en) | Intelligent glasses, method for controlling holder of intelligent glasses, holder, control method and unmanned aerial vehicle | |
CN105721788B (en) | A kind of multi-cam electronic equipment and its image pickup method | |
DE112015001226T5 (en) | Shooting system, firearm and data processing device | |
JP2006317701A5 (en) | ||
DE102011001268A1 (en) | Method for measuring deformation of hub with rotating object e.g. blades of propeller of e.g. aircraft, involves fixing camera arrangement in hub, such that camera arrangement is rotated around rotational axis | |
DE102012012817A1 (en) | Camera mount, camera and support structure for shooting spherical panoramas | |
DE202019100930U1 (en) | Reduction of microphone audio noise from a gimbal motor | |
DE102011078901A1 (en) | Camera mount for a skier | |
DE102017004658A1 (en) | Segmented recording system for the automatic acquisition of panoramic images | |
DE202011003574U1 (en) | camera device | |
DE102015115405A1 (en) | Stabilization arrangement for cameras | |
WO2017072354A1 (en) | Support device for a flying device, and flying device | |
CN107618672B (en) | Shooting assembly and unmanned aerial vehicle with same | |
DE102015000873A1 (en) | Seeker head for a guided missile | |
DE102019105225A1 (en) | System and method for taking at least one image of an observation area | |
CN214824212U (en) | Inspection triaxial nacelle integrating identification tracking technology | |
DE102011008166A1 (en) | Apparatus and method for shooter location | |
Zhang et al. | Large field-of-view stitching with a single lens and its implementation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20131121 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAX | Request for extension of the european patent (deleted) | ||
REG | Reference to a national code |
Ref country code: HK Ref legal event code: DE Ref document number: 1195837 Country of ref document: HK |
|
17Q | First examination report despatched |
Effective date: 20150223 |
|
APBK | Appeal reference recorded |
Free format text: ORIGINAL CODE: EPIDOSNREFNE |
|
APBN | Date of receipt of notice of appeal recorded |
Free format text: ORIGINAL CODE: EPIDOSNNOA2E |
|
APBR | Date of receipt of statement of grounds of appeal recorded |
Free format text: ORIGINAL CODE: EPIDOSNNOA3E |
|
APAF | Appeal reference modified |
Free format text: ORIGINAL CODE: EPIDOSCREFNE |
|
19U | Interruption of proceedings before grant |
Effective date: 20170701 |
|
19W | Proceedings resumed before grant after interruption of proceedings |
Effective date: 20180903 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: AMARYLLIS INNOVATION GMBH |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R003 |
|
APBT | Appeal procedure closed |
Free format text: ORIGINAL CODE: EPIDOSNNOA9E |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN REFUSED |
|
REG | Reference to a national code |
Ref country code: HK Ref legal event code: WD Ref document number: 1195837 Country of ref document: HK |
|
18R | Application refused |
Effective date: 20201030 |