WO2018124688A1 - Dispositif de commande de drone pour l'évitement d'une collision - Google Patents

Dispositif de commande de drone pour l'évitement d'une collision Download PDF

Info

Publication number
WO2018124688A1
WO2018124688A1 PCT/KR2017/015463 KR2017015463W WO2018124688A1 WO 2018124688 A1 WO2018124688 A1 WO 2018124688A1 KR 2017015463 W KR2017015463 W KR 2017015463W WO 2018124688 A1 WO2018124688 A1 WO 2018124688A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
image
omnidirectional
collision avoidance
distance
Prior art date
Application number
PCT/KR2017/015463
Other languages
English (en)
Korean (ko)
Inventor
이선구
이충구
Original Assignee
이선구
이충구
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 이선구, 이충구 filed Critical 이선구
Publication of WO2018124688A1 publication Critical patent/WO2018124688A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/04Anti-collision systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography

Definitions

  • the present invention relates to a collision avoidance drone control device that provides a function of avoiding collision with surrounding objects using an omnidirectional camera system.
  • drones unmanned drones
  • drones have been commercialized, and drones have been used in various fields such as camera photography.
  • These drones are small unmanned drones and are generally operated manually by receiving operator's operation signals wirelessly.
  • This drone operation method is inefficient in that the operator must be together, and also poses a danger in that an accident may occur due to the operator's mistake.
  • a drone control system that installs a distance sensor and a camera on the drone to detect objects around the drone and sends an alarm to a remote controller when an object with a risk of collision is detected during remote control, or around a drone
  • An unmanned drone control system has been developed to control the flight of objects avoiding objects.
  • the omnidirectional camera technology has been applied to vehicle-related fields such as vehicle periphery monitoring, and content production fields for providing virtual reality contents, and research and development are being actively conducted.
  • Korean Patent Publication No. 10-2015-0025452 (July 12, 2005) is disclosed as a prior art document related to the present invention, and the prior art document discloses a flight-wide photographing system.
  • each of the drones is photographed in the air by using a plurality of single viewing angle cameras, and then a single 360-degree omnidirectional video is generated so that a wide photographing range can be expressed as a single photograph, and difficult to access. It can be easily photographed at the position to provide a flying all-round shooting system that can produce a variety of videos.
  • the photographing unit of the flightable omnidirectional photographing system may be provided with six camera modules having a short viewing angle installed in the flying unit, and the six camera modules are disposed in front, rear, left, and right directions, respectively, Four shooting areas and two shooting areas can be taken simultaneously. And, the omnidirectional omnidirectional photographing system overlaps the common photographing area of the omnidirectional photographs to generate one 360 degree omnidirectional photograph.
  • the prior art invention requires a plurality of cameras installed in different directions to photograph the omnidirectional, and the image and the image boundary portions cause heterogeneity to the observer in the process of matching the images captured by the different cameras.
  • the omnidirectional photographing system currently mounted on the drone should use a plurality of cameras or a camera including a plurality of external lenses, and only transmits the photographed omnidirectional image to the user's terminal. I'm not doing it.
  • the present invention is to solve the above-mentioned problem, by using a sensor unit for capturing the omnidirectional with a single lens to accurately detect the object around the drone, collision control drone control device to self-control the drone to avoid the detected object To provide.
  • the drive unit for driving the rotary blades of the drone;
  • a sensor unit including at least one omnidirectional camera capturing omnidirectional images with a single external lens; And image the omnidirectional image to detect objects around the drone, and measure the distance between the detected object and the drone based on at least two omnidirectional images, and when the distance between the object and the drone is less than a predetermined distance,
  • a processor that controls the driver to emergency avoid the object.
  • the processor may divide the omnidirectional image at intervals along the arc direction, and detect the object around the drone by image processing the divided image.
  • the processor may segment the omnidirectional image such that the segmentation angle is gradually increased as it is spaced apart from the omnidirectional image region matching the moving direction of the drone.
  • the sensor unit may include a first omnidirectional camera disposed under the drone and a second omnidirectional camera disposed under the first omnidirectional camera.
  • the first omnidirectional camera and the second omnidirectional camera may be stereo cameras.
  • the processor may further include an image preprocessor for preprocessing an omnidirectional image to obtain a stereo image, a disparity calculator for stereo matching the stereo image to obtain disparity information, and a stereo based on the disparity information. It may include a segmentation unit separating the background and the foreground of the image, and an object identification unit detecting at least one or more objects in the separated foreground.
  • the processor calculates a first disparity from the first image of the first omnidirectional camera, calculates a second disparity from the second image of the second omnidirectional camera, and calculates the first and second disparities.
  • the distance between the object and the drone can be calculated in inverse proportion to the sum of the parity.
  • the processor may correct the first disparity according to the distance and the direction in which the point on which the subject is projected from the first image is spaced based on the focus position, and the point at which the subject is projected on the second image may be adjusted to the focus position.
  • the second disparity may be corrected according to the distance and the direction separated from each other, and the distance between the object and the drone may be calculated based on the corrected first and second disparities.
  • the processor also converts the first image of the first omnidirectional camera into a first rectangular panoramic image, converts the second image of the second omnidirectional camera into a second rectangular panoramic image, and converts the first disc from the first rectangular panoramic image.
  • the parity may be calculated, the second disparity may be calculated from the second rectangular panoramic image, and the distance between the object and the drone may be calculated based on the first disparity and the second disparity.
  • the collision avoidance control apparatus obtains the omnidirectional image by capturing the omnidirectional image of the drone with a single external omnidirectional lens, and processes the omnidirectional image to detect objects around the drone, based on at least two omnidirectional images. It measures the distance between the detected object and the drone, and when the distance between the object and the drone is less than the predetermined distance, the drone controls the propeller to emergency avoid the object, providing emergency collision avoidance function to safely drive the drone. Can be.
  • such a collision avoidance control device by using only one, as many as two omnidirectional cameras, can monitor the drone all around at once, there is an advantage that the manufacturing cost is reduced because there is no need for multiple cameras or distance sensors, Since the entire omnidirectional image is captured by a single external omnidirectional lens, there is less distortion, and thus, the position of the object can be accurately detected.
  • the collision avoidance control apparatus may analyze the omnidirectional image in consideration of the characteristics of the omnidirectional image photographed by a single external omnidirectional lens to more accurately detect the position of the object photographed in the omnidirectional image. .
  • FIG. 1 is a side view of a drone equipped with a collision avoidance drone control device according to an exemplary embodiment of the present invention.
  • FIG. 2 is a block diagram of a collision avoidance drone control device according to an embodiment of the present invention.
  • FIG 3 is a cross-sectional view of the omni-directional camera of the sensor unit according to an embodiment of the present invention.
  • FIG. 4 is a cross-sectional view of the omni-directional camera of the sensor unit according to another embodiment of the present invention.
  • FIG. 5 is a flowchart illustrating a process of providing an emergency object avoidance function by a collision avoidance drone control device according to an exemplary embodiment of the present invention.
  • FIG. 6A is an example of a segmented omnidirectional image according to an embodiment of the present invention
  • FIG. 6B is an example of a segmented omnidirectional image according to another embodiment of the present invention.
  • FIG. 7 is an internal block diagram of a processor according to an embodiment of the present invention.
  • FIG. 8 is a conceptual diagram illustrating a method of measuring a distance between an object and a drone by a processor according to an embodiment of the present invention.
  • FIG. 9 is a flowchart illustrating a process of controlling the focus of the omnidirectional camera by the processor according to an embodiment of the present invention.
  • FIG. 10 is a cross-sectional view of the omni-directional camera of the sensor unit according to another embodiment of the present invention.
  • FIG. 11 is a side view of a drone equipped with a collision avoidance drone control device according to another embodiment of the present invention.
  • Figure 13 is a block diagram of a collision avoidance drone control device according to another embodiment of the present invention.
  • FIG. 16 shows how a processor sets an optimal movement path of a drone according to an embodiment of the present invention.
  • FIG. 1 is a side view of a drone equipped with a collision avoidance drone control device according to an exemplary embodiment of the present invention.
  • a drone may include a body 10 forming an outer shape of a drone, a propeller 20 rotatably disposed on the body 10, an object surrounding the drone, and detecting an object. It includes a collision avoidance drone control device for driving control of the propeller 20 to avoid.
  • the collision avoidance control apparatus by capturing the omnidirectional image of the drone with a single external lens to obtain a omnidirectional image, image processing the omnidirectional image to detect objects around the drone, based on at least two omnidirectional images The distance between the detected object and the drone, and when the distance between the object and the drone is less than a predetermined distance, the drone controls the propeller 20 to emergency avoid the object, it can provide an emergency collision avoidance function. .
  • Such a collision avoidance control device by using only one, many two omnidirectional cameras 160, can monitor the drones at once, there is no need for multiple cameras or distance sensors, there is an advantage that the manufacturing cost is reduced Since the whole omnidirectional image is an image captured by a single external omnidirectional lens 200a, since there is little distortion, the position of the object can be accurately detected.
  • the collision avoidance control apparatus analyzes the omnidirectional image in consideration of the characteristics of the omnidirectional image captured by the single external omnidirectional lens 200a to more accurately detect the position of the object photographed in the omnidirectional image. can do.
  • FIG. 2 is a block diagram of a collision avoidance drone control device according to an embodiment of the present invention.
  • a collision avoidance control apparatus includes an input unit 110, a communication unit 12, a driver 130, a memory 140, a power supply unit 150, a sensor unit 160, and a processor ( 170, and in particular, the sensor unit 160 may include at least one omnidirectional camera 200 and other sensors.
  • the units of the collision avoidance control device shown in FIG. 2 may not be an essential unit for implementing the collision avoidance control device, and according to an embodiment, some of the units may not be included in the collision avoidance control device. Other units not shown may be further included in the collision avoidance control device.
  • the collision avoidance drone control apparatus may include an input unit 110 that detects a user input.
  • the input unit 110 may execute execution input for turning on / off the power supply of the collision avoidance drone control device, or turn on / off the emergency drone avoidance function of the collision avoidance drone control device. Input can be detected, etc.
  • the input unit 110 may include a gesture input unit 110 for detecting a user gesture (for example, an optical sensor), a touch input unit 110 for detecting a touch (for example, a touch sensor, and a touch key). (touch key, push key, etc.), the user terminal 600, a microphone for detecting a voice input, a microphone, a remote remote controller, and at least one of a mobile terminal to detect a user input. .
  • the collision avoidance drone control device may include a communication unit 120 for wireless communication with the user's terminal.
  • the communicator 120 may transmit the drone surrounding information obtained by analyzing the omnidirectional image and the omnidirectional image captured by the omnidirectional camera 200 to the terminal.
  • the communication unit 120 may receive a remote drone control signal input by a user through the terminal, and the processor 170 controls the driver 130 according to the received remote drone control signal. In addition, it may provide a remote control function for the user to remote control the drone. However, if there is a risk of collision with the object even during remote control, the processor 170 may execute the collision avoidance control in preference to the remote control signal.
  • the collision avoidance drone control apparatus may transmit the omnidirectional image so that the omnidirectional location of the drone can be confirmed as one image, thereby improving user convenience.
  • the communication unit 120 may exchange data with a terminal in a wireless manner.
  • the wireless data communication method may be various data communication methods such as Bluetooth WiFi, Direct WiFi, APiX, LTE, or NFC.
  • the collision avoidance drone control device may include a driving unit 130 for supplying power to the propeller 20 and controlling the propeller 20 rotational direction to control the movement direction / movement speed of the drone.
  • the collision avoidance drone control device has been described as including the drive unit 130 directly, the drone is provided with a separate drive unit 130 and the collision avoidance drone control device is connected to the drive unit 130 as an interface By transmitting the control signal, an embodiment of controlling the driver 130 may be possible.
  • the driver 130 includes a power source driver 130, and the power source driver 130 may perform electronic control of a power source in the drone.
  • the power source driver 130 may control the motor. Thereby, the rotation speed, torque, etc. of a motor can be controlled.
  • the steering driver 130 may perform electronic control of a steering apparatus in the drone.
  • the steering driver 130 may change the propagation direction of the drone through the inclination of the propeller 20 and the power control of the different propellers 20.
  • the processor 170 may detect a traveling direction to avoid the collision dangerous object when detecting the collision dangerous object, and control the steering driver 130 to advance the drone in the detected traveling direction.
  • the collision avoidance drone controller may include a memory 140, and the memory 140 includes various data for operation of the overall collision avoidance drone controller such as a program for processing or controlling the processor 170. Can be stored.
  • the memory 140 may store a plurality of application programs or applications that are driven by the collision avoidance drone controller, data for operating the collision avoidance drone controller, and instructions.
  • the application program may be stored in the memory 140 and may be driven by the processor 170 to perform an operation (or function) of the collision avoidance drone controller.
  • the memory 140 may store data for identifying an object included in the omnidirectional camera 200 image.
  • the memory 140 may store data for identifying what the object corresponds to by a predetermined algorithm when a predetermined object is detected in the image acquired through the camera.
  • the memory 140 may store data for identifying what the object corresponds to by a predetermined algorithm when a predetermined object such as a tall building, a flying vehicle, or a bird is included in the image acquired by the camera. Can be.
  • the collision avoidance drone control apparatus includes a power supply unit 150, and the power supply unit 150 may supply power required for operation of each component under the control of the processor 170.
  • the power supply unit 150 may be a battery inside the drone.
  • the collision avoidance drone control device may include a sensor unit 160 composed of at least one omnidirectional camera 200.
  • FIG 3 is a cross-sectional view of the omnidirectional camera 200 of the sensor unit 160 according to the embodiment of the present invention.
  • the sensor unit 160 may include one omnidirectional camera 200.
  • the omnidirectional camera 200 includes a refraction part 240 formed to refract incident light, a horizontal part 260 formed horizontally at the end of the refraction part 240, and a horizontal part 260. And a reflection coating 250 which reflects incident light reflected from the inner reflection coating to provide incident light to the inner recess, and refracts the reflected light reflected through the reflection coating 250 to the lens array 290. Reflective coating of the incident light incident on the inner concave portion 210, the inner refractive portion 220 concave at the end of the inner concave portion 210, and the inner refractive portion 240, and incident from the refracting portion 240.
  • An omnidirectional lens 200a including an inner reflection coating 230 for reflecting to 250 and an omnidirectional lens 200a are installed at the front, and a barrel having a lens array 290 formed therein ( 280).
  • the refracting part 240 is formed so that incident light is refracted, and the horizontal part 260 may be formed horizontally at the end of the refracting part 240.
  • the reflection coating 250 may be formed on the horizontal portion 260 to reflect incident light reflected from the inner reflection coating 250 to provide incident light to the inner recess 210.
  • an inner recess 210 may be formed on the inner surface of the front recess lens 200a to refract incident light reflected through the reflective coating 250 to the lens array 290.
  • An inner refractive portion 220 may be formed concave at the end of the 210.
  • the inner reflection coating 230 formed on the inner refractive portion 240 to reflect the incident light incident from the refracting portion 240 to the reflection coating 250 may be formed.
  • two refractive surfaces and two reflective coatings 250 may be configured.
  • the omnidirectional lens 200a itself is not an aspherical surface, but is spherical, the processing is easy and the manufacturing process is simple, thereby providing convenience in manufacturing and manufacturing cost reduction.
  • the refraction portion 240, the inner refraction portion 240 and the inner concave portion 210 is to have a spherical surface can solve the difficulty of machining, which is a problem of aspheric surface and at the same time can provide an omnidirectional photographing effect.
  • the omnidirectional lens 200a may be installed at the front of the barrel 280, and a lens array 290 may be formed inside the barrel 280.
  • the inclination of the inner refractive portion 240 based on the virtual central axis 270 of the omnidirectional lens 200a may be steeper than that of the refracting portion 240. This is to form an inclination angle of the inner refractive portion 240 to be steeper than the inclination angle of the refracting portion 240 to satisfy the viewing range within the range of 35 to 85 degrees.
  • the reason for forming the inner recess 210 in the inner central portion of the omnidirectional lens 200a is to focus.
  • the incident light when the incident light is incident on the lens array 290 by the inner recess 210, light may be collected and focused.
  • the image is formed according to the amount or reflection of light, and thus the design of the optical system can be manufactured in a complete body. That is, since the reflection of light generated through one overall shape is organic, it can be processed into a complete body from the initial design point.
  • FIG 4 is a cross-sectional view of the omnidirectional camera 200 of the sensor unit 160 according to another embodiment of the present invention.
  • the sensor unit 160 may include a first omnidirectional camera 200 and a second omnidirectional camera 201 disposed to be spaced apart from each other.
  • the description of each of the first and second omnidirectional cameras 200 and 201 overlaps with the description of the omnidirectional camera 200 described above, and thus a detailed description thereof will be omitted.
  • the first omnidirectional camera 200 may be disposed under the drone to photograph the omnidirectional view. At this time, the upper portion covered by the drone and the lower portion of the lower portion covered by the reflective coating 250 may not be photographed.
  • the second omnidirectional camera 201 may be disposed to extend to the reflective coating 250 of the first omnidirectional camera 200.
  • one end of the barrel 281 of the second omnidirectional camera 201 may be disposed to overlap with the reflective coating 250 of the first omnidirectional camera 200.
  • the first omnidirectional camera 200 and the second omnidirectional camera 201 are disposed to be spaced apart by a predetermined distance in the vertical direction (up and down direction), so that the first omnidirectional camera 200 and the second omnidirectional camera 201 are disposed.
  • the camera may take two omnidirectional images having parallax with each other in the vertical direction.
  • the sensor unit 160 is configured as a omnidirectional stereo camera to position the first omnidirectional camera 200 and the second omnidirectional camera 201 at different positions so as to photograph the omnidirectional image with a parallax. Can be.
  • the sensor unit 160 may shoot the omnidirectional image of the drone with the two omnidirectional cameras 200, and may accurately measure the distance between the object and the drone photographed in the image.
  • the sensor unit 160 of the collision avoidance drone control device may further include at least one sensor of a motion sensor, an altitude sensor, a distance sensor, a balance sensor, or a vision sensor.
  • the motion sensor may be at least one magnetometer, accelerometer, gyroscope, acceleration gyro sensor.
  • the motion information of the drone measured by the motion sensor may be used for drone control for collision avoidance.
  • the magnetometer is a sensor that functions as a compass, and measures magnetic north to obtain the drone's bearing information
  • the processor combines the GPS position information, the magnetometer's bearing information, and the accelerometer's motion information to obtain the drone's motion information. Can be obtained.
  • the accelerometer may be a three-axis accelerometer, it is possible to measure the acceleration in the x, y, z axis direction to obtain the motion information, such as the speed and position of the drone, the inclination, the warp, the direction of the war. Such motion information may be utilized by the processor 170 to maintain the drone in a stable posture.
  • the gyroscope may acquire the inclination information of the drone by measuring the angular velocity in the x, y, z axis direction with a three-axis gyroscope.
  • the acceleration gyro sensor may be a sensor in which the three-axis acceleration sensor and the three-axis gyroscope sensor are combined, and further include a temperature sensor to acquire motion information.
  • the altitude sensor may acquire the altitude of the drone by measuring atmospheric pressure, and may supplement the altitude information of the drone measured through GPS.
  • the GPS sensor may measure the position, coordinates, and altitude of the drone through the satellite signal. It can be understood that such a GPS sensor is included in the communication unit.
  • the apparatus may further include a distance sensor measuring a distance between the drone and the external object.
  • the distance sensor may measure the distance between the drone and the external object through an ultrasonic or laser output and return time. This distance information may be further used by the processor 170 to control the drone.
  • the apparatus may further include a vision sensor for capturing an external image and analyzing the photographed image pattern to determine the presence or absence of an obstacle.
  • This vision sensor has a similar function to an omnidirectional camera, but can be driven when hovering at an altitude of 10m or less to check obstacles and to prevent collisions.
  • the collision avoidance drone control device may include a processor 170 that controls the overall operation of each unit in the collision avoidance drone control device.
  • the processor 170 may include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), controllers, and microcontrollers. It may be implemented using at least one of a controller (micro-controllers), microprocessors (microprocessors), electrical units for performing other functions.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • controllers and microcontrollers. It may be implemented using at least one of a controller (micro-controllers), microprocessors (microprocessors), electrical units for performing other functions.
  • the sensor unit 160 may photograph an omnidirectional image around the drone. (S101)
  • one omnidirectional camera 200 detects the light incident through the single external omnidirectional lens 200a to obtain an omnidirectional image around the drone. can do.
  • the two omnidirectional cameras 200 respectively detect light incident through the single external omnidirectional lens 200a to provide two omnidirectional images having parallax. Can be obtained.
  • the obtained omnidirectional image may be formed in a circular shape and may have a donut shape by the reflective coating 250.
  • the processor 170 may divide the obtained omnidirectional image. (S102)
  • the embodiment is for preventing an excessive load on the processor 170 when image processing the entire omnidirectional image of high-capacity data at once.
  • the processor 170 may divide the omnidirectional image into a plurality of images and sequentially process the divided images to minimize the processing load occurring at the moment.
  • the processor 170 may precisely detect an object disposed between the divided images, thereby improving resolution of sensing information around the drone.
  • the processor 170 may convert the sector-shaped split image into a square image, match the square images, and convert the omnidirectional image into a panoramic image.
  • the processor 170 may divide the omnidirectional image to have equal intervals in the arc direction.
  • the processor 170 may determine the priorities of the divided images, sequentially process the divided images according to the priorities, and analyze the objects included in the divided images.
  • the processor 170 may analyze the split image on the moving direction side of the drone as the highest priority, and analyze the split image adjacent to the moving direction side split image as the lower priority.
  • the processor 170 may divide the omnidirectional image into different sizes.
  • the processor 170 may designate the main surveillance region in the omnidirectional image, divide the main surveillance region into a plurality of images, improve resolution, and accurately detect an object in the primary surveillance region.
  • the processor 170 may move a region of the omnidirectional image that is matched to the direction in which the drone moves to have a small split angle ⁇ 1 and move away from the split image in the moving direction D side.
  • the omnidirectional image may be divided to have a large division angle ⁇ 5.
  • the processor 170 divides the drone moving direction D side omnidirectional image area to increase resolution to accurately detect the moving direction D object, and divides the omnidirectional side image area farther from the moving direction. You can detect dangerous objects roughly.
  • the processor 170 may divide the area of the corresponding omnidirectional image into small portions to accurately detect the dangerous object.
  • the processor 170 may extract an object photographed in the omnidirectional image. (S103)
  • the sensor unit 160 is composed of one omnidirectional camera 200, a method of sensing a photographed object will be described.
  • the processor 170 extracts an object using one omnidirectional camera 200
  • a change occurs in the entire video frame photographed by the omnidirectional camera 200, which is an area where an object entering the surveillance target area is located on the entire video frame. This is indicated by the change in pixel value at.
  • the processor 170 may compare the pixel values of the pixels constituting the previous image frame and the current image frame, respectively, and determine the region where the change in the pixel value is an object image.
  • a criterion for extracting an object from a current image frame may be set using a plurality of previous image frames. That is, the processor 170 generates a background image for extracting an object based on a plurality of previous image frames temporally preceding the current image frame, and generates pixels of the pixels constituting the background image and pixels of the current image frame. You can extract objects based on the difference in values. Since the region in which the pixel value does not change over the plurality of previous image frames is regarded as a background where no object exists, the processor 170 generates a background image using the same. If the area of the pixel value is changed between the background image and the current image frame as the object image, the object can be extracted more accurately than when comparing the previous image frame with the current image frame.
  • the sensor unit 160 When the sensor unit 160 is configured as an omnidirectional stereo camera, the sensor 160 may detect the object more precisely.
  • a method of extracting an object photographed in the omnidirectional stereo image will be described in detail with reference to FIG. 7.
  • the processor 170 may include an image preprocessor 410, a disparity calculator 420, an object detector 434, and an object tracking unit 440.
  • the application unit 450 may be included.
  • an image is processed in the order of the image preprocessor 410, the disparity calculator 420, the object detector 434, the object tracking unit 440, and the application unit 450. It is not limited.
  • the image preprocessor 410 may receive two images from the omnidirectional stereo camera and perform preprocessing.
  • the image preprocessing unit 410 may include noise reduction, rectification, calibration, color enhancement, and color space conversion for two images. CSC), interpolation, and the omnidirectional camera 200 gain control. Accordingly, a sharper image may be obtained than the stereo image captured by the omnidirectional camera 200.
  • the disparity calculator 420 receives an image signal-processed by the image preprocessor 410, performs stereo matching on the two received images, and performs stereo matching.
  • a disparity map may be obtained. That is, disparity information on the stereo image of the front of the drone may be obtained.
  • the stereo matching may be performed in units of pixels of stereo images or in units of predetermined blocks.
  • the disparity map may refer to a map in which stereo parallax information of stereo images, that is, left and right images, is numerically represented.
  • the segmentation unit 432 may perform segmentation and clustering on at least one of the images based on the disparity information from the disparity calculator 420.
  • the segmentation unit 432 may separate the background and the foreground from at least one of the stereo images based on the disparity information.
  • an area in which the disparity information is equal to or less than a predetermined value in the disparity map may be calculated in the background, and the corresponding part may be excluded. Thereby, the foreground can be relatively separated.
  • an area in which the disparity information is greater than or equal to a predetermined value in the disparity map may be calculated in the foreground and a corresponding portion may be extracted. Thereby, the foreground can be separated.
  • the signal processing speed, the signal processing amount, and the like can be shortened in the subsequent object detection.
  • the processor 170 may calculate a positional relationship (eg, distance) between the detected object and the drone based on the disparity information extracted from the stereo image, and a detailed method will be described later.
  • a positional relationship eg, distance
  • the object detector 434 may detect the object based on the image segment from the segmentation unit 432.
  • the object detector 434 may detect an object with respect to at least one of the images based on the disparity information.
  • the object detector 434 may detect an object with respect to at least one of the images.
  • an object can be detected from the foreground separated by image segments.
  • the object verification unit 436 may classify and verify the separated object.
  • the object verification unit 436 may include an identification method using a neural network, a support vector machine (SVM) method, a method of identifying by AdaBoost using a haar-like feature, or a histograms of oriented gradients (HOG). Techniques can be used.
  • SVM support vector machine
  • AdaBoost a method of identifying by AdaBoost using a haar-like feature
  • HOG histograms of oriented gradients
  • the object checking unit 436 may check the objects by comparing the objects stored in the memory 140 with the detected objects.
  • the object verification unit 436 can identify the surrounding aircraft, buildings, dangerous areas, and the like located near the drone.
  • the object tracking unit 440 may perform tracking on the identified object. For example, in order to sequentially identify the object in the obtained stereo images, calculate the motion or motion vector of the identified object, track the movement of the object, etc. based on the calculated motion or motion vector. Can be. As a result, it is possible to track surrounding vehicles, high-rise buildings, dangerous areas, etc., which are located around the drone.
  • the application unit 450 may calculate a risk of collision or the like based on various objects located around the drone, for example, a surrounding vehicle, a high-rise building, a dangerous area, and the like. In addition, the possibility of collision with a vehicle or a building can be calculated.
  • the application unit 450 may output, as the drone surrounding information, a message or the like for informing the user of such information, based on the calculated risk, the likelihood of collision, the slip state, or the like.
  • a control signal for attitude control or running control of the drone may be generated as the drone control information.
  • the image preprocessor 410, the disparity calculator 420, the segmentation unit 432, the object detector 434, the object checker 436, the object tracking unit 440, and the application unit 450 are processors. It may be an internal configuration of the image processing unit in 170.
  • the processor 170 may include an image preprocessor 410, a disparity calculator 420, a segmentation unit 432, an object detector 434, an object checker 436, and an object tracking unit ( 440 and only a part of the application unit 450 may be included.
  • the processor 170 may calculate a positional relationship between the detected object and the drone. This step may be executed simultaneously with the step of extracting the object from the omnidirectional image.
  • the processor 170 may compare the first omnidirectional image photographed by the first omnidirectional camera 200 and the second omnidirectional image photographed by the second omnidirectional camera 201, and may be separated from each other by an object. The separation distance can be calculated.
  • the processor 170 may obtain the disparity information generated due to the parallax difference in the first omnidirectional image and the second omnidirectional image.
  • the processor 170 may correct the disparity information in consideration of the distortion generated by the omnidirectional image captured by the single external omnidirectional lens 200a.
  • the processor 170 may correct the disparity information in consideration of the distortion generated by the omnidirectional image captured by the single external omnidirectional lens 200a.
  • the omnidirectional image the object photographed near the center is photographed as being closer than the actual separation distance, and the object photographed near the outer periphery in the omnidirectional image is photographed as being farther than the actual separation distance.
  • the processor 170 may accurately measure the distance between the object and the drone by correcting the disparity information of the stereo image by reflecting the distortion characteristic of the omnidirectional image.
  • Equation 1 is a formula for measuring a distance between a drone and an object based on a stereo image.
  • f focal length
  • L baseline between the first omnidirectional camera 200 and the second omnidirectional camera 201
  • dr focal position of the first image plane from the point at which the subject is projected
  • dl distance from the point at which the subject is projected to the focal point of the second image plane (second disparity)
  • the processor 170 may calculate a first disparity from the first image 200i of the first omnidirectional camera 200, and may calculate the second image (eg, the second omnidirectional camera 201).
  • the second disparity may be calculated, and the distance between the object and the drone may be calculated in inverse proportion to the sum of the first and second disparities.
  • the first disparity means the distance from the point at which the subject is projected to the focal position of the plane of the first image 200i
  • the second disparity is the second image 201i from the point at which the subject is projected.
  • first disparity and the second disparity contain distortions generated by the single external circular lens of the omni-directional camera 200, correction may be performed.
  • the processor 170 may correct the first disparity according to the distance and the direction in which the point on which the subject is projected in the first image 200i is spaced based on the focal position, and in the second image 201i.
  • the second projection is corrected according to the distance from the point where the subject is projected based on the focus position and the separation direction, and the distance between the object and the drone based on the corrected first and second disparities. Can be calculated.
  • the processor 170 may focus on the first image 200i because, if the position where the subject is projected in the first image 200i is close to the outer periphery of the image, the object may have been photographed as being farther than the actual position.
  • the first disparity can be corrected by adding a correction variable ⁇ (negative) proportional to the distance between the position and the subject projection position.
  • the processor 170 may determine that if the position where the subject is projected in the second image 201i is close to the center of the image, the object may have been photographed as being closer than the actual position.
  • the second disparity can be corrected by adding a correction variable ⁇ (positive number) proportional to the distance between the subject projection positions.
  • the processor 170 converts the first image 200i of the first omnidirectional camera 200 into a first rectangular panoramic image, and converts the second image 201i of the second omnidirectional camera 201 into a second rectangular image. Convert to a panoramic image, calculate a first disparity from the first rectangular panoramic image,
  • the second disparity may be calculated from the second rectangular panoramic image, and the distance between the object and the drone may be calculated based on the first disparity and the second disparity. That is, in the step of converting the omnidirectional image image into a quadrangular image, the distortion of the omnidirectional image may be corrected in advance.
  • the processor 170 may accurately calculate the distance between the object and the drone in consideration of the distortion generated in the omnidirectional stereo image.
  • the processor 170 may detect an object at risk of collision according to the positional relationship between the drone and the object. (S105)
  • the processor 170 may determine the object as a collision risk object.
  • the processor 170 may determine the collision risk object by considering whether the rate of change of the distance between the drone and the object is negative or positive.
  • the processor 170 may calculate a collision avoidance direction and control to make an emergency turning in the collision avoidance direction of the drone through the driver 130.
  • the processor 170 may control the drone in preference to the remote control signal.
  • the processor 170 may generate an emergency drone control signal in advance of the remote control signal, transmit the emergency drone control signal to the driver 130, and control the propeller 20, thereby further protecting the drone.
  • FIG. 9 is a flowchart illustrating a process of controlling the focus of the omnidirectional camera 200 by the processor 170 according to an embodiment of the present invention.
  • the processor 170 may store the speed at which the drone maintains in the memory 140, and automatically control the focus of the omnidirectional camera 200 according to the altitude at which the drone starts supporting and changes. (S203)
  • the processor 170 calculates the altitude of the drone in consideration of the time taken to start supporting and the predetermined speed, and photographs the ground side according to the altitude of the drone. It is possible to calculate the focus for the control of the omnidirectional camera 200 according to the calculated focus.
  • the omnidirectional camera 200 may further include a focus control unit for controlling the focus.
  • the omnidirectional camera 200 may further include a focus lens for controlling focus and a focus controller for controlling the focus lens, together with the same configuration as the above-described embodiment. can do.
  • the omnidirectional camera 200 includes a fluid focus lens 320 disposed in the inner concave portion and including a different liquid, and a focus controller 310 including a current injection portion contracting or expanding the fluid focus lens 320. It may include.
  • FluidFocus lenses may include two types of liquids that are not mixed with each other.
  • the two liquids may be liquids with different refractive indices, one with electricity and one without electricity.
  • the focus control unit 310 injects current into one of the two liquids through the current injection unit, and the conductive liquid into which the current is injected changes in volume, so that the other liquids contract and expand like the cornea of the human eye. While the focus of the omnidirectional camera 200 can be controlled.
  • the fluid focus lens 320 is disposed in a shape corresponding to the inner concave portion 210 of the single external lens (the omnidirectional lens 200a) of the omnidirectional camera 200 to uniformly control the focus of the entire omnidirectional image.
  • the processor 170 may detect the tracking object, calculate the distance to the tracking object, and then control the focus according to the calculated distance change. (S205)
  • the processor 170 may accurately photograph a moving object, analyze the photographed image, and accurately calculate a distance from the photographed object, thereby further improving the performance of the emergency collision avoidance function. .
  • FIG. 11 is a side view of a drone equipped with a collision avoidance drone control device according to another exemplary embodiment of the present invention
  • FIG. 12 is an internal configuration diagram of the lower and upper omnidirectional cameras.
  • a collision avoidance drone control device mounted on a drone may include a lower omnidirectional camera 160a for detecting a lower peripheral area of a drone, and an upper peripheral area of a drone. It may include an upward omnidirectional camera 160b.
  • the upper omnidirectional camera 160b may have the same configuration as the omnidirectional camera of the above-described embodiment.
  • the surface of the single external omnidirectional lens 200a is subjected to window tinting. That is, the upper omnidirectional camera 160b is the same as the omnidirectional camera described above, except that a window tinting process is applied to the surface of the single external omnidirectional lens 200a.
  • the processor detects the object photographed by the upper omnidirectional camera 160b, measures the distance between the object and the drone, and emergency avoidance control to move the drone downward as a dangerous object when the detected object is less than a predetermined distance. Can be performed.
  • the processor detects the object photographed by the lower omnidirectional camera 160a, measures the distance between the object and the drone, and moves the drone upward as a dangerous object when the detected object is less than a predetermined distance.
  • Emergency evasion control can be performed.
  • FIG. 13 is a block diagram of a collision avoidance drone control device according to still another embodiment of the present invention
  • FIGS. 14 and 15 are cross-sectional views of a liquid lens.
  • a collision avoidance control apparatus includes an input unit 110, a communication unit 12, a driving unit 130, a memory 140, a power supply unit 150,
  • the sensor unit 160 and the processor 170 may be included.
  • the sensor unit 160 may include at least one omnidirectional camera 200 and a liquid lens 121.
  • the liquid lens 121 includes an oil layer 121c and an aqueous solution layer 121d stacked between the first protective glass 121a, the second protective glass 121b, the first and second protective glasses 121a and 121b, The first insulating portion 121g disposed between the first electrode portion 121e and the second electrode portion 121f and the first and second electrode portions 121e and 121f disposed at the periphery thereof to insulate a voltage. And a second insulating portion 121h.
  • the liquid lens 121 is supplied with power from the outside to the first and second electrode portions 121e and 121f, the radius of curvature and the thickness of the oil layer 121c are changed to pass through the liquid lens 121.
  • the position of the focus of the light can be changed.
  • the radius of curvature and the thickness of the oil layer 121c may increase.
  • the focal length can be shortened by increasing the voltage.
  • the liquid lens 121 does not require a separate servo motor for the flow of the lens, thereby greatly reducing the manufacturing cost. In addition, precise refractive index change is possible.
  • liquid lens 121 may be located between the lens array 290 and the omnidirectional lens 200a, may be located at the rear end of the lens array 290, any of the plurality of lenses in the lens array 290 It may be located between two lenses.
  • the processor 170 may control the power applied to the liquid lens 121 to vary the refractive index of the liquid lens 121. Therefore, by focusing from far to near, it is possible to check the approaching object precisely.When focusing at far distance, the angle of view becomes smaller in the direction of progress, and focuses about 500 times per second so that the object can be identified by image analysis. do.
  • the collision avoidance drone control apparatus may calculate an optimal movement path based on an image photographed through a camera and map information received through the communicator 120.
  • the processor 170 may include a movement path setting unit 171 for calculating the optimum movement path.
  • obstacle 1 may be photographed in an area of an image that can be photographed through a camera in a drone. Therefore, the collision avoidance drone controller can set the optimal route to the destination while avoiding obstacle 1. However, since the area of the image that can be photographed by the drone has a limit, it may be difficult to calculate a moving path outside the photographing area.
  • the collision avoidance drone control device may receive a map image matching the image through the communication unit 120.
  • the collision avoidance drone controller may match the image with the received map image to determine a location where the current drone is located in the map image, and then set an optimal route to the destination while avoiding obstacles through the map image.
  • the obstacle 2 can be grasped in advance through the map information, and an optimum movement path can be set even for an area outside the photographing area.
  • the present invention can also be embodied as computer readable codes on a computer readable recording medium.
  • Computer-readable recording media include all kinds of recording devices that store data that can be read by a computer system. Examples of computer-readable recording media include ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage, and the like, and may also be implemented in the form of a carrier wave (for example, transmission over the Internet). Include.
  • the computer readable recording medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
  • functional programs, codes, and code segments for implementing the present invention can be easily inferred by programmers in the art to which the present invention belongs.
  • the present invention is a device mounted on the drone to control the movement to avoid the collision when the drone moves.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • Traffic Control Systems (AREA)

Abstract

Un dispositif de commande de drone pour l'évitement d'une collision, selon un mode de réalisation, comprend : une unité d'entraînement permettant d'entraîner une pale de rotor d'un drone; une unité de capteur comprenant une caméra omnidirectionnelle permettant d'acquérir une image omnidirectionnelle en photographiant toutes les directions du drone au moyen d'une seule lentille omnidirectionnelle externe; et un processeur, qui effectue un traitement d'image sur l'image omnidirectionnelle, de sorte à détecter un objet positionné autour du drone, mesure la distance entre le drone et l'objet détecté sur la base d'au moins deux images omnidirectionnelles, et commande l'unité d'entraînement de sorte que le drone évite l'objet de toute urgence lorsque la distance entre l'objet et le drone est inférieure ou égale à une distance prédéfinie.
PCT/KR2017/015463 2016-12-26 2017-12-26 Dispositif de commande de drone pour l'évitement d'une collision WO2018124688A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020160179069A KR101895343B1 (ko) 2016-12-26 2016-12-26 충돌 회피용 드론 제어장치
KR10-2016-0179069 2016-12-26

Publications (1)

Publication Number Publication Date
WO2018124688A1 true WO2018124688A1 (fr) 2018-07-05

Family

ID=62709810

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/015463 WO2018124688A1 (fr) 2016-12-26 2017-12-26 Dispositif de commande de drone pour l'évitement d'une collision

Country Status (2)

Country Link
KR (1) KR101895343B1 (fr)
WO (1) WO2018124688A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110544009A (zh) * 2019-07-26 2019-12-06 中国人民解放军海军航空大学青岛校区 基于数字图像处理的航空有机涂层老化损伤量化评估方法

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102526857B1 (ko) * 2018-07-16 2023-04-28 한국전자통신연구원 스테레오 카메라와 추가 카메라를 이용한 무인비행체 장애물 탐지 장치 및 방법
WO2020080911A1 (fr) 2018-10-19 2020-04-23 안병열 Drone ayant une fonction de prévention de collision et de récupération après collision
KR102171043B1 (ko) * 2019-02-28 2020-10-28 한국과학기술원 영상 처리를 기반으로 한 비행체의 충돌 방지 방법
KR102281164B1 (ko) 2019-11-20 2021-07-23 한국생산기술연구원 선형적 특성정보의 상호 관계를 이용한 드론의 전선 인식장치 및 방법
KR102305307B1 (ko) 2019-11-27 2021-09-27 김민호 에어분사 기반의 드론용 비행 장애물 회피기동 제어시스템
KR20210065459A (ko) 2019-11-27 2021-06-04 김민호 드론용 비행 장애물 회피기동 제어 방법
KR102391771B1 (ko) 2020-04-02 2022-04-27 함영국 이진화된 3차원 공간맵 기반의 자율 이동체 운영 방법
KR102131377B1 (ko) 2020-04-17 2020-07-08 주식회사 파블로항공 감시를 위한 무인 이동체 및 이를 포함하는 시스템
KR102316012B1 (ko) * 2020-05-26 2021-10-22 (주)파이온시스템즈 드론에 구비된 카메라 영상을 이용하여 드론 전방의 비행물체와의 충돌 가능성 판정 장치 및 방법
KR20230157192A (ko) 2022-05-09 2023-11-16 청주대학교 산학협력단 군집 무인비행체 이용 지능형 방어 방법, 장치 및 시스템
KR102638951B1 (ko) 2023-08-09 2024-02-21 (주)모빌리티원 이기종 드론/로봇 통합 원격제어 시스템 및 방법

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070106473A1 (en) * 2005-01-24 2007-05-10 Bodin William K Navigating a uav with obstacle avoidance algorithms
US20110164108A1 (en) * 2009-12-30 2011-07-07 Fivefocal Llc System With Selective Narrow FOV and 360 Degree FOV, And Associated Methods
KR20130037697A (ko) * 2013-03-12 2013-04-16 노인철 무인항공기의 충돌 방지 시스템 및 방법
KR20150113586A (ko) * 2014-03-31 2015-10-08 세종대학교산학협력단 비전센서가 결합된 다중회전익 무인비행체 및 다중회전익 무인비행체의 자율비행 제어방법, 그 방법을 수행하기 위한 프로그램이 기록된 기록매체
KR20160083774A (ko) * 2015-01-02 2016-07-12 (주)창조인프라 피사체 추적형 셀카용 드론과 이를 이용한 범죄취약자 보호장치 및 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070106473A1 (en) * 2005-01-24 2007-05-10 Bodin William K Navigating a uav with obstacle avoidance algorithms
US20110164108A1 (en) * 2009-12-30 2011-07-07 Fivefocal Llc System With Selective Narrow FOV and 360 Degree FOV, And Associated Methods
KR20130037697A (ko) * 2013-03-12 2013-04-16 노인철 무인항공기의 충돌 방지 시스템 및 방법
KR20150113586A (ko) * 2014-03-31 2015-10-08 세종대학교산학협력단 비전센서가 결합된 다중회전익 무인비행체 및 다중회전익 무인비행체의 자율비행 제어방법, 그 방법을 수행하기 위한 프로그램이 기록된 기록매체
KR20160083774A (ko) * 2015-01-02 2016-07-12 (주)창조인프라 피사체 추적형 셀카용 드론과 이를 이용한 범죄취약자 보호장치 및 방법

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110544009A (zh) * 2019-07-26 2019-12-06 中国人民解放军海军航空大学青岛校区 基于数字图像处理的航空有机涂层老化损伤量化评估方法
CN110544009B (zh) * 2019-07-26 2022-12-09 中国人民解放军海军航空大学青岛校区 基于数字图像处理的航空有机涂层老化损伤量化评估方法

Also Published As

Publication number Publication date
KR20180075111A (ko) 2018-07-04
KR101895343B1 (ko) 2018-09-05

Similar Documents

Publication Publication Date Title
WO2018124688A1 (fr) Dispositif de commande de drone pour l'évitement d'une collision
WO2020071839A1 (fr) Dispositif et procédé de surveillance de port et de navires
US10904430B2 (en) Method for processing image, image processing apparatus, multi-camera photographing apparatus, and aerial vehicle
WO2017008224A1 (fr) Procédé de détection de distance à un objet mobile, dispositif et aéronef
WO2018110848A1 (fr) Procédé de fonctionnement de véhicule aérien sans pilote et dispositif electronique pour sa prise en charge
US11030760B2 (en) Image processing device, ranging device and method
WO2017066927A1 (fr) Systèmes, procédés et dispositifs de réglage de paramètres de caméra
WO2016106715A1 (fr) Traitement sélectif de données de capteur
WO2017008206A1 (fr) Système à double lentille ayant un séparateur de lumière
WO2015093828A1 (fr) Caméra stéréo et véhicule comportant celle-ci
WO2019017592A1 (fr) Dispositif électronique déplacé sur la base d'une distance par rapport à un objet externe et son procédé de commande
WO2019132283A1 (fr) Ensemble de lentilles optiques, et dispositif électronique comportant cet ensemble
WO2016143983A1 (fr) Procédé et dispositif pour émettre une lumière utilisée pour capturer un iris
WO2020171512A1 (fr) Dispositif électronique de recommandation de composition et son procédé de fonctionnement
WO2020091262A1 (fr) Procédé de traitement d'image à l'aide d'un réseau neuronal artificiel, et dispositif électronique le prenant en charge
WO2015093823A1 (fr) Dispositif d'assistance à la conduite de véhicule et véhicule le comportant
US11107245B2 (en) Image processing device, ranging device, and method
WO2020141827A1 (fr) Système optique et module de caméra le comprenant
WO2020071823A1 (fr) Dispositif électronique et son procédé de reconnaissance de geste
WO2019143050A1 (fr) Dispositif électronique et procédé de commande de mise au point automatique de caméra
WO2023008791A1 (fr) Procédé d'acquisition de distance à au moins un objet situé dans une direction quelconque d'un objet mobile par réalisation d'une détection de proximité, et dispositif de traitement d'image l'utilisant
WO2016186319A1 (fr) Dispositif d'assistance à la conduite d'un véhicule et véhicule
WO2016105074A1 (fr) Système optique de lentilles
WO2020189909A2 (fr) Système et procédé de mise en oeuvre d'une solution de gestion d'installation routière basée sur un système multi-capteurs 3d-vr
WO2018135745A1 (fr) Procédé et dispositif pour générer une image pour indiquer un objet sur la périphérie d'un véhicule

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17888914

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 27/09/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17888914

Country of ref document: EP

Kind code of ref document: A1