WO2020215271A1 - Unmanned vehicle capable of displaying images on a remote plane - Google Patents

Unmanned vehicle capable of displaying images on a remote plane Download PDF

Info

Publication number
WO2020215271A1
WO2020215271A1 PCT/CN2019/084265 CN2019084265W WO2020215271A1 WO 2020215271 A1 WO2020215271 A1 WO 2020215271A1 CN 2019084265 W CN2019084265 W CN 2019084265W WO 2020215271 A1 WO2020215271 A1 WO 2020215271A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
image
vehicle
unmanned vehicle
unmanned
Prior art date
Application number
PCT/CN2019/084265
Other languages
French (fr)
Inventor
Xiao Zhu
Original Assignee
Powervision Tech (Suzhou ) Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Powervision Tech (Suzhou ) Ltd. filed Critical Powervision Tech (Suzhou ) Ltd.
Priority to PCT/CN2019/084265 priority Critical patent/WO2020215271A1/en
Publication of WO2020215271A1 publication Critical patent/WO2020215271A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41407Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a portable device, e.g. video client on a mobile phone, PDA, laptop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/414Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
    • H04N21/41422Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance located in transportation means, e.g. personal vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras

Definitions

  • the present invention relates to an unmanned vehicle, and more particularly, to an unmanned vehicle with capability of displaying images on a remote plane.
  • UAV unmanned aerial vehicle
  • a conventional unmanned aerial vehicle is equipped with a camera module for capturing surrounding images where the UAV is located, and the surrounding images are transmitted back to the remote control which is operated by a user.
  • the user may control the UAV to a destination where human beings are difficult to achieve by the remote control and see the surrounding images captured by the UAV via the remote control and an external electronic device, such as a cell phone if needed.
  • an external electronic device such as a cell phone if needed.
  • the conventional UAV equipped with the camera is incapable of showing display at a location or destination where the UAV is remotely controlled to head for, which results in a lack of convenience in use, especially for some cases, such as an outdoor rescue scenario or an indoor meeting scenario.
  • an unmanned vehicle capable of displaying images on a remote plane comprising a vehicle main body, a vehicle rotor unit, an image capture unit, a projection unit, a processing unit, a gimbal unit and a storage unit.
  • the vehicle rotor unit is connected with the vehicle main body and configured to provide moving force to the unmanned vehicle.
  • the image capture unit is configured to capture at least one image.
  • the projection unit is configured to project at least one display image on the remote plane.
  • the processing unit is configured to control operation of the vehicle rotor unit and process the image captured by the image capture unit and the display image.
  • the gimbal unit has at least two rotation axes.
  • the gimbal unit is configured to support the projection unit and connected to the vehicle main body.
  • the storage unit is configured to store the captured image and the display image.
  • the present invention further discloses an unmanned vehicle capable of displaying images on a remote plane comprising a vehicle main body, an image capture unit, a projection unit, a gimbal unit and a storage unit.
  • the image capture unit is configured to capture at least one image.
  • the projection unit is configured to project the image on the remote plane.
  • the gimbal unit has at least two rotation axes.
  • the gimbal unit is configured to support the projection unit and connected to the vehicle main body.
  • the storage unit is configured to store the captured image.
  • the unmanned vehicle is equipped with the projection unit configured to project an image onto a reflecting surface to present a display image on the reflecting surface. Therefore, the unmanned vehicle of the present invention is able to display the display image showing content of one of a data file stored in the cloud server, preload data stored in the storage unit, or an external data file stored in an external electronic device according to the apparatus with which the unmanned vehicle communicates.
  • the mobile display image can offer versatile display application and timely and supplementary support for the scenario.
  • FIG. 1 is a block diagram of an unmanned vehicle (UV) according to an embodiment of the present invention.
  • FIG. 2 is an exemplary diagram of an unmanned aerial vehicle (UAV) according to an embodiment of the invention.
  • UAV unmanned aerial vehicle
  • FIG. 3 is an exemplary diagram of an unmanned underwater vehicle according to an embodiment of the invention.
  • FIG. 4 is a diagram of a gimbal unit and a projection unit according to an embodiment of the present invention.
  • FIG. 5 is a schematic diagram illustrating a general remote projection scenario with an UAV according to an embodiment of the invention.
  • FIG. 6 is a schematic diagram illustrating an indoor remote projection scenario with the UAV according to an embodiment of the invention.
  • FIG. 7 is a schematic diagram illustrating an outdoor remote projection scenario with the UAV according to an embodiment of the invention.
  • FIG. 8 is flow chart illustrating how a display image is projected on a remote plane with the UV according to an embodiment of the invention.
  • FIG. 1 is a functional block diagram of an unmanned vehicle (UV) 1000 according to an embodiment of the present invention.
  • the invention discloses the unmanned vehicle 1000 capable of displaying images on a remote plane which comprises a vehicle main body 10, a vehicle rotor unit 11, a gimbal unit 12, a processing unit 16 , a projection unit 13, an image capture unit 14, and a storage unit 17.
  • the image capture unit 14 is configured to capture at least one image, such as an environmental image of a spot or an object image with respect to an object where the unmanned vehicle 1000 reaches, and the environmental image can be a human’s facial image or an image related to the surrounding environment.
  • the vehicle rotor unit 11 is connected or coupled to the vehicle main body 10 and configured to provide moving force for the unmanned vehicle 1000.
  • the processing unit 16 is configured to control operation of the vehicle rotor unit 11 and process the image captured by the image capture unit 14.
  • the storage unit 17 is configured to store the captured image.
  • the vehicle rotor unit 11 is equipped with at least one motor. The motor is controlled by the processing unit 16, so as to provide proper moving force to the unmanned vehicle 1000 to move at a desired speed and in a desired direction.
  • the gimbal unit 12 is connected or coupled to the vehicle main body 10.
  • the projection unit 13 is supported by the gimbal unit 12 or on the vehicle main body 10 depending on practical demands.
  • the projection unit 13 is configured to project a display image onto a remote plane or a reflecting surface so as to present the display image on the remote plane, wherein the remote plane is separated and spaced apart from the vehicle main body 10.
  • the unmanned vehicle 1000 of the present invention is able to remotely display desired content via the projection unit 13 for some specific purposes, such as a demo meeting, a rescue task, and so on.
  • the unmanned vehicle 1000 of the present invention can remotely project the display image without utilizing an external display device, such as a cell phone, an external projector, a display panel carried by the unmanned vehicle 1000, and the like. It should be noticed that the display image can be processed by the processing unit 16 and stored by the storage unit 17 of the unmanned vehicle 1000.
  • an external display device such as a cell phone, an external projector, a display panel carried by the unmanned vehicle 1000, and the like.
  • the display image can be processed by the processing unit 16 and stored by the storage unit 17 of the unmanned vehicle 1000.
  • FIG. 2 is an exemplary diagram of an unmanned aerial vehicle (UAV) 2000 according to an embodiment of the invention.
  • the unmanned vehicle 1000 of the present invention can be implemented as the unmanned aerial vehicle 2000 as shown in FIG. 2, but the present invention is not limited thereto.
  • FIG. 3 is an exemplary diagram of an unmanned underwater vehicle 3000 according to an embodiment of the invention.
  • the unmanned vehicle 1000 of the present invention can be implemented as the unmanned underwater vehicle 3000 as shown in FIG. 3.
  • the unmanned vehicle 1000 implemented as a remote operated vehicle, an unmanned surface vehicle, or an autonomous robot is within the scope of the present invention.
  • the gimbal unit 12 is connected or coupled to the vehicle main body 10.
  • the projection unit 13 is supported by the gimbal unit 12 or on the vehicle main body 10 depending on practical demands.
  • the gimbal unit 12 can be configured to have least two rotation axes for controlling a projection direction of the projection unit 13. Further details of structure the gimbal unit 12 and the projection unit 13 is described in the following paragraphs.
  • FIG. 4 is a diagram of a gimbal unit 12 and a projection unit 13 according to an embodiment of the present invention.
  • the projection unit 13 comprises a projection casing 130 and a projector member 131.
  • the projection casing 130 and the vehicle main body 10 are two independent members.
  • the projector member 131 is disposed on the projection casing 130 and for projecting the display image.
  • the projector member 131 can be arranged outside the vehicle main body 10, but not limited to this. For example, the projector member 131 can be disposed right on the vehicle main body 10.
  • the gimbal unit 12 connects the vehicle main body 10 and the projection casing 130.
  • the gimbal unit 12 comprises a first motor assembly 120, a second motor assembly 121 and a third motor assembly 122.
  • the first motor assembly 120 has a first axis X1 and is configured to drive the projection unit 13 to rotate about the first axis X1.
  • the second motor assembly 121 has a second axis X2 and is configured to drive the projection unit 13 to rotate about the second axis X2.
  • the third motor assembly 122 has a third axis X3 and is configured to drive the projection unit 13 to rotate about the third axis X3.
  • the gimbal unit 12 is able to activate the projection unit 13 to rotate about at least one of the three independent axes, i.e., the first axis X1, the second axis X2 and the third axis X3.
  • the gimbal unit 12 further comprises a mounting mechanism 123, a first frame mechanism 124 and a second frame mechanism 125.
  • the mounting mechanism 123 connects the first motor assembly 120 and the vehicle main body 10.
  • the first frame mechanism 124 connects the mounting mechanism 123 and the second motor assembly 121.
  • the second frame mechanism 125 connects the second motor assembly 121 and the third motor assembly 122.
  • the first frame mechanism 124 has a substantially L-shaped orientation
  • the second frame mechanism 125 has a substantially U-shaped structure, which facilitate optimizing of structure of the gimbal unit 12 and further saves mechanical space and weight of the gimbal unit 12.
  • the unmanned vehicle 1000 can further comprise a communication unit 18 configured to transmit data, such as the captured image, to and receive data from a remote external unit via wire or wireless connection, which enables two-way communication between the unmanned vehicle 1000 and the remote external unit.
  • the remote external unit can be, but not limited to, a UV remote controller 7000 of the unmanned vehicle 1000, a mobile device (such as a smart phone 6000) with corresponding application software, a computing device (such as a personal computer 5000) with corresponding application software, or a remote cloud server 4000.
  • the unmanned vehicle 1000 can wirelessly communicate with the UV remote controller 7000 through the communication unit 18, the unmanned vehicle 1000 can be controlled by a user who is located at a remote spot apart from the unmanned vehicle 1000 to perform remote projection task, such as the demo meeting or the rescue task.
  • the processing unit 16 can further configured to generate the display image according to data received from the remote external unit through the communication unit 18 and provide the display image to the projection unit 13.
  • the processing unit 16 can further configured to control the communication unit 18 to establish communication with the remote external unit, but not limited to this.
  • FIG. 5 is a schematic diagram illustrating a general remote projection scenario with an UAV 2000 according to an embodiment of the invention.
  • a user utilizes a remote external unit 9000 to control the unmanned aerial vehicle 2000 to reach a desired spot.
  • the user can further control the projection unit 13 of the unmanned aerial vehicle 2000 through the remote external unit 9000 so as to project a display image onto a remote plane 81 at the spot.
  • content shown in the display image can be content of a file transmitted from the remote external unit 9000, but the present invention is not limited thereto.
  • the content shown in the display image can be content of a preloaded data file stored in the storage unit 17 of the unmanned vehicle 1000.
  • the projected content on the remote plane 81 such as a demo video, a demo power point file, or an exercising tap, the user is able to hold activities on the spot even if the user is not present at the spot.
  • the processing unit 16 of the unmanned vehicle 1000 is configured to generate the display image according to the image captured by the image capture unit 14 and provide the display image to the projection unit 13.
  • the processing unit 16 can further configured to identify an object in the captured image.
  • the processing unit 13 can comprise a control module 20 and an image processing module 19.
  • the control module 20 is capable of controlling operation of the unmanned vehicle 1000, the vehicle rotor unit 11, the gimbal unit 12, and the image capture unit 14.
  • the image processing module 19 is capable of processing the display image and the image captured by the image capture unit 14 and/or identifying the object in the captured image.
  • FIG. 6 is a schematic diagram illustrating an indoor remote projection scenario with the UAV 2000 according to the embodiment of the invention.
  • the UAV 2000 when a user practices a speech in an indoor exercise room, the UAV 2000 can be driven by the vehicle rotor unit 11 to move in the air or to stay in the air, and the image capture unit 14 can capture an image of the user.
  • the image processing module 19 can identify a figure of the user in the captured image.
  • the image processing module 19 can calculate a position information for locating the user according to the identified figure, which enables the control module 20 to control the vehicle rotor unit 11 for driving the unmanned vehicle 1000 to keep tracking and following the identified user and control the image capture unit 14 for continuously capturing images of the identified user.
  • the control module 20 can control the gimbal unit 12 and a projection direction of the projection unit 13 according to a direction of the identified user with respect to the unmanned vehicle 1000. Therefore, as shown in FIG. 6, the projection unit 13 can be automatically oriented in a substantial opposite direction against the image capture unit 14 so as to project and stabilize a display image on a designated projection area. It should be noticed that content of the display image can be directly extracted from the image capture unit 14 in the aforementioned embodiment.
  • FIG. 7 is a schematic diagram illustrating an outdoor remote projection scenario with the UAV 2000 according to the embodiment of the invention.
  • the rescue staff 93 can transmit an external content from the remote external unit 9000 to the unmanned aerial vehicle 2000 so that the projection unit 13 of the unmanned aerial vehicle 2000 can project a display image containing the external content showing tips or guidelines for helping the person 92 escape from the valley.
  • a facial model of the trapped person 92 can be extracted from a cloud database storing multiple facial models or can be constructed by utilizing the cloud server to analyze a previous facial image of the trapped person 92.
  • the processing unit 16 of the unmanned aerial vehicle 2000 can not only identify the trapped person 92 by identifying a figure of the trapped person 92 in an image captured by the image capture unit 14, but also recognize the trapped person 92 by verifying a captured facial image of the trapped person 92 against the transmitted facial model. Therefore, the unmanned aerial vehicle 2000 of the present application can be configured to not only passively project information on a designated projection area but also actively search an object to be identified.
  • the unmanned vehicle 1000 of the present application can be used in object tracking and/or projection direction adjustment.
  • the image capture unit 14 is disposed and fixed on a side of the vehicle main body 10 of the unmanned aerial vehicle 2000 or the unmanned underwater vehicle 3000, but is not limited to this.
  • the image capture unit 14 can be disposed on the projection casing 130 with the projector member 131.
  • the image capture unit 14 is supported by the gimbal unit 12 and disposed in an opposite direction to the projection unit 13. That is, the image capture unit 14 and the projector member 131 can be disposed on opposite sides of the projection casing 130, and the description of the abovementioned exemplary embodiments is intended to be illustrative and not to limit the scope of the invention.
  • the invention further discloses another unmanned vehicle 1000’ capable of displaying images on a remote plane which comprises a vehicle main body 10, an image capture unit 14, a projection unit 13, a gimbal unit 12, and a storage unit 17.
  • the image capture unit 14 is configured to capture at least one image.
  • the projection unit 13 is configured to project the image on the remote plane.
  • the gimbal unit 12 has at least two rotation axes.
  • the gimbal unit 12 is configured to support the projection unit 13 and connected to the vehicle main body 10.
  • the storage unit 17 is configured to store the captured image and the display image.
  • the unmanned vehicle 1000’ can further comprise a communication unit 18.
  • the communication unit 18 is configured to transmit the captured image to an external device and receive data from the external device. Therefore, the projection unit 13 can be further configured to display the received data on the remote plane, and the image capture unit 14 can be further configured to capture image according to the received data.
  • the gimbal unit 12 of the unmanned vehicle 1000’ can be further configured to support the image capture unit 14 and thereby control a shooting direction of the image capture unit 14. Besides, the gimbal unit 12 can be further configured to control a projection direction of the projection unit 13 according to a direction of an object identified in the image captured by the image capture unit with respect to the unmanned vehicle.
  • the unmanned vehicle is equipped with the projection unit configured to project an image onto a reflecting surface to present a display image on the reflecting surface. Therefore, the unmanned vehicle of the present application is able to display the display image showing content of one of a data file stored in the cloud server, preload data stored in the storage unit, or an external data file stored in an external electronic device according to the apparatus with which the unmanned vehicle communicates.
  • the mobile display image can offer versatile display application and timely and supplementary support for the scenario.
  • FIG. 8 is flow chart illustrating how a display image is projected on a remote plane with the UV 1000 according to the embodiment of the invention.
  • the detailed procedure of projecting the display image onto the remote plane comprises the following steps:
  • S102 Capture an image via the image capture unit 14.
  • S204 Track the identified object by the image capture unit 14 for capturing an image of the identified object.
  • S205 Transmit the captured image of the identified object to the remote external unit.
  • the unmanned aerial vehicle 2000 described in the following paragraphs is intended to be illustrative and not to limit the scope of the invention.
  • the exemplary unmanned vehicle can be an unmanned underwater vehicle 3000 or other type of unmanned vehicle suitable for particular occasions.
  • the vehicle main body 10 of the unmanned aerial vehicle 2000 can be equipped with the projection unit 13 by installing the projection unit 13 to the gimbal unit 12 mounted on the unmanned aerial vehicle 2000 (S101) , and the user can utilize a remote external unit 9000 to control the unmanned aerial vehicle 2000 to fly to a specific spot where there might be a remote plane 81 for reflecting and presenting a projected image.
  • the user can control the projection unit 13 to project a display image onto the remote plane 81 via the remote external unit 9000.
  • the projection direction of the projection unit 13 can be adjusted and aimed at the remote plane 81 (S304) .
  • the unmanned aerial vehicle 2000 is in communication with the remote external unit 9000, content that is destined for being presented in the display image can further be transmitted from the remote external unit 9000 to the unmanned aerial vehicle 2000 (S305) . Therefore, after receiving data of the content transmitted from the remote external unit 9000, the unmanned aerial vehicle 2000 can project the display image (with the content) onto the remote plane 81 (S106) so that the user can deliver messages to an audience at the specific spot via the display image even if the user is not present at the specific spot.
  • the vehicle main body 10 can be equipped with the projection unit 13 via the gimbal unit 12 (S101) .
  • the UAV 2000 can be driven by the vehicle rotor unit 11 to move in the air or to suspend in the air, and the image capture unit 14 of the UAV 2000 can be controlled to capture an image of the lecturer (S102) .
  • the image processing module of the processing unit 16 can identify the figure of the lecturer in the captured image through image processing (S103) .
  • the processing unit 16 can further calculate a real-time position of the lecturer based on the identified figure, which enables the vehicle rotor unit 11 controlled by the processing unit 16 to drive the UAV 2000 to keep tracking the lecturer so that the image capture unit 14 can be controlled to aim at the lecturer and capture real-time images of the lecturer (S204) . Therefore, the UAV 2000 can keep tracking the lecturer (the identified object) according to the real-time images of the lecturer captured by the image capture unit 14. Besides, the captured images of the lecturer can be transmitted to the remote external unit 9000 operated by a user (S205) , which enables the user at a remote spot to audit the speech or to further interact with the lecturer.
  • the projection unit 13 can be automatically oriented in a substantially opposite direction against the image capture unit 14 (S304) , which enables the projection unit 13 to project and stabilize a display image on a remote plane 81 facing towards the lecturer (S106) . Therefore, the lecturer could view a real-time image of himself/herself on the remote plane 81 so as to improve his/her posture, or the user could present instructions projected on the remote plane 81 by transmitting the instructions from the remote external unit 9000 to the UAV 2000 (S305) .
  • the vehicle main body 10 can be equipped with the projection unit 13 via the gimbal unit 12 (S101) .
  • a rescue staff 93 trying to rescue the person 92 can operate a remote external unit 9000 to direct the UAV 2000 to where the person 92 is.
  • the image capture unit 14 can be controlled to capture real-time images of the person 92 or facial images of the person 92 through face tracking (S102) .
  • the processing unit 16 can identify the figure of the person 92 in the captured real-time images through image processing (S103) , and then the UAV 2000 can track the person 92 (the identified object) according to the real-time images of the person 92 captured by the image capture unit 14 (S204) .
  • the UAV 2000 can be in communication with a cloud server, and a facial model of the person 92 can be extracted from a cloud database storing multiple facial models or can be constructed by utilizing the cloud server to analyze a previous facial image of the trapped person 92.
  • the processing unit 16 of the UAV 2000 can not only recognize the trapped person 92 by verifying a captured facial image of the trapped person 92 against the transmitted facial model but also identify the trapped person 92 by identifying a figure of the trapped person 92 in an image captured by the image capture unit 14 (S102, S103) .
  • the real-time images of the person 92 can be transmitted to the remote external unit 9000 (S304) , which enables the rescue staff 93 to perceive the trapped person’s condition.
  • the projection unit 13 can be automatically oriented in a substantially opposite direction against the image capture unit 14 (S304) , which enables the projection unit 13 to project and stabilize a display image on a remote plane facing towards the person 92.
  • the rescue staff 93 can transmit external content from the remote external unit 9000 to the unmanned vehicle 1000 (S305) so that the projection unit 13 can project a display image containing the external content showing tips or guidelines for helping the person 92 escape from the valley.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Mechanical Engineering (AREA)
  • Remote Sensing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

An unmanned vehicle is disclosed. The unmanned vehicle is equipped with a projection unit, which is configured to project a display image onto a remote plane so that the display image is visible on the remote plane surface. Accordingly, the unmanned vehicle is able to display the display image showing content of one of a data file stored in a cloud server, a preload data stored in a storage unit of the unmanned vehicle or an external data file stored in an external electronic device according to the device with which the unmanned vehicle is in communication.

Description

UNMANNED VEHICLE CAPABLE OF DISPLAYING IMAGES ON A REMOTE PLANE Background of the Invention
1. Field of the Invention
The present invention relates to an unmanned vehicle, and more particularly, to an unmanned vehicle with capability of displaying images on a remote plane.
2. Description of the Prior Art
Recently, the unmanned aerial vehicle (UAV) has been gradually applied to all aspects of human life, such as building mapping, spraying pesticides, water rescue and so on.A conventional unmanned aerial vehicle is equipped with a camera module for capturing surrounding images where the UAV is located, and the surrounding images are transmitted back to the remote control which is operated by a user.
As a result, the user may control the UAV to a destination where human beings are difficult to achieve by the remote control and see the surrounding images captured by the UAV via the remote control and an external electronic device, such as a cell phone if needed. However, the conventional UAV equipped with the camera is incapable of showing display at a location or destination where the UAV is remotely controlled to head for, which results in a lack of convenience in use, especially for some cases, such as an outdoor rescue scenario or an indoor meeting scenario.
Summary of the Invention
Therefore, it is an objective of the present invention to provide an unmanned vehicle capable of displaying images on a remote plane for solving above drawbacks.
To achieve the aforementioned objective, the present invention discloses an unmanned vehicle capable of displaying images on a remote plane comprising a vehicle main body, a vehicle rotor unit, an image capture unit, a projection unit, a processing unit, a gimbal unit and a storage unit. The vehicle rotor unit is connected with the vehicle main body and configured to provide moving force to the unmanned vehicle. The image capture unit is configured to capture at least one image. The projection unit is configured to project at least one display image on the remote plane. The processing unit is configured to control operation of the vehicle rotor unit and process the image captured by the image capture unit and the display image. The gimbal unit has at least two rotation axes. The gimbal unit is configured to support the projection unit and connected to the vehicle main body. The storage unit is configured to store the captured image and the display image.
To achieve the aforementioned objective, the present invention further discloses an unmanned vehicle capable of displaying images on a remote plane comprising a vehicle main body, an image capture unit, a projection unit, a gimbal unit and a storage unit. The image capture unit is configured to capture at least one image. The projection unit is configured to project the image on the remote plane. The gimbal unit has at least two rotation axes. The gimbal unit is configured to support the projection unit and connected to the vehicle main body. The storage unit is configured to store the captured image.
In contrast to the prior art, the unmanned vehicle is equipped with the projection unit configured to project an image onto a reflecting surface to present a display image  on the reflecting surface. Therefore, the unmanned vehicle of the present invention is able to display the display image showing content of one of a data file stored in the cloud server, preload data stored in the storage unit, or an external data file stored in an external electronic device according to the apparatus with which the unmanned vehicle communicates. On the spot of an outdoor rescue scenario or an indoor meeting scenario, the mobile display image can offer versatile display application and timely and supplementary support for the scenario.
These and other objectives of the present invention will no doubt become obvious to those of ordinary skill in the art after reading the following detailed description of the preferred embodiment that is illustrated in the various figures and drawings.
Brief Description of the Drawings
FIG. 1 is a block diagram of an unmanned vehicle (UV) according to an embodiment of the present invention.
FIG. 2 is an exemplary diagram of an unmanned aerial vehicle (UAV) according to an embodiment of the invention.
FIG. 3 is an exemplary diagram of an unmanned underwater vehicle according to an embodiment of the invention.
FIG. 4 is a diagram of a gimbal unit and a projection unit according to an embodiment of the present invention.
FIG. 5 is a schematic diagram illustrating a general remote projection scenario with an UAV according to an embodiment of the invention.
FIG. 6 is a schematic diagram illustrating an indoor remote projection scenario with the UAV according to an embodiment of the invention.
FIG. 7 is a schematic diagram illustrating an outdoor remote projection scenario with the UAV according to an  embodiment of the invention.
FIG. 8 is flow chart illustrating how a display image is projected on a remote plane with the UV according to an embodiment of the invention.
Detailed Description
In the following detailed description of the embodiments, reference is made to the accompanying drawings which form a part hereof, and in which is shown by way of illustration specific embodiments in which the invention may be practiced. In this regard, directional terminology, such as "top, " "bottom, " etc., is used with reference to the orientation of the Figure (s) being described. The components of the present invention can be positioned in a number of different orientations. As such, the directional terminology is used for purposes of illustration and is in no way limiting. On the other hand, the drawings are only schematic and the sizes of components may be exaggerated for clarity. It is to be understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the present invention. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including, ” “comprising, ” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Unless limited otherwise, the terms “connected, ” and “installed” and variations thereof herein are used broadly and encompass direct and indirect connections and installations. Accordingly, the drawings and descriptions will be regarded as illustrative in nature and not as restrictive.
Please refer to FIG. 1. FIG. 1 is a functional block diagram  of an unmanned vehicle (UV) 1000 according to an embodiment of the present invention. The invention discloses the unmanned vehicle 1000 capable of displaying images on a remote plane which comprises a vehicle main body 10, a vehicle rotor unit 11, a gimbal unit 12, a processing unit 16 , a projection unit 13, an image capture unit 14, and a storage unit 17. The image capture unit 14 is configured to capture at least one image, such as an environmental image of a spot or an object image with respect to an object where the unmanned vehicle 1000 reaches, and the environmental image can be a human’s facial image or an image related to the surrounding environment. The vehicle rotor unit 11 is connected or coupled to the vehicle main body 10 and configured to provide moving force for the unmanned vehicle 1000. The processing unit 16 is configured to control operation of the vehicle rotor unit 11 and process the image captured by the image capture unit 14. The storage unit 17 is configured to store the captured image. In practical application, the vehicle rotor unit 11 is equipped with at least one motor. The motor is controlled by the processing unit 16, so as to provide proper moving force to the unmanned vehicle 1000 to move at a desired speed and in a desired direction.
Furthermore, the gimbal unit 12 is connected or coupled to the vehicle main body 10. The projection unit 13 is supported by the gimbal unit 12 or on the vehicle main body 10 depending on practical demands. The projection unit 13 is configured to project a display image onto a remote plane or a reflecting surface so as to present the display image on the remote plane, wherein the remote plane is separated and spaced apart from the vehicle main body 10. In such a manner, the unmanned vehicle 1000 of the present invention is able to remotely display desired content via the projection unit 13 for some specific purposes, such as a demo meeting, a rescue task, and so on.  In other words, the unmanned vehicle 1000 of the present invention can remotely project the display image without utilizing an external display device, such as a cell phone, an external projector, a display panel carried by the unmanned vehicle 1000, and the like. It should be noticed that the display image can be processed by the processing unit 16 and stored by the storage unit 17 of the unmanned vehicle 1000.
Please refer to FIG. 1 and FIG. 2. FIG. 2 is an exemplary diagram of an unmanned aerial vehicle (UAV) 2000 according to an embodiment of the invention. In the present embodiment, the unmanned vehicle 1000 of the present invention can be implemented as the unmanned aerial vehicle 2000 as shown in FIG. 2, but the present invention is not limited thereto. For example, referring to FIG. 3, FIG. 3 is an exemplary diagram of an unmanned underwater vehicle 3000 according to an embodiment of the invention. As shown in FIG. 3, the unmanned vehicle 1000 of the present invention can be implemented as the unmanned underwater vehicle 3000 as shown in FIG. 3. Furthermore, the unmanned vehicle 1000 implemented as a remote operated vehicle, an unmanned surface vehicle, or an autonomous robot is within the scope of the present invention. According to the exemplary embodiments of FIG. 2 and FIG. 3, the gimbal unit 12 is connected or coupled to the vehicle main body 10. The projection unit 13 is supported by the gimbal unit 12 or on the vehicle main body 10 depending on practical demands. The gimbal unit 12 can be configured to have least two rotation axes for controlling a projection direction of the projection unit 13. Further details of structure the gimbal unit 12 and the projection unit 13 is described in the following paragraphs.
Please refer to FIG. 1 and FIG. 4. FIG. 4 is a diagram of a gimbal unit 12 and a projection unit 13 according to an  embodiment of the present invention. The projection unit 13 comprises a projection casing 130 and a projector member 131. The projection casing 130 and the vehicle main body 10 are two independent members. The projector member 131 is disposed on the projection casing 130 and for projecting the display image. The projector member 131 can be arranged outside the vehicle main body 10, but not limited to this. For example, the projector member 131 can be disposed right on the vehicle main body 10.
The gimbal unit 12 connects the vehicle main body 10 and the projection casing 130. The gimbal unit 12 comprises a first motor assembly 120, a second motor assembly 121 and a third motor assembly 122. The first motor assembly 120 has a first axis X1 and is configured to drive the projection unit 13 to rotate about the first axis X1. The second motor assembly 121 has a second axis X2 and is configured to drive the projection unit 13 to rotate about the second axis X2. The third motor assembly 122 has a third axis X3 and is configured to drive the projection unit 13 to rotate about the third axis X3. In such a manner, the gimbal unit 12 is able to activate the projection unit 13 to rotate about at least one of the three independent axes, i.e., the first axis X1, the second axis X2 and the third axis X3.
In another embodiment, the gimbal unit 12 further comprises a mounting mechanism 123, a first frame mechanism 124 and a second frame mechanism 125. The mounting mechanism 123 connects the first motor assembly 120 and the vehicle main body 10. The first frame mechanism 124 connects the mounting mechanism 123 and the second motor assembly 121. The second frame mechanism 125 connects the second motor assembly 121 and the third motor assembly 122. In this embodiment, the first frame mechanism 124 has a substantially L-shaped orientation,  and the second frame mechanism 125 has a substantially U-shaped structure, which facilitate optimizing of structure of the gimbal unit 12 and further saves mechanical space and weight of the gimbal unit 12.
Please refer to FIG. 1. As shown in FIG. 1, the unmanned vehicle 1000 can further comprise a communication unit 18 configured to transmit data, such as the captured image, to and receive data from a remote external unit via wire or wireless connection, which enables two-way communication between the unmanned vehicle 1000 and the remote external unit. The remote external unit can be, but not limited to, a UV remote controller 7000 of the unmanned vehicle 1000, a mobile device (such as a smart phone 6000) with corresponding application software, a computing device (such as a personal computer 5000) with corresponding application software, or a remote cloud server 4000. Therefore, since the unmanned vehicle 1000 can wirelessly communicate with the UV remote controller 7000 through the communication unit 18, the unmanned vehicle 1000 can be controlled by a user who is located at a remote spot apart from the unmanned vehicle 1000 to perform remote projection task, such as the demo meeting or the rescue task. Besides, the processing unit 16 can further configured to generate the display image according to data received from the remote external unit through the communication unit 18 and provide the display image to the projection unit 13. The processing unit 16 can further configured to control the communication unit 18 to establish communication with the remote external unit, but not limited to this.
Further detail of how a display image is projected onto a remote plane 81 by the unmanned vehicle 1000 of the present application is described as follows. Please refer to FIG. 1 and FIG. 5. FIG. 5 is a schematic diagram illustrating a general  remote projection scenario with an UAV 2000 according to an embodiment of the invention. As shown in FIG. 5, a user utilizes a remote external unit 9000 to control the unmanned aerial vehicle 2000 to reach a desired spot. The user can further control the projection unit 13 of the unmanned aerial vehicle 2000 through the remote external unit 9000 so as to project a display image onto a remote plane 81 at the spot. Since the unmanned aerial vehicle 2000 is in communication with the remote external unit 9000, content shown in the display image can be content of a file transmitted from the remote external unit 9000, but the present invention is not limited thereto. In another embodiment of the present invention, the content shown in the display image can be content of a preloaded data file stored in the storage unit 17 of the unmanned vehicle 1000. With the projected content on the remote plane 81, such as a demo video, a demo power point file, or an exercising tap, the user is able to hold activities on the spot even if the user is not present at the spot.
Please refer to FIG. 1. According to a preferred embodiment of the present invention, the processing unit 16 of the unmanned vehicle 1000 is configured to generate the display image according to the image captured by the image capture unit 14 and provide the display image to the projection unit 13. The processing unit 16 can further configured to identify an object in the captured image. For example, the processing unit 13 can comprise a control module 20 and an image processing module 19. The control module 20 is capable of controlling operation of the unmanned vehicle 1000, the vehicle rotor unit 11, the gimbal unit 12, and the image capture unit 14. The image processing module 19 is capable of processing the display image and the image captured by the image capture unit 14 and/or identifying the object in the captured image.
Please refer to FIG. 1 and FIG. 6. FIG. 6 is a schematic diagram illustrating an indoor remote projection scenario with the UAV 2000 according to the embodiment of the invention. As shown in FIG. 6, when a user practices a speech in an indoor exercise room, the UAV 2000 can be driven by the vehicle rotor unit 11 to move in the air or to stay in the air, and the image capture unit 14 can capture an image of the user. The image processing module 19 can identify a figure of the user in the captured image. Once the figure of the user in the captured image is identified, the image processing module 19 can calculate a position information for locating the user according to the identified figure, which enables the control module 20 to control the vehicle rotor unit 11 for driving the unmanned vehicle 1000 to keep tracking and following the identified user and control the image capture unit 14 for continuously capturing images of the identified user. Besides, the control module 20 can control the gimbal unit 12 and a projection direction of the projection unit 13 according to a direction of the identified user with respect to the unmanned vehicle 1000. Therefore, as shown in FIG. 6, the projection unit 13 can be automatically oriented in a substantial opposite direction against the image capture unit 14 so as to project and stabilize a display image on a designated projection area. It should be noticed that content of the display image can be directly extracted from the image capture unit 14 in the aforementioned embodiment.
Please refer to FIG. 1 and FIG. 7. FIG. 7 is a schematic diagram illustrating an outdoor remote projection scenario with the UAV 2000 according to the embodiment of the invention. As shown in FIG. 7, when a person 92 is trapped in a valley and a rescue staff 93 trying to rescue the person 92 operates a remote external unit 9000 to fly the unmanned aerial vehicle  2000 to where the person 92 is, the rescue staff 93 can transmit an external content from the remote external unit 9000 to the unmanned aerial vehicle 2000 so that the projection unit 13 of the unmanned aerial vehicle 2000 can project a display image containing the external content showing tips or guidelines for helping the person 92 escape from the valley. Besides, if the unmanned aerial vehicle 2000 is in communication with a cloud server, a facial model of the trapped person 92 can be extracted from a cloud database storing multiple facial models or can be constructed by utilizing the cloud server to analyze a previous facial image of the trapped person 92. Once the extracted or constructed facial model is transmitted to the unmanned aerial vehicle 2000 at the trap site, the processing unit 16 of the unmanned aerial vehicle 2000 can not only identify the trapped person 92 by identifying a figure of the trapped person 92 in an image captured by the image capture unit 14, but also recognize the trapped person 92 by verifying a captured facial image of the trapped person 92 against the transmitted facial model. Therefore, the unmanned aerial vehicle 2000 of the present application can be configured to not only passively project information on a designated projection area but also actively search an object to be identified.
According to the abovementioned exemplary embodiments of FIG. 6 and FIG. 7, it is concluded that the unmanned vehicle 1000 of the present application can be used in object tracking and/or projection direction adjustment. Besides, as shown in FIG. 2 and FIG. 3, the image capture unit 14 is disposed and fixed on a side of the vehicle main body 10 of the unmanned aerial vehicle 2000 or the unmanned underwater vehicle 3000, but is not limited to this. In another embodiment, the image capture unit 14 can be disposed on the projection casing 130 with the projector member 131. According to a preferred  embodiment, the image capture unit 14 is supported by the gimbal unit 12 and disposed in an opposite direction to the projection unit 13. That is, the image capture unit 14 and the projector member 131 can be disposed on opposite sides of the projection casing 130, and the description of the abovementioned exemplary embodiments is intended to be illustrative and not to limit the scope of the invention.
The invention further discloses another unmanned vehicle 1000’ capable of displaying images on a remote plane which comprises a vehicle main body 10, an image capture unit 14, a projection unit 13, a gimbal unit 12, and a storage unit 17.The image capture unit 14 is configured to capture at least one image. The projection unit 13 is configured to project the image on the remote plane. The gimbal unit 12 has at least two rotation axes. The gimbal unit 12 is configured to support the projection unit 13 and connected to the vehicle main body 10. The storage unit 17 is configured to store the captured image and the display image.
The unmanned vehicle 1000’ can further comprise a communication unit 18. The communication unit 18 is configured to transmit the captured image to an external device and receive data from the external device. Therefore, the projection unit 13 can be further configured to display the received data on the remote plane, and the image capture unit 14 can be further configured to capture image according to the received data.
The gimbal unit 12 of the unmanned vehicle 1000’ can be further configured to support the image capture unit 14 and thereby control a shooting direction of the image capture unit 14. Besides, the gimbal unit 12 can be further configured to control a projection direction of the projection unit 13  according to a direction of an object identified in the image captured by the image capture unit with respect to the unmanned vehicle.
In contrast to prior art, the unmanned vehicle is equipped with the projection unit configured to project an image onto a reflecting surface to present a display image on the reflecting surface. Therefore, the unmanned vehicle of the present application is able to display the display image showing content of one of a data file stored in the cloud server, preload data stored in the storage unit, or an external data file stored in an external electronic device according to the apparatus with which the unmanned vehicle communicates. On the spot of an outdoor rescue scenario or an indoor meeting scenario, the mobile display image can offer versatile display application and timely and supplementary support for the scenario.
Please refer to FIG. 8. FIG. 8 is flow chart illustrating how a display image is projected on a remote plane with the UV 1000 according to the embodiment of the invention. The detailed procedure of projecting the display image onto the remote plane comprises the following steps:
S101: Equip the vehicle main body 10 with the projection unit 13 via the gimbal unit 12.
S102: Capture an image via the image capture unit 14.
S103: Identify an object in the captured image.
S204: Track the identified object by the image capture unit 14 for capturing an image of the identified object.
S205: Transmit the captured image of the identified object to the remote external unit.
S304: Adjust a projection direction of the projection unit 13 via the gimbal unit 12.
S305: Receive data from the remote external unit 9000.
S106: Project the display image on the remote plane 81.
Further detailed description about steps of the projection method for projecting the display image on the remote plane via the unmanned aerial vehicle 2000 in different scenarios is introduced as follows. It should be noticed that the unmanned aerial vehicle 2000 described in the following paragraphs is intended to be illustrative and not to limit the scope of the invention. According to other embodiments, the exemplary unmanned vehicle can be an unmanned underwater vehicle 3000 or other type of unmanned vehicle suitable for particular occasions.
Please refer to FIG. 8 and FIG. 5 for an application in a general scenario. First, the vehicle main body 10 of the unmanned aerial vehicle 2000 can be equipped with the projection unit 13 by installing the projection unit 13 to the gimbal unit 12 mounted on the unmanned aerial vehicle 2000 (S101) , and the user can utilize a remote external unit 9000 to control the unmanned aerial vehicle 2000 to fly to a specific spot where there might be a remote plane 81 for reflecting and presenting a projected image. Upon the unmanned aerial vehicle 2000 reaching the specific spot, the user can control the projection unit 13 to project a display image onto the remote plane 81 via the remote external unit 9000. Specifically, by using the remote external unit 9000 to control the motion of the gimbal unit 12 via the control module 20 of the processing unit 16, the projection direction of the projection unit 13 can be adjusted and aimed at the remote plane 81 (S304) . Since the unmanned aerial vehicle 2000 is in communication with the remote external unit 9000, content that is destined for being presented in the display image can further be transmitted from the remote external unit 9000 to  the unmanned aerial vehicle 2000 (S305) . Therefore, after receiving data of the content transmitted from the remote external unit 9000, the unmanned aerial vehicle 2000 can project the display image (with the content) onto the remote plane 81 (S106) so that the user can deliver messages to an audience at the specific spot via the display image even if the user is not present at the specific spot.
Please refer to FIG. 8 and FIG. 6 for an application with the tracking function in an indoor scenario. First, the vehicle main body 10 can be equipped with the projection unit 13 via the gimbal unit 12 (S101) . During an indoor speech, for example, the UAV 2000 can be driven by the vehicle rotor unit 11 to move in the air or to suspend in the air, and the image capture unit 14 of the UAV 2000 can be controlled to capture an image of the lecturer (S102) . After the image of the lecturer is captured by the UAV 2000, the image processing module of the processing unit 16 can identify the figure of the lecturer in the captured image through image processing (S103) . After the figure of the lecturer is identified, the processing unit 16 can further calculate a real-time position of the lecturer based on the identified figure, which enables the vehicle rotor unit 11 controlled by the processing unit 16 to drive the UAV 2000 to keep tracking the lecturer so that the image capture unit 14 can be controlled to aim at the lecturer and capture real-time images of the lecturer (S204) . Therefore, the UAV 2000 can keep tracking the lecturer (the identified object) according to the real-time images of the lecturer captured by the image capture unit 14. Besides, the captured images of the lecturer can be transmitted to the remote external unit 9000 operated by a user (S205) , which enables the user at a remote spot to audit the speech or to further interact with the lecturer. In addition, based on the real-time calculated positions of the identified object, the  projection unit 13 can be automatically oriented in a substantially opposite direction against the image capture unit 14 (S304) , which enables the projection unit 13 to project and stabilize a display image on a remote plane 81 facing towards the lecturer (S106) . Therefore, the lecturer could view a real-time image of himself/herself on the remote plane 81 so as to improve his/her posture, or the user could present instructions projected on the remote plane 81 by transmitting the instructions from the remote external unit 9000 to the UAV 2000 (S305) .
Please refer to FIG. 8 and FIG. 7 for an application with tracking function in an outdoor scenario. First, the vehicle main body 10 can be equipped with the projection unit 13 via the gimbal unit 12 (S101) . When a person 92 is trapped in a valley, for example, a rescue staff 93 trying to rescue the person 92 can operate a remote external unit 9000 to direct the UAV 2000 to where the person 92 is. After the UAV 2000 reaches the trap site in the valley, the image capture unit 14 can be controlled to capture real-time images of the person 92 or facial images of the person 92 through face tracking (S102) . Next, the processing unit 16 can identify the figure of the person 92 in the captured real-time images through image processing (S103) , and then the UAV 2000 can track the person 92 (the identified object) according to the real-time images of the person 92 captured by the image capture unit 14 (S204) . Besides, In another scenario where the trapped person 92 is yet to be found, the UAV 2000 can be in communication with a cloud server, and a facial model of the person 92 can be extracted from a cloud database storing multiple facial models or can be constructed by utilizing the cloud server to analyze a previous facial image of the trapped person 92. Once the extracted or constructed facial model is transmitted to the UAV 2000 at the trap site, the processing unit 16 of the UAV  2000 can not only recognize the trapped person 92 by verifying a captured facial image of the trapped person 92 against the transmitted facial model but also identify the trapped person 92 by identifying a figure of the trapped person 92 in an image captured by the image capture unit 14 (S102, S103) .
After the trapped persons is identified or recognized, the real-time images of the person 92 can be transmitted to the remote external unit 9000 (S304) , which enables the rescue staff 93 to perceive the trapped person’s condition. At the same time, based on the real-time calculated positions of the identified object, the projection unit 13 can be automatically oriented in a substantially opposite direction against the image capture unit 14 (S304) , which enables the projection unit 13 to project and stabilize a display image on a remote plane facing towards the person 92. In addition, the rescue staff 93 can transmit external content from the remote external unit 9000 to the unmanned vehicle 1000 (S305) so that the projection unit 13 can project a display image containing the external content showing tips or guidelines for helping the person 92 escape from the valley.
Those skilled in the art will readily observe that numerous modifications and alterations of the device and method may be made while retaining the teachings of the invention. Accordingly, the above disclosure should be construed as limited only by the metes and bounds of the appended claims.

Claims (18)

  1. An unmanned vehicle capable of displaying images on a remote plane, comprising:
    a vehicle main body;
    a vehicle rotor unit connected with the vehicle main body and configured to provide moving force to the unmanned vehicle;
    an image capture unit configured to capture at least one image;
    a projection unit configured to project at least one display image on the remote plane;
    a processing unit configured to control operation of the vehicle rotor unit and process the image captured by the image capture unit and the display image;
    a gimbal unit with at least two rotation axes and configured to support the projection unit and connected to the vehicle main body; and
    a storage unit configured to store the captured image and the display image.
  2. The unmanned vehicle of claim 1, wherein the processing unit is further configured to generate the display image according to the image captured by the image capture unit and provide the display image to the projection unit.
  3. The unmanned vehicle of claim 1, further comprising:
    a communication unit configured to transmit the captured image to and receive data from a remote external unit.
  4. The unmanned vehicle of claim 3, wherein the processing unit is further configured to generate the display image according to the data received from the remote external unit and provide the display image to the projection unit.
  5. The unmanned vehicle of claim 1, wherein the processing unit is further configured to identify an object in the captured image.
  6. The unmanned vehicle of claim 5, wherein the processing unit is further configured to control the vehicle rotor unit to drive the unmanned vehicle to follow the identified object and control the image capture unit to capture an image of the identified object.
  7. The unmanned vehicle of claim 5, wherein the processing unit is further configured to control the gimbal unit and a projection direction of the projection unit according to a direction of identified object with respect to the unmanned vehicle.
  8. The unmanned vehicle of claim 1, wherein the image capture unit is supported by the gimbal unit and disposed in an opposite direction to the projection unit.
  9. The unmanned vehicle of claim 1, wherein the processing unit comprises a control module configured to control operation of the vehicle rotor unit and an image processing module configured to process the image captured by the image capture unit and the display image.
  10. The unmanned vehicle of claim 1, wherein the processing unit comprises an image processing module configured to process the image captured by the image capture unit and identify an object in the captured image.
  11. The unmanned vehicle of claim 1, wherein the unmanned vehicle is one of an unmanned aerial vehicle, a remote  operated vehicle, an unmanned surface vehicle, an autonomous robot.
  12. An unmanned vehicle capable of displaying images on a remote plane, comprising:
    a vehicle main body;
    an image capture unit configured to capture at least one image;
    a projection unit configured to project the image on the remote plane;
    a gimbal unit with at least two rotation axes and configured to support the projection unit and connected to the vehicle main body; and
    a storage unit configured to store the captured image.
  13. The unmanned vehicle of claim 12, further comprising:
    a communication unit configured to transmit the captured image to an external device and receive data from the external device.
  14. The unmanned vehicle of claim 13, wherein the projection unit is further configured to display the received data on the remote plane.
  15. The unmanned vehicle of claim 13, wherein the image capture unit is further configured to capture image according to the received data.
  16. The unmanned vehicle of claim 12, wherein the gimbal unit is further configured to support the image capture unit and control a shooting direction of the image capture unit.
  17. The unmanned vehicle of claim 12, wherein the gimbal unit  is further configured to control a projection direction of the projection unit according to a direction of an object identified in the image captured by the image capture unit with respect to the unmanned vehicle.
  18. The unmanned vehicle of claim 12, wherein the unmanned vehicle is one of an unmanned aerial vehicle, a remote operated vehicle, an unmanned surface vehicle, an autonomous robot.
PCT/CN2019/084265 2019-04-25 2019-04-25 Unmanned vehicle capable of displaying images on a remote plane WO2020215271A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/084265 WO2020215271A1 (en) 2019-04-25 2019-04-25 Unmanned vehicle capable of displaying images on a remote plane

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/084265 WO2020215271A1 (en) 2019-04-25 2019-04-25 Unmanned vehicle capable of displaying images on a remote plane

Publications (1)

Publication Number Publication Date
WO2020215271A1 true WO2020215271A1 (en) 2020-10-29

Family

ID=72941249

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/084265 WO2020215271A1 (en) 2019-04-25 2019-04-25 Unmanned vehicle capable of displaying images on a remote plane

Country Status (1)

Country Link
WO (1) WO2020215271A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017076084A (en) * 2015-10-16 2017-04-20 株式会社プロドローン Information transmission device
CN206171831U (en) * 2016-09-30 2017-05-17 苏州工艺美术职业技术学院 Traffic intelligence unmanned aerial vehicle
WO2018027338A1 (en) * 2016-08-06 2018-02-15 SZ DJI Technology Co., Ltd. Automatic terrain evaluation of landing surfaces, and associated systems and methods
CN207045774U (en) * 2017-08-01 2018-02-27 四川省冶地工程勘察设计有限公司 Photo control point aircraft is used in a kind of orthography mapping
WO2018118257A1 (en) * 2016-12-21 2018-06-28 Intel Corporation Unmanned aerial vehicle traffic signals and related methods
CN208306993U (en) * 2018-04-12 2019-01-01 佛山天源创新科技有限公司 A kind of fire-fighting emergent evacuation unmanned plane
CN109665111A (en) * 2019-01-29 2019-04-23 李汉高 Continuation of the journey artificial intelligence line holographic projections aircraft when a kind of overlength

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017076084A (en) * 2015-10-16 2017-04-20 株式会社プロドローン Information transmission device
WO2018027338A1 (en) * 2016-08-06 2018-02-15 SZ DJI Technology Co., Ltd. Automatic terrain evaluation of landing surfaces, and associated systems and methods
CN206171831U (en) * 2016-09-30 2017-05-17 苏州工艺美术职业技术学院 Traffic intelligence unmanned aerial vehicle
WO2018118257A1 (en) * 2016-12-21 2018-06-28 Intel Corporation Unmanned aerial vehicle traffic signals and related methods
CN207045774U (en) * 2017-08-01 2018-02-27 四川省冶地工程勘察设计有限公司 Photo control point aircraft is used in a kind of orthography mapping
CN208306993U (en) * 2018-04-12 2019-01-01 佛山天源创新科技有限公司 A kind of fire-fighting emergent evacuation unmanned plane
CN109665111A (en) * 2019-01-29 2019-04-23 李汉高 Continuation of the journey artificial intelligence line holographic projections aircraft when a kind of overlength

Similar Documents

Publication Publication Date Title
US10336469B2 (en) Unmanned aerial vehicle movement via environmental interactions
US9552056B1 (en) Gesture enabled telepresence robot and system
JP4257298B2 (en) Device for displaying facial features
US9969080B2 (en) Robotic camera system
US11380332B2 (en) Information processing apparatus, information processing method, and computer program
WO2021098453A1 (en) Target tracking method and unmanned aerial vehicle
CN106249888A (en) A kind of cloud platform control method and device
WO2017028275A1 (en) Flying robot provided with projector
WO2018051222A1 (en) System and method for remotely assisted camera orientation
WO2020215271A1 (en) Unmanned vehicle capable of displaying images on a remote plane
US20200401139A1 (en) Flying vehicle and method of controlling flying vehicle
JP2016208255A (en) Movable projection device
CN111526280A (en) Control method and device of camera device, electronic equipment and storage medium
US20210302922A1 (en) Artificially intelligent mechanical system used in connection with enabled audio/video hardware
TWI696122B (en) Interactive photographic system and method for unmanned aerial vehicle
CN107657893B (en) Scene experience window and system
TWI573104B (en) Indoor monitoring system and method thereof
CN105549618A (en) Real-scene interactive control system
CN105807783A (en) Flight camera
JP6687954B2 (en) Mobile projection device and projection system
US20230040068A1 (en) Autonomous positioning system for interchangeable camera devices
JP2011133176A (en) Target device
US11434002B1 (en) Personal drone assistant
CN110764535B (en) Tour sightseeing system based on unmanned aerial vehicle
CN113498952A (en) Model display device with human-computer interaction function

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19925912

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19925912

Country of ref document: EP

Kind code of ref document: A1