CN113973196A - Mobile projection robot and mobile projection method thereof - Google Patents

Mobile projection robot and mobile projection method thereof Download PDF

Info

Publication number
CN113973196A
CN113973196A CN202111318324.6A CN202111318324A CN113973196A CN 113973196 A CN113973196 A CN 113973196A CN 202111318324 A CN202111318324 A CN 202111318324A CN 113973196 A CN113973196 A CN 113973196A
Authority
CN
China
Prior art keywords
projection
image information
mobile
aerial vehicle
unmanned aerial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111318324.6A
Other languages
Chinese (zh)
Inventor
王红光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Mengtebo Intelligent Robot Technology Co ltd
Original Assignee
Beijing Mengtebo Intelligent Robot Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Mengtebo Intelligent Robot Technology Co ltd filed Critical Beijing Mengtebo Intelligent Robot Technology Co ltd
Priority to CN202111318324.6A priority Critical patent/CN113973196A/en
Publication of CN113973196A publication Critical patent/CN113973196A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • H04L67/125Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the disclosure provides a mobile projection robot and a projection method thereof, wherein the method comprises the following steps: the unmanned aerial vehicle collects environment image information according to a preset track and sends the collected environment image information to the bearing table; the bearing table receives environment image information acquired by the unmanned aerial vehicle, analyzes the environment image information, determines corresponding environment information, generates a corresponding movement strategy according to the environment information, and sends the movement strategy to the movement module; the mobile module selects a corresponding mobile part to move according to the mobile strategy, and after the mobile module moves to a target area, the unmanned aerial vehicle receives projection content sent by the bearing platform and projects the projection content to the target projection area. In this way, when guaranteeing user's safety, can guarantee that the projection is long and projection stability to improve user experience.

Description

Mobile projection robot and mobile projection method thereof
Technical Field
Embodiments of the present disclosure relate generally to the field of projection technology, and more particularly, to a mobile projection robot and a mobile projection method thereof.
Background
Projection refers to projecting the shape of an object onto a plane with a set of light rays, and with the development of projection technology, video content on a mobile terminal (e.g., a mobile phone) can be projected onto a plane.
Generally, when performing video projection, a professional projection device is usually required, and a common mobile terminal does not have a video projection function, and needs to transmit video content to the projection device, so that the projection device projects the video content onto a plane.
In the prior art, when video projection is carried out in a special environment, a user often cannot carry out video projection to a specified place for safety, so that the video projection is required to be completed by a robot, for example, a large-sized animal test in the field or the video projection in the special environment, and the user cannot directly go to the specified place to carry out adjustment and needs to carry out the video projection in the specified place.
Under the above-mentioned condition, can adopt unmanned aerial vehicle to carry out the projection, but unmanned aerial vehicle's continuation of the journey and stability are difficult to satisfy the projection demand.
Disclosure of Invention
According to the embodiment of the disclosure, a mobile projection scheme capable of ensuring the projection time and the projection stability while ensuring the user safety is provided.
In a first aspect of the present disclosure, there is provided a mobile projection robot comprising:
the unmanned aerial vehicle is used for acquiring environment image information, establishing communication connection with the bearing table, sending the acquired environment image information to the bearing table, receiving projection content sent by the bearing table and projecting the received projection content to a target projection area;
the bearing platform is used for receiving the environment image information, determining corresponding environment information according to the environment image information, generating a moving strategy, receiving projection contents sent by a user side, bearing the unmanned aerial vehicle and providing power for the moving module;
and the moving module is used for moving in different moving modes according to the moving strategy.
In some embodiments, the carrier is further configured to periodically charge the drone.
In some embodiments, the movement module includes a track movement portion, a limb-foot movement portion, a roller movement portion, and an air bladder turbine movement portion.
In a second aspect of the present disclosure, there is provided a mobile projection method applied to the mobile projection robot of the first aspect, including:
the unmanned aerial vehicle collects environment image information according to a preset track and sends the collected environment image information to the bearing table;
the bearing table receives environment image information acquired by the unmanned aerial vehicle, analyzes the environment image information, determines corresponding environment information, generates a corresponding movement strategy according to the environment information, and sends the movement strategy to the movement module;
the mobile module selects a corresponding mobile part to move according to the mobile strategy, and after the mobile module moves to a target area, the unmanned aerial vehicle receives projection content sent by the bearing platform and projects the projection content to the target projection area.
In some embodiments, the receiving station receives environment image information acquired by the unmanned aerial vehicle, analyzes the environment image information, and determines corresponding environment information, including:
the bearing table receives environment image information acquired by the unmanned aerial vehicle, identifies the environment image information, determines a target area, plans a moving path according to the target area, and determines road condition information corresponding to different road sections according to the road condition information on the moving path.
In some embodiments, the generating a corresponding mobility policy according to the environment information and sending the mobility policy to a mobility module includes:
the corresponding moving direction is determined according to the moving path, the corresponding road condition information of different road sections is determined according to the road condition information on the moving path, the corresponding moving part is determined according to the road condition information, and the moving direction and the information of selecting the moving part for different road sections are sent to the moving module.
In some embodiments, further comprising:
after the mobile projection robot moves to the target area, the plummer determines the target projection area and the drop point of the unmanned aerial vehicle according to the environment image information collected by the unmanned aerial vehicle, generates drop point coordinate information and sends the drop point coordinate information to the unmanned aerial vehicle.
In some embodiments, further comprising:
the unmanned aerial vehicle moves to the first drop point coordinate, collects first image information of the target projection area, and sends the collected first image information of the target projection area to the bearing table;
the plummer confirms whether there is the projection shelter according to first image information, responds to and exists the projection shelter, confirms the second placement coordinate for unmanned aerial vehicle again according to the image information of the target projection region that unmanned aerial vehicle gathered.
In some embodiments, further comprising:
the unmanned aerial vehicle moves to the second landing point coordinate, projects the projection content to the target projection area, collects second image information of the target projection area and sends the second image information to the bearing table, and the bearing table determines the adjustment value of the projection angle of the unmanned aerial vehicle according to the brightness and the definition of the projection content in the second image information.
In some embodiments, further comprising:
and the unmanned aerial vehicle adjusts the projection angle according to the adjustment value of the projection angle.
It should be understood that the statements herein reciting aspects are not intended to limit the critical or essential features of the embodiments of the present disclosure, nor are they intended to limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
By the mobile projection method, the safety of the user is guaranteed, and meanwhile the projection time and the projection stability can be guaranteed, so that the user experience is improved.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. In the drawings, like or similar reference characters designate like or similar elements, and wherein:
fig. 1 shows a schematic structural diagram of a mobile projection robot according to a first embodiment of the present disclosure;
fig. 2 shows a flowchart of a mobile projection method according to a second embodiment of the disclosure;
fig. 3 shows a schematic structural diagram of a mobile projection device according to a third embodiment of the present disclosure.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are some, but not all embodiments of the present disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
In addition, the term "and/or" herein is only one kind of association relationship describing an associated object, and means that there may be three kinds of relationships, for example, a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
According to the mobile projection method, the user can only carry the mobile terminal, the mobile projection robot moves to the position where the user cannot conveniently arrive through sending the mobile terminal to the mobile projection robot, the unmanned aerial vehicle projects the projection content to the target projection area, the user safety is guaranteed, meanwhile, the projection duration and the projection stability are guaranteed, and the user experience is improved.
Specifically, as shown in fig. 1, it is a schematic structural diagram of a mobile projection robot according to a first embodiment of the present disclosure. As can be seen from fig. 1, the mobile projection robot 100 of the present embodiment includes:
the unmanned aerial vehicle 101 is configured to collect environment image information, establish a communication connection with the plummer 102, send the collected environment image information to the plummer 102, receive projection content sent by the plummer 102, and project the received projection content to a target projection area.
The bearing table 102 is configured to receive the environment image information, determine corresponding environment information according to the environment image information, generate a movement policy, receive projection content sent by a user side, bear the unmanned aerial vehicle 101, and provide power for the movement module 103.
And the moving module 103 is used for moving in different moving modes according to the moving strategy.
As a specific implementation of this embodiment, unmanned aerial vehicle 101 sets up on plummer 102, can be provided with the fixing device that can open and shut on the plummer 102 for fix unmanned aerial vehicle 101 under non-user state, when unmanned aerial vehicle 101 breaks away from the needs take off and carries out image acquisition or flies to the destination and carry out the projection, fixing device opens, and unmanned aerial vehicle 101 flies from plummer 102. Furthermore, the drone 101 and the carrier 102 may be connected in a short-distance communication manner, such as bluetooth communication, ZigBee communication, or the like.
In another optional embodiment of the present disclosure, the plummer 102 may also be a lifting structure, and by lifting the plummer 102 to different heights, the unmanned aerial vehicle 101 may be lifted to different heights for acquiring the environmental image information, and the plummer 102 may also rotate in the horizontal plane to change the shooting angle of the unmanned aerial vehicle 101. Thus, the drone 101 does not need to fly to acquire environmental image information, thereby prolonging the endurance time of the drone 101.
Can be provided with the charging plug on the fixing device of plummer 102 for carry out the periodic charging to unmanned aerial vehicle 101, plummer 102 itself can carry the large capacity power. The movement module 103 includes a crawler movement portion, a limb and foot movement portion, a roller movement portion, and an air bladder turbine movement portion to accommodate different terrains.
The projection robot of the embodiment of the disclosure can adopt different moving modes for different terrains, can move to the position where the user cannot conveniently arrive and project to the designated position area through the unmanned aerial vehicle, and can guarantee the projection duration and the projection stability while guaranteeing the user safety, thereby improving the user experience.
The following describes the technical solution of the embodiment of the present disclosure with reference to a specific working principle of a mobile projection robot. Fig. 2 is a flowchart of a mobile projection method according to a second embodiment of the disclosure. The mobile projection method of the present embodiment, applied to the mobile projection robot of the above embodiment, may include the steps of:
s201: the unmanned aerial vehicle collects environment image information according to a preset track, and sends the collected environment image information to the bearing platform.
In this embodiment, the mobile projection robot may move to a target area that a user does not conveniently reach, and then the unmanned aerial vehicle of the mobile robot may project in the target projection area of the target area.
Specifically, unmanned aerial vehicle can fly according to predetermineeing the orbit to gather environment image information, gather the image of surrounding environment promptly, and send the environment image information who gathers to the plummer, confirm the target projection area according to the environment image information of unmanned aerial vehicle passback with another plummer, and then confirm the target area and the removal route that the mobile projection robot need remove to.
S202: the plummer receives the environment image information that unmanned aerial vehicle gathered, right the environment image information carries out the analysis, confirms corresponding environment information, according to environment information generates corresponding movement strategy, will movement strategy sends to the removal module.
After receiving the environment image information acquired by the unmanned aerial vehicle, the plummer can analyze the environment image information, determine a plane suitable for projection, namely a target projection area, such as a wall body, a stone wall, or other planes suitable for projection, and then determine the target area according to the target projection area, namely the target area which needs to be reached by the mobile projection robot.
The target projection area and the target area will typically have a certain distance, i.e. a projection distance. In the process of moving the mobile projection robot to the target area, a moving path may be generated after acquiring the environmental image information and the mobile projection robot directly moves to the target area, or the environmental image information of the current position point may be acquired first, the mobile projection robot moves to the next position point after generating the moving path, and then the environmental image information of the next position point is acquired to generate the next moving path. The above process is repeated until the target area is moved.
In the moving process of the mobile projection robot, the unmanned aerial vehicle can collect the environmental image information around the mobile projection robot according to the preset track in the preset range around the mobile projection robot, and send the collected environmental image information to the bearing table, so that the bearing table analyzes the collected environmental image information, and identifies road conditions and obstacles, thereby determining the corresponding moving mode and moving path. The track moving part can move on a stable road surface to ensure the moving stability of the mobile robot, such as a road, the roller can move on the stable road surface to ensure the moving speed of the mobile robot, such as an earth road, the limb and foot moving part can move on a complex road surface, such as a stone pile or an uneven road surface, and the air bag turbine moving part can move on the water surface.
After the overall moving path is determined, the moving path can be determined in a segmented mode according to the environment image information in the preset range around the mobile projection robot, on one hand, the calculated amount in the process of determining the overall moving path can be reduced, on the other hand, the moving path of the mobile projection robot can be accurately planned, and the stability and the safety of the mobile robot in the moving process are improved. That is, in determining the entire movement path, the movement path of the mobile projection robot may be roughly determined on the premise of avoiding obstacles, and then in determining the movement path in stages, an accurate movement path may be determined for specific environmental information while temporal changes in the environmental information due to external factors (e.g., natural factors such as wind, rain, etc.) may be avoided.
Through the above process, a corresponding movement policy may be generated.
In some embodiments, when determining the overall movement path and/or determining the movement path in a segmented manner, the environment image information may be enhanced and denoised, then the environment image may be identified, a stereo model of the surrounding environment may be constructed, and then the movement path may be determined and/or the movement path may be determined in a segmented manner.
S203: the mobile module selects a corresponding mobile part to move according to the mobile strategy, and after the mobile module moves to a target area, the unmanned aerial vehicle receives projection content sent by the bearing platform and projects the projection content to the target projection area.
In this embodiment, after the corresponding movement strategy is generated, the corresponding movement direction may be determined according to the movement path, the traffic information corresponding to different road segments may be determined according to the traffic information on the movement path, the corresponding moving portion may be determined according to the traffic information, the movement direction and the information of the selected moving portion for different road segments are sent to the moving module, the moving module selects the corresponding moving portion according to the movement strategy to move, and after moving to the target area, the unmanned aerial vehicle receives the projection content sent by the bearer and projects the projection content to the target projection area.
The mobile projection method can adopt different moving modes according to different terrains, can move to a position where a user cannot conveniently arrive, projects the position to a designated position area through the unmanned aerial vehicle, guarantees the safety of the user, and meanwhile guarantees the duration and the stability of projection, and therefore improves user experience.
As an optional embodiment of the present disclosure, in the method in the foregoing embodiment, the method may further include:
after the mobile projection robot moves to the target area, the plummer determines the target projection area and the drop point of the unmanned aerial vehicle according to the environment image information collected by the unmanned aerial vehicle, generates drop point coordinate information and sends the drop point coordinate information to the unmanned aerial vehicle.
Because the certain distance of interval is needed between projection point (unmanned aerial vehicle position) and the target projection region to the contained angle between unmanned aerial vehicle's optical axis and the perpendicular line of target projection region just can gain good projection effect in certain extent, consequently, needs unmanned aerial vehicle's projection position and projection angle (the placement promptly).
The method comprises the following steps that the landing point determination process can be carried out step by step, namely, the unmanned aerial vehicle moves to a first landing point coordinate first, first image information of a target projection area is collected, and the collected first image information of the target projection area is sent to a bearing table; the plummer confirms whether there is the projection shelter according to first image information, responds to and exists the projection shelter, confirms the second placement coordinate for unmanned aerial vehicle again according to the image information of the target projection region that unmanned aerial vehicle gathered. Repeating the steps, the projection position of the unmanned aerial vehicle can be determined.
After the unmanned aerial vehicle moves to the second landing point coordinate, the projection content is projected to the target projection area, the second image information of the target projection area is collected and sent to the bearing table, and the bearing table determines the adjustment value of the projection angle of the unmanned aerial vehicle according to the brightness and the definition of the projection content in the second image information. Through the steps, the projection angle of the unmanned aerial vehicle can be determined. And the unmanned aerial vehicle can adjust the projection angle according to the adjustment value of the projection angle, and receives the projection content sent by the plummer, and projects the projection content to the target projection area.
It is noted that while for simplicity of explanation, the foregoing method embodiments have been described as a series of acts or combination of acts, it will be appreciated by those skilled in the art that the present disclosure is not limited by the order of acts, as some steps may, in accordance with the present disclosure, occur in other orders and concurrently. Further, those skilled in the art should also appreciate that the embodiments described in the specification are exemplary embodiments and that acts and modules referred to are not necessarily required by the disclosure.
FIG. 3 shows a schematic block diagram of an electronic device 300 that may be used to implement embodiments of the present disclosure. As shown, device 300 includes a Central Processing Unit (CPU)301 that may perform various appropriate actions and processes in accordance with computer program instructions stored in a Read Only Memory (ROM)302 or loaded from a storage unit 308 into a Random Access Memory (RAM) 303. In the RAM 303, various programs and data necessary for the operation of the device 300 can also be stored. The CPU 301, ROM302, and RAM 303 are connected to each other via a bus 304. An input/output (I/O) interface 305 is also connected to bus 304.
Various components in device 300 are connected to I/O interface 305, including: an input unit 306 such as a keyboard, a mouse, or the like; an output unit 307 such as various types of displays, speakers, and the like; a storage unit 308 such as a magnetic disk, optical disk, or the like; and a communication unit 309 such as a network card, modem, wireless communication transceiver, etc. The communication unit 309 allows the device 300 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
The processing unit 301, which tangibly embodies a machine-readable medium, such as the storage unit 308, performs the various methods and processes described above. In some embodiments, part or all of the computer program may be loaded and/or installed onto device 300 via ROM302 and/or communication unit 309. When the computer program is loaded into the RAM 703 and executed by the CPU 301, one or more steps of the method described above may be performed. Alternatively, in other embodiments, the CPU 301 may be configured to perform the above-described method in any other suitable manner (e.g., by way of firmware).
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a load programmable logic device (CPLD), and the like.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
Further, while operations are depicted in a particular order, this should be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation can also be implemented in multiple implementations separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (10)

1. A mobile projection robot, comprising:
the unmanned aerial vehicle is used for acquiring environment image information, establishing communication connection with the bearing table, sending the acquired environment image information to the bearing table, receiving projection content sent by the bearing table and projecting the received projection content to a target projection area;
the bearing platform is used for receiving the environment image information, determining corresponding environment information according to the environment image information, generating a moving strategy, receiving projection contents sent by a user side, bearing the unmanned aerial vehicle and providing power for the moving module;
and the moving module is used for moving in different moving modes according to the moving strategy.
2. The mobile projection robot of claim 1, wherein the plummer is further configured to periodically charge the drone.
3. The mobile projection robot of claim 2, wherein the moving module comprises a crawler moving part, a limb-foot moving part, a roller moving part, and an air bag turbine moving part.
4. A mobile projection method applied to the mobile projection robot according to claim 1, comprising:
the unmanned aerial vehicle collects environment image information according to a preset track and sends the collected environment image information to the bearing table;
the bearing table receives environment image information acquired by the unmanned aerial vehicle, analyzes the environment image information, determines corresponding environment information, generates a corresponding movement strategy according to the environment information, and sends the movement strategy to the movement module;
the mobile module selects a corresponding mobile part to move according to the mobile strategy, and after the mobile module moves to a target area, the unmanned aerial vehicle receives projection content sent by the bearing platform and projects the projection content to the target projection area.
5. The mobile projection method of claim 4, wherein the plummer receives environment image information collected by the unmanned aerial vehicle, analyzes the environment image information, and determines corresponding environment information, and the method comprises:
the bearing table receives environment image information acquired by the unmanned aerial vehicle, identifies the environment image information, determines a target area, plans a moving path according to the target area, and determines road condition information corresponding to different road sections according to the road condition information on the moving path.
6. The mobile projection method of claim 5, wherein generating a corresponding mobile policy according to the environment information and sending the mobile policy to a mobile module comprises:
the corresponding moving direction is determined according to the moving path, the corresponding road condition information of different road sections is determined according to the road condition information on the moving path, the corresponding moving part is determined according to the road condition information, and the moving direction and the information of selecting the moving part for different road sections are sent to the moving module.
7. The mobile projection method of claim 6, further comprising:
after the mobile projection robot moves to the target area, the plummer determines the target projection area and the drop point of the unmanned aerial vehicle according to the environment image information collected by the unmanned aerial vehicle, generates drop point coordinate information and sends the drop point coordinate information to the unmanned aerial vehicle.
8. The mobile projection method of claim 7, further comprising:
the unmanned aerial vehicle moves to the first drop point coordinate, collects first image information of the target projection area, and sends the collected first image information of the target projection area to the bearing table;
the plummer confirms whether there is the projection shelter according to first image information, responds to and exists the projection shelter, confirms the second placement coordinate for unmanned aerial vehicle again according to the image information of the target projection region that unmanned aerial vehicle gathered.
9. The mobile projection method of claim 8, further comprising:
the unmanned aerial vehicle moves to the second landing point coordinate, projects the projection content to the target projection area, collects second image information of the target projection area and sends the second image information to the bearing table, and the bearing table determines the adjustment value of the projection angle of the unmanned aerial vehicle according to the brightness and the definition of the projection content in the second image information.
10. The mobile projection method of claim 9, further comprising:
and the unmanned aerial vehicle adjusts the projection angle according to the adjustment value of the projection angle.
CN202111318324.6A 2021-11-09 2021-11-09 Mobile projection robot and mobile projection method thereof Pending CN113973196A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111318324.6A CN113973196A (en) 2021-11-09 2021-11-09 Mobile projection robot and mobile projection method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111318324.6A CN113973196A (en) 2021-11-09 2021-11-09 Mobile projection robot and mobile projection method thereof

Publications (1)

Publication Number Publication Date
CN113973196A true CN113973196A (en) 2022-01-25

Family

ID=79589438

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111318324.6A Pending CN113973196A (en) 2021-11-09 2021-11-09 Mobile projection robot and mobile projection method thereof

Country Status (1)

Country Link
CN (1) CN113973196A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114245091A (en) * 2022-01-27 2022-03-25 美的集团(上海)有限公司 Projection position correction method, projection positioning method, control device and robot

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105825713A (en) * 2016-04-08 2016-08-03 重庆大学 Vehicular-mounted unmanned aerial vehicle auxiliary driving system and operation mode
CN205770187U (en) * 2016-06-22 2016-12-07 上海顺砾智能科技有限公司 A kind of multifunctional intellectual UAS
US20170261975A1 (en) * 2016-03-08 2017-09-14 Fuji Xerox Co., Ltd. Systems and methods employing coded light to dock aerial drones, self-driving cars and surface robots
CN107833473A (en) * 2017-11-30 2018-03-23 上海孩子国科教设备有限公司 Guiding system and vehicle based on unmanned plane
CN108602189A (en) * 2015-10-28 2018-09-28 巴伊兰大学 Robot cooperated system
CN208412177U (en) * 2018-06-26 2019-01-22 深圳市华讯方舟装备技术有限公司 Unmanned plane functional entity for unmanned boat
CN111278519A (en) * 2017-09-08 2020-06-12 索尼互动娱乐股份有限公司 Second screen projection from space and user perception of a companion robot or device
CN112882477A (en) * 2021-01-26 2021-06-01 汕头大学 Control method and system for separable air-ground amphibious cooperative robot

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108602189A (en) * 2015-10-28 2018-09-28 巴伊兰大学 Robot cooperated system
US20170261975A1 (en) * 2016-03-08 2017-09-14 Fuji Xerox Co., Ltd. Systems and methods employing coded light to dock aerial drones, self-driving cars and surface robots
CN105825713A (en) * 2016-04-08 2016-08-03 重庆大学 Vehicular-mounted unmanned aerial vehicle auxiliary driving system and operation mode
CN205770187U (en) * 2016-06-22 2016-12-07 上海顺砾智能科技有限公司 A kind of multifunctional intellectual UAS
CN111278519A (en) * 2017-09-08 2020-06-12 索尼互动娱乐股份有限公司 Second screen projection from space and user perception of a companion robot or device
CN107833473A (en) * 2017-11-30 2018-03-23 上海孩子国科教设备有限公司 Guiding system and vehicle based on unmanned plane
CN208412177U (en) * 2018-06-26 2019-01-22 深圳市华讯方舟装备技术有限公司 Unmanned plane functional entity for unmanned boat
CN112882477A (en) * 2021-01-26 2021-06-01 汕头大学 Control method and system for separable air-ground amphibious cooperative robot

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114245091A (en) * 2022-01-27 2022-03-25 美的集团(上海)有限公司 Projection position correction method, projection positioning method, control device and robot
CN114245091B (en) * 2022-01-27 2023-02-17 美的集团(上海)有限公司 Projection position correction method, projection positioning method, control device and robot
WO2023142678A1 (en) * 2022-01-27 2023-08-03 美的集团(上海)有限公司 Projection position correction method, projection localization method, control device, and robot

Similar Documents

Publication Publication Date Title
EP3633478B1 (en) Method and device for assessing probability of presence of obstacle in unknown position
CN109582034B (en) Multitask route planning method and device and electronic equipment
EP3974778A1 (en) Method and apparatus for updating working map of mobile robot, and storage medium
CN111415409B (en) Modeling method, system, equipment and storage medium based on oblique photography
US11214386B2 (en) System, control device and light aircraft
CN108638084A (en) A kind of control method of crusing robot and crusing robot
JP6949196B2 (en) Servers that implement automatic remote control of mobile transportation and methods of automatic remote control of mobile transportation
CN112419779B (en) Selection method and device of unmanned vehicle stop point, storage medium and electronic equipment
CN113973196A (en) Mobile projection robot and mobile projection method thereof
US20220250488A1 (en) Method and system for charging electric vehicle, and storage medium
CN109218598A (en) A kind of camera switching method, device and unmanned plane
CN112540625A (en) Unmanned aerial vehicle autonomous automatic power grid tower inspection system
CN112622923A (en) Method and device for controlling a vehicle
CN111966111B (en) Automatic power distribution based mobile charging equipment formation control method, system and device
CN113791640A (en) Image acquisition method and device, aircraft and storage medium
CN113220030A (en) Method and device for generating unmanned aerial vehicle power inspection route, storage medium and electronic equipment
CN114489128B (en) Flight position planning method and device for relay unmanned aerial vehicle and electronic equipment
CN111189449A (en) Robot map construction method
CN112783192B (en) Unmanned aerial vehicle path planning method, device, equipment and storage medium
CN114879735A (en) Route planning method, system, terminal device and medium
CN112333626B (en) Position determination method and device, computer readable storage medium and electronic equipment
CN113066123A (en) Tower data processing method and device based on functional semantic coordinate system
CN117729555B (en) Air base station deployment method, cooperative system and related equipment
CN113858268B (en) Charging method of robot chassis and related device
CN117128975B (en) Navigation method, system, medium and equipment for switch cabinet inspection operation robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information
CB03 Change of inventor or designer information

Inventor after: Wang Hongguang

Inventor after: Shi Xuan

Inventor before: Wang Hongguang