CN111605705A - Unmanned aerial vehicle device as intelligent assistant - Google Patents

Unmanned aerial vehicle device as intelligent assistant Download PDF

Info

Publication number
CN111605705A
CN111605705A CN201910131507.3A CN201910131507A CN111605705A CN 111605705 A CN111605705 A CN 111605705A CN 201910131507 A CN201910131507 A CN 201910131507A CN 111605705 A CN111605705 A CN 111605705A
Authority
CN
China
Prior art keywords
information
target object
information interaction
preset
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910131507.3A
Other languages
Chinese (zh)
Inventor
于振东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongxi Heyi Zhuhai Data Technology Co ltd
Original Assignee
Dongxi Heyi Zhuhai Data Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongxi Heyi Zhuhai Data Technology Co ltd filed Critical Dongxi Heyi Zhuhai Data Technology Co ltd
Priority to CN201910131507.3A priority Critical patent/CN111605705A/en
Publication of CN111605705A publication Critical patent/CN111605705A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B1/00Details of transmission systems, not covered by a single one of groups H04B3/00 - H04B13/00; Details of transmission systems not characterised by the medium used for transmission
    • H04B1/38Transceivers, i.e. devices in which transmitter and receiver form a structural unit and in which at least one part is used for functions of transmitting and receiving
    • H04B1/40Circuits

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Remote Sensing (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application discloses unmanned aerial vehicle device as intelligent assistant relates to unmanned aerial vehicle technical field. One implementation of the system includes: a flight control unit configured to follow the target object and/or to perform patrol flight related to the target object based on a preset route and/or preset rules and/or a preset model; and the information interaction unit is configured to be used for carrying out information interaction with the target object. The unmanned aerial vehicle in flight is used as an intelligent assistant for a target object or a user, so that the user can realize efficient information acquisition and information interaction while liberating two hands.

Description

Unmanned aerial vehicle device as intelligent assistant
Technical Field
The embodiment of the application relates to the technical field of unmanned aerial vehicles, in particular to an unmanned aerial vehicle device as an intelligent assistant.
Background
With the development of scientific technology, unmanned aerial vehicles are widely used in entertainment, military, agriculture, education and other fields, and perform various tasks, such as program performance, target reconnaissance, agricultural plant protection, animal tracking, fire fighting and disaster relief, and the like.
In some scenarios, the unmanned aerial vehicle is also required to provide more convenient and intelligent services for us, and the unmanned aerial vehicle can be used as an intelligent assistant for us or the target object while flying along the target object, so that efficient information acquisition and information interaction can be provided while hands are liberated. The development of 5G and artificial intelligence technology, especially the application of 5G technology such as information capability, object recognition, image recognition, speech recognition and other deep learning and reinforcement learning, makes electronic equipment more intelligent and possible, for example, the application of reinforcement learning in unmanned vehicles and the practice of deep reinforcement learning-based Google alpha go on weiqi. This also provides powerful support for unmanned aerial vehicle's intelligent, efficient information acquisition or information interaction.
Disclosure of Invention
The embodiment of the application provides an unmanned aerial vehicle device serving as an intelligent assistant.
In a first aspect, an embodiment of the present application provides an unmanned aerial vehicle device as an intelligent assistant, including:
a flight control unit configured to follow the target object and/or to perform patrol flight related to the target object based on a preset route and/or preset rules and/or a preset model;
and the information interaction unit is configured to be used for carrying out information interaction with the target object.
In some embodiments, the apparatus further comprises a networked communication unit configured to implement one or any combination of the following features:
networking with at least one third party device;
the method includes communicating with at least one third party device, transmitting data to and/or receiving data from the third party device.
In some embodiments, the information interaction with the target object includes one or any combination of the following features:
directly performing information interaction with a target object;
and performing information interaction with the target object through at least one third-party device.
In some embodiments, interacting information with the target object includes:
and pushing or playing information to the target object based on the preset rule and/or the preset model.
In some embodiments, the information interaction comprises the steps of:
acquiring first data, the first data comprising at least one of sound, images, 3D spatial data, sensor data, and data from a third party device;
identifying the collected first data to obtain an identification result;
sending the recognition result to at least one third-party device, and/or sending a notice to the target object in a preset mode based on the recognition result when the recognition result meets a preset condition;
the preset mode comprises one or any combination of the following modes: sound, light, naked eye 3D visualization, image projection screen.
In some embodiments, the information interaction comprises the steps of:
acquiring first information corresponding to a target object;
acquiring second information corresponding to the first information;
acquiring third information corresponding to the second information;
displaying the third information in a mode of image projection, image projection screen, VR, AR or naked eye 3D visualization, and/or playing the third information through sound, and/or carrying out light indication based on the third information;
the first information at least comprises one or any combination of the following information: sound, gestures, expressions, gestures, images, 3D spatial information, information obtained by communicating with a third party.
In some embodiments, the information interaction comprises the steps of:
acquiring first information corresponding to a target object;
acquiring an instruction corresponding to the first information;
acquiring instruction parameters corresponding to the instructions;
and setting corresponding equipment or software according to the instruction and/or the instruction parameter, and/or communicating with the corresponding equipment according to the instruction and sending the instruction and/or the instruction parameter to the equipment, and then pushing the acquired response information of the equipment to the target object.
In some embodiments, the information interaction comprises:
and navigating and/or reminding the target object through information interaction.
In a second aspect, an embodiment of the present application provides an unmanned aerial vehicle, including the apparatus of any embodiment in the control method.
The unmanned aerial vehicle device serving as the intelligent assistant provided by the embodiment of the application flies along with the target object, and/or performs information interaction with the target object while performing patrol flight related to the target object based on a preset route, a preset rule and/or a preset model. Therefore, the device which enables the unmanned aerial vehicle in flight to serve as an intelligent assistant of a target object or a user is realized, and the user can realize efficient information acquisition and information interaction while liberating two hands.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which one embodiment of the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of an unmanned aerial device as an intelligent assistant according to the present application;
FIG. 3 is a schematic diagram of an application scenario of an unmanned aerial vehicle device as an intelligent assistant according to the present application;
FIG. 4 is a flow diagram of yet another embodiment of an unmanned aerial device as an intelligent assistant according to the present application;
FIG. 5 is a flow diagram of yet another embodiment of an unmanned aerial device as an intelligent assistant according to the present application;
FIG. 6 is a schematic block diagram of a computer system suitable for use in implementing a server according to embodiments of the present application;
FIG. 7 is a flow diagram of yet another embodiment of an unmanned aerial device as an intelligent assistant according to the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
It should be noted that the term "and/or" is only one kind of association relationship describing the associated object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
It should be noted that the term "preset" is used herein to mean both preset and pre-trained. Generally, the preset model refers to a pre-trained model, the preset route refers to a preset route, and the preset rule refers to a preset rule.
FIG. 1 illustrates an exemplary system architecture 100 to which an embodiment of an unmanned aerial device as an intelligent assistant of the present application may be applied.
As shown in fig. 1, system architecture 100 may include terminal device 101, network 102, and server 103. Network 102 is the medium used to provide communication links between terminal devices 101 and server 103. Network 102 may include various connection types of wireless communication links such as laser/microwave/RF. The terminal device 101 may be various aircrafts or flying devices such as a drone, a controllable airship or a balloon, or various controllable levitation devices such as a magnetic levitation device.
The terminal device 101 may be installed with radar (e.g., infrared laser radar), voice device (e.g., microphone, speaker, etc.), image device (e.g., display screen, camera, projector, projection screen device, AR/VR device, naked eye 3D visualization device such as laser imaging, etc.), text input application, spatial object recognition application, image object recognition application, voice recognition application, etc. A user may use terminal device 101 to interact with server 103 over network 102 to receive or transmit information or the like.
The terminal apparatus 101 may be hardware or software. When the terminal device 101 is hardware, it may be various devices with a flight function or a hover function, including but not limited to a drone or the like. When the terminal apparatus 101 is software, it can be installed in the above-described apparatus. It may be implemented as multiple pieces of software or software modules, or as a single piece of software or software module. And is not particularly limited herein.
The server 103 may be a server providing various services, for example, a space object recognition server that analyzes and recognizes three-dimensional space data transmitted from the terminal apparatus 101 and generates a tag, a feature tag, a presence state, or the like corresponding to a target object or a feature of the target object. The space object recognition server may analyze the acquired three-dimensional space data, and determine an identifier or a presence state corresponding to the target object. And an information search server for performing information inquiry and acquisition on an information inquiry request sent by the terminal device 101, for example. The information search server can analyze and process the information query request and determine a query result corresponding to the information query request.
It should be noted that an unmanned aerial vehicle device as an intelligent assistant provided in the embodiments of the present application is generally executed by the terminal device 101.
It is noted that the terminal device 101 generally acquires corresponding three-dimensional spatial data by using a radar scan, such as an infrared laser radar based on structured light 3D imaging technology or a radar based on TOF technology.
It should be noted that the three-dimensional space data or the information for interaction corresponding to the terminal device 101 or the target object may also be stored locally in the terminal device 101, the terminal device 101 may directly extract local related three-dimensional space data or information for interaction, and the terminal device 101 may also obtain related three-dimensional space data or information for interaction through communication with a third party.
It should be noted that the unmanned aerial vehicle device as an intelligent assistant provided in the embodiment of the present application may also be executed by the server 103, or a part of the unmanned aerial vehicle device may be installed in the server 103, and another part of the unmanned aerial vehicle device may be installed in the terminal device 101.
It should be noted that the server 103 or the terminal device 101 may also locally store information or a preset model for interaction, the server 103 or the terminal device 10 may directly extract a local information preset model for interaction, and the server 103 or the terminal device 10 may also obtain a relevant information preset model for interaction through communication with a third party.
The server 103 may be hardware or software. When the server 103 is hardware, it may be implemented as a distributed server cluster composed of a plurality of servers, or may be implemented as a single server. When the server is software, it may be implemented as a plurality of software or software modules, or may be implemented as a single software or software module. And is not particularly limited herein.
It should be noted that the model or rule related to the unmanned aerial vehicle device provided by the embodiment of the present application as an intelligent assistant may be stored or run on the server 103, or may be stored or run on the terminal device 101.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to fig. 2, a flow 200 of one embodiment of a drone apparatus as an intelligent assistant according to the present application is shown, a drone apparatus as an intelligent assistant comprising:
a flight control unit 201 configured to follow the target object and/or to perform patrol flights in relation to the target object based on a preset route and/or preset rules and/or a preset model.
In this embodiment, an execution subject (for example, the terminal device 101 in fig. 1) performing information interaction based on the unmanned aerial vehicle may fly along with the target object, or perform patrol flight related to the target object based on a preset route, a preset rule, and/or a preset model, or receive the acquired related flight parameters from the server in a wireless connection manner, and control the flight attitude, speed, and/or acceleration of the unmanned aerial vehicle based on the received flight parameters.
In this embodiment, the target object may be a human or other animal, or may be other objects such as an electronic device.
And an information interaction unit 202 configured to perform information interaction with the target object.
In this embodiment, the information interaction may be voice interaction, such as voice conversation/question and answer, between an execution main body performing the information interaction based on the unmanned aerial vehicle and the target object, may be an action taken by the execution main body based on a preset rule after analyzing the acquired sound from the target object, or may be an action taken by the execution main body based on a preset rule after receiving information such as voice/gesture of the target object, and/or may be an action taken by the execution main body and/or an action taken by the execution main body after receiving information such as voice/gesture of the target object, and/. Here, various input/output means may be combined as necessary, for example, a voice question for a target object and a voice answer for a drone, a voice question for a target object and an image answer for a drone, a voice question for a target object and a voice and image answer for a drone, a gesture question for a target object and a voice and/or image answer for a drone, a voice and/or gesture command for a target object and a program response for a drone execution, and the like. Here, the sound mode of information interaction includes voice.
It is noted that the names of these units do not in some cases constitute a limitation on the units themselves, for example, an information interaction unit may also be described as a "unit configured for information interaction with an object".
In some embodiments, a networked communication unit is also included, configured to implement one or any combination of the following features:
networking with at least one third party device;
the method includes communicating with at least one third party device, transmitting data to and/or receiving data from the third party device.
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of a drone device as an intelligent assistant according to the present embodiment. In the application scenario of fig. 3, drone 301 receives a request from target 304 for "how to make steamed weever", after which drone 301 inputs the request to server 302. After receiving the request, the server 302 may perform information retrieval through the information source or the preset model and/or the preset rule 303 to obtain a retrieval result. Then, the server 302 sends the obtained search result to the drone 301, and the drone 301 presents the search result in a manner of image projection and sound playing so as to facilitate the target object 304 to receive.
According to the unmanned aerial vehicle device serving as the intelligent assistant, when the unmanned aerial vehicle device flies along with the target object, and/or when patrol flight related to the target object is carried out based on a preset route, a preset rule and/or a preset model, information interaction is carried out on the unmanned aerial vehicle device and the target object so as to meet the information requirement of the target object or a user, complete the task of arrangement of the target object or the user, guarantee the safety of the target object or the user and assist the life of the target object or the user, therefore, the information acquisition efficiency and the life quality of the target object are improved, and the time for the target object to click the handheld device with two hands is reduced.
With further reference to fig. 4, there is shown a flow diagram 400 of fig. 4 for yet another embodiment of an unmanned aerial device as an intelligent assistant according to the present application. The process 400 includes the following steps:
step 401, collecting first data, the first data comprising at least one of sound, image, 3D spatial data, sensor data, and data from a third party device.
In this embodiment, an executing body (for example, the terminal device shown in fig. 1) of the method for information interaction based on the unmanned aerial vehicle may collect environmental sound, sound or voice from a target object through a microphone, collect an image of the target object or an environment through a camera, measure 3D spatial data where a 3D imaging device such as a lidar is located, measure environmental parameters or relevant parameters of the unmanned aerial vehicle through a sensor, and may also communicate with a third party device to obtain parameters of the target object, for example, obtain measurement/sensor data of a wearable device worn by the target object. Here, the sound includes voice.
In the present embodiment, the first data may be data on temperature, disaster such as fire, posture/gesture/physiological characteristics of the target object, and the like.
Step 402, identifying the collected first data to obtain an identification result.
In this embodiment, an executing subject (for example, the terminal device shown in fig. 1) of the method for performing information interaction based on the unmanned aerial vehicle identifies the collected first data, and obtains an identification result. For example, analyzing and recognizing the acquired picture to obtain the body posture, such as standing or falling, of the target object; analyzing and identifying the acquired picture to obtain a conclusion that suspicious people enter the room; for example, the physiological state of the acquired target object is analyzed to obtain a high probability value of life risk, and the like.
And step 403, sending the identification result to at least one third-party device, and/or sending a notice to the target object in a preset mode based on the identification result when the identification result meets a preset condition.
The preset mode comprises one or any combination of the following modes: sound, light, naked eye 3D visualization, image projection screen.
In this embodiment, an executing body (for example, the terminal device shown in fig. 1) of the method for information interaction based on the unmanned aerial vehicle may send the acquired recognition result to a third party device in a wireless connection manner, or send a notification to a target object in a preset manner, for example, sound, light, naked eye 3D visualization, image projection, and/or image projection screen, based on the recognition result when the recognition result satisfies a preset condition.
As can be seen from fig. 4, unlike the embodiment shown in fig. 2, the present embodiment highlights the information interaction step based on the drone. Therefore, the information interaction process is more accurate.
With further reference to fig. 5, there is shown a flow chart 500 of fig. 5 for yet another embodiment of an unmanned aerial device as an intelligent assistant according to the present application. The process 500 includes the following steps:
step 501, acquiring first information corresponding to a target object.
The first information at least comprises one or any combination of the following information: sound, gestures, expressions, gestures, images, 3D spatial information, information obtained by communicating with a third party.
In this embodiment, an executing subject (for example, the terminal device shown in fig. 1) of the method for information interaction based on the unmanned aerial vehicle may collect environmental sound, sound or voice from a target object through a microphone, collect an image of the target object or environment through a camera, measure 3D spatial data where a 3D imaging device such as a lidar is located, collect gestures/expressions/gestures through the camera or the 3D imaging device such as the lidar, and may also communicate with a third party device to obtain parameters of the target object, for example, obtain measurement/sensor data of a wearable device worn by the target object.
In this embodiment, the first data may be a voice question from the target object, or may be a gesture command/expression/gesture from the target object.
Step 502, second information corresponding to the first information is obtained.
In this embodiment, an execution main body (for example, the terminal device shown in fig. 1) of the method for performing information interaction based on the unmanned aerial vehicle acquires second information corresponding to the first information according to the acquired first information. For example, a voice question from the target object is subjected to voice recognition to obtain corresponding second information such as characters.
Step 503, third information corresponding to the second information is acquired.
In this embodiment, an execution main body (for example, the terminal device shown in fig. 1) of the method for performing information interaction based on the unmanned aerial vehicle acquires third information corresponding to the second information according to the acquired second information. For example, the information source or database is searched according to the characters obtained by voice recognition of the voice from the target object, and answer information related to the question is obtained, wherein the answer information can be characters/voice/images and the like.
And 504, displaying the third information in a mode of image projection, VR, AR or naked eye 3D visualization, and/or playing the third information through sound, and/or carrying out light indication based on the third information.
In this embodiment, an executing body (for example, the terminal device shown in fig. 1) of the method for performing information interaction based on the unmanned aerial vehicle displays the third information in a manner of image projection, VR, AR, or naked eye 3D visualization, and/or plays the third information by sound, and/or performs light indication based on the third information. For example, the obtained answer is projected onto a wall by means of image projection, or is presented by naked eye 3D visualization such as laser imaging.
As can be seen from fig. 5, unlike the embodiment shown in fig. 2, the present embodiment highlights the information interaction step based on the drone. Therefore, the information interaction process is more accurate.
Referring now to FIG. 6, a block diagram of a computer system 600 suitable for use in implementing an electronic device (e.g., the server shown in FIG. 1) of an embodiment of the present application is shown. The electronic device shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU)601 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the system 600 are also stored. The CPU 601, ROM 602, and RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a microphone, a touch device, a button, and the like; an output portion 607 including a display such as a Liquid Crystal Display (LCD) and a speaker; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a WIFI card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that a computer program read out therefrom is mounted in the storage section 608 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611. The computer program performs the above-described functions defined in the method of the present application when executed by a Central Processing Unit (CPU) 601. It should be noted that the computer readable medium of the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes a flight control unit 201 and an information interaction unit 202. The names of the units do not form a limitation on the units themselves under certain conditions, and for example, the information interaction unit may also be described as a "unit configured to perform information interaction with a target object".
With further reference to fig. 7, there is shown a flow diagram 700 of fig. 7 for yet another embodiment of an unmanned aerial device as an intelligent assistant according to the present application. The process 700 includes the following steps:
step 701, acquiring first information corresponding to a target object.
In this embodiment, an executing subject (for example, the terminal device shown in fig. 1) of the method for information interaction based on the unmanned aerial vehicle may collect environmental sound, sound or voice from a target object through a microphone, collect an image of the target object or an environment through a camera, measure 3D spatial data where a 3D imaging device is located through a 3D imaging device such as a lidar, collect gestures/expressions/gestures through the camera or the 3D imaging device such as the lidar, and may also communicate with a third party device to obtain parameters of the target object, for example, obtain measurement/sensor data of a wearable device worn by the target object, as the first information.
In this embodiment, the first data may be a voice command from the target object for controlling the home appliance, or may be a gesture command from the target object for calling the drone.
Step 702, acquiring an instruction corresponding to the first information.
In this embodiment, an executing body (for example, the terminal device shown in fig. 1) of the method for performing information interaction based on the unmanned aerial vehicle acquires a corresponding instruction from the first information, for example, adjusts the temperature of the air conditioner in the bedroom.
Step 703, obtaining instruction parameters corresponding to the instruction.
In this embodiment, an executing entity (for example, the terminal device shown in fig. 1) of the method for performing information interaction based on the unmanned aerial vehicle acquires a corresponding instruction parameter from the first information, for example, the bedroom air conditioner is adjusted to 25 degrees celsius, or the instruction parameter is obtained by querying the parameter library according to the acquired instruction, where the instruction parameter may be null.
And 704, setting corresponding equipment or software according to the instruction and/or the instruction parameter, and/or communicating with the corresponding equipment according to the instruction and sending the instruction and/or the instruction parameter to the equipment, and then pushing the acquired response information of the equipment to the target object.
In this embodiment, an execution main body (for example, the terminal device shown in fig. 1) of the method for performing information interaction based on the unmanned aerial vehicle sets corresponding devices or software according to the instruction and/or the instruction parameter, and/or communicates with the corresponding devices according to the instruction and sends the instruction and/or the instruction parameter to the devices, and then pushes the acquired response information of the devices to the target object. For example, a command for adjusting the temperature of the air conditioner to 25 degrees celsius is sent to the corresponding air conditioner. In some embodiments, it may also be a parameter to set an alarm clock.
As can be seen from fig. 7, unlike the embodiment shown in fig. 2, the present embodiment highlights the information interaction step based on the drone. Therefore, the information interaction process is more accurate.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (9)

1. An unmanned aerial vehicle device as an intelligent assistant, comprising:
a flight control unit configured to follow a target object and/or to perform a patrol flight in relation to the target object based on a preset route and/or preset rules and/or a preset model;
and the information interaction unit is configured to be used for carrying out information interaction with the target object.
2. The apparatus of claim 1, further comprising a networking communication unit configured to implement one or any combination of the following features:
networking with at least one third party device;
communicating with at least one third party device, sending data to and/or receiving data from the third party device.
3. The device of claim 1, wherein the information interaction with the target object comprises one or any combination of the following features:
directly performing information interaction with the target object;
and performing information interaction with the target object through at least one third-party device.
4. The apparatus of claim 1 or 3, wherein the information interaction with the target object comprises:
and pushing or playing information to the target object based on a preset rule and/or a preset model.
5. The apparatus according to claim 1 or 3, wherein the information interaction comprises the following steps:
acquiring first data, the first data comprising at least one of sound, images, 3D spatial data, sensor data, and data from a third party device;
identifying the collected first data to obtain an identification result;
sending the identification result to at least one third-party device, and/or sending a notice to the target object in a preset mode based on the identification result when the identification result meets a preset condition;
the preset mode comprises one or any combination of the following modes: sound, light, naked eye 3D visualization, image projection screen.
6. The apparatus according to claim 1 or 3, wherein the information interaction comprises the following steps:
acquiring first information corresponding to the target object;
acquiring second information corresponding to the first information;
acquiring third information corresponding to the second information;
displaying the third information in a mode of image projection, image projection screen, VR, AR or naked eye 3D visualization, and/or playing the third information through sound, and/or carrying out light indication based on the third information;
the first information at least comprises one or any combination of the following information: sound, gestures, expressions, gestures, images, 3D spatial information, information obtained by communicating with a third party.
7. The apparatus according to claim 1 or 3, wherein the information interaction comprises the following steps:
acquiring first information corresponding to the target object;
acquiring an instruction corresponding to the first information;
acquiring instruction parameters corresponding to the instructions;
and setting corresponding equipment or software according to the instruction and/or the instruction parameters, and/or communicating with the corresponding equipment according to the instruction and sending the instruction and/or the instruction parameters to the equipment, and then pushing the acquired response information of the equipment to the target object.
8. The apparatus of claim 1 or 3, wherein the information interaction comprises:
and navigating and/or reminding the target object through information interaction.
9. A drone comprising the apparatus of any one of claims 1-8.
CN201910131507.3A 2019-02-22 2019-02-22 Unmanned aerial vehicle device as intelligent assistant Pending CN111605705A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910131507.3A CN111605705A (en) 2019-02-22 2019-02-22 Unmanned aerial vehicle device as intelligent assistant

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910131507.3A CN111605705A (en) 2019-02-22 2019-02-22 Unmanned aerial vehicle device as intelligent assistant

Publications (1)

Publication Number Publication Date
CN111605705A true CN111605705A (en) 2020-09-01

Family

ID=72195652

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910131507.3A Pending CN111605705A (en) 2019-02-22 2019-02-22 Unmanned aerial vehicle device as intelligent assistant

Country Status (1)

Country Link
CN (1) CN111605705A (en)

Similar Documents

Publication Publication Date Title
US10831197B2 (en) Personality sharing among drone swarm
US10979625B2 (en) Method for editing image based on artificial intelligent and artificial device
US11041737B2 (en) Method, device and system for processing a flight task
JP2020537262A (en) Methods and equipment for automated monitoring systems
US11605379B2 (en) Artificial intelligence server
US11200075B2 (en) Artificial intelligence apparatus and method for extracting user's concern
CN113177968A (en) Target tracking method and device, electronic equipment and storage medium
US10846326B2 (en) System and method for controlling camera and program
CN110737212A (en) Unmanned aerial vehicle control system and method
US20210118447A1 (en) Artificial intelligence apparatus for generating recipe information and method thereof
US10339381B2 (en) Control apparatus, control system, and control method
US20190392382A1 (en) Refrigerator for managing item using artificial intelligence and operating method thereof
CN108334498A (en) Method and apparatus for handling voice request
CN114092920B (en) Model training method, image classification method, device and storage medium
US20210239338A1 (en) Artificial intelligence device for freezing product and method therefor
US20210004022A1 (en) Method, apparatus and control system for controlling mobile robot
CN111610850A (en) Method for man-machine interaction based on unmanned aerial vehicle
US11449074B2 (en) Robot for providing guidance service using artificial intelligence and method of operating the same
CN109895780A (en) A kind of method and apparatus realizing unmanned equipment autonomously and getting rid of poverty
CN109471437B (en) Method, device and control system for controlling mobile robot
US20210137311A1 (en) Artificial intelligence device and operating method thereof
CN115082690B (en) Target recognition method, target recognition model training method and device
CN111605705A (en) Unmanned aerial vehicle device as intelligent assistant
CN111605706A (en) Unmanned aerial vehicle device as intelligent assistant
CN111605707A (en) Unmanned aerial vehicle device as intelligent assistant

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200901