CN113093811B - Unmanned aerial vehicle control method and system based on intelligent cabin system - Google Patents

Unmanned aerial vehicle control method and system based on intelligent cabin system Download PDF

Info

Publication number
CN113093811B
CN113093811B CN202110625300.9A CN202110625300A CN113093811B CN 113093811 B CN113093811 B CN 113093811B CN 202110625300 A CN202110625300 A CN 202110625300A CN 113093811 B CN113093811 B CN 113093811B
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
control
cockpit
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110625300.9A
Other languages
Chinese (zh)
Other versions
CN113093811A (en
Inventor
陈豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Joynext Technology Corp
Original Assignee
Ningbo Joynext Technology Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Joynext Technology Corp filed Critical Ningbo Joynext Technology Corp
Priority to CN202110625300.9A priority Critical patent/CN113093811B/en
Publication of CN113093811A publication Critical patent/CN113093811A/en
Application granted granted Critical
Publication of CN113093811B publication Critical patent/CN113093811B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D43/00Arrangements or adaptations of instruments

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Selective Calling Equipment (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application relates to an unmanned aerial vehicle control method and system based on an intelligent cabin system, wherein the method comprises the following steps: the cockpit domain controller determines airborne equipment needing to be started by the unmanned aerial vehicle according to the acquired sensing information, and generates a control instruction for the airborne equipment according to the acquired control requirement; the cockpit area controller determines a reference point of the unmanned aerial vehicle in the flight process; the cockpit area controller transmits the reference point and the control instruction to the unmanned aerial vehicle flight control communication module; the unmanned aerial vehicle flight control communication module generates a flight route according to the reference point; unmanned aerial vehicle flies to control communication module and handles flight route and control command and obtain the data that go upward to make intelligence passenger cabin system will go upward data transmission to unmanned aerial vehicle, need not the manual work and set up the unmanned aerial vehicle's that launches airborne equipment and the manual work sets up the control command to airborne equipment, make to unmanned aerial vehicle's control more intelligent, avoided the manual work to set up the problem that the flight route probably is not fit for unmanned aerial vehicle flight.

Description

Unmanned aerial vehicle control method and system based on intelligent cabin system
Technical Field
The application relates to the field of intelligent automobiles, in particular to an unmanned aerial vehicle control method and system based on an intelligent cabin system.
Background
Currently, an intelligent automobile is generally provided with an In-Vehicle Infotainment (IVI) system, the IVI system is a Vehicle-mounted integrated information service processing system developed based on an automobile body bus system and internet services, and the IVI system coordinates and controls the whole Vehicle-mounted Infotainment equipment through a special Vehicle-mounted processor and an operating system to provide basic services such as professional active safety alarm, trip navigation, multimedia interaction, online application, emergency help and the like for users.
In recent years, along with the unmanned aerial vehicle technology development rapidly, with the organic combination of unmanned aerial vehicle and intelligent car IVI system, not only can utilize some popular functions of unmanned aerial vehicle, like the photography of shooing, extend vehicle passenger's amusement, more the accessible carries on suitable sensor, extends the perception scope of vehicle, improves the intellectuality and the driving safety of whole car.
However, in the prior art, the remote controller is usually adopted to manually control the unmanned aerial vehicle associated with the smart car, which puts forward higher requirements on personnel in the car using the unmanned aerial vehicle and influences the experience of the user.
Disclosure of Invention
In order to solve the problems mentioned in the background art, the embodiment of the application provides an unmanned aerial vehicle control method and system based on an intelligent cabin system, and the technical scheme is as follows:
in a first aspect, an unmanned aerial vehicle control method based on an intelligent cabin system is provided, where the intelligent cabin system includes a cockpit area controller and an unmanned aerial vehicle flight control communication module, and the method includes: the cockpit domain controller determines airborne equipment needing to be started by the unmanned aerial vehicle according to the acquired sensing information, and generates a control instruction for the airborne equipment according to the acquired control requirement; the cockpit domain controller determines a reference point of the unmanned aerial vehicle in the flight process, and transmits the reference point and the control command to the unmanned aerial vehicle flight control communication module; the unmanned aerial vehicle flight control communication module generates a flight route according to the reference point; the unmanned aerial vehicle flight control communication module processes the flight route and the control instruction to obtain uplink data, so that the intelligent cabin system sends the uplink data to the unmanned aerial vehicle.
In a preferred embodiment, the cockpit domain controller determines the onboard equipment to be activated by the unmanned aerial vehicle according to the acquired sensing information, and includes: the cockpit domain controller sets a primary task scene of the unmanned aerial vehicle according to the acquired perception information; the cockpit domain controller determines airborne equipment to be started by the unmanned aerial vehicle in the primary task scene; the cockpit area controller determines a reference point of the unmanned aerial vehicle in the flight process, and the determining includes: the cockpit domain controller determines a reference point of the unmanned aerial vehicle in the flight process under the primary task scene; the primary task scene is any one of a traffic scene, an entertainment scene and a communication relay scene.
In a preferred embodiment, the cockpit domain controller sets a primary task scene of the unmanned aerial vehicle according to the acquired perception information, including: an IVI system control module or a multi-mode perception decision module in the cockpit area controller determines the use requirement of the unmanned aerial vehicle according to the perception information; and an unmanned aerial vehicle control fusion module in the cockpit domain controller sets the primary task scene according to the use requirement.
In a preferred embodiment, one or more secondary mission scenarios are provided in each of the primary mission scenarios, and the cockpit domain controller determines on-board devices to be activated by the drone in the primary mission scenario, including: the cockpit domain controller determines a secondary task scene corresponding to the unmanned aerial vehicle under the primary task scene; and the cockpit domain controller determines airborne equipment to be started by the unmanned aerial vehicle in the secondary task scene.
In a preferred embodiment, the determining, by the cockpit domain controller, a reference point of the drone during flight in the primary mission scenario includes: if the primary task scene is the traffic scene, the cockpit domain controller determines a position point arranged on a road along the advancing direction of the vehicle as the reference point; if the primary task scene is the entertainment scene, the cockpit domain controller determines the interest point as the reference point; or if the primary task scene is the communication relay scene, the cockpit area controller determines the position of the signal tower near the advancing direction of the vehicle as the reference point.
In a preferred embodiment, the generating a control command for the onboard device according to the acquired control requirement includes: determining a control mode of the airborne equipment, wherein the control mode comprises a full-automatic mode and a man-machine interaction mode; acquiring a control demand for the airborne equipment in the control mode; and generating a control command for the onboard equipment according to the control requirement.
In a preferred embodiment, the processing, by the drone flight control communication module, the flight route and the control instruction to obtain uplink data, so that the smart cabin system sends the uplink data to the drone, includes: the unmanned aerial vehicle flight control communication module synthesizes the reference point and the control instruction into the uplink data; the intelligent cabin system transmits the uplink data to the unmanned aerial vehicle through any one of the following two data transmission modes: the unmanned aerial vehicle flight control communication module sends the uplink data to the unmanned aerial vehicle in a radio mode; the intelligent cockpit system further comprises a V2X module, the unmanned aerial vehicle flight control communication module encrypts the uplink data and sends the encrypted uplink data to the cockpit area controller, and the cockpit area controller sends the encrypted uplink data to the unmanned aerial vehicle through a cellular network via the V2X module.
In a preferred embodiment, before the drone flight control communication module transmits the uplink data to the drone, the method further includes: the cockpit area controller obtains the current distance between a vehicle provided with the intelligent cockpit system and the unmanned aerial vehicle and the number, the position and the signal quality of cellular signal towers between the vehicle and the unmanned aerial vehicle, and determines the data transmission mode according to the distance, the number, the position and the signal quality of the signal towers.
In a preferred embodiment, the method further comprises: the unmanned aerial vehicle flight control communication module acquires downlink data sent by the unmanned aerial vehicle, wherein the downlink data is obtained based on data acquired by airborne equipment of the unmanned aerial vehicle; the unmanned aerial vehicle flight control communication module decodes the downlink data and transmits the decoded downlink data to the cockpit area controller; if the downlink data are image data, the cockpit domain controller distributes the downlink data to an image application program, and the image application program processes the downlink data and displays the downlink data in the human-computer interaction equipment of the intelligent cockpit system; and if the downlink data are non-image data, the cockpit domain controller distributes the downlink data to a corresponding non-image application program, and the downlink data are processed by the non-image application program and then displayed in the human-computer interaction equipment.
In a second aspect, an unmanned aerial vehicle control system based on an intelligent cabin system is provided, and comprises a cockpit area controller and an unmanned aerial vehicle flight control communication module; the cockpit area controller and the unmanned aerial vehicle flight control communication module are used for the method.
Compared with the prior art, the technical scheme provided by the application at least has the following beneficial effects:
1. the intelligent cabin-based system comprises a cockpit domain controller and an unmanned aerial vehicle flight control communication module for controlling the unmanned aerial vehicle, on one hand, the cockpit domain controller determines airborne equipment required to be started by the unmanned aerial vehicle according to acquired sensing information, generates a control instruction for the airborne equipment according to the acquired control requirement, and does not need to manually set the started airborne equipment of the unmanned aerial vehicle and manually set the control instruction for the airborne equipment, so that the control of the unmanned aerial vehicle is more intelligent; on the other hand, the cockpit area controller determines a reference point of the unmanned aerial vehicle in the flying process, the cockpit area controller transmits the reference point and a control command to the unmanned aerial vehicle flight control communication module, the unmanned aerial vehicle flight control communication module generates a flying route according to the reference point, in the process of generating the flying route of the unmanned aerial vehicle, the cockpit area controller only acquires the reference point through which the unmanned aerial vehicle passes, the specific flying route is generated by the unmanned aerial vehicle flight control communication module according to the reference point, the unmanned aerial vehicle flight control communication module can generate a flying route which is more suitable for the unmanned aerial vehicle according to the reference point, and the problem that the flying route which is manually set is not suitable for the unmanned aerial vehicle to fly is avoided; the unmanned aerial vehicle flies to control communication module and handles flight path and control command through unmanned aerial vehicle and obtain the data that go upward in this application, and this data that go upward is that unmanned aerial vehicle can discern data for unmanned aerial vehicle can in time handle after accepting the data that go upward, and fly according to the flight path under control command's control, improved the control efficiency to unmanned aerial vehicle.
2. The first-level task scene of the unmanned aerial vehicle is set by the cockpit domain controller according to the acquired perception information, the airborne equipment which needs to be started by the unmanned aerial vehicle under the first-level task scene is determined by the cockpit domain controller, the reference point of the unmanned aerial vehicle in the flying process under the first-level task scene is determined by the cockpit domain controller, namely, the airborne equipment and the reference point are determined according to the first-level task scene, under different task scenes, the airborne equipment of the unmanned aerial vehicle corresponding to control and the reference point of the unmanned aerial vehicle passing through are different, therefore, the unmanned aerial vehicle control method and the unmanned aerial vehicle control system can be suitable for various task scenes, the diversified requirements of users are met, and the application range of the unmanned aerial vehicle is greatly expanded.
Drawings
Fig. 1 is a system architecture diagram of an unmanned aerial vehicle control system based on an intelligent cabin system according to the present application;
fig. 2 is a flowchart of a method for controlling an unmanned aerial vehicle based on an intelligent cockpit system according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a determination process of an onboard device in one embodiment of the present application;
FIG. 4 is a schematic diagram of a control command generation process according to an embodiment of the present application;
fig. 5 is a schematic diagram of an uplink data transmission process in an embodiment of the present application;
fig. 6 is a schematic diagram of a downlink data assignment process according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It is to be understood that, unless the context clearly requires otherwise, throughout the description and the claims, the words "comprise", "comprising", and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is, what is meant is "including, but not limited to".
Furthermore, in the description of the present application, it is to be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In addition, in the description of the present application, "a plurality" means two or more unless otherwise specified.
Fig. 1 is an unmanned aerial Vehicle control system based on an intelligent cockpit system provided by the present application, including an intelligent cockpit system 100 and an unmanned aerial Vehicle 200, where the intelligent cockpit system 100 includes a cockpit domain controller 101, an information sensing system 102, and an unmanned aerial Vehicle flight control communication module 103, the cockpit domain controller 101 includes an IVI (In-Vehicle entertainment navigation) system control module, a multi-modal sensing decision module, and an unmanned aerial Vehicle control fusion module, and the information sensing system 102 includes an acousto-optic-electrical interaction module, an audio-video acquisition module, and a V2X module; the unmanned aerial vehicle flies to control communication module 103 intelligence passenger cabin system and is used for carrying out data interaction with unmanned aerial vehicle 200. With further reference to fig. 1, the drone control fusion module is connected to the drone flight control communication module 103 and the IVI system control module. The unmanned aerial vehicle control fusion module comprises an application scene setting unit, a flight path planning unit, an airborne equipment control unit and a data transmission control unit, wherein the application scene setting unit is used for setting a primary task scene of the unmanned aerial vehicle according to the acquired sensing information and determining airborne equipment which needs to be started under the primary task scene of the unmanned aerial vehicle according to the primary task scene; the airborne equipment control unit is used for generating a control instruction for the airborne equipment according to the acquired control requirement; the flight path planning unit is used for determining a reference point of the unmanned aerial vehicle in the flight process, specifically, determining the reference point of the unmanned aerial vehicle in the flight process under the primary task scene; the data transmission control unit is used for transmitting the reference point and the control instruction to the unmanned aerial vehicle flight control communication module and determining a data transmission mode between the unmanned aerial vehicle flight control communication module and the unmanned aerial vehicle. The information perception system 102 comprises an acoustic-optical-electric interaction module, an audio-video acquisition module and a V2X module, the IVI system control module is connected with the unmanned aerial vehicle control fusion module, the multi-modal perception decision module, the V2X module and the acoustic-optical-electric interaction module in the information perception system 102, and the multi-modal perception decision module is connected with the audio-video acquisition module in the information perception system 102. The application scene setting in the unmanned aerial vehicle control fusion module is mainly determined according to the perception information acquired from the information perception system, and the perception information comprises acousto-optic and electric information acquired through the acousto-optic and electric interaction module, audio and video information acquired through the audio and video acquisition module and V2X information acquired through the V2X module.
The acousto-optic-electric interaction module comprises a central control screen, other display screens, a sound box, an AR HUD and AR glasses, the acousto-optic-electric interaction module acquires acousto-optic-electric information and transmits the acousto-optic-electric information to the IVI system control module, the IVI system control module processes the acousto-optic-electric interaction information and extracts the use requirements, the unmanned aerial vehicle control fusion module acquires the use requirements related to the unmanned aerial vehicle from the IVI system control module, the application scene setting module determines the primary task scene of the unmanned aerial vehicle according to the acousto-optic-electric information, for example, a passenger in the vehicle calls a navigation map through the central control screen, the application scene setting module determines that the user needs to acquire a traffic scene at the moment according to the information, and then sets the current primary task scene of the unmanned aerial vehicle into the traffic scene.
The audio/video acquisition module specifically comprises a video acquisition unit and an audio acquisition unit, wherein the video acquisition unit is used for acquiring image data of passengers in the vehicle, the image data of the passengers in the vehicle is used for sensing sitting posture state information and/or face state information of the passengers in the vehicle and uploading the information to a multi-modal perception decision module, such as a Driver Monitoring System (DMS) infrared camera, the DMS infrared camera is used for acquiring image data of the Driver and sensing the sitting posture state information and/or the face state information of the Driver from the image data of the Driver, the audio acquisition unit is used for acquiring sound data of the passengers and uploading the sound data to the multi-modal perception decision module, the multi-modal perception decision module is provided with a decision algorithm, and the use requirements are intelligently output according to the configured decision algorithm by combining with some System parameters acquired from the IVI System through the acquired audio/video information, and the data are transmitted to an IVI system control module, and the unmanned aerial vehicle control fusion module determines the primary task scene of the unmanned aerial vehicle by the application scene setting module after acquiring the demand data related to the unmanned aerial vehicle.
V2X information is gathered to V2X module, V2X module can communicate with V2X car networking in order to obtain V2X information, send V2X information to multi-modal perception decision-making module through IVI system control unit, multi-modal perception decision-making module intelligent analysis goes out the user demand, transmit the passenger demand to unmanned aerial vehicle control fusion module through IVI system control module, unmanned aerial vehicle control fusion module confirms the corresponding one-level task scene of unmanned aerial vehicle according to the passenger demand.
Besides setting a primary task scene, an application scene setting unit also needs to determine a secondary task scene corresponding to a human machine in the primary task scene, a cockpit domain controller determines airborne equipment to be started by the unmanned aerial vehicle in the secondary task scene, the primary task scene comprises a plurality of secondary task scenes, and the airborne equipment corresponding to each secondary task scene is possibly different, so that the airborne equipment determined according to the secondary task scenes is closer to the use requirement.
The flight path planning unit acquires different reference points under different primary task scenes, and if the primary task scene is the traffic scene, the cockpit area controller determines a position point arranged on a road along the advancing direction of the vehicle as the reference point; if the primary task scene is an entertainment scene, the cockpit area controller determines the interest point as a reference point; if the primary task scene is a communication relay scene, the cockpit area controller determines the position of a signal tower near the advancing direction of the vehicle as a reference point, the flight path planning unit determines the flight path through the unmanned aerial vehicle flight control communication module after determining the reference point, and finally the flight path is displayed in a control screen of the vehicle in a navigation chart form and executed after waiting for confirmation.
The control method comprises the steps that before a control instruction is generated by the airborne equipment control unit, the control mode of the airborne equipment is determined firstly, the control mode comprises a full-automatic mode and a man-machine interaction mode, under the control mode, the control requirement for the airborne equipment is obtained, and the control instruction for the airborne equipment is generated according to the control requirement.
Unmanned aerial vehicle flies to control communication module and is set up the module that is used for controlling unmanned aerial vehicle in intelligent passenger cabin system, unmanned aerial vehicle flies to control communication module and generates the flight route according to the reference point to synthesize flight route and control command and obtain the uplink data, then will go upward data transmission to unmanned aerial vehicle by intelligent passenger cabin system, unmanned aerial vehicle is after receiving the uplink data, under control command's control, fly according to the flight route, airborne equipment gathers corresponding uplink data simultaneously. The intelligent cabin system transmits uplink data to the unmanned aerial vehicle through any one of the following two data transmission modes: the unmanned aerial vehicle flight control communication module sends the uplink data to the unmanned aerial vehicle in a radio mode; the intelligent cockpit system also comprises a V2X module, the unmanned aerial vehicle flight control communication module encrypts uplink data and sends the encrypted uplink data to the cockpit domain controller, the cockpit domain controller sends the encrypted uplink data to the unmanned aerial vehicle through the V2X module via a cellular network, and the data transmission mode is determined by the data transmission unit, the data transmission unit acquires the current distance between a vehicle provided with the intelligent cockpit system and the unmanned aerial vehicle and the number, position and signal quality of cellular signal towers between the vehicle and the unmanned aerial vehicle from the IVI system control module, and determines the data transmission mode according to the distance, the number, the position and the signal quality of the signal towers.
The unmanned aerial vehicle flies according to a flying route set by the unmanned aerial vehicle flight control communication module, airborne equipment of the unmanned aerial vehicle acquires corresponding downlink data under the control of a control instruction and transmits the downlink data to the unmanned aerial vehicle flight control communication module, the unmanned aerial vehicle flight control communication module decodes the downlink data and transmits the decoded downlink data to the data transmission control unit, and if the downlink data is image data, the data transmission control unit distributes the downlink data to an image application program and displays the image application program in human-computer interaction equipment of the intelligent cockpit system after the image application program processing; and if the downlink data is non-image data, the data transmission control unit allocates the downlink data to a corresponding non-image application program, and the downlink data is displayed in the human-computer interaction equipment after being processed by the non-image application program.
The system architecture or the implementation environment related to the unmanned aerial vehicle control method based on the intelligent cabin system provided by the application is introduced above. It should be understood that, the implementation process of the unmanned aerial vehicle control method based on the intelligent cabin system provided in the present application may include some or all of the above components and modules, which is not limited in the present application.
The following describes in detail the drone control method and system given to the smart cockpit system in the present application. It should be noted that, in the following embodiments, the same or similar contents as those in the scheme shown in fig. 1 may refer to the description in fig. 1, and are not repeated in the following.
Example one
The embodiment of the application provides an unmanned aerial vehicle control method based on an intelligent cabin system, as shown in fig. 2, the intelligent cabin system comprises a cockpit domain controller and an unmanned aerial vehicle flight control communication module, and the method comprises the following steps:
and S11, determining airborne equipment needing to be started by the unmanned aerial vehicle according to the acquired sensing information by the cockpit area controller, and generating a control instruction for the airborne equipment according to the acquired control requirement.
It can be known in combination with fig. 1 that, intelligence passenger cabin system information perception system, be used for gathering the perception information, information perception system includes reputation electricity interactive module, audio and video collection module and V2X module, reputation electricity interactive module is used for gathering reputation electricity interactive information, reputation electricity interactive module includes well accuse screen, other display screens, the stereo set, AR HUD and AR glasses, well accuse screen, other display screens, the stereo set, be provided with the button on AR HUD and the AR glasses, for example, physical button or virtual button, the user produces reputation electricity interactive information through pressing the button, interactive information type is definite in the reputation electricity interactive information, for example, the user calls map software through well accuse screen, show in well accuse screen and select whether, volume high low button in the stereo set, the mode of AR glasses selects the button and opens the button etc.. The audio/video acquisition module may specifically include a video acquisition unit and an audio acquisition unit, the video acquisition unit is configured to acquire image data of an occupant in the vehicle, sense sitting posture state information of the occupant in the vehicle and/or face state information of the occupant in the vehicle from the image data of the occupant in the vehicle, and upload the information to a multi-modal sensing decision module, such as a Driver Monitoring System (DMS) infrared camera, the DMS infrared camera is configured to acquire image data of the Driver, and sense sitting posture state information of the Driver and/or face state information of the Driver from the image data of the Driver, the audio acquisition unit is configured to acquire sound data of the occupant, information acquired by the audio/video acquisition module is more various than that acquired by the audio/video interaction module, and interaction with the occupant in the vehicle is more intelligent and diversified, for example, current emotion of the Driver acquired by the video acquisition unit is boring, the speaking content of the driver can be further collected through the audio collecting unit, or conversation can be carried out with the driver. The V2X information collected by the V2X module includes but is not limited to: the number, the position, the speed, the distance and the obstacle information of the vehicles around the vehicle, the road surface flatness, the speed limit value and other environment information related to the driving of the vehicle, such as the current road around the vehicle, and the V2X information are objective data which are not influenced by the in-vehicle members and are collected.
The cockpit area controller comprises an unmanned aerial vehicle control fusion module, a multi-module perception decision module and an IVI system control module, after the information perception system obtains the perception information, the perception information is transmitted to the IVI system control module or the multi-mode perception decision module in the cockpit area controller, wherein the sound-light-electricity interaction information is transmitted to the IVI system control module for analysis, the audio-video information is transmitted to the multi-mode perception decision module, the V2X information is transmitted to the multi-mode perception decision module through the IVI system control module for decision analysis, and finally the unmanned aerial vehicle control fusion module determines airborne equipment which needs to be started by the unmanned aerial vehicle.
After airborne equipment confirms, unmanned aerial vehicle control fusion module is according to the control demand that acquires to the control command of airborne equipment, and to different airborne equipment, the control demand is different, for example, to the camera, the control demand is the multi-angle shooting, also can be for enlargeing the shooting, and to the radar, the control demand is for obtaining with vehicle, barrier or pedestrian's distance on every side, and to the communication repeater, the control demand is for obtaining the better signal of quality.
And S12, determining a reference point of the unmanned aerial vehicle in the flight process by the cockpit area controller.
There is the route information that traveles of vehicle in the IVI system control module, unmanned aerial vehicle control fuses the module and obtains the route information that traveles of vehicle from IVI system control module, and confirm unmanned aerial vehicle reference point in flight process according to the route information that traveles, the reference point is the position point that the vehicle travel in-process passed through, the reference point can be specific point, also can be approximate scope, for example, can be bus station, bus station's position is comparatively accurate, also can be a building, or lake, be a great scope, as long as pass through can.
And S13, the cockpit area controller transmits the reference point and the control instruction to the unmanned aerial vehicle flight control communication module.
And S14, the unmanned aerial vehicle flight control communication module generates a flight route according to the reference point.
After control command and reference point all acquire, unmanned aerial vehicle control fuses the module with the reference point and control command transmits to unmanned aerial vehicle flight control communication module, and unmanned aerial vehicle flight control communication module generates the flight route according to the reference point, because the obstacle that unmanned aerial vehicle probably met in the flight process or which kind of route of adoption is comparatively convenient at all, sometimes artificially can not observe, consequently fly by unmanned aerial vehicle and control communication module and be more fit for the unmanned aerial vehicle flight according to the flight route of reference point determination.
S15, the unmanned aerial vehicle flight control communication module processes the flight route and the control instruction to obtain uplink data, and therefore the intelligent cabin system sends the uplink data to the unmanned aerial vehicle.
Unmanned aerial vehicle flies to control communication module and handles flight route and control command and obtain the data that go upward to make intelligence passenger cabin system with go upward data send to unmanned aerial vehicle, this data that go upward is that unmanned aerial vehicle can discern data, makes unmanned aerial vehicle can in time handle after accepting the data that go upward, and flies according to the flight route under control command's control, has improved the control efficiency to unmanned aerial vehicle.
In the application, the unmanned aerial vehicle and the intelligent cabin system are better combined on the basis of the cockpit area controller in the intelligent cabin system, on one hand, the cockpit area controller is improved, the cockpit area controller determines airborne equipment to be started by the unmanned aerial vehicle according to the acquired sensing information, the control instruction to the airborne equipment is generated according to the acquired control requirement, the airborne equipment of the unmanned aerial vehicle is started without manual setting, and the control instruction to the airborne equipment is manually set, so that the control to the unmanned aerial vehicle is more intelligent, the cockpit domain controller determines a reference point of the unmanned aerial vehicle in the flying process, the cockpit domain controller transmits the reference point and a control command to the unmanned aerial vehicle flight control communication module, and the unmanned aerial vehicle flight control communication module generates a flying route according to the reference point, so that the problem that the manually set flying route is possibly not suitable for the unmanned aerial vehicle to fly is avoided; on the other hand, set up unmanned aerial vehicle flight control communication module, fly to control communication module through unmanned aerial vehicle and handle flight route and control command and obtain the data that go upward, this data that go upward are that unmanned aerial vehicle can discern data for unmanned aerial vehicle can in time handle after accepting the data that go upward, and fly according to the flight route under control command's control, improved the control efficiency to unmanned aerial vehicle.
In a preferred embodiment, the cockpit domain controller determines the on-board equipment to be activated by the unmanned aerial vehicle according to the acquired sensing information, and includes: the cockpit domain controller sets a primary task scene of the unmanned aerial vehicle according to the acquired sensing information; the cockpit domain controller determines airborne equipment to be started by the unmanned aerial vehicle in the primary task scene; the cockpit area controller determines a reference point of the unmanned aerial vehicle in the flight process, and the method comprises the following steps: the control method comprises the steps that a cockpit domain controller determines a reference point of the unmanned aerial vehicle in the flight process under a primary task scene; the primary task scene comprises any one of a traffic scene, an entertainment scene and a communication relay scene.
In this embodiment, the determination of the primary task scenario is the first step of the mission planning of the unmanned aerial vehicle, that is, it needs to first determine which type of task the unmanned aerial vehicle needs to execute, and this step directly affects the determination of the airborne equipment and the selection of the reference point, for example, the traffic scenario, which can be used for displaying traffic scenes, guiding driving processes, and exploring routes, and in the traffic mode, the intelligent cabin system mainly acquires traffic information through the unmanned aerial vehicle, so the main task of the unmanned aerial vehicle is to acquire traffic videos and distance information with surrounding vehicles, pedestrians, or obstacles, and in the entertainment scenario, the intelligent cabin system mainly acquires image information from different viewing angles through the unmanned aerial vehicle, so the main task of the unmanned aerial vehicle is image information, in the communication relay mode, the intelligent cabin system mainly acquires signals through the unmanned aerial vehicle, therefore, the main task of the drone is to act as a signal relay platform.
The application scene setting unit in the unmanned aerial vehicle control module in the application comprises a primary task scene setting sub-unit, the information perception system acquires perception information and transmits the perception information to the cockpit domain controller, and the cockpit domain controller sets a primary task scene of the unmanned aerial vehicle according to the acquired perception information, such as:
the system comprises an acoustic-optical-electric interaction module, an IVI system control module, a primary task scene setting sub-unit and an unmanned aerial vehicle control sub-unit, wherein the acoustic-optical-electric interaction module comprises a central control screen, other display screens, a sound box, an AR HUD and AR glasses, passengers in the vehicle call a navigation APP through the central control screen, the navigation APP determines that the congestion condition needs to be checked through the navigation APP, an electric signal is generated at the moment, the IVI system control module processes the electric signal, the primary task scene setting sub-unit determines that the user needs to acquire the traffic road condition at the moment according to the electric signal, and then the current primary task scene of the unmanned aerial vehicle is set to be a traffic road condition mode; after a passenger in the vehicle wears the AR glasses, the AR glasses are driven to move through head steering, an electric signal is generated at the moment, the IVI system control module processes the electric signal, then the primary task scene setting subunit determines that the user needs to acquire shooting information of other angles at the moment according to the electric signal, and then the current primary task scene of the unmanned aerial vehicle is set to be an entertainment exploration mode; the passengers in the vehicle determine the use requirements through voice interaction, for example, the signal quality of a mobile phone needs to be improved, the IVI system control module processes the signal according to the signal, and then the primary task scene setting module subunit sets the current primary task scene of the unmanned aerial vehicle into a communication relay mode according to the sound signal.
The audio and video acquisition module can specifically comprise a video acquisition unit and an audio acquisition unit, wherein the video acquisition unit is used for acquiring image data of passengers in the vehicle, the image data of the passengers in the vehicle is used for sensing sitting posture state information and/or face state information of the passengers in the vehicle and uploading the information to the multi-modal perception decision module, the multi-modal perception decision module is provided with a decision algorithm, the multi-modal perception decision module is used for intelligently analyzing the demands of the passengers according to the configured decision algorithm by combining some system parameters acquired from the IVI system, the demands are processed by the IVI system control module according to the demands, then a corresponding first-stage character scene is determined by a first-stage task scene setting module subunit, for example, the arms of a driver lean on a vehicle window to show the boredom on the face, the audio and video acquisition module is used for analyzing the driver to wait in a anxious state at the moment after acquiring the video information, inquiring whether a driver needs to check the front road condition or not, replying the requirement by the driver, transmitting the information to a multi-modal perception decision module, analyzing the requirement of a user as checking the traffic road condition, uploading the requirement to an IVI system control module, and acquiring the requirement as needing to use the unmanned aerial vehicle by an unmanned aerial vehicle control fusion module, so that a primary task scene setting module subunit determines that the current scene is a traffic scene according to the requirement for checking the traffic road condition; the video acquisition unit gathers the passenger of copilot and is playing the cell-phone always, can analyze out that the passenger is boring this moment, can ask whether need look over the scenery on the way, the passenger replies needs, and with these information transmission to at most the modal perception decision module, it explores for the amusement to analyze out user's demand, and upload to IVI system control module, unmanned aerial vehicle control fuses the module and acquires this demand and uses unmanned aerial vehicle as needs, consequently, one-level task scene sets for the module subunit and confirms that present is amusement class scene according to looking over this demand of traffic road conditions.
The V2X module collects V2X information, the V2X module can communicate with the V2X vehicle networking to obtain V2X information, the V2X information is sent to the multi-modal perception decision module through the IVI system control unit and then transmitted to the primary task scene setting module through the IVI system control module, and the V2X information includes but is not limited to: the number, the position, the speed, the distance and the obstacle information of the vehicles around the vehicle on the road, and the road surface evenness, the speed limit value and other environment information related to vehicle driving of the vehicle around the vehicle are acquired by the V2X module, for example, when the road surface evenness of the current road is poor, the requirement for route exploration can be sent to the unmanned aerial vehicle control fusion module through the multi-mode perception decision module, and the primary task scene setting module subunit sets the primary task scene to be the traffic road condition according to the requirement. After the primary task scene is determined, an airborne equipment control unit in an unmanned aerial vehicle control fusion module determines airborne equipment which needs to be started by the unmanned aerial vehicle in the primary task scene, in a traffic scene, corresponding airborne equipment is a camera and a radar, and pictures can be shot through the camera of the unmanned aerial vehicle, so that passengers in the vehicle can visually feel traffic reality information, the traffic reality is obtained, the traffic situation can be used for judging the direction of the vehicle in a guiding mode, the vehicle can also be used for developing a new route for the passengers in the vehicle, distance information between the passengers and surrounding vehicles or obstacles can also be obtained through the radar of the unmanned aerial vehicle, and safe driving of the driver is facilitated through the distance information; in an entertainment scene, the corresponding onboard equipment is a camera, more passengers in the vehicle want to acquire more image information, can shoot through the camera of the unmanned aerial vehicle, can be used for vehicle-mounted shooting, and can also be used for outdoor exploration shooting; under the communication relay type scene, the corresponding airborne equipment is a communication relay, and when the current communication signal is weak, the unmanned aerial vehicle is started to serve as the communication relay, so that the communication signal quality is improved.
In the application, under different primary task scenes, reference points through which the unmanned aerial vehicle passes are different, for example, in a traffic scene, a user more wants to acquire traffic conditions in a road ahead, so that the unmanned aerial vehicle flies in the traveling direction of the vehicle as much as possible, in an entertainment scene, the user more wants to acquire scenery with more view angles, so that the flying range of the unmanned aerial vehicle is wider, and in a communication relay scene, the user more wants to improve the communication quality, so that the unmanned aerial vehicle preferably flies in accordance with the position of a signal tower.
In a preferred embodiment, the cockpit domain controller sets a primary task scene of the unmanned aerial vehicle according to the acquired perception information, including: an IVI system control module or a multi-mode perception decision module in the cockpit area controller determines the use requirement of the unmanned aerial vehicle according to perception information; an unmanned aerial vehicle control fusion module in the cockpit area controller sets a primary task scene according to use requirements. In this embodiment, as shown in fig. 3, for the setting process of the primary task scene, the IVI system control module or the multi-modal perception decision module in the cockpit domain controller determines the usage requirement of the unmanned aerial vehicle according to the perception information, specifically, the information perception system obtains the perception information and then transmits the obtained perception information to the IVI system control module or the multi-modal perception decision module in the cockpit domain controller, wherein the sound-light-electricity interaction information is transmitted to the IVI system control module for analysis, the IVI system control module analyzes the usage requirement according to the sound-light-electricity interaction module, the unmanned aerial vehicle control fusion module obtains the usage requirement related to the unmanned aerial vehicle from the IVI system control module, the primary task scene setting sub-unit determines the primary task scene according to the usage requirement, the audio-video information is transmitted to the multi-modal perception decision module, and the V2X information is transmitted to the multi-modal perception decision module through the IVI system control module for, the multi-modal perception decision module analyzes the use requirement according to the audio and video information or the V2X information and transmits the use requirement to the IVI system control module, the unmanned aerial vehicle control fusion module acquires the use requirement related to the unmanned aerial vehicle from the IVI system control module, and the primary task scene setting subunit determines a primary task scene according to the use requirement.
For example, the passenger in the vehicle directly says that the front road condition needs to be checked through the voice interaction of the central control screen, the IVI system control module determines the use requirement as the traffic requirement according to the voice interaction, the front road condition needs to be checked, the unmanned aerial vehicle control fusion module acquires the requirement, the primary task scene setting subunit determines that the user needs to acquire the traffic road condition at the moment according to the requirement, and then the current primary task scene of the unmanned aerial vehicle is set as the traffic scene.
The method comprises the steps that an audio and video acquisition module acquires conversations of members in a vehicle, the conversation content is found to be poor mobile phone signals, the content is uploaded to a multi-modal perception decision module, the use requirement is analyzed to improve the quality of the mobile phone signals, an unmanned aerial vehicle control fusion module obtains the requirement, then a primary task scene setting subunit determines that a user needs to improve the quality of the mobile phone signals at the moment according to the requirement, and then the current primary task scene of the unmanned aerial vehicle is set to be a communication relay scene.
When the road surface smoothness of the current road is poor is collected by the V2X module, the content is sent to the multi-mode perception decision module, other routes need to be changed through analysis of the multi-mode perception decision module, the requirement for route exploration is sent to the IVI system control module, the unmanned aerial vehicle control fusion module obtains the requirement, then the user needs to conduct route exploration according to the requirement determined by the primary task scene setting sub-unit, and then the current primary task scene of the unmanned aerial vehicle is set as a traffic scene.
In this application, the one-level task scene of unmanned aerial vehicle will be confirmed earlier before confirming control command and reference point, airborne equipment and reference point according to one-level task scene definite unmanned aerial vehicle to make unmanned aerial vehicle's control command and flight route different under the different application scenes, thereby divide the user to unmanned aerial vehicle's demand, laminate user demand more to unmanned aerial vehicle's control.
In a preferred embodiment, one or more secondary mission scenarios are provided in each primary mission scenario, and the cockpit domain controller determines the onboard equipment to be activated by the drone in the primary mission scenario, including: the cockpit domain controller determines a secondary task scene corresponding to the unmanned aerial vehicle in the primary task scene; and the cockpit domain controller determines airborne equipment to be started by the unmanned aerial vehicle in a secondary task scene. In this embodiment, one or more secondary task scenes are further set in the primary task scene, the application scene setting unit further includes a secondary task scene setting subunit, the cockpit domain controller determines the secondary task scene corresponding to the unmanned aerial vehicle in the primary task scene, and the cockpit domain controller determines the airborne equipment to be started in the secondary task scene, specifically, the secondary task scene setting subunit determines the secondary task scene corresponding to the unmanned aerial vehicle in the primary task scene according to the use requirement, and determines the airborne equipment to be started in the secondary task scene.
The secondary task scene under the traffic scene is a traffic live-action exhibition mode, a driving guide mode and a route exploration mode, so that the requirements of traffic live-action exhibition, driving guide and route exploration of passengers in the vehicle can be met; the secondary task scene under the entertainment scene can be divided into an on-vehicle shooting mode, an outdoor exploration mode and a game mode, so that the entertainment requirements of shooting, exploration and games of passengers in the vehicle can be met.
The airborne equipment corresponding to the secondary task scene is more specific and better meets the use requirements of users, and the airborne equipment corresponding to the secondary task scene is shown in the following table 1:
TABLE 1
Figure DEST_PATH_IMAGE001
In this embodiment, in the a1 traffic reality display mode, the user wants to display the traffic reality more, and therefore, the corresponding onboard device of the unmanned aerial vehicle is a camera, and in the a2 driving guidance mode, the user wants to be able to assist driving of the vehicle more, and therefore, the corresponding onboard device of the unmanned aerial vehicle is a camera and a radar, and the camera captures images for the driver to watch, and the radar of the unmanned aerial vehicle acquires distance information with surrounding vehicles or obstacles, and the distance information helps the driver to drive safely, and in the a3 route exploration mode, the corresponding onboard device is a camera, and the onboard device used in the communication relay type scene is a communication relay by capturing multiple route images ahead for the driver to refer to.
The whole process of determining the onboard equipment is as shown in fig. 3, and firstly, a primary task scene of the unmanned aerial vehicle is determined, on one hand, according to the IVI system control module, the using requirements of the unmanned aerial vehicle are determined according to the sound-light-electricity interaction information, for example, contact information determined through APP interaction of an IVI system, non-interactive information determined through voice interaction of the IVI system, on the other hand, a multi-modal perception decision module determines the use requirement of the unmanned aerial vehicle according to audio and video information or V2X information, secondly, the primary scene setting subunit determines a primary task scene according to the use requirement, wherein the primary task scene is one of a traffic scene, an entertainment scene and a communication relay scene, secondly, the secondary scene setting subunit determines a secondary task scene according to the use requirement, and finally, and determining the corresponding airborne equipment of the unmanned aerial vehicle according to the secondary task scene in the table 1.
In a preferred embodiment, the determining, by the cockpit domain controller, the reference point of the drone during flight in the primary mission scenario includes: if the primary task scene is a traffic scene, the cockpit domain controller determines a position point arranged on a road along the advancing direction of the vehicle as a reference point; if the primary task scene is an entertainment scene, the cockpit area controller determines the interest point as a reference point; or if the primary task scene is a communication relay scene, the cockpit area controller determines the position of the signal tower near the advancing direction of the vehicle as a reference point.
In this embodiment, the flight path planning unit obtains a navigation path from a navigation system in the IVI system control module, and the reference points obtained in different primary task scenes are different:
in a traffic scene, the current position of the vehicle is taken as a starting point, a position point arranged on a road along the advancing direction of the vehicle is taken as a reference point, and when the unmanned aerial vehicle returns, the expected position of the vehicle is taken as a terminal point. Under the traffic road condition mode, the unmanned aerial vehicle needs to shoot the traffic road condition picture or obtain the distance information with surrounding barrier, vehicle and pedestrian, and unmanned aerial vehicle is as far as possible according to the navigation map flight of vehicle, consequently, through selecting the position point as the reference point on the road that the vehicle gos forward densely, send these reference points to unmanned aerial vehicle flight control communication module after, the flight route planning of formulating is less than the navigation route skew of vehicle.
In the entertainment scene, the current position of the vehicle is taken as a starting Point, a Point of Interest (POI) selected on a road along the advancing direction of the vehicle is taken as a reference Point, and the expected position of the vehicle is taken as a terminal Point when the unmanned aerial vehicle returns. Under the mode is explored in the amusement, unmanned aerial vehicle need shoot the interest point that passenger selected in the car, and unmanned aerial vehicle flight in-process as long as pass through the POI can, the degree of freedom is great, consequently, through after obtaining the POI point that passenger selected in the car, send these position points to unmanned aerial vehicle flight control communication module, the navigation route skew of the vehicle of the flight route planning of formulating is more nimble relatively.
In a communication relay scene, the current position of the vehicle is taken as a starting point, the position of a signal tower near the advancing direction of the vehicle is taken as a reference point, and when the unmanned aerial vehicle returns, the expected position of the vehicle is taken as a terminal point. Under the communication relay mode, unmanned aerial vehicle seeks the position that communication quality is good, and near the signal tower needs to pass through as far as possible in the unmanned aerial vehicle flight process, and the degree of freedom is less, consequently, through after obtaining near signal tower position of vehicle direction of advance, after sending these position points to unmanned aerial vehicle flight control communication module, the navigation route skew of the vehicle of the flight route planning of formulating is more nimble relatively.
After the flight path planning unit acquires the reference point, the reference point is transmitted to the unmanned aerial vehicle flight control communication module through the data transmission control unit, and the unmanned aerial vehicle flight control communication module is a module for controlling the unmanned aerial vehicle in an unmanned aerial vehicle system and is a black box provided by an unmanned aerial vehicle system supplier for planning the flight path of the unmanned aerial vehicle.
Different reference points are selected under different first-level task scenes in the application, so that the unmanned aerial vehicle flight control communication module generates different flight routes according to the reference points, the flight routes are different in deviation degree between the flight routes and the navigation routes, the flight routes are more reasonable and flexible to be formulated, the formulated routes are more suitable for user demands, and user experience is integrally improved.
In a preferred embodiment, generating control commands for the onboard equipment according to the acquired control requirements comprises: determining a control mode of airborne equipment, wherein the control mode comprises a full-automatic mode and a human-computer interaction mode; acquiring a control demand on airborne equipment in a control mode; and generating a control command for the onboard equipment according to the control requirement.
As shown in fig. 4, a generation process of a control command of an onboard device is shown, different onboard devices have different control modes, and for a camera, a main task is to acquire image data, which has high flexibility, and shooting parameters, such as a shooting mode, a shooting angle, and the like, can be changed according to actual conditions, and the setting of the shooting parameters can be determined by an occupant in a vehicle without causing a large influence on shooting precision, so that the control mode can be a man-machine interaction mode or a full-automatic mode, for a radar, data in the aspects of distance and the like are mainly acquired, which have high requirements on precision, and if the occupant in the vehicle sets the radar parameters, it is highly possible that the acquired data is inaccurate, thereby causing an error, and similarly, for a communication repeater, communication signals are mainly acquired, and the occupant in-vehicle occupant cannot set the parameters by himself or herself, therefore, only a fully automatic method is available for radar and communication repeaters. Under human-computer interaction, mainly IVI system control module determines the control mode according to the sound photoelectricity interactive information who gathers, unmanned aerial vehicle control fusion module acquires the relevant control mode of unmanned aerial vehicle, airborne equipment the control unit acquires the control demand under this control mode, or, the control mode is determined according to the audio and video information that gathers to multimode perception decision module, unmanned aerial vehicle control fusion module acquires the relevant control mode of unmanned aerial vehicle, airborne equipment the control unit acquires the control demand under this control mode.
Different control modes, the acquired control requirements for the airborne equipment are different, in the automatic mode, the control requirements are automatically generated according to a flight plan, and the camera, the radar and the communication repeater automatically adjust corresponding parameters according to the flight plan, so that control instructions are also automatically generated, for example, the camera automatically adjusts the shooting angle and the shooting mode when passing through a building A, a park B and a lake C according to the flight plan in an entertainment exploration mode, and the camera automatically adjusts the shooting angle and the shooting mode when passing through the building A, the park B and the lake C according to the flight plan, so that the control requirements are fully automatically generated, and in the human-computer interaction mode, the IVI system control module determines the control requirements according to the acousto-optic-electric interaction information acquired by the acousto-optic-electric interaction module, and the control requirements can be input type information acquired through a central control screen and other display screens, the vehicle-mounted equipment control unit can also be used for acquiring voice commands through a sound device and can also be used for acquiring rotation information through AR glasses, for example, a vehicle-mounted passenger can call a map APP through a central control screen and amplify a building A, so that the control demand needs to acquire details of the building A, the vehicle-mounted equipment control unit determines a control instruction for adjusting corresponding shooting parameters according to the demand so as to shoot more details of the building A, and for example, the vehicle-mounted passenger wears the AR glasses and sees the scenery of a park B from multiple angles through the AR glasses, the head rotates to drive the AR glasses to rotate, the control demand is used for adjusting the shooting angle of the camera at the moment, and the vehicle-mounted equipment control unit determines a control instruction for adjusting the corresponding shooting angle according to the demand so as to shoot the scenery of the park B from different angles.
Airborne equipment the control unit generates corresponding control command according to the control demand, for example, to the camera, the interior passenger of car transfers map APP through well accuse screen, enlarge mansion A, the control demand is for needs acquire the detail to mansion A so, control command is the magnification adjustment to the camera, for example again, the interior passenger of car wears AR glasses, see park B's scenery from the multi-angle through AR glasses, thereby the head rotates and drives AR glasses rotatory, control command is that the camera is rotatory.
Thereby through the definite control mode and confirm the control demand of user to unmanned aerial vehicle under control mode and generate airborne equipment's control command in this application for when guaranteeing to gather the required data of user, satisfied many-sided demand of user to unmanned aerial vehicle's airborne equipment's control, make unmanned aerial vehicle's use more pluralism more intelligent.
In a preferred embodiment, the processing, by the drone flight control communication module, of the flight route and the control instruction to obtain uplink data, so that the smart cabin system sends the uplink data to the drone, includes: the unmanned aerial vehicle flight control communication module synthesizes the reference point and the control instruction into uplink data; the intelligent cabin system transmits uplink data to the unmanned aerial vehicle through any one of the following two data transmission modes: the unmanned aerial vehicle flight control communication module sends uplink data to the unmanned aerial vehicle in a radio mode; the intelligent cockpit system further comprises a V2X module, the unmanned aerial vehicle flight control communication module encrypts uplink data and sends the encrypted uplink data to the cockpit domain controller, and the cockpit domain controller sends the encrypted uplink data to the unmanned aerial vehicle through the V2X module via a cellular network.
In this embodiment, as shown in fig. 5, the flight plan planning unit in the unmanned aerial vehicle control fusion module obtains the reference point corresponding to the primary task scene, and the airborne device control unit generates the control command according to the control demand, and these two aspects can be performed simultaneously, and then, the data transmission control unit sends the reference point and the control command to the unmanned aerial vehicle flight control communication module, and the unmanned aerial vehicle flight control communication module synthesizes the reference point and the control command into uplink data. The data transmission control unit also needs to determine a data transmission mode between the unmanned aerial vehicle flight control communication module and the unmanned aerial vehicle.
The data transmission mode of the intelligent cabin system and the unmanned aerial vehicle is any one of the following two modes:
the unmanned aerial vehicle flight control communication module sends uplink data to the unmanned aerial vehicle in a radio mode;
the intelligent cockpit system further comprises a V2X module, the unmanned aerial vehicle flight control communication module encrypts uplink data and sends the encrypted uplink data to the cockpit domain controller, and the cockpit domain controller sends the encrypted uplink data to the unmanned aerial vehicle through the V2X module via a cellular network.
In the first mode, adopt conventional radio remote control's mode, fly to control the antenna in communication module and the unmanned aerial vehicle through unmanned aerial vehicle promptly as signal transmission end and signal receiving terminal, realize the direct control to unmanned aerial vehicle, this is the most direct, traditional remote control mode, and concrete process is: after the unmanned aerial vehicle flies to control communication module and synthesizes upward data, fly to control communication module through unmanned aerial vehicle will go upward data conversion for unmanned aerial vehicle receivable radio signal can to make unmanned aerial vehicle fly according to flight path planning, unmanned aerial vehicle's airborne equipment makes corresponding adjustment under control command.
In the second mode, a cellular network remote control mode is adopted, and a vehicle-mounted V2X module is used as a signal sending end and a signal receiving end, and the specific process is as follows: uplink data are encrypted by the unmanned aerial vehicle flight control communication module, then transmitted to the IVI system control module by the data transmission control unit, and transmitted to the unmanned aerial vehicle by the IVI system control module through the cellular network.
In this application, intelligence passenger cabin system will go upward data transmission to unmanned aerial vehicle according to any kind of data transmission mode for the data transmission mode between unmanned aerial vehicle and the intelligent passenger cabin system is more diversified, has enlarged the scope that unmanned aerial vehicle used.
In a preferred embodiment, before the drone flight control communication module transmits the uplink data to the drone, the method further includes: the cockpit area controller obtains the current distance between a vehicle provided with the intelligent cabin system and the unmanned aerial vehicle and the number, the position and the signal quality of cellular signal towers between the vehicle and the unmanned aerial vehicle, and determines a data transmission mode according to the distance, the number, the position and the signal quality of the signal towers.
In this application, what kind of data transmission mode is adopted is determined by the cockpit domain controller, and the cockpit domain controller obtains the current distance between the vehicle of installing intelligent cockpit system and the unmanned aerial vehicle the honeycomb signal tower quantity, position, signal quality between vehicle and the unmanned aerial vehicle, specifically, the data transmission control unit in the unmanned aerial vehicle control fusion module obtains the current distance between the vehicle of installing intelligent cockpit system and the unmanned aerial vehicle and the honeycomb signal tower quantity, position, signal quality between vehicle and the unmanned aerial vehicle from the IVI system control module to confirm the data transmission mode according to distance and signal tower quantity, position, signal quality.
In the first mode, radio transmission has a high requirement on distance, and can generally ensure stable control within two kilometers, while cellular network transmission can realize long-distance transmission, if the distance exceeds two kilometers, the data transmission control unit automatically selects the second mode for transmission, and in the second mode, the requirements on the position, the number and the quality of a cellular signal tower are high, and when the position of the cellular signal tower is less, the cellular signal tower is within the radio transmission range, the data transmission control unit automatically selects the first mode for transmission. Only when the two transmission modes meet the conditions, man-machine interaction selection can be carried out, the user can be reminded of selection through the acousto-optic-electric interaction module, for example, the selection of which data transmission mode is displayed on the central control screen, and if only one data transmission mode meets the requirements, the data transmission control unit automatically selects the data transmission mode, so that the data transmission requirements are met. Furthermore, the data transmission is carried out in a cellular network mode, so that the unmanned aerial vehicle can be remotely controlled in a long distance or even in different places, namely the unmanned aerial vehicle can return to a specified position without returning to the vehicle, or the unmanned aerial vehicle in other places is summoned to fly to the vehicle, and the sharing of the unmanned aerial vehicle is realized.
In this application, confirm the data transmission mode through cockpit domain controller according to the vehicle of installation intelligence passenger cabin system and the current distance of unmanned aerial vehicle and the honeycomb signal tower quantity between vehicle and the unmanned aerial vehicle, position, signal quality for the mode of data transmission is confirmed more rationally, thereby has guaranteed the communication quality between unmanned aerial vehicle and the intelligence passenger cabin system, makes unmanned aerial vehicle can in time fly and gather data according to the data of going upward.
In a preferred embodiment, the method further comprises: the unmanned aerial vehicle flight control communication module acquires downlink data sent by the unmanned aerial vehicle, wherein the downlink data is obtained based on data acquired by airborne equipment of the unmanned aerial vehicle; the unmanned aerial vehicle flight control communication module decodes the downlink data and transmits the decoded downlink data to the cockpit area controller; if the downlink data are image data, the cockpit domain controller distributes the downlink data to an image application program, and the image application program processes the downlink data and displays the downlink data in the human-computer interaction equipment of the intelligent cockpit system; and if the downlink data are non-image data, the cockpit domain controller distributes the downlink data to corresponding non-image application programs, and the downlink data are processed by the non-image application programs and then displayed in the human-computer interaction equipment.
In this embodiment, the data transmission control unit has another function in addition to the transmission mode determination: and allocating the downlink data. As shown in fig. 6, after the airborne device in the unmanned aerial vehicle acquires corresponding downlink data, the unmanned aerial vehicle sends the downlink data to the unmanned aerial vehicle flight control communication module, and the unmanned aerial vehicle flight control communication module sends the downlink data to the data transmission control unit, the data transmission mode is the same as the data transmission mode of the uplink data, if the uplink data is sent in a radio mode, the unmanned aerial vehicle sends the downlink data to the unmanned aerial vehicle flight control communication module in a radio mode, and the unmanned aerial vehicle flight control communication module decodes the downlink data; if the uplink data are sent by the cellular network, the unmanned aerial vehicle sends the downlink data to a V2X module of the vehicle through the cellular network, the downlink data are transmitted to an unmanned aerial vehicle flight control communication module by an IVI system control module, and the downlink data are decoded by the unmanned aerial vehicle flight control communication module, namely, the downlink data can be sent by only one mode.
After decoding the downlink data, the unmanned aerial vehicle flight control communication module transmits the downlink data to the data transmission control unit in the cockpit area controller again, and the data transmission control unit dispatches the data.
In unmanned aerial vehicle control fuses the module, the APP that needs to use downlink data in the IVI system has all registered in downlink data dispatch subunit, and there is inside Application Program Interface (API), after unmanned aerial vehicle control fuses the module and receives downlink data, downlink data dispatch subunit is according to the data dispatch agreement, dispatch downlink data to each APP through the API, realize the synthesis and the application to downlink data by the APP, the APP then specifically realizes the synthesis and the application to required unmanned aerial vehicle perception data, and through the man-machine interaction equipment in the IVI call system, finally realize man-machine interaction with the mode of seeing and hearing.
The downlink data is mainly divided into image data and non-image data, and the image data and the non-image data can be assigned simultaneously, wherein the image data mainly comes from the shooting of an airborne camera device on a route along a flight planning route, for the image data, a data transmission control unit in a cockpit domain controller distributes the downlink data to an image application program, the image application program processes the downlink data and displays the downlink data in a man-machine interaction device of an intelligent cockpit system, such as a central control screen, other screens or AR glasses, through the image data, on one hand, the downlink data is beneficial to driving, the intuitive sensing force of a driver and a passenger on road conditions is improved, a photographed road condition picture is beneficial to understanding the current traffic condition with the driver, the anxiety mood of the driver is appealed, on the other hand, the entertainment is improved, the outside view of the vehicle is watched by the view angle of an unmanned aerial vehicle, and the experience of the passenger on seeing the scenery in different vehicles can be, by combining with AR technology, the real scene outside the automobile is substituted into the game scene inside the automobile, so that the passengers can obtain immersive extraordinary driving experience; the non-image data mainly comes from the detection of a vehicle ahead by an airborne millimeter wave radar, for the non-image data, a data transmission control unit in a cockpit domain controller distributes downlink data to a corresponding non-image application program, the downlink data is displayed in a human-computer interaction device after being processed by the non-image application program, for example, the non-image data displays a synthesis result on a central control screen or an AR-HUD through an IVI system by a central navigation APP, the detectable distance of the intelligent vehicle is prolonged through the non-image data, the vehicle can conveniently sense other vehicle positions beyond the detection range of the vehicle sensor through the calculation of the relative positions of a target, an unmanned aerial vehicle and the vehicle, on the other hand, the detectable coverage range of the vehicle is expanded, and the air visual angle detection can carry out dead angle-free coverage on a sensing blind angle generated by the vehicle sensor due to the road terrain, particularly for vehicles travelling over rough road terrain, such as vehicles travelling over curves.
In the application, the image data and the non-image data are respectively processed through the cockpit domain controller, so that the processing of the downlink data of the unmanned aerial vehicle by the intelligent cockpit system is more efficient, the downlink data are timely displayed after being processed, and the user experience is improved.
Example two
The embodiment provides an unmanned aerial vehicle control system based on an intelligent cabin system, which comprises a cockpit area controller and an unmanned aerial vehicle flight control communication module; the cockpit area controller and the unmanned aerial vehicle flight control communication module are used for executing the method in the embodiment.
It is understood that, in this embodiment, the same or similar contents as those in the solutions shown in fig. 1 to fig. 6 may refer to detailed descriptions in the solutions shown in fig. 1 to fig. 6, and are not described again here.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (9)

1. An unmanned aerial vehicle control method based on an intelligent cabin system is characterized in that the intelligent cabin system comprises a cockpit area controller and an unmanned aerial vehicle flight control communication module, and the method comprises the following steps:
the cockpit domain controller sets a primary task scene of the unmanned aerial vehicle according to the acquired perception information; the cockpit domain controller determines airborne equipment needing to be started by the unmanned aerial vehicle in the primary task scene, and generates a control instruction for the airborne equipment according to the acquired control requirement; the first-level task scene is any one of a traffic scene or a communication relay scene;
the cockpit domain controller determines a reference point of the unmanned aerial vehicle in the flight process under the primary task scene;
the cockpit area controller transmits the reference point and the control instruction to the unmanned aerial vehicle flight control communication module;
the unmanned aerial vehicle flight control communication module generates a flight route according to the reference point;
the unmanned aerial vehicle flight control communication module processes the flight route and the control instruction to obtain uplink data, so that the intelligent cabin system sends the uplink data to the unmanned aerial vehicle.
2. The method of claim 1, wherein the cockpit domain controller sets a primary mission scenario of the drone according to the obtained awareness information, comprising:
an IVI system control module or a multi-mode perception decision module in the cockpit area controller determines the use requirement of the unmanned aerial vehicle according to the perception information;
and an unmanned aerial vehicle control fusion module in the cockpit domain controller sets the primary task scene according to the use requirement.
3. The method of claim 2, wherein one or more secondary mission scenarios are provided under each of the primary mission scenarios, and wherein the cockpit domain controller determines on-board equipment to be enabled by the drone under the primary mission scenario comprises:
the cockpit domain controller determines a secondary task scene corresponding to the unmanned aerial vehicle under the primary task scene;
and the cockpit domain controller determines airborne equipment to be started by the unmanned aerial vehicle in the secondary task scene.
4. The method of claim 2, wherein the cockpit domain controller determining a reference point of the drone during flight in the primary mission scenario comprises:
if the primary task scene is the traffic scene, the cockpit domain controller determines a position point arranged on a road along the advancing direction of the vehicle as the reference point;
alternatively, the first and second electrodes may be,
and if the primary task scene is the communication relay scene, the cockpit area controller determines the position of a signal tower near the advancing direction of the vehicle as the reference point.
5. The method according to claim 1, wherein generating control instructions for the onboard equipment according to the acquired control requirements comprises:
determining a control mode of the airborne equipment, wherein the control mode comprises a full-automatic mode and a man-machine interaction mode;
acquiring a control demand for the airborne equipment in the control mode;
and generating a control command for the onboard equipment according to the control requirement.
6. The method of claim 1, wherein the processing of the flight route and the control command by the drone flight control communication module to obtain uplink data, so that the smart cabin system sends the uplink data to the drone, comprises:
the unmanned aerial vehicle flight control communication module synthesizes the reference point and the control instruction into the uplink data;
the intelligent cabin system transmits the uplink data to the unmanned aerial vehicle through any one of the following two data transmission modes:
the unmanned aerial vehicle flight control communication module sends the uplink data to the unmanned aerial vehicle in a radio mode;
the intelligent cockpit system further comprises a V2X module, the unmanned aerial vehicle flight control communication module encrypts the uplink data and sends the encrypted uplink data to the cockpit area controller, and the cockpit area controller sends the encrypted uplink data to the unmanned aerial vehicle through a cellular network via the V2X module.
7. The method of claim 6, wherein before the smart cockpit system sends the upstream data to the drone, the method further comprises:
the cockpit area controller obtains the current distance between a vehicle provided with the intelligent cockpit system and the unmanned aerial vehicle and the number, the position and the signal quality of cellular signal towers between the vehicle and the unmanned aerial vehicle, and determines the data transmission mode according to the distance, the number, the position and the signal quality of the signal towers.
8. The method according to any one of claims 1 to 7, further comprising:
the unmanned aerial vehicle flight control communication module acquires downlink data sent by the unmanned aerial vehicle, wherein the downlink data is obtained based on data acquired by airborne equipment of the unmanned aerial vehicle;
the unmanned aerial vehicle flight control communication module decodes the downlink data and transmits the decoded downlink data to the cockpit area controller;
if the downlink data are image data, the cockpit domain controller distributes the downlink data to an image application program, and the image application program processes the downlink data and displays the downlink data in the human-computer interaction equipment of the intelligent cockpit system;
and if the downlink data are non-image data, the cockpit domain controller distributes the downlink data to a corresponding non-image application program, and the downlink data are processed by the non-image application program and then displayed in the human-computer interaction equipment.
9. An unmanned aerial vehicle control system based on an intelligent cabin system is characterized by comprising a cockpit area controller and an unmanned aerial vehicle flight control communication module;
wherein the cockpit domain controller and the drone flight control communication module are configured to perform the method of any of claims 1-8.
CN202110625300.9A 2021-06-04 2021-06-04 Unmanned aerial vehicle control method and system based on intelligent cabin system Active CN113093811B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110625300.9A CN113093811B (en) 2021-06-04 2021-06-04 Unmanned aerial vehicle control method and system based on intelligent cabin system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110625300.9A CN113093811B (en) 2021-06-04 2021-06-04 Unmanned aerial vehicle control method and system based on intelligent cabin system

Publications (2)

Publication Number Publication Date
CN113093811A CN113093811A (en) 2021-07-09
CN113093811B true CN113093811B (en) 2021-09-07

Family

ID=76664523

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110625300.9A Active CN113093811B (en) 2021-06-04 2021-06-04 Unmanned aerial vehicle control method and system based on intelligent cabin system

Country Status (1)

Country Link
CN (1) CN113093811B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113433874B (en) * 2021-07-21 2023-03-31 广东工业大学 Unmanned ship integrated control management system based on 5G
CN115941514A (en) * 2022-09-29 2023-04-07 国网浙江省电力有限公司桐乡市供电公司 Unmanned aerial vehicle self-inspection system and method based on 5G slicing technology

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107544493A (en) * 2017-08-03 2018-01-05 汤庆佳 One kind is based on unpiloted intelligent navigation method and its system
CN110347182A (en) * 2019-07-23 2019-10-18 广汽蔚来新能源汽车科技有限公司 Auxiliary driving device, system, unmanned plane and vehicle
CN111240351A (en) * 2018-11-27 2020-06-05 宝沃汽车(中国)有限公司 Method and device for controlling unmanned aerial vehicle based on vehicle, storage medium and vehicle
CN112731926A (en) * 2020-12-21 2021-04-30 东风商用车有限公司 Automatic driving method and system for commercial vehicle in mining area
CN112793481A (en) * 2021-01-11 2021-05-14 宁波均联智行科技股份有限公司 Cabin active safety method and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170103659A1 (en) * 2015-10-10 2017-04-13 Xin Jin Method and system for implementing and enforcing a no-fly zone or prohibitive zone for drones and unmanned vehicles

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107544493A (en) * 2017-08-03 2018-01-05 汤庆佳 One kind is based on unpiloted intelligent navigation method and its system
CN111240351A (en) * 2018-11-27 2020-06-05 宝沃汽车(中国)有限公司 Method and device for controlling unmanned aerial vehicle based on vehicle, storage medium and vehicle
CN110347182A (en) * 2019-07-23 2019-10-18 广汽蔚来新能源汽车科技有限公司 Auxiliary driving device, system, unmanned plane and vehicle
CN112731926A (en) * 2020-12-21 2021-04-30 东风商用车有限公司 Automatic driving method and system for commercial vehicle in mining area
CN112793481A (en) * 2021-01-11 2021-05-14 宁波均联智行科技股份有限公司 Cabin active safety method and system

Also Published As

Publication number Publication date
CN113093811A (en) 2021-07-09

Similar Documents

Publication Publication Date Title
CN113093811B (en) Unmanned aerial vehicle control method and system based on intelligent cabin system
CN107147709B (en) Team vehicle management method, vehicle-mounted terminal and server
EP3629309A2 (en) Drone real-time interactive communications system
CN111480194B (en) Information processing device, information processing method, program, display system, and moving object
TW201815096A (en) Method, apparatus and device for communication between vehicle and unmanned aerial vehicle, and operating system
CN106394545A (en) Driving system, unmanned vehicle and vehicle remote control terminal
CN110109639A (en) Multi-screen interaction method and onboard system
US11165955B2 (en) Album generation apparatus, album generation system, and album generation method
CN109445464B (en) Flight control method and flight control system
US11119633B2 (en) Information processing device and method
CN111016905A (en) Interaction method and system for automatic driving vehicle and driving remote control terminal
CN109917799A (en) The manned unmanned plane during flying method and system gone sightseeing in the air applied to scenic spot
JPWO2019116784A1 (en) Information processing equipment, mobiles, control systems, information processing methods and programs
US20170374636A1 (en) Local network for the simultaneous exchange of data between a drone and a plurality of user terminals
JP2020080542A (en) Image providing system for vehicle, server system, and image providing method for vehicle
CN110100461A (en) The method that direct communication is connect is established with other people vehicles in motor vehicle ambient enviroment
US11467572B2 (en) Moving object operation system, operation signal transmission system, moving object operation method, program, and recording medium
JP2020126666A (en) Mobile body operation system, operation signal transmission system, mobile body operation method, program, and recording medium
EP4123563A1 (en) Shared content service provision system using remote control of shared autonomous driving device
CN113774984A (en) Immersive remote control system and method for excavator
US20230269537A1 (en) Information Processing Apparatus, Information Processing Method, And Video Sound Output System
JP7314944B2 (en) Information processing device, information processing method, and video/audio output system
KR101577399B1 (en) System for remote control based on internet
US20230362485A1 (en) Camera service system and method
JP7447399B2 (en) Video distribution control device, video distribution control method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant