WO2023241139A1 - Intelligent carriage control method, controller, intelligent carriage, and storage medium - Google Patents

Intelligent carriage control method, controller, intelligent carriage, and storage medium Download PDF

Info

Publication number
WO2023241139A1
WO2023241139A1 PCT/CN2023/081396 CN2023081396W WO2023241139A1 WO 2023241139 A1 WO2023241139 A1 WO 2023241139A1 CN 2023081396 W CN2023081396 W CN 2023081396W WO 2023241139 A1 WO2023241139 A1 WO 2023241139A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
millimeter wave
information
wave unit
augmented reality
Prior art date
Application number
PCT/CN2023/081396
Other languages
French (fr)
Chinese (zh)
Inventor
张永亮
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2023241139A1 publication Critical patent/WO2023241139A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/29Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area inside the vehicle, e.g. for viewing passengers or cargo
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/31Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles providing stereoscopic vision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/1407General aspects irrespective of display type, e.g. determination of decimal point position, display with fixed or driving decimal point, suppression of non-significant zeros
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems

Definitions

  • This application relates to the field of vehicle equipment control technology, and specifically relates to an intelligent cockpit control method, a controller, an intelligent cockpit and a computer storage medium.
  • the current vehicle head-up display device has weak processing capabilities, resulting in the vehicle head-up display projector being unable to process more complex data when providing augmented reality services to users, and can only provide simple traffic information. Projection has a single function and cannot further provide users with the driving information or entertainment content they need. Moreover, it is limited by the processing power of the vehicle head-up display device. When the head-up display projector provides users with augmented reality services, it runs slowly and affects user experience.
  • Embodiments of the present application provide an intelligent cockpit control method, a controller, an intelligent cockpit and a computer storage medium.
  • embodiments of the present application provide a smart cockpit control method, which is applied to the smart cockpit.
  • the smart cockpit includes a millimeter wave module, and the millimeter wave module is connected to a projection device.
  • the method includes: obtaining fusion Information data and communication data, the fused information data includes monitoring information obtained by scanning the interior of the smart cockpit by the millimeter wave module, and the communication data includes communication data of the interaction between the millimeter wave module and the inside and outside of the smart cockpit; according to the The information data and the communication data are fused to obtain augmented reality data; projection data is obtained after rendering processing of the augmented reality data; and the projection data is sent to the projection device for projection display.
  • embodiments of the present application provide an intelligent cockpit controller, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor.
  • the processor executes the computer program.
  • the program implements the intelligent cockpit control method described in any embodiment of the first aspect.
  • embodiments of the present application provide a smart cockpit, including the control system described in the second aspect. device.
  • embodiments of the present application provide a computer-readable storage medium that stores computer-executable instructions.
  • the computer-executable instructions are used to execute the smart cockpit control method described in any embodiment of the first aspect.
  • Figure 1 is a schematic diagram of a smart cockpit proposed by an embodiment of the present application.
  • Figure 2 is a functional block diagram of a smart cockpit proposed by another embodiment of the present application.
  • Figure 3 is an extended functional block diagram of a smart cockpit proposed by another embodiment of the present application.
  • Figure 4 is another extended functional block diagram of the smart cockpit proposed by another embodiment of the present application.
  • Figure 5 is a system architecture diagram of a smart cockpit proposed by another embodiment of the present application.
  • Figure 6 is a method flow chart of a smart cockpit control method proposed by another embodiment of the present application.
  • Figure 7 is a flow chart of a method in which the first millimeter wave unit obtains fusion information data in the smart cockpit control method proposed by another embodiment of the present application;
  • Figure 8 is a flowchart of a method corresponding to the second millimeter wave unit acquiring fusion information data in the smart cockpit control method proposed by another embodiment of the present application;
  • Figure 9 is a flow chart of a method for performing federated calculations by the first millimeter wave unit and the second millimeter wave unit in the smart cockpit control method proposed by another embodiment of the present application;
  • Figure 10 is a structural diagram of an intelligent cockpit controller proposed by another embodiment of this application.
  • the functional modules are divided in the system schematic diagram and the logical sequence is shown in the flow chart, in some cases, the module division in the system or the order in the flow chart may be different.
  • the steps shown or described The terms first, second, etc. in the description, claims, and above-mentioned drawings are used to distinguish similar objects, and are not necessarily used to describe a specific order or sequence.
  • This application discloses an intelligent cockpit control method, a controller, an intelligent cockpit and a computer storage medium.
  • fusion information data is obtained.
  • the fusion information data includes scanning of the interior of the intelligent cockpit by a millimeter wave module.
  • augmented reality data is obtained based on the fusion information data, and the projection data is obtained after rendering the augmented reality data.
  • the millimeter wave module is used as the augmented reality generator, the augmented reality data is obtained based on the multi-sensor fusion data, and the augmented reality data is obtained.
  • the data is rendered and processed to more quickly and stably provide users with augmented reality services through the vehicle-mounted augmented reality head-up display projector, display driving information and entertainment content to users according to user needs, and provide users with a more intelligent interactive experience.
  • FIG. 1 is a schematic diagram of an intelligent cockpit proposed by an embodiment of the present application.
  • the embodiment of the present application provides an intelligent cockpit.
  • a smart cockpit proposed by an embodiment of the present application includes: a head-up display screen projector 110 , a first millimeter wave unit 106 , a second millimeter wave unit 102 and a driver monitoring system camera 107 ,
  • the head-up display projector 110 is an AR-3D-HUD (Augmented Reality 3D Head Up Display) projector
  • the driver monitoring system camera 107 is a DMS (Driver monitoring system) camera
  • the augmented reality glasses 103 They are AR (Augmented Reality) glasses.
  • the smart cockpit proposed by an embodiment of the present application also includes augmented reality glasses 103, a handle 104 and a steering wheel 105, wherein the head-up display projector 110 includes a DLP digital light processing optical engine 112 (Digital Light Processing, digital Light processing optical machine), non-adjustable curved lenses 111 and adjustable curved lenses 113; among them, the first millimeter wave unit 106 is placed in the cockpit facing the driver and close to the head-up display projector 110, and is called a front millimeter wave module , the first millimeter wave unit 106 has a beamforming controllable millimeter wave antenna array 101.
  • DLP digital light processing optical engine 112 Digital Light Processing, digital Light processing optical machine
  • non-adjustable curved lenses 111 non-adjustable curved lenses 111 and adjustable curved lenses 113
  • the first millimeter wave unit 106 is placed in the cockpit facing the driver and close to the head-up display projector 110, and is called a front millimeter wave module
  • the first millimeter wave unit 106 and the second millimeter wave unit 102 monitor the driver's breathing and heartbeat by detecting the phase changes of frequency modulated continuous wave signals in a specific range of values caused by the target's tiny vibrations, and through the target object.
  • Parameters such as distance, speed, horizontal angle and pitch angle can be used to perform 4D imaging processing on the driver's dangerous actions, distractions and fatigue driving conditions to assist the driver in safe driving.
  • the first millimeter wave unit 106 and the second millimeter wave unit 102 have stronger computing power, the first millimeter wave unit 106 and the second millimeter wave unit 102 replace the conventional heads-up display projector 110.
  • the head-up display projector drive board serves as the AR algorithm processing center.
  • the second millimeter wave unit 102 is placed inside the top of the cabin and is connected to the antenna array 101 outside the roof, which is called a top-mounted millimeter wave module.
  • the overhead millimeter-wave module has a millimeter-wave antenna array 101 that can be controlled by beamforming both inside the cabin and outside the roof. On the one hand, it is used to monitor the breathing and heartbeat of passengers and conduct 4D imaging monitoring of passengers' behaviors.
  • the second millimeter wave unit has streaming media content distribution capabilities. It can communicate with the road side unit (Road side unit), Internet of Vehicles and mobile Internet through the antenna array 101 on the outside of the car roof for V2X (vehicle to everything, vehicle to the outside world) communication.
  • the front-mounted millimeter wave module and the top-mounted millimeter wave module communicate through millimeter waves in the typical frequency band above 60GHz.
  • DMS cameras are used for infrared observation of drivers and eye tracking of people in smart cockpits. At the same time, DMS cameras are infrared cameras that can work better at night.
  • Figure 2 is a functional block diagram of a smart cockpit proposed by another embodiment of the present application. This embodiment of the present application provides a smart cockpit.
  • the infotainment host is a virtual functional module, and the model display function of the infotainment host is provided by the millimeter wave module 1 (i.e.
  • the first One millimeter wave unit 106) or millimeter wave module 2 is responsible for the millimeter wave module 1 or millimeter wave module 2 to obtain the map navigation information from the positioning module, the DMS camera information from the DMS camera, and the vehicle body Vehicle body speed, acceleration and steering angle information from the sensor, ADAS information from the ADAS (Advanced Driving Assistance System) module, and 4D imaging and respiration, heart rate and other IMS (In -carbin monitoring system: includes DMS and OMS (occupancy monitoring system, passenger monitoring system) information.
  • millimeter wave module 1 or millimeter wave module 2 is used as an AR generator for data fusion algorithm processing to extract effective AR Information is fed into the display model.
  • the AR generator of the infotainment host is the first millimeter wave unit 106, and the rendered AR display model information is sent to the head-up display projector via LVDS (Low-Voltage Differential Signaling) 110, in the head-up display projector 110 through the head-up display projector drive board, image generation unit, optical/structural components (the first two correspond to the digital light processing optical engine 112 in Figure 1, the latter corresponds to the non-adjustable The curved lens 111 and the adjustable curved lens 113) project into the human eye.
  • LVDS Low-Voltage Differential Signaling
  • the first two correspond to the digital light processing optical engine 112 in Figure 1
  • the latter corresponds to the non-adjustable
  • the curved lens 111 and the adjustable curved lens 113 project into the human eye.
  • the millimeter wave module 1 itself has to process the 4D imaging point cloud, so it has powerful computing power and can be used for AR generation in the infotainment host.
  • the second millimeter wave unit 102 not only has powerful computing power and powerful transmission capabilities, not only can be used as an AR renderer, but also has powerful data caching capabilities to store data copies to meet user needs nearby, and at the same time borrow Content updates are realized through V2X communication with external network spaces.
  • the millimeter wave module can sense while communicating.
  • the first millimeter wave unit 106 and the second millimeter wave unit 102 can perform communication between the two millimeter wave modules and with the head-up display projector. It can perform 4D imaging and heart rate and respiration detection while communicating before 110, and can provide full coverage for IMS including DMS and OMS.
  • the second millimeter wave unit 102 implements external Internet of Vehicles via V2X communication for XR (extended-range, extended range) display in the car cockpit, audio-visual entertainment, real-time navigation equipment, and content of mobile terminals such as user mobile phones.
  • XR extended-range, extended range
  • the original system processor performs 4D imaging of people or objects in the car cockpit and the breathing and heartbeat monitoring data of the drivers and passengers in the car cockpit are shared to the roadside RSU (Road Side Unit, Road Side Unit) when safety permits.
  • Side unit Internet of Vehicles computing center and mobile Internet, associated individual or group end users can remotely understand the situation in the car in real time.
  • the large bandwidth and low latency of the millimeter wave module itself can improve AR rendering capabilities, thereby improving the image quality of the head-up display projector.
  • Figure 3 is an extended functional block diagram of an intelligent cockpit control proposed by another embodiment of the present application.
  • An embodiment of the present application provides an intelligent cockpit.
  • car head-up display projectors are only used for AR display based on real-time navigation map information and vehicle condition information while the car is driving.
  • head-up display Screen projection can also be used for user entertainment, so this application provides a smart cockpit that uses a millimeter wave module as an augmented reality generator to obtain augmented reality data based on multi-sensor fusion data, and renders the augmented reality data to make it more Quickly and stably provide users with augmented reality services through the vehicle head-up display projector, which can provide users with entertainment and metaverse experiences during autonomous driving.
  • the car head-up display projector 110 is used for metaverse experience, which needs to enhance the user's body perception and spatial positioning.
  • AR glasses and a handle 104 are added at the driver's position for The user increases spatial positioning through AR glasses and performs azimuth control through the handle 104;
  • the millimeter wave module uses 4D imaging to perceive the user's head freedom, gestures, expressions, and body movements, and enhances the DMS camera's refined tracking of the eyeballs , ensuring accurate calculations in terms of user body perception and spatial positioning, enabling virtual objects to be placed in the real world and perfectly integrated with the real world.
  • FIG. 4 is another extended functional block diagram of an intelligent cockpit proposed by another embodiment of the present application. This embodiment of the present application provides an intelligent cockpit.
  • the smart cockpit in this application performs data interaction based on millimeter wave modules and associates with external networks, which can further improve the computing power of processing sensor information, and can make the information source object to be processed by the millimeter wave module It is further expanded to the aspects of taste, smell, touch, and consciousness to realize an interactive virtual reality combined with augmented reality and metaverse experience environment.
  • the smart cockpit of the present application adds windshield-based HUD hardware at the passenger position of the smart cockpit, which can provide passengers with augmented reality services more quickly and stably through the vehicle head-up display projector, and provide augmented reality services according to user needs. Passengers are shown driving information and entertainment content.
  • the smart cockpit of the present application includes an AR helmet, and the AR helmet provides users with an immersive experience through the on-board millimeter wave communication and environmental sensing functions of the smart cockpit.
  • the fused information data also includes sensing information
  • the augmented reality data obtained according to the fused information data includes obtaining the augmented reality data according to the monitoring information and the sensing information, wherein the sensing information includes at least one of the following:
  • the user eye tracking data obtained by the driver monitoring camera, the user keystroke data obtained through the handle or the spatial positioning data obtained by the augmented reality glasses can realize an interactive virtual and real combined augmented reality and metaverse experience environment.
  • the vehicle-mounted AR-HUD display generates a mirror of the real world based on digital twin technology.
  • digital twin technology By adding blockchain technology to the edge computing center of the millimeter wave module, the AR-HUD fusion processing of millimeter wave communication and perception information in the car cockpit is achieved.
  • HUD can well construct the economic, social, and social aspects between the virtual world and the real world. The environment that is closely integrated in terms of identity makes the car an important place for users to experience the metaverse.
  • the first millimeter wave unit and the second millimeter wave unit of the above-mentioned millimeter wave module may be two independent physical entities connected through millimeter wave communication, or they may be two virtual functional units within a common physical entity.
  • the two physical entity embodiments are implemented by being placed on the driver's seat and the roof of the car respectively, and the common physical entity embodiment is implemented by being placed inside the car's interior rearview mirror.
  • the millimeter wave common physical entity module is located at the rearview mirror in the car. This millimeter wave unit is used to provide an AR-HUD experience for the driver's position by monitoring the gestures, movements, and expressions of the user at this location, using only a millimeter.
  • the wave physical entity module can save smart cockpit costs and smart cockpit space.
  • the millimeter wave module continuously positions and dynamically controls the position of the monitored object very frequently.
  • AR-HUD gradually During the upgrade iteration process, the computing power requirements for a single millimeter wave module are very high.
  • the smart cockpit of the present application includes two physical millimeter wave units.
  • the division of labor between the two millimeter wave units is more conducive to ensuring overall performance.
  • the first millimeter wave unit 106 is also used as a The algorithm processing center of the AR generator, the second millimeter wave unit 102 has the function of an edge computing center and also serves as an edge CDN (content delivery network, content distribution network), allowing users to quickly obtain the required content or game data nearby, thereby making it more convenient Quickly and stably provide users with augmented reality services through the vehicle head-up display projector, displaying driving information and entertainment content to users according to their needs.
  • edge CDN content delivery network, content distribution network
  • FIG. 5 is a system architecture diagram of a smart cockpit proposed by another embodiment of the present application. This embodiment of the present application provides a smart cockpit.
  • each millimeter wave module can be coordinated with each other, replaced or calculated collaboratively, for specific
  • the function does not refer specifically to a specific millimeter wave module, but is uniformly expressed as millimeter wave module Xn.
  • the first millimeter wave unit 106 serves as an AR generator for fusing computing data, AR rendering data, V2X communication data, CDN communication forwarding data, DMS sensing data, OMS sensing data and edge computing center data to obtain multi-sensor
  • the fused data allows the second millimeter wave unit 102 to perform subsequent work based on the fused data, where each data in the above fused data interacts through the control entrance of the control bus.
  • control bus in Figure 5 also includes central control screen user UI control and user AR glasses and handle 104 control in addition to system communication control, system perception control and system calculation control.
  • system communication control system perception control and system calculation control.
  • system perception control system calculation control.
  • the millimeter wave module of this application is actually an extension of the millimeter wave radar.
  • a certain millimeter wave module also serves as an AR generator, and another millimeter wave module also serves as an edge computing center and CDN can improve users' in-vehicle AR-HUD experience by enriching the resources, capabilities and applications of smart cockpit communication and perception integration.
  • FIG. 6 is a method flow chart of a smart cockpit control method proposed by another embodiment of the present application.
  • the embodiment of the present application provides a smart cockpit control method, which is applied to the smart cockpit.
  • the smart cockpit includes a millimeter wave module, and the millimeter wave module is connected to a head-up display screen projector.
  • the method includes but is not limited to the following steps:
  • Step S610 obtain fusion information data and communication data.
  • the fusion information data includes monitoring information obtained by scanning the interior of the smart cockpit by the millimeter wave module, and the communication data includes communication data about the interaction between the millimeter wave module and the inside and outside of the smart cockpit;
  • Step S620 obtain augmented reality data based on the fused information data and communication data
  • Step S630 perform rendering processing on the augmented reality data to obtain projection data
  • Step S640 Send the projection data to the projection device for projection display.
  • the communication data includes at least one of the following: Internet of Vehicles data, content distribution network data or associated edge computing data acquired by the millimeter wave module inside and outside the smart cockpit, where the Internet of Vehicles data is the vehicle-mounted device through wireless communication Technology, vehicle dynamic information data such as driving road conditions and driving speed are obtained from external information network platforms.
  • the millimeter wave module implements corresponding data display and intelligent driving functions based on the Internet of Vehicles data; the content distribution network data is passed through the millimeter wave module through the exterior of the car roof.
  • the antenna array communicates between the vehicle and the outside world, and then obtains the communication data from the roadside unit.
  • the millimeter wave module can update local rendering resources in real time based on the content distribution network data to better render the augmented reality data;
  • Edge computing data is communication data obtained by the millimeter wave module from the edge computing center inside the car. During the rendering process of the millimeter wave module, the edge computing center can share part of the computing work and send the calculation results in the form of edge computing data. To the millimeter wave module, thereby improving the rendering efficiency of the millimeter wave module.
  • the projection device that is, the head-up display screen projector, obtains fusion information data and communication data through the millimeter wave module.
  • the fusion information data includes monitoring information obtained by scanning the interior of the smart cockpit by the millimeter wave module.
  • the communication data includes the millimeter wave module.
  • augmented reality data is obtained based on the fusion information data and communication data, rendering demand information is generated based on the augmented reality data, and the augmented reality data is generated based on the rendering demand information.
  • the data is rendered and processed to obtain the projection data, and the projection data is sent to the head-up display projector for projection display, and then the millimeter wave module is used as the augmented reality generator to obtain augmented reality data based on multi-sensor fusion data and communication data, and the augmented reality
  • the data is rendered and processed to more quickly and stably provide users with augmented reality and metaverse services through the vehicle head-up display projector, which can display driving information and entertainment content to users according to their needs, allowing users to quickly obtain the information or games they need. data and display it.
  • the method before obtaining the fusion information data, further includes scanning the cabin where the millimeter wave module is located to obtain a positioning calculation value, obtaining target location information based on the positioning calculation value, and obtaining monitoring information based on the target location information.
  • the millimeter wave module is provided with a millimeter wave antenna array.
  • the method further includes: obtaining the phase interference difference value of the millimeter wave antenna array, obtaining the target position information according to the phase interference difference value, and obtaining the target position information according to the phase interference difference value. Get monitoring information.
  • a dedicated phase interference array can be selected in the millimeter wave antenna array.
  • the phase interference difference of each element is used to determine the corresponding target position information of the driver or passenger; at the same time, the target position information corresponding to the driver or passenger can be determined by directly scanning the target object with millimeter waves and first performing positioning calculations. Positioning through millimeter waves can make the positioning The accuracy reaches centimeter level. Based on this level of precision positioning, various applications that require user location information in the smart cockpit can be realized, and then more accurately provide users with augmented reality services through the vehicle head-up display projector, and display driving information to users based on their location. and entertainment content.
  • the millimeter wave module after positioning the target and obtaining the target position information, performs phase control on the millimeter wave antenna array to perform beam alignment based on the target position information, and repeatedly performs beam scanning and pairing actions during the alignment process. , ensuring that the sensing data of the object monitored by the millimeter wave module is updated in real time when the object monitored by the millimeter wave module changes dynamically.
  • the millimeter wave module performs 4D imaging and breathing and heart rate monitoring respectively for the driver and the passenger.
  • the 4D imaging performs distance, speed, horizontal angle and pitch angle imaging by imaging the millimeter wave point cloud reflected by the monitoring target.
  • CSI Channel State Information
  • the millimeter wave module can perform 4D imaging and breathing and heart rate monitoring for the driver and passenger respectively in two ways: time-sharing sensing, concentrating on monitoring the target for a large period of time and then switching to breathing and breathing at regular intervals after 4D imaging of the target.
  • Heart rate monitoring among which, through regular monitoring during the time-sharing sensing process, the purpose of 4D imaging and respiration and heart rate monitoring for drivers and passengers is achieved respectively and computer resources are saved; Simultaneous sensing, due to the impact on the performance of millimeter wave modules through simultaneous sensing processing
  • millimeter wave modules are required to use advanced MIMO waveforms to design multiple antenna channels to work simultaneously, use software algorithms to achieve super-resolution of less than 1 degree, and simultaneously perform short-range, medium-range, and long-range concurrent multi-mode sensing to quickly and reliably 4D imaging and breathing and heart rate monitoring are performed respectively for drivers and passengers; among them, behavior classification processing is performed through artificial intelligence models to realize the user's head degree of freedom, gestures, body posture, expressions, etc. that are perceived in 4D imaging. Parameter estimation enables application-oriented pattern recognition.
  • the smart cockpit includes a DMS camera, and augmented reality data is obtained based on the fusion information data.
  • DMS monitoring of the DMS camera and millimeter wave module requires joint calibration, that is, the internal and external parameters of the camera and the external parameters of the radar need to be registered. , and then form DMS combined data.
  • data such as vehicle body, ADAS, and positioning modules are added to perform multi-sensor information fusion calculations on the data layer, function layer, and feature layer, and finally effective AR information is extracted based on the fused information; wherein, effective AR information is extracted based on the fused information AR information includes multi-sensor information fusion processing based on the system's preset multi-sensor fusion algorithm to obtain effective AR information.
  • Multi-sensor fusion algorithms include common multi-sensor fusion algorithms such as Kalman filtering and artificial neural networks.
  • the rendering requirements include information display model rendering requirements or streaming media playback content or game rendering library requirements.
  • the projection data obtained by rendering the augmented reality data according to the rendering requirement information includes, and the millimeter wave module performs enhancement processing according to the rendering requirements.
  • the display model corresponding to the real data performs real-time rendering or loading of streaming media content, and the game rendering library renders the display model corresponding to the augmented reality data.
  • augmented reality data is effective information based on AR.
  • the fused information data and communication data are obtained through the millimeter wave module.
  • the fused information data includes the monitoring information obtained by the millimeter wave module scanning the inside of the smart cockpit, and the communication data includes the Internet of Vehicles data obtained by the millimeter wave module inside and outside the smart cockpit. data, content delivery network data, and associated edge computing data.
  • Augmented reality data is obtained based on the fusion of information data and communication data, which greatly expands the source of augmented reality data, making the augmented reality data not only come from the car cockpit (in addition to the communication and perception fusion information data between two local millimeter wave units, There are also local edge computing 3D models, rendering data, content distribution data, etc.), which also come from outside the car cockpit and extend to the Internet of Vehicles and the mobile Internet (remote rendering data, remote content distribution data from the mobile Internet, and widely distributed data from the Internet of Vehicles) augmented reality models, etc.).
  • the rendering requirement information is generated based on the augmented reality data, and the augmented reality data is rendered according to the rendering requirement information to obtain the projection data.
  • the head-up display projector driver board can be dedicated to driving the optical part of the projector, thereby overcoming the limitations in some cases of the weak processing capabilities of the current vehicle-mounted augmented reality head-up display equipment, resulting in vehicle-mounted augmented reality heads-up displays.
  • the display projector has a single function and can only provide users with some simple traffic information, making the augmented reality head-up display projector further provide users with driving information and entertainment content they need.
  • Figure 7 is a method flow chart of a smart cockpit control method proposed by another embodiment of the present application.
  • Embodiments of the present application provide a smart cockpit control method.
  • the millimeter wave module includes a first millimeter wave unit and a second millimeter wave unit.
  • the first millimeter wave unit is connected to the second millimeter wave unit, and the first millimeter wave unit is connected to the second millimeter wave unit.
  • Projection device connection, methods include:
  • Step S710 control the first millimeter wave unit to obtain fusion information data, where the fusion information data includes the first monitoring information obtained by the first millimeter wave unit and the second monitoring information obtained by the second millimeter wave unit;
  • Step S720 control the first millimeter wave unit to obtain augmented reality data based on the fusion information data and the communication data obtained by the second millimeter wave unit;
  • Step S730 control the first millimeter wave unit to generate rendering requirement information based on the augmented reality data
  • Step S740 control the first millimeter wave unit to send rendering requirement information to the second millimeter wave unit;
  • Step S750 Control the second millimeter wave unit to render the augmented reality data according to the rendering requirement information to obtain projection data, and send the projection data to the projection device for projection display.
  • two millimeter wave module devices are provided inside the car cabin, namely a first millimeter wave unit and a second millimeter wave unit.
  • the first millimeter wave unit is placed in the cockpit facing the driver and close to the AR HUD projector.
  • the location is called the front millimeter wave module;
  • the second millimeter wave unit is placed inside the top of the car cabin and is connected to the antenna array outside the roof, which is called the top millimeter wave module.
  • the first millimeter wave unit is also used as an AR generator algorithm processing center, and the second millimeter wave unit is also used as an edge computing center and content distribution network; both Cooperating with each other, on the one hand, it is used as an integral part of the car's in-cabin communication system and monitoring system for communication and perception; on the other hand, it is used to process the car's multi-sensor fusion information to generate effective AR information and perform AR rendering, as well as multimedia content and game playback.
  • Rendering the first millimeter wave unit obtains augmented reality data based on the fusion information data, and the first millimeter wave unit The meter-wave unit generates rendering demand information based on the augmented reality data.
  • the first millimeter-wave unit sends rendering demand information to the second millimeter-wave unit.
  • the rendering demand information carries augmented reality data.
  • the second millimeter-wave unit processes the augmented reality data based on the rendering demand information.
  • the rendering process obtains the projection data and sends the projection data to the head-up display projector for projection display.
  • the first monitoring information is the driver's breathing, heartbeat, and movement information obtained by the first millimeter wave unit
  • the second monitoring information is the passenger's breathing, heartbeat, and movement information obtained by the second millimeter wave unit.
  • the physical signs and movement information of the driver and passengers are displayed on the head-up display projector, providing a safety warning function for the driver and passengers.
  • the first millimeter wave unit has a millimeter wave antenna array that can be controlled by beamforming. On the one hand, it is used to monitor the driver's breathing and heartbeat (by detecting frequency modulated continuous wave signals with a specific range of values caused by tiny vibrations of the target). phase change) and 4D imaging of the driver's dangerous actions, distractions, and fatigue driving conditions (by judging the target distance, speed, horizontal angle, and pitch angle). On the other hand, the first millimeter wave unit has Stronger computing power replaces the traditional HUD driver board as the AR algorithm processing center.
  • the second millimeter wave unit is connected to the roadside unit and/or the Internet of Vehicles cloud computing center, and the second millimeter wave unit renders the augmented reality data according to the rendering requirement information to obtain projection data including, the second millimeter wave unit
  • the wave unit obtains updated data from the roadside unit and/or the Internet of Vehicles cloud computing center, performs augmented reality data rendering processing on the augmented reality data based on the updated data and demand information, and obtains projection data.
  • the second millimeter wave unit is in the car cabin There are millimeter-wave antenna arrays that can be controlled by beamforming both inside and on the roof. On the one hand, they are used to monitor the breathing and heartbeat of passengers and 4D imaging monitoring of their behavior.
  • the content distribution network provides powerful AR rendering computing power and streaming content distribution capabilities, and can communicate with the outside world through the antenna array on the roof of the car to the roadside unit, the Internet of Vehicles and the mobile Internet, and then update data and demand information based on Perform augmented reality data rendering processing on the augmented reality data to obtain projection data.
  • Figure 8 is a method flow chart of a smart cockpit control method proposed by another embodiment of the present application.
  • Embodiments of the present application provide a smart cockpit control method, wherein the millimeter wave module includes a first millimeter wave unit and a second millimeter wave unit, the first millimeter wave unit is connected to the second millimeter wave unit, and the first millimeter wave unit is connected to the second millimeter wave unit.
  • Projection device connection, methods include:
  • Step S810 control the second millimeter wave unit to obtain fusion information data and communication data.
  • the fusion information data includes the first monitoring information obtained by the first millimeter wave unit and the second monitoring information obtained by the second millimeter wave unit;
  • Step S820 control the second millimeter wave unit to obtain augmented reality data based on the fused information data and communication data;
  • Step S830 control the second millimeter wave unit to generate rendering requirement information based on the augmented reality data
  • Step S840 control the second millimeter wave unit to render the augmented reality data according to the rendering requirement information to obtain projection data, and send the projection data to the projection device for projection display.
  • the communication data includes Internet of Vehicles data acquired by the second millimeter wave unit, content analysis Send network data and associated edge computing data.
  • the AR information processing process becomes more complicated and requires higher computing power.
  • the first millimeter-wave unit is used to accurately calculate and load body perception elements such as the user's head degree of freedom, eyeballs, gestures, posture, and expressions into the AR generator.
  • the spatial positioning obtained by the user through AR glasses is also Accurately calculate and load it to the AR generator.
  • the first millimeter wave unit also serves as the AR generator. The calculation load and power consumption are very large.
  • the second millimeter wave unit obtains fusion information data, obtains augmented reality data based on the fusion information data, generates rendering demand information based on the augmented reality data, renders the augmented reality data based on the rendering demand information to obtain projection data, and projects the
  • the data is sent to the head-up display projector for projection display, and then the vehicle head-up display projector can provide users with augmented reality services more quickly and stably, and display driving information and entertainment content to users according to their needs.
  • Figure 9 is a method flow chart of an intelligent cockpit control method proposed by another embodiment of the present application.
  • Embodiments of the present application provide a smart cockpit control method, wherein the millimeter wave module includes a first millimeter wave unit and a second millimeter wave unit, the first millimeter wave unit is connected to the second millimeter wave unit, and the first millimeter wave unit is connected to the second millimeter wave unit.
  • Projection device connection, methods include:
  • Step S910 control the first millimeter wave unit and the second millimeter wave unit to perform rendering task allocation processing according to their respective computing power consumption data, and obtain the allocation result;
  • Step S920 Control the first millimeter wave unit and the second millimeter wave unit to process the fusion information data and communication data according to the allocation results to obtain augmented reality data.
  • the fusion information data includes the first monitoring information and the third millimeter wave unit obtained by the first millimeter wave unit.
  • the second monitoring information obtained by the two millimeter wave unit;
  • Step S930 control the second millimeter wave unit to generate rendering requirement information based on the augmented reality data
  • Step S940 control the second millimeter wave unit to render the augmented reality data according to the rendering requirement information to obtain projection data, and send the projection data to the projection device for projection display.
  • the computing power consumption data represents the processing capabilities of the first millimeter wave unit and the second millimeter wave unit, and the first millimeter wave unit and the second millimeter wave unit perform task allocation processing according to their respective computing power consumption data.
  • the first millimeter wave unit and the second millimeter wave unit collaboratively process the fusion information data and communication data according to the allocation result, that is, federated computing processing, to obtain augmented reality data.
  • the fusion information data includes the information obtained by the first millimeter wave unit.
  • the first monitoring information and the second monitoring information obtained by the second millimeter wave unit wherein the federal calculation is to allocate computing tasks to the first millimeter wave unit and the second millimeter wave unit, so that the first millimeter wave unit and the second millimeter wave unit
  • the wave unit can allocate loads, share data files and memory based on the allocation results, perform various assigned computing tasks, and collaboratively process subsequent rendering processing tasks.
  • the communication data includes Internet of Vehicles data, content distribution network data and associated edge computing data acquired by the first millimeter wave unit and/or the second millimeter wave unit.
  • Unit division of labor makes the AR information processing process more complex and requires higher computing power.
  • the first millimeter wave unit is used to combine the user's head degree of freedom, eyeballs, gestures, and body posture. , expressions and other body perception elements are accurately calculated and loaded into the AR generator, while also The spatial positioning obtained by the user through AR glasses must be accurately calculated and loaded into the AR generator.
  • the first millimeter wave unit also serves as the AR generator. The calculation load and power consumption are very large.
  • the second millimeter-wave unit participates in part or all of the multi-sensor sensing calculations, that is, federated computing is introduced, and the first millimeter-wave unit and the second millimeter-wave unit perform Federal calculation is performed to obtain the federal calculation result.
  • the first millimeter wave unit and the second millimeter wave unit process the fusion information data according to the federation calculation result to obtain augmented reality data.
  • the fusion information data includes the first monitoring information obtained by the first millimeter wave unit.
  • the first millimeter wave unit and the second millimeter wave unit can achieve computing power balancing through mutual communication to ensure that the unit The user's body perception and spatial positioning of the universe are well integrated into the interactive virtual and real XR space, achieving a perfect integration of the virtual and real worlds.
  • an embodiment of the present application also provides a smart cockpit controller 1000, which includes: a memory 1020, a processor 1010, and a computer program stored in the memory and executable on the processor.
  • the processor 1010 executes the computer program Implement the smart cockpit control method as in any one of the above embodiments, for example, perform the above-described method steps S610 to S650 in Figure 6, method steps S710 to S750 in Figure 7, and method steps S810 to S840 in Figure 8 , method steps S910 to S940 in Figure 9.
  • the embodiment of the present application also provides a smart cockpit, which includes the above-mentioned controller. Since the smart cockpit of the embodiment of the present application has the controller of the above-mentioned embodiment, and the controller of the above-mentioned embodiment can execute The smart cockpit control method in the above embodiments, therefore, the implementation manner and technical effects of the smart cockpit in the embodiment of the present application can be referred to the implementation manner and technical effects of the smart cockpit control method in any of the above embodiments.
  • an embodiment of the present application also provides a computer-readable storage medium that stores computer-executable instructions, and the computer-executable instructions are executed by one or more control processors, for example, execute The method steps S610 to S650 in FIG. 6 , the method steps S710 to S750 in FIG. 7 , the method steps S810 to S840 in FIG. 8 , and the method steps S910 to S940 in FIG. 9 are described above.
  • fusion information data and communication data are obtained.
  • the fusion information data includes monitoring information obtained by scanning the interior of the intelligent cockpit by the millimeter wave module.
  • the communication data includes the millimeter wave module and The communication data that interacts inside and outside the smart cockpit is used to obtain augmented reality data based on the fusion of information data and communication data.
  • the augmented reality data is rendered and processed to obtain projection data, and the projection data is sent to the projection device for projection display.
  • the millimeter wave module as the augmented reality generator, it obtains augmented reality data based on multi-sensor fusion data and communication data, renders the augmented reality data, and provides users with augmented reality and elements more quickly and stably through the vehicle head-up display projector.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disk (DVD) or other optical disk storage, magnetic cassettes, tapes, disk storage or other magnetic storage devices, or may Any other medium used to store the desired information and that can be accessed by a computer.
  • communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism, and may include any information delivery media .

Abstract

The present application discloses an intelligent carriage control method, a controller, an intelligent carriage, and a computer storage medium. The intelligent carriage control method comprises: acquiring fusion information data and communication data, the fusion information data comprising monitoring information obtained by a millimeter wave module scanning the inside of the intelligent carriage, and the communication data comprising communication data for interaction between the millimeter wave module and the inside and outside of the intelligent carriage (S610); obtaining augmented reality data according to the fusion information data and the communication data (S620); rendering the augmented reality data to obtain projection data (S630); and sending the projection data to a projection device for projection display (S640).

Description

智能座舱控制方法、控制器、智能座舱以及存储介质Intelligent cockpit control method, controller, intelligent cockpit and storage medium
相关申请的交叉引用Cross-references to related applications
本申请基于申请号为202210659957.1、申请日为2022年06月13日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。This application is filed based on a Chinese patent application with application number 202210659957.1 and a filing date of June 13, 2022, and claims the priority of the Chinese patent application. The entire content of the Chinese patent application is hereby incorporated into this application as a reference.
技术领域Technical field
本申请涉及车载设备控制技术领域,具体涉及一种智能座舱控制方法、控制器、智能座舱以及计算机存储介质。This application relates to the field of vehicle equipment control technology, and specifically relates to an intelligent cockpit control method, a controller, an intelligent cockpit and a computer storage medium.
背景技术Background technique
近年来汽车行业已经进入了智能化、电动化时代,智能座舱的发展已经是大势所趋,相对于传统汽车座舱,智能座舱可以通过车载抬头显示屏投影仪为用户在汽车的挡风玻璃上显示关于前方路况信息,辅助用户驾驶。In recent years, the automobile industry has entered an era of intelligence and electrification. The development of smart cockpits has become a general trend. Compared with traditional car cockpits, smart cockpits can display information about the situation ahead on the windshield of the car for users through a vehicle-mounted head-up display projector. Traffic information to assist users in driving.
然而,在智能座舱技术中,当前的车载抬头显示屏设备的处理能力较弱,导致车载抬头显示屏投影仪在为用户提供增强现实服务时无法处理较为复杂的数据,仅能就简单的路况信息进行投影,功能单一,无法进一步为用户提供用户所需要驾驶信息或娱乐内容,且受限于车载抬头显示屏设备的处理能力,抬头显示屏投影仪为用户提供增强现实服务时运行速度缓慢,影响用户体验。However, in smart cockpit technology, the current vehicle head-up display device has weak processing capabilities, resulting in the vehicle head-up display projector being unable to process more complex data when providing augmented reality services to users, and can only provide simple traffic information. Projection has a single function and cannot further provide users with the driving information or entertainment content they need. Moreover, it is limited by the processing power of the vehicle head-up display device. When the head-up display projector provides users with augmented reality services, it runs slowly and affects user experience.
发明内容Contents of the invention
本申请实施例提供一种智能座舱控制方法、控制器、智能座舱以及计算机存储介质。Embodiments of the present application provide an intelligent cockpit control method, a controller, an intelligent cockpit and a computer storage medium.
第一方面,本申请实施例提供了一种智能座舱控制方法,应用于所述智能座舱,所述智能座舱包括毫米波模块,所述毫米波模块与投影设备连接,所述方法包括:获取融合信息数据和通信数据,所述融合信息数据包括所述毫米波模块对智能座舱内部扫描得到的监测信息,所述通信数据包括所述毫米波模块与智能座舱内外部进行交互的通信数据;根据所述融合信息数据和所述通信数据得到增强现实数据;对所述增强现实数据进行渲染处理后得到投影数据;将所述投影数据发送至所述投影设备进行投影显示。In a first aspect, embodiments of the present application provide a smart cockpit control method, which is applied to the smart cockpit. The smart cockpit includes a millimeter wave module, and the millimeter wave module is connected to a projection device. The method includes: obtaining fusion Information data and communication data, the fused information data includes monitoring information obtained by scanning the interior of the smart cockpit by the millimeter wave module, and the communication data includes communication data of the interaction between the millimeter wave module and the inside and outside of the smart cockpit; according to the The information data and the communication data are fused to obtain augmented reality data; projection data is obtained after rendering processing of the augmented reality data; and the projection data is sent to the projection device for projection display.
第二方面,本申请实施例提供了一种智能座舱控制器,包括存储器、处理器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现如第一方面中任意一项实施例所述的智能座舱控制方法。In a second aspect, embodiments of the present application provide an intelligent cockpit controller, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor. The processor executes the computer program. The program implements the intelligent cockpit control method described in any embodiment of the first aspect.
第三方面,本申请实施例提供了一种智能座舱,包括如第二方面所述的控制 器。In the third aspect, embodiments of the present application provide a smart cockpit, including the control system described in the second aspect. device.
第四方面,本申请实施例提供了一种计算机可读存储介质,存储有计算机可执行指令,计算机可执行指令用于执行如第一方面中任意一项实施例所述的智能座舱控制方法。In a fourth aspect, embodiments of the present application provide a computer-readable storage medium that stores computer-executable instructions. The computer-executable instructions are used to execute the smart cockpit control method described in any embodiment of the first aspect.
附图说明Description of the drawings
图1为本申请一实施例提出的智能座舱的示意图;Figure 1 is a schematic diagram of a smart cockpit proposed by an embodiment of the present application;
图2为本申请另一实施例提出的智能座舱的功能框图;Figure 2 is a functional block diagram of a smart cockpit proposed by another embodiment of the present application;
图3为本申请另一实施例提出的智能座舱的扩展功能框图;Figure 3 is an extended functional block diagram of a smart cockpit proposed by another embodiment of the present application;
图4为本申请另一实施例提出的智能座舱的另一扩展功能框图;Figure 4 is another extended functional block diagram of the smart cockpit proposed by another embodiment of the present application;
图5为本申请另一实施例提出的智能座舱的系统架构图;Figure 5 is a system architecture diagram of a smart cockpit proposed by another embodiment of the present application;
图6为本申请另一实施例提出的智能座舱控制方法的方法流程图;Figure 6 is a method flow chart of a smart cockpit control method proposed by another embodiment of the present application;
图7为本申请另一实施例提出的智能座舱控制方法中,第一毫米波单元获取融合信息数据对应的方法流程图;Figure 7 is a flow chart of a method in which the first millimeter wave unit obtains fusion information data in the smart cockpit control method proposed by another embodiment of the present application;
图8为本申请另一实施例提出的智能座舱控制方法中,第二毫米波单元获取融合信息数据对应的方法流程图;Figure 8 is a flowchart of a method corresponding to the second millimeter wave unit acquiring fusion information data in the smart cockpit control method proposed by another embodiment of the present application;
图9为本申请另一实施例提出的智能座舱控制方法中,第一毫米波单元和第二毫米波单元进行联邦计算对应的方法流程图;Figure 9 is a flow chart of a method for performing federated calculations by the first millimeter wave unit and the second millimeter wave unit in the smart cockpit control method proposed by another embodiment of the present application;
图10本申请另一实施例提出的智能座舱控制器的结构图。Figure 10 is a structural diagram of an intelligent cockpit controller proposed by another embodiment of this application.
附图标记:101、天线阵列;102、第二毫米波单元;103、增强现实眼镜;104、手柄;105、方向盘;106、第一毫米波单元;107、驾驶员监控系统摄像头;110、抬头显示屏投影仪;111、不可调曲面镜片;112、数字光处理光机;113、可调曲面镜片。Reference signs: 101. Antenna array; 102. Second millimeter wave unit; 103. Augmented reality glasses; 104. Handle; 105. Steering wheel; 106. First millimeter wave unit; 107. Driver monitoring system camera; 110. Head up Display projector; 111. Non-adjustable curved lens; 112. Digital light processing optical machine; 113. Adjustable curved lens.
具体实施方式Detailed ways
为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所描述的实施例仅用以解释本申请,并不用于限定本申请。In order to make the purpose, technical solutions and advantages of the present application more clear, the present application will be further described in detail below with reference to the drawings and embodiments. It should be understood that the embodiments described here are only used to explain the present application and are not used to limit the present application.
在一些实施例中,虽然在系统示意图中进行了功能模块划分,在流程图中示出了逻辑顺序,但是在某些情况下,可以以不同于系统中的模块划分,或流程图中的顺序执行所示出或描述的步骤。说明书和权利要求书及上述附图中的术语第一、第二等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。In some embodiments, although the functional modules are divided in the system schematic diagram and the logical sequence is shown in the flow chart, in some cases, the module division in the system or the order in the flow chart may be different. Follow the steps shown or described. The terms first, second, etc. in the description, claims, and above-mentioned drawings are used to distinguish similar objects, and are not necessarily used to describe a specific order or sequence.
本申请公开了一种智能座舱控制方法、控制器、智能座舱以及计算机存储介质,其中,通过本申请提出的智能座舱控制方法,获取融合信息数据,融合信息数据包括毫米波模块对智能座舱内部扫描得到的监测信息,根据融合信息数据得到增强现实数据,对增强现实数据进行渲染处理后得到投影数据,以毫米波模块作为增强现实生成器,根据多传感器融合数据获取增强现实数据,并对增强现实 数据进行渲染处理,进而更加快速稳定的通过车载增强现实抬头显示屏投影仪为用户提供增强现实服务,根据用户需求为用户展示驾驶信息和娱乐内容,为用户提供更为智能的交互体验。This application discloses an intelligent cockpit control method, a controller, an intelligent cockpit and a computer storage medium. Through the intelligent cockpit control method proposed in this application, fusion information data is obtained. The fusion information data includes scanning of the interior of the intelligent cockpit by a millimeter wave module. For the obtained monitoring information, augmented reality data is obtained based on the fusion information data, and the projection data is obtained after rendering the augmented reality data. The millimeter wave module is used as the augmented reality generator, the augmented reality data is obtained based on the multi-sensor fusion data, and the augmented reality data is obtained. The data is rendered and processed to more quickly and stably provide users with augmented reality services through the vehicle-mounted augmented reality head-up display projector, display driving information and entertainment content to users according to user needs, and provide users with a more intelligent interactive experience.
下面结合附图,对本申请实施例作进一步描述。The embodiments of the present application will be further described below with reference to the accompanying drawings.
参考图1,图1为本申请一实施例提出的智能座舱的示意图,本申请实施例提供了一种智能座舱。Referring to FIG. 1 , FIG. 1 is a schematic diagram of an intelligent cockpit proposed by an embodiment of the present application. The embodiment of the present application provides an intelligent cockpit.
在一实施方式中,如图1所示,本申请一实施例提出的智能座舱包括:抬头显示屏投影仪110、第一毫米波单元106、第二毫米波单元102和驾驶员监控系统摄像头107,In one implementation, as shown in FIG. 1 , a smart cockpit proposed by an embodiment of the present application includes: a head-up display screen projector 110 , a first millimeter wave unit 106 , a second millimeter wave unit 102 and a driver monitoring system camera 107 ,
在一实施方式中,抬头显示屏投影仪110即为AR-3D-HUD(Augmented Reality 3D Head Up Display)投影仪,驾驶员监控系统摄像头107即为DMS(Driver monitoring system)摄像头,增强现实眼镜103即为AR(Augmented Reality)眼镜。In one implementation, the head-up display projector 110 is an AR-3D-HUD (Augmented Reality 3D Head Up Display) projector, the driver monitoring system camera 107 is a DMS (Driver monitoring system) camera, and the augmented reality glasses 103 They are AR (Augmented Reality) glasses.
在一些实施例中,本申请一实施例提出的智能座舱还包括增强现实眼镜103、手柄104和方向盘105,其中,抬头显示屏投影仪110包括DLP数字光处理光机112(Digital Light Processing,数字光处理光机)、不可调曲面镜片111和可调曲面镜片113;其中,第一毫米波单元106置于面向驾驶员的驾驶舱靠近抬头显示屏投影仪110位置,称为前置毫米波模块,第一毫米波单元106具有可波束赋形控制的毫米波天线阵列101。In some embodiments, the smart cockpit proposed by an embodiment of the present application also includes augmented reality glasses 103, a handle 104 and a steering wheel 105, wherein the head-up display projector 110 includes a DLP digital light processing optical engine 112 (Digital Light Processing, digital Light processing optical machine), non-adjustable curved lenses 111 and adjustable curved lenses 113; among them, the first millimeter wave unit 106 is placed in the cockpit facing the driver and close to the head-up display projector 110, and is called a front millimeter wave module , the first millimeter wave unit 106 has a beamforming controllable millimeter wave antenna array 101.
在一些实施例中,第一毫米波单元106、第二毫米波单元102通过探测目标微小振动所引起的特定范围值的调频连续波信号的相位变化来监测驾驶员呼吸、心跳,并通过目标物距离、速度、水平角和俯仰角等参数,针对驾驶员进行危险动作、分心、疲劳驾驶状态的4D成像处理,辅助驾驶员安全驾驶。In some embodiments, the first millimeter wave unit 106 and the second millimeter wave unit 102 monitor the driver's breathing and heartbeat by detecting the phase changes of frequency modulated continuous wave signals in a specific range of values caused by the target's tiny vibrations, and through the target object. Parameters such as distance, speed, horizontal angle and pitch angle can be used to perform 4D imaging processing on the driver's dangerous actions, distractions and fatigue driving conditions to assist the driver in safe driving.
在一些实施例中,因第一毫米波单元106、第二毫米波单元102具备更强的算力,第一毫米波单元106、第二毫米波单元102替代传统抬头显示屏投影仪110中的抬头显示屏投影仪驱动板作为AR算法处理中心,其中,第二毫米波单元102置于车舱顶部内侧位置并有车顶外部的天线阵列101相连,称为顶置毫米波模块。顶置毫米波模块在车舱内部和车顶外部均有可波束赋形控制的毫米波天线阵列101,一方面用于对乘客进行呼吸、心跳监测和对乘客行为动作进行4D成像监测,另一方面用于车舱边缘计算中心和内容分发网络(Content Delivery Network,内容分发网络),为智能座舱中AR数据提供强大的AR渲染算力,同时,第二毫米波单元具备流媒体内容分发能力,能通过车顶外部的天线阵列101与路侧单元(Road side Unit)、车联网以及移动互联网进行V2X(vehicle to everything,车对外界)通信。前置毫米波模块和顶置毫米波模块之间通过典型如60GHz以上频段毫米波进行通信,藉由毫米波大带宽低时延的通信特性以及点云成像能力和波束相位差定位能力,在两者通信过程中能够同步实现对驾驶员和乘客的4D成像感知以及呼吸、心率识别。 In some embodiments, because the first millimeter wave unit 106 and the second millimeter wave unit 102 have stronger computing power, the first millimeter wave unit 106 and the second millimeter wave unit 102 replace the conventional heads-up display projector 110. The head-up display projector drive board serves as the AR algorithm processing center. The second millimeter wave unit 102 is placed inside the top of the cabin and is connected to the antenna array 101 outside the roof, which is called a top-mounted millimeter wave module. The overhead millimeter-wave module has a millimeter-wave antenna array 101 that can be controlled by beamforming both inside the cabin and outside the roof. On the one hand, it is used to monitor the breathing and heartbeat of passengers and conduct 4D imaging monitoring of passengers' behaviors. It is used in the cabin edge computing center and Content Delivery Network (Content Delivery Network) to provide powerful AR rendering computing power for AR data in the smart cockpit. At the same time, the second millimeter wave unit has streaming media content distribution capabilities. It can communicate with the road side unit (Road side unit), Internet of Vehicles and mobile Internet through the antenna array 101 on the outside of the car roof for V2X (vehicle to everything, vehicle to the outside world) communication. The front-mounted millimeter wave module and the top-mounted millimeter wave module communicate through millimeter waves in the typical frequency band above 60GHz. By virtue of the large bandwidth and low latency communication characteristics of millimeter waves, as well as point cloud imaging capabilities and beam phase difference positioning capabilities, the two During the communication process, 4D imaging perception and breathing and heart rate recognition of the driver and passengers can be simultaneously realized.
在一些实例中,DMS摄像头用于对驾驶员进行红外观测,并对智能座舱中人员进行眼球追踪,同时,DMS摄像头为红外摄像头在夜晚能拥有更好的工作效果。In some instances, DMS cameras are used for infrared observation of drivers and eye tracking of people in smart cockpits. At the same time, DMS cameras are infrared cameras that can work better at night.
参考图1和图2,图2为本申请另一实施例提出的智能座舱的功能框图,本申请实施例提供了一种智能座舱。Referring to Figures 1 and 2, Figure 2 is a functional block diagram of a smart cockpit proposed by another embodiment of the present application. This embodiment of the present application provides a smart cockpit.
在一实施方式中,如图2所示,在另一实施例提出的智能座舱的功能框图中,信息娱乐主机为虚拟的功能模块,信息娱乐主机的模型显示功能由毫米波模块1(即第一毫米波单元106)或毫米波模块2(即第二毫米波单元102)承担,毫米波模块1或毫米波模块2获取来自定位模块的地图导航信息,来自DMS摄像头的DMS摄像信息,来自车身传感器的车身速度、加速度和转向角信息,来自ADAS(Advanced Driving Assistance System,高级驾驶辅助系统)模块的ADAS信息以及来自毫米波模块1和毫米波模块2的4D成像和呼吸、心率等IMS(In-carbin monitoring system,舱内监测系统:包含DMS和OMS(occupancy monitoring system,乘客监测系统))信息,同时,毫米波模块1或毫米波模块2作为AR生成器进行数据融合算法处理,提取AR有效信息送入到显示模型。In one embodiment, as shown in Figure 2, in the functional block diagram of the smart cockpit proposed in another embodiment, the infotainment host is a virtual functional module, and the model display function of the infotainment host is provided by the millimeter wave module 1 (i.e. the first One millimeter wave unit 106) or millimeter wave module 2 (i.e., the second millimeter wave unit 102) is responsible for the millimeter wave module 1 or millimeter wave module 2 to obtain the map navigation information from the positioning module, the DMS camera information from the DMS camera, and the vehicle body Vehicle body speed, acceleration and steering angle information from the sensor, ADAS information from the ADAS (Advanced Driving Assistance System) module, and 4D imaging and respiration, heart rate and other IMS (In -carbin monitoring system: includes DMS and OMS (occupancy monitoring system, passenger monitoring system) information. At the same time, millimeter wave module 1 or millimeter wave module 2 is used as an AR generator for data fusion algorithm processing to extract effective AR Information is fed into the display model.
在一些实施例中,信息娱乐主机的AR生成器即为第一毫米波单元106,渲染后的AR显示模型信息经LVDS(Low-Voltage Differential Signaling,低电压差分信号)送到抬头显示屏投影仪110,在抬头显示屏投影仪110内经抬头显示屏投影仪驱动板、图像生成单元、光学/结构组件(前两者对应图1中的数字光处理光机112,后者对应图1的不可调曲面镜片111和可调曲面镜片113)投影射出进入人眼。但从人眼角度看到的是挡风玻璃镜面成像在汽车前方一定距离的虚像,其中,毫米波模块1本身要处理4D成像点云,故算力强大,可以兼用于信息娱乐主机中AR生成器的算法处理中心。In some embodiments, the AR generator of the infotainment host is the first millimeter wave unit 106, and the rendered AR display model information is sent to the head-up display projector via LVDS (Low-Voltage Differential Signaling) 110, in the head-up display projector 110 through the head-up display projector drive board, image generation unit, optical/structural components (the first two correspond to the digital light processing optical engine 112 in Figure 1, the latter corresponds to the non-adjustable The curved lens 111 and the adjustable curved lens 113) project into the human eye. However, what is seen from the perspective of the human eye is a virtual image formed by the windshield mirror at a certain distance in front of the car. Among them, the millimeter wave module 1 itself has to process the 4D imaging point cloud, so it has powerful computing power and can be used for AR generation in the infotainment host. The algorithm processing center of the processor.
在一些实施例中,第二毫米波单元102则不仅算力强大且传输能力强大,不单单是可作为AR渲染器,还拥有强大的数据缓存能力存储数据副本,就近满足用户需求,且同时藉由V2X和外部网络空间通联,实现内容更新。In some embodiments, the second millimeter wave unit 102 not only has powerful computing power and powerful transmission capabilities, not only can be used as an AR renderer, but also has powerful data caching capabilities to store data copies to meet user needs nearby, and at the same time borrow Content updates are realized through V2X communication with external network spaces.
在一些实施例中,毫米波模块能够在通信的同时进行感知,第一毫米波单元106还有第二毫米波单元102,都能在进行两个毫米波模块之间以及和抬头显示屏投影仪110之前通信的同时进行4D成像和心率、呼吸检测,能针对包含DMS、OMS在内的IMS进行全覆盖。In some embodiments, the millimeter wave module can sense while communicating. The first millimeter wave unit 106 and the second millimeter wave unit 102 can perform communication between the two millimeter wave modules and with the head-up display projector. It can perform 4D imaging and heart rate and respiration detection while communicating before 110, and can provide full coverage for IMS including DMS and OMS.
在一些实施例中,第二毫米波单元102和经由V2X通信实现外部车联网针对汽车座舱内的XR(extended-range,扩展范围)显示、视听文娱、实时导航设备以及用户手机等移动终端的内容更新,同时原本的系统处理器对汽车座舱内人或物目标进行4D成像和对汽车座舱内司乘人员的呼吸、心跳监测数据在安全许可的情况下共享到路侧RSU(Road Side Unit,路侧单元)、车联网计算中心和移动互联网,关联的个人或集团终端用户可实现远程实时了解车内情况。与此同时,毫米波模块本身的大带宽低延时的特点,可以提高AR渲染能力,进而提升抬头显示屏投影仪画面质量。 In some embodiments, the second millimeter wave unit 102 implements external Internet of Vehicles via V2X communication for XR (extended-range, extended range) display in the car cockpit, audio-visual entertainment, real-time navigation equipment, and content of mobile terminals such as user mobile phones. Update, at the same time, the original system processor performs 4D imaging of people or objects in the car cockpit and the breathing and heartbeat monitoring data of the drivers and passengers in the car cockpit are shared to the roadside RSU (Road Side Unit, Road Side Unit) when safety permits. Side unit), Internet of Vehicles computing center and mobile Internet, associated individual or group end users can remotely understand the situation in the car in real time. At the same time, the large bandwidth and low latency of the millimeter wave module itself can improve AR rendering capabilities, thereby improving the image quality of the head-up display projector.
参考图1和图3,图3为本申请另一实施例提出的智能座舱控的扩展功能框图本申请实施例提供了一种智能座舱。Referring to Figures 1 and 3, Figure 3 is an extended functional block diagram of an intelligent cockpit control proposed by another embodiment of the present application. An embodiment of the present application provides an intelligent cockpit.
通常汽车抬头显示屏投影仪仅在汽车行驶过程中用于根据实时导航地图信息和车况信息进行AR显示,但是在汽车自动驾驶程度日益提升的情况下,在汽车完全自主行驶的情况下,抬头显示屏投影也能用于用户娱乐,故本申请提供了一种智能座舱,通过以毫米波模块作为增强现实生成器根据多传感器融合数据获取增强现实数据,并对增强现实数据进行渲染处理,进而更加快速稳定的通过车载抬头显示屏投影仪为用户提供增强现实服务,能在自动驾驶过程中为用户提供娱乐和元宇宙体验。Usually, car head-up display projectors are only used for AR display based on real-time navigation map information and vehicle condition information while the car is driving. However, as the degree of autonomous driving of cars is increasing day by day, and when the car is driving completely autonomously, head-up display Screen projection can also be used for user entertainment, so this application provides a smart cockpit that uses a millimeter wave module as an augmented reality generator to obtain augmented reality data based on multi-sensor fusion data, and renders the augmented reality data to make it more Quickly and stably provide users with augmented reality services through the vehicle head-up display projector, which can provide users with entertainment and metaverse experiences during autonomous driving.
在一实施方式中,如图3所示,汽车抬头显示屏投影仪110用于元宇宙体验,需要增强用户的身体感知和空间定位,用户层面在驾驶员位置增加AR眼镜和手柄104,用于用户通过AR眼镜增加空间定位,通过手柄104进行方位操控;同时,毫米波模块通过4D成像对用户头部自由度、手势、表情、身姿动作进行感知,并增强DMS摄像头对眼球的精细化追踪,确保用户身体感知和空间定位两个方面精确计算,实现把虚拟物体放在现实世界,并和现实世界达到完美融合。这个过程中,基于毫米波大带宽和低延时以及自身强大的算力支持,构筑了汽车座舱内的元宇宙上佳体验环境,使依托于车载抬头显示屏投影仪虚像环境更为宏大,使得基于扩展现实技术提供的沉浸式体验更佳。In one implementation, as shown in Figure 3, the car head-up display projector 110 is used for metaverse experience, which needs to enhance the user's body perception and spatial positioning. At the user level, AR glasses and a handle 104 are added at the driver's position for The user increases spatial positioning through AR glasses and performs azimuth control through the handle 104; at the same time, the millimeter wave module uses 4D imaging to perceive the user's head freedom, gestures, expressions, and body movements, and enhances the DMS camera's refined tracking of the eyeballs , ensuring accurate calculations in terms of user body perception and spatial positioning, enabling virtual objects to be placed in the real world and perfectly integrated with the real world. In this process, based on the large bandwidth and low latency of millimeter waves and its own powerful computing power support, a superior experience environment of the metaverse in the car cockpit is built, making the virtual image environment relying on the vehicle head-up display projector even grander, making An even better immersive experience based on extended reality technology.
参考图1和图4,图4为本申请另一实施例提出的智能座舱的另一扩展功能框图,本申请实施例提供了一种智能座舱。Referring to FIG. 1 and FIG. 4 , FIG. 4 is another extended functional block diagram of an intelligent cockpit proposed by another embodiment of the present application. This embodiment of the present application provides an intelligent cockpit.
在一实施方式中,如图4所示本申请中的智能座舱基于毫米波模块进行数据交互并关联外部网络能进一步提高处理传感信息的算力,可以使得毫米波模块所要处理的信息来源对象进一步扩展到味觉、嗅觉、触觉、意识方面,以实现交互式虚实结合增强现实和元宇宙体验环境。In one embodiment, as shown in Figure 4, the smart cockpit in this application performs data interaction based on millimeter wave modules and associates with external networks, which can further improve the computing power of processing sensor information, and can make the information source object to be processed by the millimeter wave module It is further expanded to the aspects of taste, smell, touch, and consciousness to realize an interactive virtual reality combined with augmented reality and metaverse experience environment.
在一些实施例中,本申请的智能座舱在智能座舱的乘客位置,增加基于挡风玻璃的HUD硬件,能更加快速稳定的通过车载抬头显示屏投影仪为乘客提供增强现实服务,根据用户需求为乘客展示驾驶信息和娱乐内容。In some embodiments, the smart cockpit of the present application adds windshield-based HUD hardware at the passenger position of the smart cockpit, which can provide passengers with augmented reality services more quickly and stably through the vehicle head-up display projector, and provide augmented reality services according to user needs. Passengers are shown driving information and entertainment content.
在一些实施例中,本申请的智能座舱包括AR头盔,通过智能座舱的车载毫米波通信和感知环境功能,通过AR头盔为用户提供沉浸式体验。In some embodiments, the smart cockpit of the present application includes an AR helmet, and the AR helmet provides users with an immersive experience through the on-board millimeter wave communication and environmental sensing functions of the smart cockpit.
在一些实施例中,融合信息数据还包括传感信息,根据融合信息数据得到增强现实数据,包括,根据监测信息和传感信息得到增强现实数据,其中,传感信息包括以下至少之一:通过驾驶员监测摄像头获取的用户眼球跟踪数据、通过手柄获取的用户按键数据或增强现实眼镜所获取的空间定位数据,以实现交互式虚实结合增强现实和元宇宙体验环境。In some embodiments, the fused information data also includes sensing information, and the augmented reality data obtained according to the fused information data includes obtaining the augmented reality data according to the monitoring information and the sensing information, wherein the sensing information includes at least one of the following: The user eye tracking data obtained by the driver monitoring camera, the user keystroke data obtained through the handle or the spatial positioning data obtained by the augmented reality glasses can realize an interactive virtual and real combined augmented reality and metaverse experience environment.
在一些实施例中,车载AR-HUD展示基于数字孪生技术生成现实世界的镜像,通过在毫米波模块边缘计算中心增加区块链技术,使得汽车座舱内毫米波通信和感知信息融合处理的AR-HUD能够很好地构筑虚拟世界与现实世界在经济、社交、 身份上密切融合的环境,使得汽车成为用户进行元宇宙体验的重要场所。In some embodiments, the vehicle-mounted AR-HUD display generates a mirror of the real world based on digital twin technology. By adding blockchain technology to the edge computing center of the millimeter wave module, the AR-HUD fusion processing of millimeter wave communication and perception information in the car cockpit is achieved. HUD can well construct the economic, social, and social aspects between the virtual world and the real world. The environment that is closely integrated in terms of identity makes the car an important place for users to experience the metaverse.
上述毫米波模块的第一毫米波单元和第二毫米波单元,既可以是两个独立的通过毫米波通信进行连接的物理实体,也可以是共同物理实体内部的两个虚拟的功能单元。两个物理实体实施例通过分别放置于汽车驾驶台和车顶实现,共同物理实体实施例通过放置于汽车内部后视镜内部实现。毫米波共同物理实体模块的位于车内后视镜位置,该毫米波单元用于通过对该位置用户的手势、动作、表情的监测,为驾驶员位置人员提供AR-HUD体验,仅使用一个毫米波物理实体模块能更加节约智能座舱成本和智能座舱空间,但此时毫米波模块针对监测对象持续不断的位置定位和动态波束控制非常频繁,在毫米波通信和感知信息融合处理AR-HUD的逐步升级迭代过程中,对单个毫米波模块的算力要求非常高,。The first millimeter wave unit and the second millimeter wave unit of the above-mentioned millimeter wave module may be two independent physical entities connected through millimeter wave communication, or they may be two virtual functional units within a common physical entity. The two physical entity embodiments are implemented by being placed on the driver's seat and the roof of the car respectively, and the common physical entity embodiment is implemented by being placed inside the car's interior rearview mirror. The millimeter wave common physical entity module is located at the rearview mirror in the car. This millimeter wave unit is used to provide an AR-HUD experience for the driver's position by monitoring the gestures, movements, and expressions of the user at this location, using only a millimeter. The wave physical entity module can save smart cockpit costs and smart cockpit space. However, at this time, the millimeter wave module continuously positions and dynamically controls the position of the monitored object very frequently. In the process of millimeter wave communication and perception information fusion processing, AR-HUD gradually During the upgrade iteration process, the computing power requirements for a single millimeter wave module are very high.
在一些实施例中,本申请的智能座舱包括两个物理实体的毫米波单元,采用两个毫米波单元分工更利于保证整体性能,采用两个毫米波单元,第一毫米波单元106兼用作了AR生成器的算法处理中心,第二毫米波单元102在具备边缘计算中心功能同时兼用作了边缘CDN(content delivery network,内容分发网络),使用户就近快速获取所需内容或游戏数据,进而更加快速稳定的通过车载抬头显示屏投影仪为用户提供增强现实服务,根据用户需求为用户展示驾驶信息和娱乐内容。In some embodiments, the smart cockpit of the present application includes two physical millimeter wave units. The division of labor between the two millimeter wave units is more conducive to ensuring overall performance. Using two millimeter wave units, the first millimeter wave unit 106 is also used as a The algorithm processing center of the AR generator, the second millimeter wave unit 102 has the function of an edge computing center and also serves as an edge CDN (content delivery network, content distribution network), allowing users to quickly obtain the required content or game data nearby, thereby making it more convenient Quickly and stably provide users with augmented reality services through the vehicle head-up display projector, displaying driving information and entertainment content to users according to their needs.
参考图5,图5为本申请另一实施例提出的智能座舱的系统架构图,本申请实施例提供了一种智能座舱。Referring to FIG. 5 , FIG. 5 is a system architecture diagram of a smart cockpit proposed by another embodiment of the present application. This embodiment of the present application provides a smart cockpit.
在一实施方式中,如图5所示,考虑各个毫米波模块算力均衡以及实际还可以存在更多毫米波模块情况下,且各个毫米波模块各自功能可以相互协调替换或协同计算,针对具体功能不专指某具体毫米波模块,统一以毫米波模块Xn表述。In one implementation, as shown in Figure 5, considering the balance of computing power of each millimeter wave module and the fact that more millimeter wave modules may actually exist, and the respective functions of each millimeter wave module can be coordinated with each other, replaced or calculated collaboratively, for specific The function does not refer specifically to a specific millimeter wave module, but is uniformly expressed as millimeter wave module Xn.
在一些实施例中,第一毫米波单元106作为AR生成器用于融合计算数据、AR渲染数据、V2X通信数据、CDN通信转发数据、DMS感知数据、OMS感知数据以及边缘计算中心数据,得到多传感器融合数据,使第二毫米波单元102根据该融合数据进行后续工作,其中,上述融合数据中各项数据通过控制总线的控制入口进行交互。In some embodiments, the first millimeter wave unit 106 serves as an AR generator for fusing computing data, AR rendering data, V2X communication data, CDN communication forwarding data, DMS sensing data, OMS sensing data and edge computing center data to obtain multi-sensor The fused data allows the second millimeter wave unit 102 to perform subsequent work based on the fused data, where each data in the above fused data interacts through the control entrance of the control bus.
在一些实施例中,图5中控制总线在系统通信控制、系统感知控制和系统计算控制外还纳入了中控屏用户UI控制和用户AR眼镜、手柄104控制,相对于前三者属于在毫米波模块内或毫米波模块间用户不可见的模块内部系统的隐含控制,后两者则属于用户层面可见的控制,使得用户在汽车座舱内的信息娱乐内容选择更加丰富,更可以上升到元宇宙体验层面。In some embodiments, the control bus in Figure 5 also includes central control screen user UI control and user AR glasses and handle 104 control in addition to system communication control, system perception control and system calculation control. Compared with the first three, which belong to the millimeter The implicit control of the internal system of the module is invisible to the user within the wave module or between the millimeter wave modules. The latter two are visible controls at the user level, making the user's infotainment content choices in the car cockpit more abundant and rising to the meta level. The level of cosmic experience.
在一些实施例中,本申请毫米波模块实际是毫米波雷达的扩展,在普通毫米波雷达外,某个毫米波模块兼做了AR生成器,另一个毫米波模块兼做了边缘计算中心和CDN,通过智能座舱的通信感知一体化的资源、能力和应用进行充实,可以提高用户的车载AR-HUD体验。In some embodiments, the millimeter wave module of this application is actually an extension of the millimeter wave radar. In addition to the ordinary millimeter wave radar, a certain millimeter wave module also serves as an AR generator, and another millimeter wave module also serves as an edge computing center and CDN can improve users' in-vehicle AR-HUD experience by enriching the resources, capabilities and applications of smart cockpit communication and perception integration.
参考图6,图6为本申请另一实施例提出的智能座舱控制方法的方法流程图, 本申请实施例提供了一种智能座舱控制方法,应用于智能座舱,智能座舱包括毫米波模块,毫米波模块与抬头显示屏投影仪连接,方法包括但不限于以下步骤:Referring to Figure 6, Figure 6 is a method flow chart of a smart cockpit control method proposed by another embodiment of the present application. The embodiment of the present application provides a smart cockpit control method, which is applied to the smart cockpit. The smart cockpit includes a millimeter wave module, and the millimeter wave module is connected to a head-up display screen projector. The method includes but is not limited to the following steps:
步骤S610,获取融合信息数据和通信数据,融合信息数据包括毫米波模块对智能座舱内部扫描得到的监测信息,通信数据包括毫米波模块与智能座舱内外部进行交互的通信数据;Step S610, obtain fusion information data and communication data. The fusion information data includes monitoring information obtained by scanning the interior of the smart cockpit by the millimeter wave module, and the communication data includes communication data about the interaction between the millimeter wave module and the inside and outside of the smart cockpit;
步骤S620,根据融合信息数据和通信数据得到增强现实数据;Step S620, obtain augmented reality data based on the fused information data and communication data;
步骤S630,对增强现实数据进行渲染处理后得到投影数据;Step S630, perform rendering processing on the augmented reality data to obtain projection data;
步骤S640,将投影数据发送至投影设备进行投影显示。Step S640: Send the projection data to the projection device for projection display.
在一实施方式中,通信数据至少包括以下之一:毫米波模块对智能座舱内外部获取的车联网数据、内容分发网络数据或关联的边缘计算数据,其中,车联网数据为车载设备通过无线通信技术,从外部的信息网络平台获取的行车路况、行车速度等车辆动态信息数据,毫米波模块根据车联网数据实现相应数据显示和智能驾驶等功能;内容分发网络数据为毫米波模块通过车顶外部的天线阵列进行车对外界通信,进而从路侧单元获取的通信数据,毫米波模块根据内容分发网络数据能实时更新本地渲染资源,更好的对增强现实数据进行渲染处理;In one embodiment, the communication data includes at least one of the following: Internet of Vehicles data, content distribution network data or associated edge computing data acquired by the millimeter wave module inside and outside the smart cockpit, where the Internet of Vehicles data is the vehicle-mounted device through wireless communication Technology, vehicle dynamic information data such as driving road conditions and driving speed are obtained from external information network platforms. The millimeter wave module implements corresponding data display and intelligent driving functions based on the Internet of Vehicles data; the content distribution network data is passed through the millimeter wave module through the exterior of the car roof. The antenna array communicates between the vehicle and the outside world, and then obtains the communication data from the roadside unit. The millimeter wave module can update local rendering resources in real time based on the content distribution network data to better render the augmented reality data;
边缘计算数据为毫米波模块从汽车内部的边缘计算中心获取的通信数据,在毫米波模块进行渲染处理的过程中,边缘计算中心可以分担部分计算工作,并将计算结果以边缘计算数据的形式发向毫米波模块,进而提升毫米波模块的渲染效率。Edge computing data is communication data obtained by the millimeter wave module from the edge computing center inside the car. During the rendering process of the millimeter wave module, the edge computing center can share part of the computing work and send the calculation results in the form of edge computing data. To the millimeter wave module, thereby improving the rendering efficiency of the millimeter wave module.
在一实施方式中,投影设备即抬头显示屏投影仪,通过毫米波模块获取融合信息数据和通信数据,融合信息数据包括毫米波模块对智能座舱内部扫描得到的监测信息,通信数据包括毫米波模块对智能座舱内外部获取的车联网数据、内容分发网络数据以及关联的边缘计算数据,根据融合信息数据和通信数据得到增强现实数据,根据增强现实数据生成渲染需求信息,根据渲染需求信息对增强现实数据进行渲染处理得到投影数据,将投影数据发送至抬头显示屏投影仪进行投影显示,进而实现以毫米波模块作为增强现实生成器根据多传感器融合数据和通信数据获取增强现实数据,并对增强现实数据进行渲染处理,进而更加快速稳定的通过车载抬头显示屏投影仪为用户提供增强现实和元宇宙服务,根据用户需求为用户展示驾驶信息和娱乐内容的功能,为用户快速获取所需信息或游戏数据并进行显示。In one embodiment, the projection device, that is, the head-up display screen projector, obtains fusion information data and communication data through the millimeter wave module. The fusion information data includes monitoring information obtained by scanning the interior of the smart cockpit by the millimeter wave module. The communication data includes the millimeter wave module. For the Internet of Vehicles data, content distribution network data and associated edge computing data acquired inside and outside the smart cockpit, augmented reality data is obtained based on the fusion information data and communication data, rendering demand information is generated based on the augmented reality data, and the augmented reality data is generated based on the rendering demand information. The data is rendered and processed to obtain the projection data, and the projection data is sent to the head-up display projector for projection display, and then the millimeter wave module is used as the augmented reality generator to obtain augmented reality data based on multi-sensor fusion data and communication data, and the augmented reality The data is rendered and processed to more quickly and stably provide users with augmented reality and metaverse services through the vehicle head-up display projector, which can display driving information and entertainment content to users according to their needs, allowing users to quickly obtain the information or games they need. data and display it.
在一些实施例中,在获取融合信息数据之前,还包括,对毫米波模块所在的车舱进行扫描处理获取定位计算值,根据定位计算值获取目标位置信息,根据目标位置信息获取监测信息。In some embodiments, before obtaining the fusion information data, the method further includes scanning the cabin where the millimeter wave module is located to obtain a positioning calculation value, obtaining target location information based on the positioning calculation value, and obtaining monitoring information based on the target location information.
在一些实施例中,毫米波模块设置有毫米波天线阵列,获取融合信息数据之前,还包括,获取毫米波天线阵列的相位干涉差值,根据相位干涉差值获取目标位置信息,根据目标位置信息获取监测信息。In some embodiments, the millimeter wave module is provided with a millimeter wave antenna array. Before obtaining the fusion information data, the method further includes: obtaining the phase interference difference value of the millimeter wave antenna array, obtaining the target position information according to the phase interference difference value, and obtaining the target position information according to the phase interference difference value. Get monitoring information.
在一实施方式中,为获取代表驾驶员或者乘客的动作或体征信息,需要先确定驾驶员或者乘客的位置,故可以通过在毫米波天线阵列中选取专用相位干涉阵 元的相位干涉差值来确定驾驶员或者乘客对应目标位置信息;同时也可以直接通过毫米波扫描目标物首先进行定位计算来确定驾驶员或者乘客对应目标位置信息,通过毫米波进行定位可以使定位精度达到厘米级,基于此级别精度定位可实现智能座舱中各种需要用户位置信息的应用,进而更加精准的通过车载抬头显示屏投影仪为用户提供增强现实服务,根据用户位置为用户展示驾驶信息和娱乐内容。In one embodiment, in order to obtain the action or physical sign information representing the driver or passenger, the position of the driver or passenger needs to be determined first. Therefore, a dedicated phase interference array can be selected in the millimeter wave antenna array. The phase interference difference of each element is used to determine the corresponding target position information of the driver or passenger; at the same time, the target position information corresponding to the driver or passenger can be determined by directly scanning the target object with millimeter waves and first performing positioning calculations. Positioning through millimeter waves can make the positioning The accuracy reaches centimeter level. Based on this level of precision positioning, various applications that require user location information in the smart cockpit can be realized, and then more accurately provide users with augmented reality services through the vehicle head-up display projector, and display driving information to users based on their location. and entertainment content.
在一些实施例中,进行目标位置定位,获取目标位置信息后,毫米波模块根据目标位置信息针对毫米波天线阵列进行相位控置进行波束对准,在对准过程中反复进行波束扫描、配对动作,确保在毫米波模块监测的对象是动态变化的情况下,实时更新毫米波模块监测对象的感应数据。In some embodiments, after positioning the target and obtaining the target position information, the millimeter wave module performs phase control on the millimeter wave antenna array to perform beam alignment based on the target position information, and repeatedly performs beam scanning and pairing actions during the alignment process. , ensuring that the sensing data of the object monitored by the millimeter wave module is updated in real time when the object monitored by the millimeter wave module changes dynamically.
在一些实施例中,毫米波模块分别针对驾驶员和乘客进行4D成像和呼吸、心率监测,其中,4D成像通过对监测目标反射的毫米波点云进行距离、速度、水平角和俯仰角的成像分析识别人的动作,呼吸、心率监测为利用毫米波探测目标微小振动所引起的特定范围值的调频连续波信号CSI(Channel State Informat ion,信道状态信息)相位变化获取的驾驶员和乘客的呼吸、心率监测。In some embodiments, the millimeter wave module performs 4D imaging and breathing and heart rate monitoring respectively for the driver and the passenger. The 4D imaging performs distance, speed, horizontal angle and pitch angle imaging by imaging the millimeter wave point cloud reflected by the monitoring target. Analyze and identify human movements, breathing, and heart rate monitoring to detect the driver's and passengers' breathing using millimeter waves to detect the phase changes of the frequency modulated continuous wave signal CSI (Channel State Information) in a specific range caused by the tiny vibrations of the target. , heart rate monitoring.
在一些实施例中,毫米波模块分别针对驾驶员和乘客进行4D成像和呼吸、心率监测可以采用两种方式:分时感知,集中大段时间内集中进行监测目标4D成像后定时再切换呼吸、心率监测,其中,通过在分时感知过程中定期监测达到分别针对驾驶员和乘客进行4D成像和呼吸、心率监测的目的并节省计算机资源;同时感知,由于通过进行同时感知处理对毫米波模块性能要求提高,故需要毫米波模块采用高级MIMO波形设计多个天线通道同时工作,采用软件算法实现小于1度的超分辨率,并同时短程、中距、长距并发多模感测进而根据迅速可靠的分别针对驾驶员和乘客进行4D成像和呼吸、心率监测;其中,通过人工智能模型进行行为分类处理,实现在4D成像中感知的用户头部自由度、手势、身姿、表情等需要进行感知参数估计实现面向应用的模式识别。In some embodiments, the millimeter wave module can perform 4D imaging and breathing and heart rate monitoring for the driver and passenger respectively in two ways: time-sharing sensing, concentrating on monitoring the target for a large period of time and then switching to breathing and breathing at regular intervals after 4D imaging of the target. Heart rate monitoring, among which, through regular monitoring during the time-sharing sensing process, the purpose of 4D imaging and respiration and heart rate monitoring for drivers and passengers is achieved respectively and computer resources are saved; Simultaneous sensing, due to the impact on the performance of millimeter wave modules through simultaneous sensing processing As requirements increase, millimeter wave modules are required to use advanced MIMO waveforms to design multiple antenna channels to work simultaneously, use software algorithms to achieve super-resolution of less than 1 degree, and simultaneously perform short-range, medium-range, and long-range concurrent multi-mode sensing to quickly and reliably 4D imaging and breathing and heart rate monitoring are performed respectively for drivers and passengers; among them, behavior classification processing is performed through artificial intelligence models to realize the user's head degree of freedom, gestures, body posture, expressions, etc. that are perceived in 4D imaging. Parameter estimation enables application-oriented pattern recognition.
在一些实施例中,智能座舱包括DMS摄像头,根据融合信息数据得到增强现实数据,其中,关于DMS摄像头和毫米波模块的DMS监测需要做联合校准,即相机的内外参和雷达外参要配准,然后形成DMS组合数据。在一实施方式中,再加入车身、ADAS、定位模块等数据一起进行数据层、功能层、特征层的多传感信息融合计算,最后根据融合信息提取有效AR信息;其中,根据融合信息提取有效AR信息包括,根据系统预设的多传感器融合算法对融合信息进行多传感信息融合处理得到有效AR信息,多传感器融合算法包括卡尔曼滤波和人工神经网络等常见多传感器融合算法。In some embodiments, the smart cockpit includes a DMS camera, and augmented reality data is obtained based on the fusion information data. Among them, DMS monitoring of the DMS camera and millimeter wave module requires joint calibration, that is, the internal and external parameters of the camera and the external parameters of the radar need to be registered. , and then form DMS combined data. In one embodiment, data such as vehicle body, ADAS, and positioning modules are added to perform multi-sensor information fusion calculations on the data layer, function layer, and feature layer, and finally effective AR information is extracted based on the fused information; wherein, effective AR information is extracted based on the fused information AR information includes multi-sensor information fusion processing based on the system's preset multi-sensor fusion algorithm to obtain effective AR information. Multi-sensor fusion algorithms include common multi-sensor fusion algorithms such as Kalman filtering and artificial neural networks.
在一些实施例中,渲染需求包括信息显示模型渲染需求或提出流媒体播放内容或游戏渲染库需求,根据渲染需求信息对增强现实数据进行渲染处理得到投影数据包括,毫米波模块根据渲染需求对增强现实数据对应的显示模型进行实时渲染或加载流媒体内容以及游戏渲染库对增强现实数据对应的显示模型进行渲染, 其中,增强现实数据即为根据AR有效信息。In some embodiments, the rendering requirements include information display model rendering requirements or streaming media playback content or game rendering library requirements. The projection data obtained by rendering the augmented reality data according to the rendering requirement information includes, and the millimeter wave module performs enhancement processing according to the rendering requirements. The display model corresponding to the real data performs real-time rendering or loading of streaming media content, and the game rendering library renders the display model corresponding to the augmented reality data. Among them, augmented reality data is effective information based on AR.
在一些实施例中,通过毫米波模块获取融合信息数据和通信数据,融合信息数据包括毫米波模块对智能座舱内部扫描得到的监测信息,通信数据包括毫米波模块对智能座舱内外部获取的车联网数据、内容分发网络数据以及关联的边缘计算数据。根据融合信息数据和通信数据得到增强现实数据,极大的扩展了增强现实数据来源,使得增强现实的数据不仅来自于汽车座舱内(本地两个毫米波单元之间通信和感知融合信息数据外,还有本地边缘计算的3D模型、渲染数据、内容分发数据等),还来自汽车座舱外并延伸到车联网、移动互联网(来自移动互联网的远程渲染数据、远程内容分发数据以及车联网广布的增强现实模型等)。在一实施方式中,根据增强现实数据生成渲染需求信息,根据渲染需求信息对增强现实数据进行渲染处理得到投影数据,不必额外增加存储计算单元到抬头显示屏投影仪的抬头显示屏投影仪驱动板中,抬头显示屏投影仪驱动板就可以专用于做投影仪光学部分的驱动,进而克服在一些情形中受限于当前的车载增强现实抬头显示屏设备的处理能力较弱,导致车载增强现实抬头显示屏投影仪功能单一,仅能为用户提供一些简单的路况信息的技术问题,使增强现实抬头显示屏投影仪进一步为用户提供用户需要驾驶信息和娱乐内容。In some embodiments, the fused information data and communication data are obtained through the millimeter wave module. The fused information data includes the monitoring information obtained by the millimeter wave module scanning the inside of the smart cockpit, and the communication data includes the Internet of Vehicles data obtained by the millimeter wave module inside and outside the smart cockpit. data, content delivery network data, and associated edge computing data. Augmented reality data is obtained based on the fusion of information data and communication data, which greatly expands the source of augmented reality data, making the augmented reality data not only come from the car cockpit (in addition to the communication and perception fusion information data between two local millimeter wave units, There are also local edge computing 3D models, rendering data, content distribution data, etc.), which also come from outside the car cockpit and extend to the Internet of Vehicles and the mobile Internet (remote rendering data, remote content distribution data from the mobile Internet, and widely distributed data from the Internet of Vehicles) augmented reality models, etc.). In one embodiment, the rendering requirement information is generated based on the augmented reality data, and the augmented reality data is rendered according to the rendering requirement information to obtain the projection data. There is no need to add an additional storage computing unit to the head-up display projector drive board of the head-up display projector. , the head-up display projector driver board can be dedicated to driving the optical part of the projector, thereby overcoming the limitations in some cases of the weak processing capabilities of the current vehicle-mounted augmented reality head-up display equipment, resulting in vehicle-mounted augmented reality heads-up displays. The display projector has a single function and can only provide users with some simple traffic information, making the augmented reality head-up display projector further provide users with driving information and entertainment content they need.
参考图7,图7为本申请另一实施例提出的智能座舱控制方法的方法流程图,Referring to Figure 7, Figure 7 is a method flow chart of a smart cockpit control method proposed by another embodiment of the present application.
本申请实施例提供了一种智能座舱控制方法,毫米波模块包括第一毫米波单元和第二毫米波单元,其中,第一毫米波单元与第二毫米波单元连接,第一毫米波单元与投影设备连接,方法包括:Embodiments of the present application provide a smart cockpit control method. The millimeter wave module includes a first millimeter wave unit and a second millimeter wave unit. The first millimeter wave unit is connected to the second millimeter wave unit, and the first millimeter wave unit is connected to the second millimeter wave unit. Projection device connection, methods include:
步骤S710,控制第一毫米波单元获取融合信息数据,融合信息数据包括第一毫米波单元获取的第一监测信息和第二毫米波单元获取的第二监测信息;Step S710, control the first millimeter wave unit to obtain fusion information data, where the fusion information data includes the first monitoring information obtained by the first millimeter wave unit and the second monitoring information obtained by the second millimeter wave unit;
步骤S720,控制第一毫米波单元根据融合信息数据和第二毫米波单元获取的通信数据得到增强现实数据;Step S720, control the first millimeter wave unit to obtain augmented reality data based on the fusion information data and the communication data obtained by the second millimeter wave unit;
步骤S730,控制第一毫米波单元根据增强现实数据生成渲染需求信息;Step S730, control the first millimeter wave unit to generate rendering requirement information based on the augmented reality data;
步骤S740,控制第一毫米波单元向第二毫米波单元发送渲染需求信息;Step S740, control the first millimeter wave unit to send rendering requirement information to the second millimeter wave unit;
步骤S750,控制第二毫米波单元根据渲染需求信息对增强现实数据进行渲染处理得到投影数据,并将投影数据发送至投影设备进行投影显示。Step S750: Control the second millimeter wave unit to render the augmented reality data according to the rendering requirement information to obtain projection data, and send the projection data to the projection device for projection display.
在一实施方式中,在汽车舱内部设置两个毫米波模块装置,分别为第一毫米波单元和第二毫米波单元,第一毫米波单元置于面向驾驶员的驾驶舱靠近AR HUD投影仪位置,称为前置毫米波模块;第二毫米波单元置于车舱顶部内侧位置并有车顶外部的天线阵列相连,称为顶置毫米波模块。两者分别针对驾驶员和乘客进行4D成像和呼吸、心跳监测的同时,第一毫米波单元兼用于AR生成器算法处理中心,第二毫米波单元兼用于边缘计算中心和内容分发网络;两者相互合作,一方面用作汽车座舱内通信系统和监测系统的组成部分进行通信和感知,另一方面用于处理汽车多传感融合信息生成有效AR信息并进行AR渲染以及多媒体内容和游戏播放与渲染,第一毫米波单元根据融合信息数据得到增强现实数据,第一毫 米波单元根据增强现实数据生成渲染需求信息,第一毫米波单元向第二毫米波单元发送渲染需求信息,渲染需求信息携带增强现实数据,第二毫米波单元根据渲染需求信息对增强现实数据进行渲染处理得到投影数据,并将投影数据发送至抬头显示屏投影仪进行投影显示。In one embodiment, two millimeter wave module devices are provided inside the car cabin, namely a first millimeter wave unit and a second millimeter wave unit. The first millimeter wave unit is placed in the cockpit facing the driver and close to the AR HUD projector. The location is called the front millimeter wave module; the second millimeter wave unit is placed inside the top of the car cabin and is connected to the antenna array outside the roof, which is called the top millimeter wave module. While the two perform 4D imaging and breathing and heartbeat monitoring for drivers and passengers respectively, the first millimeter wave unit is also used as an AR generator algorithm processing center, and the second millimeter wave unit is also used as an edge computing center and content distribution network; both Cooperating with each other, on the one hand, it is used as an integral part of the car's in-cabin communication system and monitoring system for communication and perception; on the other hand, it is used to process the car's multi-sensor fusion information to generate effective AR information and perform AR rendering, as well as multimedia content and game playback. Rendering, the first millimeter wave unit obtains augmented reality data based on the fusion information data, and the first millimeter wave unit The meter-wave unit generates rendering demand information based on the augmented reality data. The first millimeter-wave unit sends rendering demand information to the second millimeter-wave unit. The rendering demand information carries augmented reality data. The second millimeter-wave unit processes the augmented reality data based on the rendering demand information. The rendering process obtains the projection data and sends the projection data to the head-up display projector for projection display.
在一些实施例中,第一监测信息为第一毫米波单元获取的驾驶员呼吸、心跳、动作信息,第二监测信息为第二毫米波单元获取的乘客的呼吸、心跳、动作信息,通过获取驾驶员和乘客的体征和动作信息在抬头显示屏投影仪上显示相应内容,为驾驶员和乘客提供的安全预警功能。In some embodiments, the first monitoring information is the driver's breathing, heartbeat, and movement information obtained by the first millimeter wave unit, and the second monitoring information is the passenger's breathing, heartbeat, and movement information obtained by the second millimeter wave unit. The physical signs and movement information of the driver and passengers are displayed on the head-up display projector, providing a safety warning function for the driver and passengers.
在一些实施例中,第一毫米波单元具有可波束赋形控制的毫米波天线阵列,一方面用于监测驾驶员呼吸、心跳(通过探测目标微小振动所引起的特定范围值的调频连续波信号的相位变化来实现)以及针对驾驶员进行危险动作、分心、疲劳驾驶状态的4D成像(通过判断目标物距离、速度、水平角和俯仰角),第一毫米波单元另一方面因自身具备更强的算力替代传统HUD驱动板作为AR算法处理中心。In some embodiments, the first millimeter wave unit has a millimeter wave antenna array that can be controlled by beamforming. On the one hand, it is used to monitor the driver's breathing and heartbeat (by detecting frequency modulated continuous wave signals with a specific range of values caused by tiny vibrations of the target). phase change) and 4D imaging of the driver's dangerous actions, distractions, and fatigue driving conditions (by judging the target distance, speed, horizontal angle, and pitch angle). On the other hand, the first millimeter wave unit has Stronger computing power replaces the traditional HUD driver board as the AR algorithm processing center.
在一些实施例中,第二毫米波单元与自路侧单元和/或车联网云计算中心连接,第二毫米波单元根据渲染需求信息对增强现实数据进行渲染处理得到投影数据包括,第二毫米波单元获取来自路侧单元和/或车联网云计算中心的更新数据,根据更新数据和需求信息对增强现实数据进行增强现实数据渲染处理,得到投影数据,其中,第二毫米波单元在车舱内部和车顶外部均有可波束赋形控制的毫米波天线阵列,一方面用于对乘客进行呼吸、心跳监测和对乘客行为动作进行4D成像监测,另一方面用于车舱边缘计算中心和内容分发网络,提供强大的AR渲染算力和流媒体内容分发能力,并能通过车顶外部的天线阵列进行车对外界通信到路侧单元、车联网以及移动互联网,进而根据更新数据和需求信息对增强现实数据进行增强现实数据渲染处理,得到投影数据。In some embodiments, the second millimeter wave unit is connected to the roadside unit and/or the Internet of Vehicles cloud computing center, and the second millimeter wave unit renders the augmented reality data according to the rendering requirement information to obtain projection data including, the second millimeter wave unit The wave unit obtains updated data from the roadside unit and/or the Internet of Vehicles cloud computing center, performs augmented reality data rendering processing on the augmented reality data based on the updated data and demand information, and obtains projection data. Among them, the second millimeter wave unit is in the car cabin There are millimeter-wave antenna arrays that can be controlled by beamforming both inside and on the roof. On the one hand, they are used to monitor the breathing and heartbeat of passengers and 4D imaging monitoring of their behavior. On the other hand, they are used in the cabin edge computing center and The content distribution network provides powerful AR rendering computing power and streaming content distribution capabilities, and can communicate with the outside world through the antenna array on the roof of the car to the roadside unit, the Internet of Vehicles and the mobile Internet, and then update data and demand information based on Perform augmented reality data rendering processing on the augmented reality data to obtain projection data.
参考图8,图8为本申请另一实施例提出的智能座舱控制方法的方法流程图,Referring to Figure 8, Figure 8 is a method flow chart of a smart cockpit control method proposed by another embodiment of the present application.
本申请实施例提供了一种智能座舱控制方法,其中,毫米波模块包括第一毫米波单元和第二毫米波单元,第一毫米波单元与第二毫米波单元连接,第一毫米波单元与投影设备连接,方法包括:Embodiments of the present application provide a smart cockpit control method, wherein the millimeter wave module includes a first millimeter wave unit and a second millimeter wave unit, the first millimeter wave unit is connected to the second millimeter wave unit, and the first millimeter wave unit is connected to the second millimeter wave unit. Projection device connection, methods include:
步骤S810,控制第二毫米波单元获取融合信息数据和通信数据,融合信息数据包括第一毫米波单元获取的第一监测信息和第二毫米波单元获取的第二监测信息,;Step S810, control the second millimeter wave unit to obtain fusion information data and communication data. The fusion information data includes the first monitoring information obtained by the first millimeter wave unit and the second monitoring information obtained by the second millimeter wave unit;
步骤S820,控制第二毫米波单元根据融合信息数据和通信数据得到增强现实数据;Step S820, control the second millimeter wave unit to obtain augmented reality data based on the fused information data and communication data;
步骤S830,控制第二毫米波单元根据增强现实数据生成渲染需求信息;Step S830, control the second millimeter wave unit to generate rendering requirement information based on the augmented reality data;
步骤S840,控制第二毫米波单元根据渲染需求信息对增强现实数据进行渲染处理得到投影数据,并将投影数据发送至投影设备进行投影显示。Step S840, control the second millimeter wave unit to render the augmented reality data according to the rendering requirement information to obtain projection data, and send the projection data to the projection device for projection display.
在一实施方式中,通信数据包括第二毫米波单元获取的车联网数据、内容分 发网络数据以及关联的边缘计算数据,在第一毫米波单元和第二毫米波单元分工合作AR信息处理流程更加复杂化,需求的算力更高的情况下,在抬头显示屏投影仪播放过程中第一毫米波单元用于将用户头部自由度、眼球、手势、身姿、表情等身体感知要素精确计算并加载到AR生成器,同时还要把用户通过AR眼镜的获取的空间定位也精确计算并加载到AR生成器,同时第一毫米波单元兼任AR生成器,计算载荷和功耗非常大,故通过第一毫米波单元和第而毫米波单元之间的大带宽高容量低时延通信,由第二毫米波单元获取融合信息数据,根据融合信息数据得到增强现实数据,根据增强现实数据生成渲染需求信息,根据渲染需求信息对增强现实数据进行渲染处理得到投影数据,并将投影数据发送至抬头显示屏投影仪进行投影显示,进而更加快速稳定的通过车载抬头显示屏投影仪为用户提供增强现实服务,根据用户需求为用户展示驾驶信息和娱乐内容。In one embodiment, the communication data includes Internet of Vehicles data acquired by the second millimeter wave unit, content analysis Send network data and associated edge computing data. When the first millimeter wave unit and the second millimeter wave unit work together, the AR information processing process becomes more complicated and requires higher computing power. During the playback process of the head-up display projector The first millimeter-wave unit is used to accurately calculate and load body perception elements such as the user's head degree of freedom, eyeballs, gestures, posture, and expressions into the AR generator. At the same time, the spatial positioning obtained by the user through AR glasses is also Accurately calculate and load it to the AR generator. At the same time, the first millimeter wave unit also serves as the AR generator. The calculation load and power consumption are very large. Therefore, through the large bandwidth, high capacity, and low time between the first millimeter wave unit and the second millimeter wave unit, Extended communication, the second millimeter wave unit obtains fusion information data, obtains augmented reality data based on the fusion information data, generates rendering demand information based on the augmented reality data, renders the augmented reality data based on the rendering demand information to obtain projection data, and projects the The data is sent to the head-up display projector for projection display, and then the vehicle head-up display projector can provide users with augmented reality services more quickly and stably, and display driving information and entertainment content to users according to their needs.
参考图9,图9为本申请另一实施例提出的智能座舱控制方法的方法流程图,Referring to Figure 9, Figure 9 is a method flow chart of an intelligent cockpit control method proposed by another embodiment of the present application.
本申请实施例提供了一种智能座舱控制方法,其中,毫米波模块包括第一毫米波单元和第二毫米波单元,第一毫米波单元与第二毫米波单元连接,第一毫米波单元与投影设备连接,方法包括:Embodiments of the present application provide a smart cockpit control method, wherein the millimeter wave module includes a first millimeter wave unit and a second millimeter wave unit, the first millimeter wave unit is connected to the second millimeter wave unit, and the first millimeter wave unit is connected to the second millimeter wave unit. Projection device connection, methods include:
步骤S910,控制第一毫米波单元和第二毫米波单元根据各自的算力功耗数据进行渲染任务分配处理,得到分配结果;Step S910, control the first millimeter wave unit and the second millimeter wave unit to perform rendering task allocation processing according to their respective computing power consumption data, and obtain the allocation result;
步骤S920,控制第一毫米波单元和第二毫米波单元根据分配结果对融合信息数据和通信数据进行处理,得到增强现实数据,融合信息数据包括第一毫米波单元获取的第一监测信息和第二毫米波单元获取的第二监测信息;Step S920: Control the first millimeter wave unit and the second millimeter wave unit to process the fusion information data and communication data according to the allocation results to obtain augmented reality data. The fusion information data includes the first monitoring information and the third millimeter wave unit obtained by the first millimeter wave unit. The second monitoring information obtained by the two millimeter wave unit;
步骤S930,控制第二毫米波单元根据增强现实数据生成渲染需求信息;Step S930, control the second millimeter wave unit to generate rendering requirement information based on the augmented reality data;
步骤S940,控制第二毫米波单元根据渲染需求信息对增强现实数据进行渲染处理得到投影数据,并将投影数据发送至投影设备进行投影显示。Step S940, control the second millimeter wave unit to render the augmented reality data according to the rendering requirement information to obtain projection data, and send the projection data to the projection device for projection display.
在一实施方式中,算力功耗数据代表第一毫米波单元和第二毫米波单元的处理能力,第一毫米波单元和第二毫米波单元根据各自的算力功耗数据进行任务分配处理,得到分配结果,第一毫米波单元和第二毫米波单元根据分配结果对融合信息数据和通信数据进行协同处理,即联邦计算处理,得到增强现实数据,融合信息数据包括第一毫米波单元获取的第一监测信息和第二毫米波单元获取的第二监测信息,其中,联邦计算为对第一毫米波单元和第二毫米波单元进行计算任务分配,使第一毫米波单元和第二毫米波单元能根据分配结果,分配负载、共享数据文件和内存,各种执行被分配的计算任务,进行协同处理后续的渲染处理任务。In one embodiment, the computing power consumption data represents the processing capabilities of the first millimeter wave unit and the second millimeter wave unit, and the first millimeter wave unit and the second millimeter wave unit perform task allocation processing according to their respective computing power consumption data. , to obtain the allocation result, the first millimeter wave unit and the second millimeter wave unit collaboratively process the fusion information data and communication data according to the allocation result, that is, federated computing processing, to obtain augmented reality data. The fusion information data includes the information obtained by the first millimeter wave unit. The first monitoring information and the second monitoring information obtained by the second millimeter wave unit, wherein the federal calculation is to allocate computing tasks to the first millimeter wave unit and the second millimeter wave unit, so that the first millimeter wave unit and the second millimeter wave unit The wave unit can allocate loads, share data files and memory based on the allocation results, perform various assigned computing tasks, and collaboratively process subsequent rendering processing tasks.
在一实施方式中,通信数据包括第一毫米波单元和/或第二毫米波单元获取的车联网数据、内容分发网络数据以及关联的边缘计算数据,在第一毫米波单元和第二毫米波单元分工合作AR信息处理流程更加复杂化,需求的算力更高的情况下,在抬头显示屏投影仪播放过程中第一毫米波单元用于将用户头部自由度、眼球、手势、身姿、表情等身体感知要素精确计算并加载到AR生成器,同时还 要把用户通过AR眼镜的获取的空间定位也精确计算并加载到AR生成器,同时第一毫米波单元兼任AR生成器,计算载荷和功耗非常大,故通过第一毫米波单元和第而毫米波单元之间的大带宽高容量低时延通信,由第二毫米波单元参与部分或全部多传感的感知计算,也就是引入联邦计算,第一毫米波单元和第二毫米波单元进行联邦计算,得到联邦计算结果,第一毫米波单元和第二毫米波单元根据联邦计算结果对融合信息数据进行处理,得到增强现实数据,融合信息数据包括第一毫米波单元获取的第一监测信息和第二毫米波单元获取的第二监测信息达到两个毫米波模块之间算力功耗均衡化,第一毫米波单元和第二毫米波单元可以通过相互间通信实现算力均衡,确保元宇宙的用户身体感知和空间定位很好融入到交互式虚实结合的XR空间,实现虚实世界完美融合。In one embodiment, the communication data includes Internet of Vehicles data, content distribution network data and associated edge computing data acquired by the first millimeter wave unit and/or the second millimeter wave unit. Unit division of labor makes the AR information processing process more complex and requires higher computing power. During the playback process of the head-up display projector, the first millimeter wave unit is used to combine the user's head degree of freedom, eyeballs, gestures, and body posture. , expressions and other body perception elements are accurately calculated and loaded into the AR generator, while also The spatial positioning obtained by the user through AR glasses must be accurately calculated and loaded into the AR generator. At the same time, the first millimeter wave unit also serves as the AR generator. The calculation load and power consumption are very large. Therefore, through the first millimeter wave unit and the second For large-bandwidth, high-capacity, and low-latency communication between millimeter-wave units, the second millimeter-wave unit participates in part or all of the multi-sensor sensing calculations, that is, federated computing is introduced, and the first millimeter-wave unit and the second millimeter-wave unit perform Federal calculation is performed to obtain the federal calculation result. The first millimeter wave unit and the second millimeter wave unit process the fusion information data according to the federation calculation result to obtain augmented reality data. The fusion information data includes the first monitoring information obtained by the first millimeter wave unit. and the second monitoring information obtained by the second millimeter wave unit to achieve the balancing of computing power and power consumption between the two millimeter wave modules. The first millimeter wave unit and the second millimeter wave unit can achieve computing power balancing through mutual communication to ensure that the unit The user's body perception and spatial positioning of the universe are well integrated into the interactive virtual and real XR space, achieving a perfect integration of the virtual and real worlds.
参考图10,本申请实施例还提供了一种智能座舱控制器1000,包括:存储器1020、处理器1010及存储在存储器上并可在处理器上运行的计算机程序,处理器1010执行计算机程序时实现如上述实施例中任意一项的智能座舱控制方法,例如,执行以上描述的图6中的方法步骤S610至S650、图7中的方法步骤S710至S750、图8中的方法步骤S810至S840、图9中的方法步骤S910至S940。Referring to Figure 10, an embodiment of the present application also provides a smart cockpit controller 1000, which includes: a memory 1020, a processor 1010, and a computer program stored in the memory and executable on the processor. When the processor 1010 executes the computer program Implement the smart cockpit control method as in any one of the above embodiments, for example, perform the above-described method steps S610 to S650 in Figure 6, method steps S710 to S750 in Figure 7, and method steps S810 to S840 in Figure 8 , method steps S910 to S940 in Figure 9.
此外,本申请实施例的还提供了一种智能座舱,该智能座舱包括由上述的控制器,由于本申请实施例的智能座舱具有上述实施例的控制器,并且上述实施例的控制器能够执行上述实施例的智能座舱控制方法,因此,本申请实施例的智能座舱的实施方式和技术效果,可以参照上述任一实施例的智能座舱控制方法的实施方式和技术效果。In addition, the embodiment of the present application also provides a smart cockpit, which includes the above-mentioned controller. Since the smart cockpit of the embodiment of the present application has the controller of the above-mentioned embodiment, and the controller of the above-mentioned embodiment can execute The smart cockpit control method in the above embodiments, therefore, the implementation manner and technical effects of the smart cockpit in the embodiment of the present application can be referred to the implementation manner and technical effects of the smart cockpit control method in any of the above embodiments.
此外,本申请的一实施例还提供了一种计算机可读存储介质,该计算机可读存储介质存储有计算机可执行指令,该计算机可执行指令被一个或多个控制处理器执行,例如,执行以上描述的图6中的方法步骤S610至S650、图7中的方法步骤S710至S750、图8中的方法步骤S810至S840、图9中的方法步骤S910至S940。In addition, an embodiment of the present application also provides a computer-readable storage medium that stores computer-executable instructions, and the computer-executable instructions are executed by one or more control processors, for example, execute The method steps S610 to S650 in FIG. 6 , the method steps S710 to S750 in FIG. 7 , the method steps S810 to S840 in FIG. 8 , and the method steps S910 to S940 in FIG. 9 are described above.
本申请至少具有以下有益效果:通过本申请提出的智能座舱控制方法,获取融合信息数据和通信数据,融合信息数据包括毫米波模块对智能座舱内部扫描得到的监测信息,通信数据包括毫米波模块与智能座舱内外部进行交互的通信数据,根据融合信息数据和通信数据得到增强现实数据,对增强现实数据进行渲染处理后得到投影数据,将投影数据发送至投影设备进行投影显示。以毫米波模块作为增强现实生成器,根据多传感器融合数据和通信数据获取增强现实数据,并对增强现实数据进行渲染处理,更加快速稳定的通过车载抬头显示屏投影仪为用户提供增强现实和元宇宙服务,进而根据用户需求为用户展示驾驶信息和娱乐内容。This application has at least the following beneficial effects: through the intelligent cockpit control method proposed in this application, fusion information data and communication data are obtained. The fusion information data includes monitoring information obtained by scanning the interior of the intelligent cockpit by the millimeter wave module. The communication data includes the millimeter wave module and The communication data that interacts inside and outside the smart cockpit is used to obtain augmented reality data based on the fusion of information data and communication data. The augmented reality data is rendered and processed to obtain projection data, and the projection data is sent to the projection device for projection display. Using the millimeter wave module as the augmented reality generator, it obtains augmented reality data based on multi-sensor fusion data and communication data, renders the augmented reality data, and provides users with augmented reality and elements more quickly and stably through the vehicle head-up display projector. Universe services, and then display driving information and entertainment content to users according to their needs.
本领域普通技术人员可以理解,上文中所公开方法中的全部或某些步骤、系统可以被实施为软件、固件、硬件及其适当的组合。某些物理组件或所有物理组件可以被实施为由处理器,如总处理器、数字信号处理器或微处理器执行的软件,或者被实施为硬件,或者被实施为集成电路,如专用集成电路。这样的软件可以 分布在计算机可读介质上,计算机可读介质可以包括计算机存储介质(或非暂时性介质)和通信介质(或暂时性介质)。如本领域普通技术人员公知的,术语计算机存储介质包括在用于存储信息(诸如计算机可读指令、数据结构、程序模块或其他数据)的任何方法或技术中实施的易失性和非易失性、可移除和不可移除介质。计算机存储介质包括但不限于RAM、ROM、EEPROM、闪存或其他存储器技术、CD-ROM、数字多功能盘(DVD)或其他光盘存储、磁盒、磁带、磁盘存储或其他磁存储装置、或者可以用于存储期望的信息并且可以被计算机访问的任何其他的介质。此外,本领域普通技术人员公知的是,通信介质通常包含计算机可读指令、数据结构、程序模块或者诸如载波或其他传输机制之类的调制数据信号中的其他数据,并且可包括任何信息递送介质。 Those of ordinary skill in the art can understand that all or some steps and systems in the methods disclosed above can be implemented as software, firmware, hardware, and appropriate combinations thereof. Some or all of the physical components may be implemented as software executed by a processor, such as a processor, a digital signal processor, or a microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit . Such software can Distributed on computer-readable media, which may include computer storage media (or non-transitory media) and communication media (or transitory media). As is known to those of ordinary skill in the art, the term computer storage media includes volatile and nonvolatile media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. removable, removable and non-removable media. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disk (DVD) or other optical disk storage, magnetic cassettes, tapes, disk storage or other magnetic storage devices, or may Any other medium used to store the desired information and that can be accessed by a computer. Additionally, it is known to those of ordinary skill in the art that communication media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism, and may include any information delivery media .

Claims (12)

  1. 一种智能座舱控制方法,应用于智能座舱,所述智能座舱包括毫米波模块,所述毫米波模块与投影设备连接,所述方法包括:An intelligent cockpit control method, applied to an intelligent cockpit. The intelligent cockpit includes a millimeter wave module, and the millimeter wave module is connected to a projection device. The method includes:
    获取融合信息数据和通信数据,所述融合信息数据包括所述毫米波模块对智能座舱内部扫描得到的监测信息,所述通信数据包括所述毫米波模块与智能座舱内外部进行交互的通信数据;Obtain fusion information data and communication data, the fusion information data includes monitoring information obtained by scanning the interior of the smart cockpit by the millimeter wave module, and the communication data includes communication data between the millimeter wave module and the inside and outside of the smart cockpit;
    根据所述融合信息数据和所述通信数据得到增强现实数据;Obtain augmented reality data according to the fused information data and the communication data;
    对所述增强现实数据进行渲染处理后得到投影数据;Perform rendering processing on the augmented reality data to obtain projection data;
    将所述投影数据发送至所述投影设备进行投影显示。The projection data is sent to the projection device for projection display.
  2. 根据权利要求1所述智能座舱控制方法,其中,所述通信数据至少包括以下之一:联网数据、内容分发网络数据或关联的边缘计算数据。The intelligent cockpit control method according to claim 1, wherein the communication data includes at least one of the following: networking data, content distribution network data or associated edge computing data.
  3. 根据权利要求1所述智能座舱控制方法,其中,所述毫米波模块包括第一毫米波单元和第二毫米波单元,所述方法包括:The intelligent cockpit control method according to claim 1, wherein the millimeter wave module includes a first millimeter wave unit and a second millimeter wave unit, and the method includes:
    控制所述第一毫米波单元获取所述融合信息数据,所述融合信息数据包括所述第一毫米波单元获取的第一监测信息和所述第二毫米波单元获取的第二监测信息;Control the first millimeter wave unit to obtain the fusion information data, where the fusion information data includes the first monitoring information obtained by the first millimeter wave unit and the second monitoring information obtained by the second millimeter wave unit;
    控制所述第一毫米波单元根据所述融合信息数据和所述第二毫米波单元获取的通信数据得到增强现实数据;Control the first millimeter wave unit to obtain augmented reality data based on the fusion information data and the communication data acquired by the second millimeter wave unit;
    控制所述第一毫米波单元根据所述增强现实数据生成渲染需求信息;Control the first millimeter wave unit to generate rendering requirement information according to the augmented reality data;
    控制所述第一毫米波单元向所述第二毫米波单元发送渲染需求信息;Control the first millimeter wave unit to send rendering requirement information to the second millimeter wave unit;
    控制所述第二毫米波单元根据所述渲染需求信息对所述增强现实数据进行渲染处理得到投影数据,并将所述投影数据发送至所述投影设备进行投影显示。The second millimeter wave unit is controlled to render the augmented reality data according to the rendering requirement information to obtain projection data, and send the projection data to the projection device for projection display.
  4. 根据权利要求1所述智能座舱控制方法,其中,所述毫米波模块包括第一毫米波单元和第二毫米波单元,所述方法包括:The intelligent cockpit control method according to claim 1, wherein the millimeter wave module includes a first millimeter wave unit and a second millimeter wave unit, and the method includes:
    控制所述第一毫米波单元和所述第二毫米波单元根据各自的算力功耗数据进行渲染任务分配处理,得到分配结果;Control the first millimeter wave unit and the second millimeter wave unit to perform rendering task allocation processing according to their respective computing power consumption data to obtain allocation results;
    控制所述第一毫米波单元和所述第二毫米波单元根据所述分配结果对所述融合信息数据和所述通信数据进行处理,得到增强现实数据,所述融合信息数据包括所述第一毫米波单元获取的第一监测信息和所述第二毫米波单元获取的第二监测信息;Control the first millimeter wave unit and the second millimeter wave unit to process the fusion information data and the communication data according to the allocation result to obtain augmented reality data, where the fusion information data includes the first The first monitoring information obtained by the millimeter wave unit and the second monitoring information obtained by the second millimeter wave unit;
    控制所述第二毫米波单元根据所述增强现实数据生成渲染需求信息;Control the second millimeter wave unit to generate rendering requirement information according to the augmented reality data;
    控制所述第二毫米波单元根据所述渲染需求信息对所述增强现实数据进行渲染处理得到投影数据,并将所述投影数据发送至所述投影设备进行投影显示。The second millimeter wave unit is controlled to render the augmented reality data according to the rendering requirement information to obtain projection data, and send the projection data to the projection device for projection display.
  5. 根据权利要求1至4任意一项所述智能座舱控制方法,其中,所述融合信息数据还包括传感信息,所述根据所述融合信息数据得到增强现实数据,包括:The intelligent cockpit control method according to any one of claims 1 to 4, wherein the fusion information data also includes sensory information, and obtaining augmented reality data based on the fusion information data includes:
    根据所述监测信息和所述传感信息得到增强现实数据。Augmented reality data is obtained according to the monitoring information and the sensing information.
  6. 根据权利要求5所述的智能座舱控制方法,其中,所述传感信息包括以 下至少之一:用户眼球跟踪数据、用户按键数据或空间定位数据。The intelligent cockpit control method according to claim 5, wherein the sensing information includes At least one of the following: user eye tracking data, user keystroke data or spatial positioning data.
  7. 根据权利要求1至4中任意一项所述的智能座舱控制方法,其中,所述获取融合信息数据之前,还包括:The intelligent cockpit control method according to any one of claims 1 to 4, wherein before obtaining the fusion information data, it further includes:
    对所述毫米波模块所在的车舱进行扫描处理获取定位计算值;Scan the cabin where the millimeter wave module is located to obtain the positioning calculation value;
    根据所述定位计算值获取目标位置信息;Obtain target location information according to the positioning calculation value;
    根据所述目标位置信息获取所述监测信息。The monitoring information is obtained according to the target location information.
  8. 根据权利要求1至4中任意一项所述的智能座舱控制方法,其中,所述毫米波模块设置有毫米波天线阵列,所述获取融合信息数据之前,还包括:The smart cockpit control method according to any one of claims 1 to 4, wherein the millimeter wave module is provided with a millimeter wave antenna array, and before obtaining the fusion information data, it further includes:
    获取所述毫米波天线阵列的相位干涉差值;Obtain the phase interference difference value of the millimeter wave antenna array;
    根据所述相位干涉差值获取目标位置信息;Obtain target position information according to the phase interference difference value;
    根据所述目标位置信息获取所述监测信息。The monitoring information is obtained according to the target location information.
  9. 根据权利要求3或4所述的智能座舱控制方法,其中,所述第二毫米波单元与路侧单元和/或车联网云计算中心连接,所述第二毫米波单元与移动互联网连接,控制所述第二毫米波单元根据所述渲染需求信息对所述增强现实数据进行渲染处理得到投影数据包括:The smart cockpit control method according to claim 3 or 4, wherein the second millimeter wave unit is connected to the roadside unit and/or the Internet of Vehicles cloud computing center, the second millimeter wave unit is connected to the mobile Internet, and the control The second millimeter wave unit performs rendering processing on the augmented reality data according to the rendering requirement information to obtain projection data including:
    控制所述第二毫米波单元获取来自路侧单元和/或车联网云计算中心的更新数据,以及来自移动互联网的内容分发网络数据;Control the second millimeter wave unit to obtain update data from the roadside unit and/or the Internet of Vehicles cloud computing center, and content distribution network data from the mobile Internet;
    控制所述第二毫米波单元根据所述更新数据、所述内容分发网络数据和所述需求信息对所述增强现实数据进行增强现实数据渲染处理,得到投影数据。The second millimeter wave unit is controlled to perform augmented reality data rendering processing on the augmented reality data according to the update data, the content distribution network data and the demand information to obtain projection data.
  10. 一种智能座舱控制器,包括存储器、处理器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现如权利要求1至9中任意一项所述的智能座舱控制方法。An intelligent cockpit controller includes a memory, a processor and a computer program stored in the memory and executable on the processor. When the processor executes the computer program, it implements claims 1 to 9 The intelligent cockpit control method described in any one of the above.
  11. 一种智能座舱,其特征在于,包括如权利要求10所述的控制器。An intelligent cockpit, characterized by comprising the controller according to claim 10.
  12. 一种计算机可读存储介质,存储有计算机可执行指令,计算机可执行指令用于执行如权利要求1至9中任意一项所述的智能座舱控制方法。 A computer-readable storage medium stores computer-executable instructions, and the computer-executable instructions are used to execute the intelligent cockpit control method according to any one of claims 1 to 9.
PCT/CN2023/081396 2022-06-13 2023-03-14 Intelligent carriage control method, controller, intelligent carriage, and storage medium WO2023241139A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210659957.1 2022-06-13
CN202210659957.1A CN117261585A (en) 2022-06-13 2022-06-13 Intelligent cabin control method, controller, intelligent cabin and storage medium

Publications (1)

Publication Number Publication Date
WO2023241139A1 true WO2023241139A1 (en) 2023-12-21

Family

ID=89193117

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/081396 WO2023241139A1 (en) 2022-06-13 2023-03-14 Intelligent carriage control method, controller, intelligent carriage, and storage medium

Country Status (2)

Country Link
CN (1) CN117261585A (en)
WO (1) WO2023241139A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030142041A1 (en) * 2002-01-30 2003-07-31 Delphi Technologies, Inc. Eye tracking/HUD system
US20150175068A1 (en) * 2013-12-20 2015-06-25 Dalila Szostak Systems and methods for augmented reality in a head-up display
US20170307881A1 (en) * 2016-04-22 2017-10-26 Electronics And Telecommunications Research Institute Apparatus and method for transforming augmented reality information of head-up display for vehicle
CN108462728A (en) * 2017-02-17 2018-08-28 中兴通讯股份有限公司 A kind of method and device, the vehicle mobile terminals of on-vehicle information processing
CN112640498A (en) * 2019-03-29 2021-04-09 丰田自动车株式会社 Location-based beam scanning for directing vehicles to all-around (V2X) networks
CN113491520A (en) * 2020-04-07 2021-10-12 广州汽车集团股份有限公司 Driving fatigue detection method and device
CN114223008A (en) * 2019-05-15 2022-03-22 罗伯瑞斯有限公司 Meta-universe data fusion system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030142041A1 (en) * 2002-01-30 2003-07-31 Delphi Technologies, Inc. Eye tracking/HUD system
US20150175068A1 (en) * 2013-12-20 2015-06-25 Dalila Szostak Systems and methods for augmented reality in a head-up display
US20170307881A1 (en) * 2016-04-22 2017-10-26 Electronics And Telecommunications Research Institute Apparatus and method for transforming augmented reality information of head-up display for vehicle
CN108462728A (en) * 2017-02-17 2018-08-28 中兴通讯股份有限公司 A kind of method and device, the vehicle mobile terminals of on-vehicle information processing
CN112640498A (en) * 2019-03-29 2021-04-09 丰田自动车株式会社 Location-based beam scanning for directing vehicles to all-around (V2X) networks
CN114223008A (en) * 2019-05-15 2022-03-22 罗伯瑞斯有限公司 Meta-universe data fusion system
CN113491520A (en) * 2020-04-07 2021-10-12 广州汽车集团股份有限公司 Driving fatigue detection method and device

Also Published As

Publication number Publication date
CN117261585A (en) 2023-12-22

Similar Documents

Publication Publication Date Title
JP2021095117A (en) Neural network based facial analysis using facial landmarks and associated confidence values
US11127373B2 (en) Augmented reality wearable system for vehicle occupants
US10764536B2 (en) System and method for a dynamic human machine interface for video conferencing in a vehicle
WO2022094624A1 (en) Model-based reinforcement learning for behavior prediction in autonomous systems and applications
US20220207756A1 (en) Image composition in multiview automotive and robotics systems
CN114270294A (en) Gaze determination using glare as input
US11626028B2 (en) System and method for providing vehicle function guidance and virtual test-driving experience based on augmented reality content
WO2019181233A1 (en) Image display system, information processing device, information processing method, program, and moving body
WO2022061277A1 (en) Measuring the effects of augmentation artifacts on a machine learning network
CN114258319A (en) Projection method and device, vehicle and AR-HUD
Fan et al. Gazmon: Eye gazing enabled driving behavior monitoring and prediction
WO2021222256A2 (en) Systems and methods for performing operations in a vehicle using gaze detection
US20240143072A1 (en) Personalized calibration functions for user gaze detection in autonomous driving applications
US20230316773A1 (en) Optimized visualization streaming for vehicle environment visualization
JP2022132075A (en) Ground Truth Data Generation for Deep Neural Network Perception in Autonomous Driving Applications
CN112977460A (en) Method and apparatus for preventing motion sickness when viewing image content in a moving vehicle
CN115701623A (en) Belief propagation of range image mapping in autonomous machine applications
WO2023241139A1 (en) Intelligent carriage control method, controller, intelligent carriage, and storage medium
WO2022230995A1 (en) Display control device, head-up display device, and display control method
JP2023165383A (en) Data set generation and augmentation for machine learning model
US20220092317A1 (en) Simulating viewpoint transformations for sensor independent scene understanding in autonomous systems
US11704814B2 (en) Adaptive eye tracking machine learning model engine
US9529193B2 (en) Device for operating one or more optical display devices of a vehicle
US20240135487A1 (en) Image stitching with saccade-based control of dynamic seam placement for surround view visualization
US20230298361A1 (en) Image to world space transformation for ground-truth generation in autonomous systems and applications

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23822711

Country of ref document: EP

Kind code of ref document: A1