Disclosure of Invention
The invention aims to provide a real-scene interaction system for drivers, which can realize real-scene interaction among drivers.
The invention provides a driver live-action interaction system, which comprises an electronic control unit, a client, a server, a loop-scene shooting system and a vehicle display screen, wherein the electronic control unit comprises an image processing module and an interaction module, the server comprises a vehicle identification module,
the image processing module of the electronic control unit is used for generating a live-action image around the vehicle body according to the image information acquired by the surrounding image shooting system;
the vehicle display screen is used for displaying the live-action image and sending operation information of a user of the first vehicle on the second vehicle in the live-action image to the electronic control unit;
the interaction module of the electronic control unit is used for generating interaction data according to the operation information and sending the interaction data to the client, wherein the interaction data comprises interaction content and relative position information of the first vehicle and the second vehicle in the live-action image;
the client is used for sending the interaction data to the server;
the vehicle identification module of the server is used for identifying the second vehicle from the adjacent vehicles of the first vehicle according to the relative position information, and sending the interaction data to the electronic control unit of the second vehicle through the client corresponding to the second vehicle so as to generate an interaction live-action image containing the interaction content.
Further, the client is connected with the electronic control unit through Bluetooth signals.
Further, the live-action image is a 360-degree real-time surrounding image around the vehicle body.
Further, the surrounding view camera system comprises a front camera, a rear camera, a left camera and a right camera, wherein the front camera, the rear camera, the left camera and the right camera are respectively positioned on the front side, the rear side, the left side and the right side of the car body.
Further, the interactive content comprises a graphic identifier or a picture containing text information.
Further, the image processing module is configured to superimpose the interactive content and the live-action image according to the relative position information to generate an interactive live-action image, where the interactive content is displayed in the interactive live-action image at a position corresponding to the first vehicle and/or the second vehicle.
Further, the client comprises a positioning information acquisition module, the server further comprises a matching module, the positioning information acquisition module is used for acquiring vehicle position information from the positioning module of the corresponding terminal of the client and sending the vehicle position information to the server, and the matching module of the server is used for matching adjacent vehicles of all vehicles according to the acquired vehicle position information.
Further, the positioning module is a GPS positioning module.
Further, the client comprises a real-time communication module for enabling real-time communication between the client and the electronic control unit and the server.
Further, the relative position information is a coordinate relationship between the first vehicle and the second vehicle in the live-action image, and the vehicle identification module of the server performs coordinate conversion on the relative position information to obtain the relative positions of the first vehicle and the second vehicle in geographic coordinates, so that the second vehicle is identified from adjacent vehicles of the first vehicle according to the relative positions.
According to the real scene interaction system for the driver, the real scene image around the vehicle body is generated through the electronic control unit according to the image information acquired by the surrounding scene shooting system, the real scene image is displayed through the vehicle display screen, the operation information of the first vehicle to the second vehicle in the real scene image is sent to the electronic control unit, so that the electronic control unit generates interaction data and sends the interaction data to the server through the client, the server processes the interaction data to identify the second vehicle from the adjacent vehicles of the first vehicle, and then sends the interaction data to the electronic control unit of the second vehicle through the client corresponding to the second vehicle to generate the interaction real scene image containing the interaction content, the combination of the driver interaction and the 360-degree surrounding scene system is realized, the social interaction function is increased compared with the existing 360-degree surrounding scene system, the real scene function is increased compared with the existing social interaction system, the interaction between the drivers is more real, and the drivers can communicate with each other in a relatively airtight carriage, for example, the driver can communicate with each other in a relatively airtight carriage, or the passengers are allowed to feel with each other, and the passengers are increased.
Detailed Description
In order to further describe the technical means and effects adopted for achieving the preset aim of the invention, the following detailed description refers to the specific implementation, structure, characteristics and effects of the invention with reference to the accompanying drawings and preferred embodiments.
Fig. 1 is a schematic structural diagram of a driver live-action interaction system according to an embodiment of the invention. As shown in fig. 1, the driver live-action interaction system according to the embodiment of the invention includes an electronic control unit 1, a client 2, a server 3, a panoramic camera system 4 and a vehicle display 5, wherein the electronic control unit 1 includes an image processing module 11, an interaction module 12 and a conversion module 13, the server 3 includes a vehicle identification module 31 and a matching module 32, and the client 2 includes a real-time communication module 21 and a positioning information acquisition module 22.
The real-time communication module 21 of the client 2 is configured to start real-time communication between the client 2 and the electronic control unit 1 and between the client 2 and the server 3, and after the driver connects and binds the client 2 and the electronic control unit 1 and starts the interactive function, the real-time communication module 21 of the client 2 starts real-time communication between the client 2 and the electronic control unit 1 and between the client 2 and the server 3, so that the server 3 can acquire vehicle position information and interactive data in real time, and the electronic control unit 1 can receive the interactive data sent by the server 3 in real time.
The positioning information obtaining module 22 of the client 2 is configured to obtain vehicle position information from a positioning module (e.g., a GPS positioning module) of a corresponding terminal of the client 2 and send the vehicle position information to the server 3, so that the matching module 32 of the server 3 matches neighboring vehicles of each vehicle according to the obtained vehicle position information, that is, matches neighboring vehicles of each vehicle in a real world coordinate system, where the neighboring vehicles refer to vehicles closest to the host vehicle in front, rear, left and right directions of each vehicle, and it can be understood that the matched neighboring vehicles also correspond to the client 2.
The surrounding view image pickup system 4 can be an image pickup system of a 360-degree surrounding view system of a vehicle and is used for acquiring image information around a vehicle body. In the present embodiment, the panoramic camera system 4 includes a front camera 41, a rear camera 42, a left camera 43, and a right camera 44, and the front camera 41, the rear camera 42, the left camera 43, and the right camera 44 are located on the front side, the rear side, the left side, and the right side of the vehicle body, respectively, so that images in all directions around the vehicle body can be obtained.
The image processing module 11 of the electronic control unit 1 is configured to generate a live-action image around the vehicle body according to the image information acquired by the panoramic image capturing system 4. In this embodiment, the live-action image is a 360 ° three-dimensional surrounding image around the vehicle generated by the image processing module 11 after splicing and processing the image information, and the surrounding image can intuitively present the position and surrounding situation of the vehicle, so as to expand the perception capability of the driver on the surrounding and environment. In this embodiment, the process of generating the live-action image by the image processing module 11, that is, the process of generating the live-action image by the 360 ° loop-view system of the vehicle, is known to those skilled in the art, and will not be described herein.
A vehicle display 5 for displaying the live-action image and transmitting operation information of the user of the first vehicle to the second vehicle in the live-action image to the electronic control unit 1. Specifically, the first vehicle is, for example, a host vehicle, the second vehicle is, for example, a nearby vehicle of the host vehicle, after the electronic control unit 1 sends the live-action image generated by the image processing module 11 to the vehicle display screen 5 for displaying, the driver can intuitively understand the position and the surrounding situation of the vehicle from the vehicle display screen 5, when the driver needs to communicate with the driver in the nearby vehicle, the driver can click the nearby vehicle visible from the live-action image of the vehicle display screen 5 and select to send interactive content, and the interactive content can be a graphic identifier or a picture containing text information (such as a phrase), at this time, the vehicle display screen 5 receives the operation information of the user and sends the operation information to the electronic control unit 1.
The interaction module 12 of the electronic control unit 1 is configured to generate interaction data according to the operation information sent by the vehicle display 5, and send the interaction data to the server 3 through the client 2, where the interaction data includes interaction content and relative position information of the first vehicle and the second vehicle in the live-action image. Specifically, when the user clicks the nearby vehicle visible in the live-action image on the vehicle display screen 5 and selects to send the interactive content, the vehicle display screen 5 generates a click signal at the corresponding position and generates data corresponding to the sent interactive content, and then sends the click signal and the data as operation information to the electronic control unit 1, and the interaction module 12 of the electronic control unit 1 processes the received operation information to generate the interactive data, where the interaction module 12 can generate relative position information (e.g. coordinate relation) of the first vehicle and the second vehicle in the live-action image according to the click signal, and convert the data corresponding to the interactive content into an interactive content format capable of performing data transmission. In this embodiment, the client 2 is connected to the electronic control unit 1 through a bluetooth signal, and the interaction data generated by the electronic control unit 1 is sent to the client 2 through the bluetooth signal, and then sent to the server 3 by the client 2.
The vehicle identification module 31 of the server 3 is configured to identify a second vehicle from the neighboring vehicles of the first vehicle according to the relative position information in the interactive data, and send the interactive data to the electronic control unit 1 of the second vehicle through the client 2 corresponding to the second vehicle to generate an interactive live-action image containing the interactive content. Specifically, in this embodiment, the relative position information is a coordinate relationship between the first vehicle and the second vehicle in the live-action image, and in general, the 360 ° surrounding system coordinate system calibrates coordinates of other vehicles with the vehicle as a center, and the vehicle identification module 31 of the server 3 performs coordinate conversion on the relative position information to obtain the relative positions of the first vehicle and the second vehicle in the geographic coordinates, that is, the vehicle identification module 31 converts the coordinates of the 360 ° surrounding system into geographic coordinates, so as to identify the second vehicle from the neighboring vehicles, where the geographic coordinates are coordinates of the vehicle in the real world coordinate system, for determining the position of the vehicle on the earth. The client 2 corresponding to the second vehicle, i.e. the client 2 bound to the second vehicle, sends the interaction data received from the server 3 to the electronic control unit 1 of the second vehicle. After the electronic control unit 1 of the second vehicle receives the interactive data, the image processing module 11 superimposes the interactive content and the live-action image according to the relative position information in the interactive data so as to generate an interactive live-action image and display the interactive live-action image through the vehicle display screen 5, and in the interactive live-action image, the interactive content is displayed at the corresponding position of the first vehicle and/or the second vehicle, so that the driver can know the interactive content in a visible way.
Next, the working procedure of the driver live-action interaction system according to the embodiment of the present invention is described below.
In the running process of the vehicle, the image processing module 11 of the electronic control unit 1 generates a live-action image around the vehicle body according to the image information acquired by the surrounding image shooting system 4 and sends the live-action image to the vehicle display screen 5 for display;
the real-time communication module 21 of the client 2 starts real-time communication between the client 2 and the electronic control unit 1 and the server 3 according to user operation, and the positioning information acquisition module 22 transmits the vehicle position information acquired from the positioning module (e.g., GPS positioning module) of the corresponding terminal to the server 3;
the matching module 32 of the server 3 matches adjacent vehicles of the vehicles according to the acquired vehicle position information, and updates the adjacent vehicles in real time by combining the map data, wherein the adjacent vehicles are vehicles closest to the vehicle in all directions of front, rear, left and right of the vehicles;
when the driver clicks the adjacent vehicle visible from the real image of the vehicle display screen 5 and selects to send the interactive content, the vehicle display screen 5 receives the operation information of the user and sends the operation information to the electronic control unit 1, the interactive content can be a graphic mark or a picture containing text information, and in the vehicle display screen 5 of the vehicle, the interactive content can be displayed at a position corresponding to the selected adjacent vehicle to form an interactive real image;
the interaction module 12 of the electronic control unit 1 generates interaction data according to the operation information sent by the vehicle display screen 5, and sends the interaction data to the server 3 through the client 2, wherein the interaction data comprises interaction content and relative position information of a first vehicle (own vehicle) and a second vehicle (adjacent vehicle) in a live-action image;
the vehicle identification module 31 of the server 3 identifies a second vehicle from the neighboring vehicles of the first vehicle according to the relative position information in the interactive data, and transmits the interactive data to the electronic control unit 1 of the second vehicle through the client 2 corresponding to the second vehicle;
after the electronic control unit 1 of the second vehicle receives the interactive data, the image processing module 11 superimposes the interactive content and the live-action image according to the relative position information in the interactive data to generate an interactive live-action image and display the interactive live-action image through the vehicle display screen 5, and in the interactive live-action image of the second vehicle, the interactive content is displayed at the corresponding position of the first vehicle, so that a driver of the opposite vehicle can know the interactive content of the driver of the opposite vehicle in a visible manner.
According to the real scene interaction system for the driver, the real scene image around the vehicle body is generated through the electronic control unit according to the image information acquired by the surrounding scene shooting system, the real scene image is displayed through the vehicle display screen, the operation information of the user of the first vehicle on the second vehicle in the real scene image is sent to the electronic control unit, so that the electronic control unit generates interaction data and sends the interaction data to the server through the client, the server processes the interaction data to identify the second vehicle from the adjacent vehicles of the first vehicle, and then sends the interaction data to the electronic control unit of the second vehicle through the client corresponding to the second vehicle to generate the interaction real scene image containing the interaction content, the combination of the driver interaction and the 360-degree surrounding scene system is realized, the social interaction function is increased compared with the existing 360-degree surrounding scene system, the real scene function is increased compared with the existing social interaction system, the interaction between the drivers is more real, the drivers can communicate with each other in a relatively airtight carriage, and the passengers can communicate with each other if the passengers are not full of the passengers, the passengers can be relieved, or the passengers can be friendly to the friends. In addition, in the embodiment of the invention, the client acquires the vehicle position information from the positioning module of the corresponding terminal and sends the vehicle position information to the server, the server matches the adjacent vehicles of the vehicles according to the acquired vehicle position information, and when the interactive data is received, the second vehicle is identified from the adjacent vehicles of the first vehicle according to the conversion relation between the 360-degree surrounding system coordinate and the geographic coordinate, so that the 360-degree surrounding system of different vehicles is associated by utilizing the mutual combination between the 360-degree surrounding system coordinate system and the geographic coordinate system of the real world, and the real-scene interaction of the driver is realized.
The present invention is not limited to the above-mentioned embodiments, but is intended to be limited to the following embodiments, and any modifications, equivalent changes and variations in the above-mentioned embodiments can be made by those skilled in the art without departing from the scope of the present invention.