CN106828320B - Driver live-action interaction system - Google Patents

Driver live-action interaction system Download PDF

Info

Publication number
CN106828320B
CN106828320B CN201710113034.5A CN201710113034A CN106828320B CN 106828320 B CN106828320 B CN 106828320B CN 201710113034 A CN201710113034 A CN 201710113034A CN 106828320 B CN106828320 B CN 106828320B
Authority
CN
China
Prior art keywords
vehicle
live
interaction
control unit
electronic control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710113034.5A
Other languages
Chinese (zh)
Other versions
CN106828320A (en
Inventor
陈务
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yinjia Electronic Technology Shanghai Co ltd
Original Assignee
Voyager Technology Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Voyager Technology Inc filed Critical Voyager Technology Inc
Priority to CN201710113034.5A priority Critical patent/CN106828320B/en
Publication of CN106828320A publication Critical patent/CN106828320A/en
Application granted granted Critical
Publication of CN106828320B publication Critical patent/CN106828320B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/105Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/50Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the display information being shared, e.g. external display, data transfer to other traffic participants or centralised traffic controller

Abstract

The real-scene interaction system for the driver comprises an electronic control unit, a client, a server, a surrounding scene shooting system and a vehicle display screen, wherein an image processing module of the electronic control unit generates a Zhou Shijing image of the vehicle according to image information acquired by the surrounding scene shooting system; the vehicle display screen displays the live-action image and sends operation information of a user of the first vehicle on a second vehicle in the live-action image to the electronic control unit; the interaction module of the electronic control unit generates interaction data according to the operation information and sends the interaction data to the server through the client; and the vehicle identification module of the server identifies a second vehicle from the adjacent vehicles of the first vehicle according to the relative position information in the interactive data, and sends the interactive data to the electronic control unit of the second vehicle through the client corresponding to the second vehicle so as to generate an interactive live-action image containing interactive contents. According to the invention, the live-action image containing the interactive content is generated on the other vehicle according to the operation of the user, so that the live-action interaction between drivers is realized.

Description

Driver live-action interaction system
Technical Field
The invention relates to the technical field of active safety of automobiles, in particular to a real scene interaction system for drivers.
Background
In the daily driving of the vehicle, because the vehicle is driven at a high speed and the driver is placed in a relatively sealed carriage environment, when some unpleasant conditions occur with other vehicles during the driving, the driver cannot communicate with the driver in the adjacent vehicle to express dissatisfaction, so that the emotion cannot be leaked, and the occurrence of the road agitation is aggravated.
Therefore, there is a need to provide a driver interaction system to solve the above-mentioned problems.
Disclosure of Invention
The invention aims to provide a real-scene interaction system for drivers, which can realize real-scene interaction among drivers.
The invention provides a driver live-action interaction system, which comprises an electronic control unit, a client, a server, a loop-scene shooting system and a vehicle display screen, wherein the electronic control unit comprises an image processing module and an interaction module, the server comprises a vehicle identification module,
the image processing module of the electronic control unit is used for generating a live-action image around the vehicle body according to the image information acquired by the surrounding image shooting system;
the vehicle display screen is used for displaying the live-action image and sending operation information of a user of the first vehicle on the second vehicle in the live-action image to the electronic control unit;
the interaction module of the electronic control unit is used for generating interaction data according to the operation information and sending the interaction data to the client, wherein the interaction data comprises interaction content and relative position information of the first vehicle and the second vehicle in the live-action image;
the client is used for sending the interaction data to the server;
the vehicle identification module of the server is used for identifying the second vehicle from the adjacent vehicles of the first vehicle according to the relative position information, and sending the interaction data to the electronic control unit of the second vehicle through the client corresponding to the second vehicle so as to generate an interaction live-action image containing the interaction content.
Further, the client is connected with the electronic control unit through Bluetooth signals.
Further, the live-action image is a 360-degree real-time surrounding image around the vehicle body.
Further, the surrounding view camera system comprises a front camera, a rear camera, a left camera and a right camera, wherein the front camera, the rear camera, the left camera and the right camera are respectively positioned on the front side, the rear side, the left side and the right side of the car body.
Further, the interactive content comprises a graphic identifier or a picture containing text information.
Further, the image processing module is configured to superimpose the interactive content and the live-action image according to the relative position information to generate an interactive live-action image, where the interactive content is displayed in the interactive live-action image at a position corresponding to the first vehicle and/or the second vehicle.
Further, the client comprises a positioning information acquisition module, the server further comprises a matching module, the positioning information acquisition module is used for acquiring vehicle position information from the positioning module of the corresponding terminal of the client and sending the vehicle position information to the server, and the matching module of the server is used for matching adjacent vehicles of all vehicles according to the acquired vehicle position information.
Further, the positioning module is a GPS positioning module.
Further, the client comprises a real-time communication module for enabling real-time communication between the client and the electronic control unit and the server.
Further, the relative position information is a coordinate relationship between the first vehicle and the second vehicle in the live-action image, and the vehicle identification module of the server performs coordinate conversion on the relative position information to obtain the relative positions of the first vehicle and the second vehicle in geographic coordinates, so that the second vehicle is identified from adjacent vehicles of the first vehicle according to the relative positions.
According to the real scene interaction system for the driver, the real scene image around the vehicle body is generated through the electronic control unit according to the image information acquired by the surrounding scene shooting system, the real scene image is displayed through the vehicle display screen, the operation information of the first vehicle to the second vehicle in the real scene image is sent to the electronic control unit, so that the electronic control unit generates interaction data and sends the interaction data to the server through the client, the server processes the interaction data to identify the second vehicle from the adjacent vehicles of the first vehicle, and then sends the interaction data to the electronic control unit of the second vehicle through the client corresponding to the second vehicle to generate the interaction real scene image containing the interaction content, the combination of the driver interaction and the 360-degree surrounding scene system is realized, the social interaction function is increased compared with the existing 360-degree surrounding scene system, the real scene function is increased compared with the existing social interaction system, the interaction between the drivers is more real, and the drivers can communicate with each other in a relatively airtight carriage, for example, the driver can communicate with each other in a relatively airtight carriage, or the passengers are allowed to feel with each other, and the passengers are increased.
Drawings
Fig. 1 is a schematic structural diagram of a driver live-action interaction system according to an embodiment of the invention.
Detailed Description
In order to further describe the technical means and effects adopted for achieving the preset aim of the invention, the following detailed description refers to the specific implementation, structure, characteristics and effects of the invention with reference to the accompanying drawings and preferred embodiments.
Fig. 1 is a schematic structural diagram of a driver live-action interaction system according to an embodiment of the invention. As shown in fig. 1, the driver live-action interaction system according to the embodiment of the invention includes an electronic control unit 1, a client 2, a server 3, a panoramic camera system 4 and a vehicle display 5, wherein the electronic control unit 1 includes an image processing module 11, an interaction module 12 and a conversion module 13, the server 3 includes a vehicle identification module 31 and a matching module 32, and the client 2 includes a real-time communication module 21 and a positioning information acquisition module 22.
The real-time communication module 21 of the client 2 is configured to start real-time communication between the client 2 and the electronic control unit 1 and between the client 2 and the server 3, and after the driver connects and binds the client 2 and the electronic control unit 1 and starts the interactive function, the real-time communication module 21 of the client 2 starts real-time communication between the client 2 and the electronic control unit 1 and between the client 2 and the server 3, so that the server 3 can acquire vehicle position information and interactive data in real time, and the electronic control unit 1 can receive the interactive data sent by the server 3 in real time.
The positioning information obtaining module 22 of the client 2 is configured to obtain vehicle position information from a positioning module (e.g., a GPS positioning module) of a corresponding terminal of the client 2 and send the vehicle position information to the server 3, so that the matching module 32 of the server 3 matches neighboring vehicles of each vehicle according to the obtained vehicle position information, that is, matches neighboring vehicles of each vehicle in a real world coordinate system, where the neighboring vehicles refer to vehicles closest to the host vehicle in front, rear, left and right directions of each vehicle, and it can be understood that the matched neighboring vehicles also correspond to the client 2.
The surrounding view image pickup system 4 can be an image pickup system of a 360-degree surrounding view system of a vehicle and is used for acquiring image information around a vehicle body. In the present embodiment, the panoramic camera system 4 includes a front camera 41, a rear camera 42, a left camera 43, and a right camera 44, and the front camera 41, the rear camera 42, the left camera 43, and the right camera 44 are located on the front side, the rear side, the left side, and the right side of the vehicle body, respectively, so that images in all directions around the vehicle body can be obtained.
The image processing module 11 of the electronic control unit 1 is configured to generate a live-action image around the vehicle body according to the image information acquired by the panoramic image capturing system 4. In this embodiment, the live-action image is a 360 ° three-dimensional surrounding image around the vehicle generated by the image processing module 11 after splicing and processing the image information, and the surrounding image can intuitively present the position and surrounding situation of the vehicle, so as to expand the perception capability of the driver on the surrounding and environment. In this embodiment, the process of generating the live-action image by the image processing module 11, that is, the process of generating the live-action image by the 360 ° loop-view system of the vehicle, is known to those skilled in the art, and will not be described herein.
A vehicle display 5 for displaying the live-action image and transmitting operation information of the user of the first vehicle to the second vehicle in the live-action image to the electronic control unit 1. Specifically, the first vehicle is, for example, a host vehicle, the second vehicle is, for example, a nearby vehicle of the host vehicle, after the electronic control unit 1 sends the live-action image generated by the image processing module 11 to the vehicle display screen 5 for displaying, the driver can intuitively understand the position and the surrounding situation of the vehicle from the vehicle display screen 5, when the driver needs to communicate with the driver in the nearby vehicle, the driver can click the nearby vehicle visible from the live-action image of the vehicle display screen 5 and select to send interactive content, and the interactive content can be a graphic identifier or a picture containing text information (such as a phrase), at this time, the vehicle display screen 5 receives the operation information of the user and sends the operation information to the electronic control unit 1.
The interaction module 12 of the electronic control unit 1 is configured to generate interaction data according to the operation information sent by the vehicle display 5, and send the interaction data to the server 3 through the client 2, where the interaction data includes interaction content and relative position information of the first vehicle and the second vehicle in the live-action image. Specifically, when the user clicks the nearby vehicle visible in the live-action image on the vehicle display screen 5 and selects to send the interactive content, the vehicle display screen 5 generates a click signal at the corresponding position and generates data corresponding to the sent interactive content, and then sends the click signal and the data as operation information to the electronic control unit 1, and the interaction module 12 of the electronic control unit 1 processes the received operation information to generate the interactive data, where the interaction module 12 can generate relative position information (e.g. coordinate relation) of the first vehicle and the second vehicle in the live-action image according to the click signal, and convert the data corresponding to the interactive content into an interactive content format capable of performing data transmission. In this embodiment, the client 2 is connected to the electronic control unit 1 through a bluetooth signal, and the interaction data generated by the electronic control unit 1 is sent to the client 2 through the bluetooth signal, and then sent to the server 3 by the client 2.
The vehicle identification module 31 of the server 3 is configured to identify a second vehicle from the neighboring vehicles of the first vehicle according to the relative position information in the interactive data, and send the interactive data to the electronic control unit 1 of the second vehicle through the client 2 corresponding to the second vehicle to generate an interactive live-action image containing the interactive content. Specifically, in this embodiment, the relative position information is a coordinate relationship between the first vehicle and the second vehicle in the live-action image, and in general, the 360 ° surrounding system coordinate system calibrates coordinates of other vehicles with the vehicle as a center, and the vehicle identification module 31 of the server 3 performs coordinate conversion on the relative position information to obtain the relative positions of the first vehicle and the second vehicle in the geographic coordinates, that is, the vehicle identification module 31 converts the coordinates of the 360 ° surrounding system into geographic coordinates, so as to identify the second vehicle from the neighboring vehicles, where the geographic coordinates are coordinates of the vehicle in the real world coordinate system, for determining the position of the vehicle on the earth. The client 2 corresponding to the second vehicle, i.e. the client 2 bound to the second vehicle, sends the interaction data received from the server 3 to the electronic control unit 1 of the second vehicle. After the electronic control unit 1 of the second vehicle receives the interactive data, the image processing module 11 superimposes the interactive content and the live-action image according to the relative position information in the interactive data so as to generate an interactive live-action image and display the interactive live-action image through the vehicle display screen 5, and in the interactive live-action image, the interactive content is displayed at the corresponding position of the first vehicle and/or the second vehicle, so that the driver can know the interactive content in a visible way.
Next, the working procedure of the driver live-action interaction system according to the embodiment of the present invention is described below.
In the running process of the vehicle, the image processing module 11 of the electronic control unit 1 generates a live-action image around the vehicle body according to the image information acquired by the surrounding image shooting system 4 and sends the live-action image to the vehicle display screen 5 for display;
the real-time communication module 21 of the client 2 starts real-time communication between the client 2 and the electronic control unit 1 and the server 3 according to user operation, and the positioning information acquisition module 22 transmits the vehicle position information acquired from the positioning module (e.g., GPS positioning module) of the corresponding terminal to the server 3;
the matching module 32 of the server 3 matches adjacent vehicles of the vehicles according to the acquired vehicle position information, and updates the adjacent vehicles in real time by combining the map data, wherein the adjacent vehicles are vehicles closest to the vehicle in all directions of front, rear, left and right of the vehicles;
when the driver clicks the adjacent vehicle visible from the real image of the vehicle display screen 5 and selects to send the interactive content, the vehicle display screen 5 receives the operation information of the user and sends the operation information to the electronic control unit 1, the interactive content can be a graphic mark or a picture containing text information, and in the vehicle display screen 5 of the vehicle, the interactive content can be displayed at a position corresponding to the selected adjacent vehicle to form an interactive real image;
the interaction module 12 of the electronic control unit 1 generates interaction data according to the operation information sent by the vehicle display screen 5, and sends the interaction data to the server 3 through the client 2, wherein the interaction data comprises interaction content and relative position information of a first vehicle (own vehicle) and a second vehicle (adjacent vehicle) in a live-action image;
the vehicle identification module 31 of the server 3 identifies a second vehicle from the neighboring vehicles of the first vehicle according to the relative position information in the interactive data, and transmits the interactive data to the electronic control unit 1 of the second vehicle through the client 2 corresponding to the second vehicle;
after the electronic control unit 1 of the second vehicle receives the interactive data, the image processing module 11 superimposes the interactive content and the live-action image according to the relative position information in the interactive data to generate an interactive live-action image and display the interactive live-action image through the vehicle display screen 5, and in the interactive live-action image of the second vehicle, the interactive content is displayed at the corresponding position of the first vehicle, so that a driver of the opposite vehicle can know the interactive content of the driver of the opposite vehicle in a visible manner.
According to the real scene interaction system for the driver, the real scene image around the vehicle body is generated through the electronic control unit according to the image information acquired by the surrounding scene shooting system, the real scene image is displayed through the vehicle display screen, the operation information of the user of the first vehicle on the second vehicle in the real scene image is sent to the electronic control unit, so that the electronic control unit generates interaction data and sends the interaction data to the server through the client, the server processes the interaction data to identify the second vehicle from the adjacent vehicles of the first vehicle, and then sends the interaction data to the electronic control unit of the second vehicle through the client corresponding to the second vehicle to generate the interaction real scene image containing the interaction content, the combination of the driver interaction and the 360-degree surrounding scene system is realized, the social interaction function is increased compared with the existing 360-degree surrounding scene system, the real scene function is increased compared with the existing social interaction system, the interaction between the drivers is more real, the drivers can communicate with each other in a relatively airtight carriage, and the passengers can communicate with each other if the passengers are not full of the passengers, the passengers can be relieved, or the passengers can be friendly to the friends. In addition, in the embodiment of the invention, the client acquires the vehicle position information from the positioning module of the corresponding terminal and sends the vehicle position information to the server, the server matches the adjacent vehicles of the vehicles according to the acquired vehicle position information, and when the interactive data is received, the second vehicle is identified from the adjacent vehicles of the first vehicle according to the conversion relation between the 360-degree surrounding system coordinate and the geographic coordinate, so that the 360-degree surrounding system of different vehicles is associated by utilizing the mutual combination between the 360-degree surrounding system coordinate system and the geographic coordinate system of the real world, and the real-scene interaction of the driver is realized.
The present invention is not limited to the above-mentioned embodiments, but is intended to be limited to the following embodiments, and any modifications, equivalent changes and variations in the above-mentioned embodiments can be made by those skilled in the art without departing from the scope of the present invention.

Claims (8)

1. The real-scene interaction system for the driver is characterized by comprising an electronic control unit, a client, a server, a surrounding scene shooting system and a vehicle display screen, wherein the electronic control unit comprises an image processing module and an interaction module, the server comprises a vehicle identification module,
the image processing module of the electronic control unit is used for generating a live-action image around the vehicle body according to the image information acquired by the surrounding image shooting system;
the vehicle display screen is used for displaying the live-action image and sending operation information of a user of the first vehicle on the second vehicle in the live-action image to the electronic control unit;
the interaction module of the electronic control unit is used for generating interaction data according to the operation information and sending the interaction data to the client, the interaction data comprise interaction content and relative position information of the first vehicle and the second vehicle in the live-action image, and the relative position information is a coordinate relation of the first vehicle and the second vehicle in the live-action image;
the client is used for sending the interaction data to the server;
the vehicle identification module of the server is used for identifying the second vehicle from the adjacent vehicles of the first vehicle according to the relative position information and sending the interaction data to the electronic control unit of the second vehicle through the client corresponding to the second vehicle;
the image processing module is further configured to superimpose the interactive content and the live-action image according to the relative position information to generate an interactive live-action image, where the interactive content is displayed in the interactive live-action image at a position corresponding to the first vehicle and/or the second vehicle.
2. The driver realistic interaction system of claim 1, wherein the client is connected to the electronic control unit via bluetooth signals.
3. The driver live action interactive system of claim 1, wherein the live action image is a 360 ° real-time loop image of the surroundings of the vehicle body.
4. The driver live action interactive system of claim 3, wherein the loop vision camera system comprises a front camera, a rear camera, a left camera, and a right camera, the front camera, the rear camera, the left camera, and the right camera being positioned on a front side, a rear side, a left side, and a right side of the vehicle body, respectively.
5. The driver realistic interaction system of claim 1, wherein the interactive content comprises graphical logos or pictures containing textual information.
6. The driver live-action interaction system of claim 1, wherein the client comprises a positioning information acquisition module, the server further comprises a matching module, the positioning information acquisition module is used for acquiring vehicle position information from a positioning module of a corresponding terminal of the client and sending the vehicle position information to the server, and the matching module of the server is used for matching adjacent vehicles of each vehicle according to the acquired vehicle position information.
7. The driver realistic interaction system of claim 6, wherein the location module is a GPS location module.
8. The driver live action interactive system of claim 1, wherein the client comprises a real-time communication module for enabling real-time communication between the client and the electronic control unit and the server.
CN201710113034.5A 2017-02-28 2017-02-28 Driver live-action interaction system Active CN106828320B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710113034.5A CN106828320B (en) 2017-02-28 2017-02-28 Driver live-action interaction system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710113034.5A CN106828320B (en) 2017-02-28 2017-02-28 Driver live-action interaction system

Publications (2)

Publication Number Publication Date
CN106828320A CN106828320A (en) 2017-06-13
CN106828320B true CN106828320B (en) 2023-07-11

Family

ID=59137613

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710113034.5A Active CN106828320B (en) 2017-02-28 2017-02-28 Driver live-action interaction system

Country Status (1)

Country Link
CN (1) CN106828320B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108556757B (en) * 2018-06-13 2023-05-26 重庆第二师范学院 Spliced vehicle-mounted information interaction device

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104266654A (en) * 2014-09-26 2015-01-07 广东好帮手电子科技股份有限公司 Vehicle real scene navigation system and method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102868970A (en) * 2012-09-21 2013-01-09 上海永畅信息科技有限公司 System and method for actively identifying and communicating with vehicles
JP6451101B2 (en) * 2014-06-30 2019-01-16 株式会社リコー Vehicle communication device
CN104900062A (en) * 2015-06-15 2015-09-09 谭兴奎 Vehicle-mounted interaction information display control system
CN106230895A (en) * 2016-07-18 2016-12-14 乐视控股(北京)有限公司 Interactive approach between a kind of vehicle and system
CN106080393A (en) * 2016-08-08 2016-11-09 浙江吉利控股集团有限公司 Automatic Pilot auxiliary display system
CN206623755U (en) * 2017-02-28 2017-11-10 上海寅喆计算机科技有限公司 Driver's live-action interaction system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104266654A (en) * 2014-09-26 2015-01-07 广东好帮手电子科技股份有限公司 Vehicle real scene navigation system and method

Also Published As

Publication number Publication date
CN106828320A (en) 2017-06-13

Similar Documents

Publication Publication Date Title
US10679420B2 (en) Augmented reality (AR) remote vehicle assistance
US10682911B2 (en) Active window for vehicle infomatics and virtual reality
US9197863B2 (en) Display system that displays augmented reality image of posted data icons on captured image for vehicle-mounted apparatus
JP5811804B2 (en) Vehicle periphery monitoring device
US11295132B2 (en) Method, a device for assisting driving, an unmanned device and a readable storage medium
WO2013108371A1 (en) Image processing apparatus, image processing server, image processing method, image processing program, and recording medium
US10600234B2 (en) Inter-vehicle cooperation for vehicle self imaging
US11525882B2 (en) Establishing a direct communication link with another vehicle in the vicinity of a motor vehicle
JP7236442B2 (en) Control method for display device in automobile
JP2016095688A (en) On-vehicle information display device
JP2008219559A (en) In-vehicle camera system
WO2018134897A1 (en) Position and posture detection device, ar display device, position and posture detection method, and ar display method
US20130135348A1 (en) Communication device, communication system, communication method, and communication program
CN106828320B (en) Driver live-action interaction system
CN111614931B (en) Vehicle surrounding image synthesis method and system
CN111064936A (en) Road condition information display method and AR equipment
JPWO2016103938A1 (en) Projection display device, electronic device, driver visual image sharing method, and driver visual image sharing program
CN106864372A (en) Outdoor scene internet is called a taxi accessory system and method
CN206914229U (en) Outdoor scene internet is called a taxi accessory system
US20150116320A1 (en) Device for Operating One or More Optical Display Devices of a Vehicle
WO2013111423A1 (en) Display device for vehicle
JP2013217808A (en) On-vehicle apparatus
CN206623755U (en) Driver's live-action interaction system
US20210263315A1 (en) Wifi enabled head up display (hud)
CN110979319A (en) Driving assistance method, device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20171211

Address after: 201203 Shanghai Zhang Heng Road, Lane 666, room 1, No. 202, room

Applicant after: SHANGHAI YINZHE COMPUTER TECHNOLOGY Co.,Ltd.

Address before: 201203 Shanghai Zhang Heng Road, Lane 666, room 1, No. 202, room

Applicant before: YINJIA ELECTRONIC TECHNOLOGY (SHANGHAI) CO.,LTD.

TA01 Transfer of patent application right
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20180727

Address after: 201203 202, 1 Lane 666 lane, Zhang Heng Road, Pudong New Area, Shanghai.

Applicant after: YINJIA ELECTRONIC TECHNOLOGY (SHANGHAI) CO.,LTD.

Address before: 201203 202, 1 Lane 666 lane, Zhang Heng Road, Pudong New Area, Shanghai.

Applicant before: SHANGHAI YINZHE COMPUTER TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information

Address after: 201203 202, 1 Lane 666 lane, Zhang Heng Road, Pudong New Area, Shanghai.

Applicant after: VOYAGER TECHNOLOGY Inc.

Address before: 201203 202, 1 Lane 666 lane, Zhang Heng Road, Pudong New Area, Shanghai.

Applicant before: YINJIA ELECTRONIC TECHNOLOGY (SHANGHAI) CO.,LTD.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: Driver real scene interaction system

Effective date of registration: 20231117

Granted publication date: 20230711

Pledgee: Jiangsu Bank Co.,Ltd. Shanghai Pudong Branch

Pledgor: VOYAGER TECHNOLOGY Inc.

Registration number: Y2023310000748

PE01 Entry into force of the registration of the contract for pledge of patent right