CN212341838U - Driving experience system - Google Patents
Driving experience system Download PDFInfo
- Publication number
- CN212341838U CN212341838U CN202021453537.0U CN202021453537U CN212341838U CN 212341838 U CN212341838 U CN 212341838U CN 202021453537 U CN202021453537 U CN 202021453537U CN 212341838 U CN212341838 U CN 212341838U
- Authority
- CN
- China
- Prior art keywords
- driving
- cabin
- host
- driving experience
- extravehicular
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 230000000694 effects Effects 0.000 claims description 14
- 230000003993 interaction Effects 0.000 claims description 11
- 239000011521 glass Substances 0.000 claims description 7
- 239000000523 sample Substances 0.000 claims description 6
- 238000000034 method Methods 0.000 description 13
- 238000010586 diagram Methods 0.000 description 9
- 230000000875 corresponding effect Effects 0.000 description 8
- 230000008569 process Effects 0.000 description 7
- 230000002452 interceptive effect Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000004590 computer program Methods 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 101000941170 Homo sapiens U6 snRNA phosphodiesterase 1 Proteins 0.000 description 1
- 102100031314 U6 snRNA phosphodiesterase 1 Human genes 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
Images
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
The present application provides a driving experience system, the system comprising: an extravehicular device; the driving experience cabin is connected with the extravehicular equipment; wherein, the driving experience cabin comprises: a VR device for displaying a VR scene image; and the driving device is connected with the VR device. The system can make the user experience the driving feeling.
Description
Technical Field
The application relates to the technical field of VR, in particular to a driving experience system.
Background
Currently, a user can experience the driving feeling of a vehicle only through real test driving before purchasing the vehicle. The experience mode is high in cost and small in selectivity, and a user may not obtain all-around experience feeling in the short test driving process.
SUMMERY OF THE UTILITY MODEL
The embodiment of the application provides a driving experience system, and driving feeling can be experienced through the system.
The application provides a driving experience system, the system comprising:
an extravehicular device;
the driving experience cabin is connected with the extravehicular equipment;
wherein, the driving experience cabin comprises:
a VR device for displaying a VR scene image;
and the driving device is connected with the VR device.
In one embodiment, the extravehicular apparatus comprises:
an outboard main machine;
the outboard display screen is connected with the outboard host;
the extravehicular camera is connected with the extravehicular host;
the cabin outer sound box is connected with the cabin outer host;
and the outboard microphone is connected with the outboard main machine.
In one embodiment, the VR device includes: VR glasses, VR voice interaction module and VR audio module be connected with the device of driving.
In one embodiment, the driving device includes:
a driving host;
the steering wheel is connected with the driving host;
a pedal connected to the steering wheel;
the dynamic platform is connected with the driving host;
the seat is positioned above the dynamic platform.
In one embodiment, the driving device further includes:
the display screen is connected with the driving host;
the gesture recognition module is connected with the driving host;
the human face recognition camera in the cabin is connected with the driving host;
the shooting camera in the cabin is connected with the driving host;
the in-cabin sound effect module is connected with the driving host;
and the light module is connected with the driving host.
In one embodiment, the driving device further includes:
and the positioner is connected with the driving host.
In one embodiment, the driving experience system further comprises:
and the data center station is connected with the driving experience cabin and the extravehicular equipment.
In one embodiment, the driver experience cabin further comprises:
the code scanning device is connected with the data center station and used for acquiring a reservation two-dimensional code of a user and sending the reservation two-dimensional code to the data center station;
and the cabin door is connected with the data center and is used for being opened when receiving a door opening signal sent by the data center.
In one embodiment, the driver experience cabin further comprises:
and the cabin door switch is positioned inside the driving experience cabin and used for opening the cabin door when being pressed down.
In one embodiment, the driving experience system further comprises:
and the Wi-Fi probe is connected with the data center station, is used for detecting wireless equipment data within a preset range outside the driving experience cabin, and sends the wireless equipment data to the data center station.
According to the technical scheme provided by the embodiment of the application, the cabin external equipment and the driving experience cabin comprising the VR device and the driving device are arranged, and the cabin external equipment is connected with the driving experience cabin, so that a user can experience driving feeling.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required to be used in the embodiments of the present application will be briefly described below.
Fig. 1 is a schematic structural diagram of a driving experience system according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an extravehicular apparatus provided in an embodiment of the present application;
fig. 3 is a schematic structural diagram of a VR device according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a driving device according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a steering device according to another embodiment of the present application;
FIG. 6 is a schematic structural diagram of a driving experience system according to another embodiment of the present application;
fig. 7 is a schematic structural diagram of a driving experience system according to another embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
Like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element. Unless expressly stated or limited otherwise, the terms "mounted," "connected," and "connected" are intended to be inclusive and mean, for example, that they may be fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in the present invention can be understood according to specific situations by those skilled in the art.
Fig. 1 is a schematic structural diagram of a driving experience system 1000 according to an embodiment of the present application. As shown in fig. 1, the system includes: an overboard equipment 100 and a cockpit experience module 200.
The extravehicular device 100 may be a kiosk, a touch device, or the like. The extravehicular apparatus 100 may provide interactive functionality related to driving experience, as well as other entertainment and advertising functionality. The extravehicular apparatus 100 may receive an input signal of a user, such as a voice signal, a touch signal, or the like, and after signal processing, return an interactive response signal corresponding to the input signal of the user. In an embodiment, after receiving the vehicle type information preferred by the user, the extravehicular device 100 may display vehicle type information similar to the vehicle type, which is convenient for the user to view. In addition, the extravehicular apparatus 100 may also have functions of face recognition, games, advertisements, two-dimensional code generation and recognition, and the like. The extravehicular device 100 can collect user information and collect user data during the working process, so as to facilitate the subsequent construction of a user information warehouse based on big data.
The cockpit experience module 200 is coupled to the extravehicular apparatus 100 for a user to experience a sensation of driving within the cockpit. The driving experience cabin 200 can be closed or semi-closed, and after a user enters the driving experience cabin 200, the driving experience aiming at a certain vehicle type can be carried out according to a set vehicle type. The set vehicle type can be selected by the user or recommended according to the preference of the user. In one embodiment, when the user interacts with the extravehicular equipment 100 outside the cabin, the vehicle model or models that the user experiences in driving the experience cabin 200 may be selected in combination with the user's preferences and recommendations of the extravehicular equipment 100.
The cockpit experience module 200 includes a VR device 210 and a piloting device 220. VR device 210 is configured to display VR scene images to simulate real driving scenes. The driving device 220 is connected to the VR device 210, and is configured to cooperate with the VR scene image to provide a driving feeling corresponding to the VR scene image to the user. In an embodiment, the driving device 220 may send, according to an input signal of a user, a VR scene image corresponding to the input signal to the VR device 210, display, by the VR device 210, the VR scene image matching with the input signal, and synchronize, by the driving device 220, a corresponding driving experience for the user. In this process, the user may experience a driving feel for the selected model.
The driving experience system 1000 provided by the above embodiment of the present application can enable the user to experience the driving feeling by setting the extravehicular apparatus 100 and the driving experience cabin 200 including the VR device 210 and the driving device 220, and connecting the extravehicular apparatus 100 and the driving experience cabin 200.
As shown in fig. 2, the outboard apparatus 100 includes: the outdoor unit 110, the outdoor display screen 120, the outdoor camera 130, the outdoor sound box 140, and the outdoor microphone 150. The outdoor display screen 120, the outdoor camera 130, the outdoor sound box 140 and the outdoor microphone 150 are all connected to the outdoor host 110.
The outdoor host 110 may input control signals to the outdoor display screen 120, the outdoor camera 130, and the outdoor sound box 140, and receive information returned by the outdoor display screen 120, the outdoor camera 130, the outdoor sound box 140, and the outdoor microphone 150. The extravehicular host 110 may also be connected to the driving device 220 of the driving experience cabin 200 to obtain sound effects of the VR experience in the cabin.
The extravehicular display 120 may be a touch screen that displays information such as graphics, video, etc., and may also display traditional outdoor screen advertisements. The extravehicular camera 130 can acquire the face, sex, age and height information of the user, can also acquire the clothes wearing color information of the user, and can capture the state of the user. In addition, the user photos can be displayed on the extravehicular display screen 120 through the extravehicular camera 130, so that the interactive blending feeling and the personalization degree of the user are enhanced. The outdoor microphone 150 may input a user voice signal. The extra-cabin sound box 140 may, under the control of the extra-cabin host 110, play out sound effects experienced by the intra-cabin VR, or voice interactive response sounds, game sounds, video sounds, and the like. In one embodiment, the cabin enclosure 140 may be mounted in a symmetrical focusing manner. In an embodiment, the extravehicular device 100 further includes a scanning window, and the scanning window may support two-dimensional Code identification of Code systems such as QR Code, PDF417, Code 49, Code 16K, and Code One.
As shown in fig. 3, VR device 210 includes: VR glasses 211, VR voice interaction module 212 and VR sound effect module 213.
As shown in fig. 4, the steering device 220 includes: a driver's host 223, a steering wheel 222, a foot pedal 221, a motion platform 224, and a seat 225.
During the user experience, the real driving operation can be simulated by turning the steering wheel 222, stepping on the foot pedal 221, etc. The steering wheel 222 is connected to the driving unit 223, and can receive a steering operation signal from a user and transmit the steering operation signal to the driving unit 223. The pedal 221 is connected to the steering wheel 222, and can receive a pedal signal from a user, transmit the pedal signal to the steering wheel 222, and transmit the pedal signal from the steering wheel 222 to the driver host 223. The motion platform 224 is a movable platform with multiple degrees of freedom, and is connected to the driving host 223. The driving host 223 may output a control signal to the motion platform 224 according to the steering operation signal and the pedal operation signal, and the current VR scene. The motion platform 224 may perform corresponding actions under the control of the driving host 223. The seat 225 is located above the motion platform 224, and can give a user a simulated driving experience, such as a back-pushing feeling, a braking feeling, a jolting feeling, and the like, under the driving of the motion platform 224.
As shown in fig. 5, the steering device 220 further includes: display 226, gesture recognition module 227, in-cabin face recognition camera 228, in-cabin camera 229, in-cabin sound effects module 2210, and light module 2211.
The display screen 226 is connected to the driving host 223, and can display related information such as game activities, people and vehicles matching, VR car-watching introduction, and the like, and can also be used as a voice interaction interface. The display screen 226 is touch-sensitive or the screen can be switched by gestures. The gesture recognition module 227 is connected to the driving host 223 and can recognize user gestures, such as a VR gesture recognition box. The in-cabin face recognition camera 228 is connected to the driving host 223, and can collect the face of the user and transmit the face to the driving host 223. The in-cabin camera 229 is connected to the host computer 223, can collect in-cabin images, capture the experience of the user, and send the images to the host computer 223. After the driving host 223 receives the image, the number of people entering the device, the face, gender, age, height information, micro-expression information, and the like of the user can be obtained. The audio module 2210 is connected to the driver's host 223 and can simulate the audio effect of different vehicle types. The in-cabin sound effect module 2210 may include a microphone, a power amplifier, and a speaker. The microphone is embedded with the voice recognition module, so that voice recognition can be performed, and voice feedback in the user experience process is recorded. The power amplifier can receive the audio output by the driver host 223 and output the audio to the speaker. The speakers may include satellite speakers and center speakers. The light module 2211 is composed of lamps in a plurality of directions and a plurality of angles in the cabin, the light module 2211 is connected with the driving host 223, switching can be performed according to VR scenes, or a game atmosphere is created when a user plays games, and a cabin entering ceremony feeling is created when the user enters the cabin.
In one embodiment, steering device 220 further comprises: and the positioner 2212 is connected with the driving host 223. When the user is experiencing the driving, the user can experience not only the driver but also the non-driver at other positions of the vehicle. According to different vehicle types, a user can select a certain seat of the vehicle through the positioner 2212, and further obtain the driving experience on the seat through the control of the driving host 223.
As shown in fig. 6, the driving experience system 1000 further includes: the data center station 300. The data center station 300 connects the pilot experience bay 200 and the extravehicular equipment 100. The data center 300 may pre-store advertisement or game information and send the advertisement or game information to the cockpit 200 or the extravehicular device 100. The data center station 300 can also be a data processing center of the system, and both the user data returned from the extravehicular equipment 100 and the pilot test cabin 200 can be sent to the data center station 300, so that the data platform can generate corresponding interaction and recommendation information according to the input of the user and return the interaction and recommendation information to the pilot test cabin 200 or the extravehicular equipment 100. The data center 300 may also perform statistical analysis on the user experience data to obtain a user profile, improve recommendation results, and the like. After the user experience is completed, the data center 300 may send the experience result and the sharing information to the driving host 223 and display the experience result and the sharing information, and the experiencer may transmit the experience result and the sharing information through the information on the host.
In one embodiment, the cockpit experience module 200 further comprises: a code scanner 230 and a hatch 240.
Before the user enters the experience of the driving experience cabin 200, the user can make an appointment in the modes of WeChat small programs, APP and the like, and a corresponding appointment two-dimensional code is generated on a mobile phone.
The code scanning device 230 is connected with the data center station 300 and can be installed at the entrance of the driving experience cabin 200. The code scanning device 230 can acquire the reserved two-dimensional code of the user and send the reserved two-dimensional code to the data center station 300. The data center 300 may compare the reserved two-dimensional code with a pre-stored two-dimensional code, and output an open signal to the hatch door 240 when the comparison is successful. The bay door 240 is connected to the data center station 300 and is opened when receiving a door opening signal transmitted from the data center station 300.
In one embodiment, the cockpit experience module 200 further comprises: and a hatch door switch 250 located inside the driving experience cabin 200. When the user is finished with the experience, the hatch switch 250 may be pressed to open the hatch 240.
In an embodiment, the driving experience system 1000 further comprises: a Wi-Fi probe 400. The Wi-Fi probe 400 can detect wireless device data around the flight experience pod 200. The Wi-Fi probe 400 is connected to the data center station 300, and transmits the detected wireless device data to the data center station 300, so that the data center station 300 can analyze users, such as the number of acquaintances, the proportion of men and women, new and old customers, the length of stay, and the like, according to the wireless device data.
Fig. 7 is a schematic structural diagram of a driving experience system 1000 according to another embodiment of the present application. As shown in fig. 7, the system further includes: a router 500. The router 500 may support network connections and interactions between the extravehicular devices 100, the cockpit experience modules 200, and the data center stations 300. As shown in fig. 7, the driving host 223 is connected to the face recognition camera, the gesture recognition module 227, the steering wheel 222 and the light module 2211 through USB1, USB2, USB3 and USB4 interfaces respectively; the pedal 221 is connected with a steering wheel 222 through RS 232; the driving host 223 is connected with the VR device 210 through the DP1, and is connected with the display screen 226 through the DP 2; positioner 2212 is connected to VR device 210 via infrared; the driving host 223 is connected with a microphone 22103 through AUX, connected with a power amplifier 22101 with a subwoofer through HDMI, and the power amplifier 22101 with the subwoofer is connected with a satellite speaker and a middle speaker 22102 through L/R sound channels. The driving host 223, the cabin camera 229, the dynamic platform 224, the code scanning device 230, the Wi-Fi probe 400 and the cabin host are all connected with the router 500 through the RJ 45. The code scanning device 230 is connected with the hatch door 240 through RS485, and the hatch door 240 is connected with the hatch door switch 250 through a hard wire. The extravehicular host 110 is connected with the extravehicular display screen 120 through MIPI/RGB/LVDS, the extravehicular camera 130 through USB/HDMI, the extravehicular speaker 140 through HDMI, and the extravehicular microphone 150 through AUX.
It should be noted that, in the embodiment of the present invention, if the computer program and the corresponding method are all implemented by directly applying the existing relatively mature method, the improvement of the method such as the computer program is not involved.
Claims (10)
1. A driving experience system, comprising:
an extravehicular device;
the driving experience cabin is connected with the extravehicular equipment;
wherein, the driving experience cabin comprises:
a VR device for displaying a VR scene image;
and the driving device is connected with the VR device.
2. The driving experience system of claim 1, wherein the outboard device comprises:
an outboard main machine;
the outboard display screen is connected with the outboard host;
the extravehicular camera is connected with the extravehicular host;
the cabin outer sound box is connected with the cabin outer host;
and the outboard microphone is connected with the outboard main machine.
3. The driving experience system of claim 1, wherein the VR device comprises: VR glasses, VR voice interaction module and VR audio module be connected with the device of driving.
4. The driving experience system of claim 1, wherein the driving device comprises:
a driving host;
the steering wheel is connected with the driving host;
a pedal connected to the steering wheel;
the dynamic platform is connected with the driving host;
the seat is positioned above the dynamic platform.
5. The driving experience system of claim 4, wherein the driving device further comprises:
the display screen is connected with the driving host;
the gesture recognition module is connected with the driving host;
the human face recognition camera in the cabin is connected with the driving host;
the shooting camera in the cabin is connected with the driving host;
the in-cabin sound effect module is connected with the driving host;
and the light module is connected with the driving host.
6. The driving experience system of claim 4, wherein the driving device further comprises:
and the positioner is connected with the driving host.
7. The driving experience system of claim 1, further comprising:
and the data center station is connected with the driving experience cabin and the extravehicular equipment.
8. The driving experience system of claim 7, wherein the driving experience cabin further comprises:
the code scanning device is connected with the data center station and used for acquiring a reservation two-dimensional code of a user and sending the reservation two-dimensional code to the data center station;
and the cabin door is connected with the data center and is used for being opened when receiving a door opening signal sent by the data center.
9. The driving experience system of claim 8, wherein the driving experience cabin further comprises:
and the cabin door switch is positioned inside the driving experience cabin and used for opening the cabin door when being pressed down.
10. The driving experience system of claim 7, further comprising:
and the Wi-Fi probe is connected with the data center station, is used for detecting wireless equipment data within a preset range outside the driving experience cabin, and sends the wireless equipment data to the data center station.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202021453537.0U CN212341838U (en) | 2020-07-21 | 2020-07-21 | Driving experience system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202021453537.0U CN212341838U (en) | 2020-07-21 | 2020-07-21 | Driving experience system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN212341838U true CN212341838U (en) | 2021-01-12 |
Family
ID=74081963
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202021453537.0U Expired - Fee Related CN212341838U (en) | 2020-07-21 | 2020-07-21 | Driving experience system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN212341838U (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113178112A (en) * | 2021-04-26 | 2021-07-27 | 重庆电子工程职业学院 | Artificial intelligence VR device |
-
2020
- 2020-07-21 CN CN202021453537.0U patent/CN212341838U/en not_active Expired - Fee Related
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113178112A (en) * | 2021-04-26 | 2021-07-27 | 重庆电子工程职业学院 | Artificial intelligence VR device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112369051B (en) | Shared environment for vehicle occupants and remote users | |
KR101212893B1 (en) | Advertisement system, and method for displaying advertisement | |
US9079749B2 (en) | Simple node transportation system and node controller and vehicle controller therein | |
CN108847214B (en) | Voice processing method, client, device, terminal, server and storage medium | |
CN105306084A (en) | Eyewear type terminal and control method thereof | |
KR20080008528A (en) | Serving robot having function serving customer | |
JP2019197499A (en) | Program, recording medium, augmented reality presentation device, and augmented reality presentation method | |
CN111712870B (en) | Information processing device, mobile device, method, and program | |
CN101674435A (en) | Image display apparatus and detection method | |
WO2021205742A1 (en) | Information processing device, information processing method, and computer program | |
WO2012119371A1 (en) | User interaction system and method | |
WO2012050029A1 (en) | Electronic equipment and method for determining language to be displayed thereon | |
US20200034729A1 (en) | Control Method, Terminal, and System | |
WO2016079470A1 (en) | Mixed reality information and entertainment system and method | |
CN212341838U (en) | Driving experience system | |
CN107817701A (en) | Equipment control method and device, computer readable storage medium and terminal | |
CN107567636A (en) | Display device and its control method and computer readable recording medium storing program for performing | |
KR101912083B1 (en) | Voice recognition artificial intelligence smart mirror TV system | |
WO2020151430A1 (en) | Air imaging system and implementation method therefor | |
JP5220953B1 (en) | Product information providing system, product information providing device, and product information output device | |
US11804154B2 (en) | Smart interactive display device, smart interactive display system, and interactive display method thereof | |
CN106571108A (en) | Advisement player having voice interaction function | |
KR20170055887A (en) | Digital signage device and operating method thereof | |
CN111754274A (en) | Intelligent regional advertisement pushing method and system based on vehicle-mounted glass display | |
CN114760434A (en) | Automobile intelligent cabin capable of realizing multi-person online video conference and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20210112 |
|
CF01 | Termination of patent right due to non-payment of annual fee |