WO2017219883A1 - Procédé, appareil et dispositif de poussée de données - Google Patents

Procédé, appareil et dispositif de poussée de données Download PDF

Info

Publication number
WO2017219883A1
WO2017219883A1 PCT/CN2017/087874 CN2017087874W WO2017219883A1 WO 2017219883 A1 WO2017219883 A1 WO 2017219883A1 CN 2017087874 W CN2017087874 W CN 2017087874W WO 2017219883 A1 WO2017219883 A1 WO 2017219883A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
scene
sub
vehicle
type
Prior art date
Application number
PCT/CN2017/087874
Other languages
English (en)
Chinese (zh)
Inventor
徐海生
Original Assignee
斑马网络技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 斑马网络技术有限公司 filed Critical 斑马网络技术有限公司
Publication of WO2017219883A1 publication Critical patent/WO2017219883A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L65/00Network arrangements, protocols or services for supporting real-time applications in data packet communication
    • H04L65/40Support for services or applications
    • H04L65/4061Push-to services, e.g. push-to-talk or push-to-video

Definitions

  • the present application relates to Internet technologies, and in particular, to a method, device and device for pushing data applied to a vehicle.
  • the related information is generally displayed to the user through the display device inside the vehicle.
  • the display devices inside the vehicle are mainly the instrument panel and the center console.
  • the navigation information is displayed on the center console, and the vehicle status information (such as the amount of oil, the vehicle speed, etc.) monitored by various vehicle sensors is displayed on the dashboard.
  • the display device inside the vehicle displays very limited information and is not convenient for the user to view.
  • the present application provides a method, device, and device for pushing data to solve the technical problem that the display information displayed by the display device inside the vehicle is very limited and is inconvenient for the user to view.
  • the present invention provides a method for pushing data, including: acquiring scene data;
  • At least part of the scene data is respectively pushed to a plurality of receiving objects carried by the vehicle, and the scene data is used for display.
  • the method before the at least part of the scene data is pushed to the plurality of receiving objects carried by the vehicle, the method further includes:
  • Each of the sub-scene data is pushed to a corresponding receiving object.
  • the embodiment After acquiring the scene data, the embodiment processes the scene data to obtain at least two sub-scene data, and pushes each sub-scene data to its corresponding receiving object according to the corresponding relationship between the sub-scene data and the receiving, so that each receiving The object can obtain the corresponding sub-scene data, fully utilizes the internal space of the vehicle, avoids receiving redundant scene data by each receiving object, and improves the rationality and simplicity of receiving the scene data of each receiving object, and Since the receiving object is two, it is also possible to output more scene data to the user, so that the user can obtain the required data by targeted.
  • the present invention provides a data pushing device, including: a data acquiring module, configured to acquire scene data;
  • a pushing module configured to push at least part of the scene data to the plurality of receiving objects carried by the vehicle, where the scene data is used for display.
  • the present invention provides a data pushing device, including: an input device, configured to acquire scene data;
  • a processor coupled to the output device and the input device, configured to control the output device to respectively push at least part of the scene data to a plurality of receiving objects carried by the vehicle, where the scene data is used for display.
  • the present invention provides a vehicle control apparatus comprising: an onboard output device, an onboard input device, and an onboard processor coupled to the onboard output device and the onboard input device;
  • the onboard input device is configured to acquire scene data
  • the onboard processor is configured to control the onboard output device to push at least part of the scene data to a plurality of receiving objects carried by the vehicle, where the scene data is used for display.
  • the present invention provides an in-vehicle Internet operating system, including:
  • Input control unit controlling the vehicle input device to acquire scene data
  • an output control unit that controls the vehicle-mounted output device to respectively push at least part of the scene data to the plurality of receiving objects carried by the vehicle, where the scene data is used for display.
  • the data sending method, device, and device provided by the embodiment by acquiring the scene data, respectively push at least part of the scene data to the plurality of receiving objects carried by the vehicle, and the scene data is used for display, thereby realizing a diversified scene.
  • the data is displayed through a plurality of receiving objects, so that the user can obtain diversified information, and the user can view the information through multiple receiving objects according to individual needs.
  • FIG. 1 is a schematic diagram of networking of a data pushing apparatus according to an embodiment of the present invention
  • FIG. 2 is a schematic diagram of an application scenario of a data pushing method according to an embodiment of the present invention
  • FIG. 3 is a schematic flowchart of a method for pushing data according to an embodiment of the present invention.
  • FIG. 4 is a schematic flowchart of a method for pushing data according to an embodiment of the present invention.
  • FIG. 5 is a schematic flowchart of a method for acquiring sub-scenario data provided by the present invention
  • FIG. 6 is a schematic diagram of a display interface according to an embodiment of the present invention.
  • FIG. 7 is a schematic diagram of a display interface according to an embodiment of the present invention.
  • FIG. 8 is a schematic diagram of a display interface according to an embodiment of the present invention.
  • FIG. 9 is a schematic structural diagram of a data pushing apparatus according to an embodiment of the present invention.
  • FIG. 10 is a schematic structural diagram of a data pushing apparatus according to an embodiment of the present invention.
  • FIG. 11 is a schematic structural diagram of hardware of a data pushing device according to an embodiment of the present invention.
  • FIG. 12 is a schematic structural diagram of an in-vehicle Internet operating system according to an embodiment of the present invention.
  • the vehicle involved in the embodiments of the present invention includes, but is not limited to, an automobile or a motorcycle, an electric car or a motorcycle, an electric bicycle, an electric balance vehicle, a remote control vehicle, and the like, and a small aircraft (for example, an unmanned aerial vehicle, a person) Small aircraft, remotely piloted aircraft, and various variants.
  • the vehicle involved herein may be a single oil road vehicle, a single steam road vehicle, a fuel-air combined vehicle, or a power-assisted electric vehicle.
  • the embodiment of the present invention does not do the type of the vehicle. limited.
  • FIG. 1 is a schematic diagram of networking of a data pushing apparatus according to an embodiment of the present invention.
  • the network architecture in FIG. 1 may include a vehicle and a cloud server and a smart home device in communication with the vehicle.
  • the vehicle carries a plurality of data collection devices
  • the cloud server can be a plurality of types of servers
  • the smart terminals can be devices such as smart homes, which can collect scene data. The details are described below separately.
  • a plurality of data collection devices are disposed in the vehicle, and the vehicle is provided with a plurality of data collection devices, and the data collection devices can also be integrated into the vehicle during the production process.
  • the data collection device may be a camera, a sensor, or the like.
  • the sensor may be a temperature sensor, a speed sensor, or the like of the water tank, and the camera may be a rear view camera, a side view camera, or the like.
  • the data collection device may also be a hardware module, a software module, or a combination of soft and hard modules inside the vehicle.
  • the data collection device can also be a device within the vehicle that can implement near field communication such that the vehicle can communicate with other vehicles or other devices in near field.
  • the cloud server may also collect scene data, and the vehicle may interact with the cloud server through the wireless network to obtain the scene data.
  • the wireless network may be a 2G network, a 3G network, a 4G network or a 5G network, a Wireless Fidelity (WIFI) network, or the like.
  • WIFI Wireless Fidelity
  • the specific type or specific form of the wireless network is not limited in the embodiment of the present invention, as long as it can provide an interface for the vehicle to access the network.
  • each vehicle can be bound to various account numbers, and the information related to the account can be obtained from the corresponding server.
  • the account can receive various information pushed by the e-commerce server, such as logistics information, payment information, and the like.
  • the account can also be an email account, and can receive various emails and other information pushed by the email server.
  • a vehicle account may also be set for the vehicle, and the vehicle account is associated with another account, for example, the vehicle account is associated with an e-commerce account, a social server account, an e-commerce account, etc., ie, After logging in to the vehicle account, the user can obtain other accounts associated with the account and obtain related information under other accounts.
  • the smart home device can communicate with the vehicle through the Internet of Things or the Internet, and feed back information about the smart home to the vehicle.
  • the execution entity may be a data pushing device, where the data pushing device may be a cloud server, a device on a vehicle, or a cloud server and a traffic.
  • the management device between the devices on the tool may be a device that manages communication resources for communication between the cloud server and the device on the vehicle.
  • the type of the data pushing device is not limited in the embodiment of the present invention.
  • the device on the vehicle may be a central control unit on the vehicle, or may be other control on the vehicle. Equipment for communication and communication functions.
  • the vehicle is a vehicle
  • the equipment on the vehicle may be a vehicle on the vehicle, a center console of the vehicle, a driving recorder on the vehicle, a smart rearview mirror, a vehicle dashboard, etc.
  • the vehicles involved are all examples of vehicles.
  • the data push device can centrally manage the data and distribute the scene data to the receiving object carried on the vehicle in a targeted manner for a specific driving scenario.
  • the receiving object may be at least two of a dashboard of the vehicle, a center console of the vehicle, a user equipment, and a head up display (HUD) smart rearview mirror, and the user equipment may be in the vehicle.
  • HUD head up display
  • the internal device may be a user's mobile phone, a tablet, a wearable device such as a smart watch, a wristband, a virtual reality (VR) device, an augmented reality (AR) device, or the like.
  • the receiving object is only an example, and the embodiment of the present invention is not limited thereto.
  • the data pushing device 201 and a plurality of receiving objects may be used.
  • the multiple receiving objects may be a dashboard 202-1 in a vehicle.
  • the data pushing device 201 can acquire the scene data, and push the scene data to a plurality of different receiving objects, so that a plurality of different receiving
  • the object displays the scene data, so that the diversified scene data is displayed through multiple receiving objects, so that the user can obtain diversified information, and the user can pass multiple according to individual needs.
  • Receive objects to view information Receive objects to view information.
  • the scene data involved in the embodiment of the present invention includes scene data sent by at least one of a data collection device, a cloud server, and a smart home device in communication with the vehicle.
  • the scene data is different according to the type of the data collection device, the type of the cloud server, and the smart terminal.
  • the data acquisition device can be a sensor, a camera, a center console, etc. carried in the vehicle.
  • the scene data may be a vehicle related to the vehicle's speed, fuel consumption, speed, remaining oil, room temperature, odometer, engine status, tire status, seat belt status, etc. Status data.
  • the scene data may be information such as photos, videos, and the like around the vehicle.
  • the scene data can be the music playing information controlled by the center console. Play information, etc.
  • the cloud server can be a navigation server, a life server, or the like.
  • the scene data may be various navigation data.
  • various data that the navigation server can provide such as a navigation overview map, a travel indication logo, a remaining mileage overview map, road congestion information, road clear information, and road accident information.
  • the scene data may be various life scene data, such as goods logistics information, credit card due repayment information, and social information. , booking information, travel information, phone bills, order information, etc.
  • the smart home device can communicate with the vehicle via the Internet of Things or the Internet.
  • the scene data may be cooking information of the rice cooker, work information of the sweeping robot, operation information of the air conditioner, etc.
  • the rice cooker starts cooking at 7 o'clock in the evening
  • the sweeping robot starts to sweep the ground at 10 o'clock in the morning
  • the air conditioner turns on at 5 o'clock in the afternoon.
  • the setting mode is cooling and the temperature is 28 °C.
  • the types of scenes involved in the embodiments of the present invention are mainly classified into a navigation scene, an entertainment scene, a life scene, and the like. That is, the scene data is divided according to the function of the scene data.
  • the scenario data may be obtained in the case of determining the navigation scenario, or may be determined according to the currently obtained scenario data.
  • the navigation function is provided to the user, and the scene data that is pushed to the receiving object includes the scene data corresponding to the navigation scenario, that is, the data acquired by the navigation server.
  • the scene data corresponding to the navigation scenario that is, the data acquired by the navigation server.
  • the road abnormal navigation scene and the normal navigation scene of the road may also be classified according to road conditions.
  • the scene data corresponding to the road abnormal navigation scene may include: the scene data corresponding to the normal navigation scene of the road includes a navigation overview map, traffic condition data, and status data of the vehicle.
  • the user is provided with an entertainment function
  • the scene data that is pushed to each receiving object includes the scene data corresponding to the entertainment scene, that is, the data acquired through the center console.
  • the scene data corresponding to the entertainment scene that is, the data acquired through the center console.
  • music playback information for example, music playback information, video playback information, radio playback information, and the like.
  • the user provides a life service function
  • the scene data corresponding to the life scene is mainly used for the data acquired through the life server and/or the smart home.
  • goods logistics information for example, goods logistics information, credit card due repayment information, air conditioning operation information, etc.
  • the data type of the sub-scene involved in the embodiment of the present invention is mainly processed from two dimensions.
  • One of the dimensions is processed according to the type of the scene, and the other dimension is processed according to the importance of the data to the user.
  • the data type of the sub-scene corresponds to the sub-scene data, and the sub-scene data is processed by the scene data to obtain a plurality of sub-scene data.
  • the scene type can be divided into a scene type related to driving, a scene type related to the user, and a scene type related to the smart home.
  • the data type of the sub-scene corresponding to the road abnormal navigation scenario includes at least two of the following: a navigation overview map, traffic condition data, and vehicle status data; if the navigation scenario is a normal navigation of the road
  • the data type of the sub-scene corresponding to the normal navigation scene of the road includes at least two of the following: a navigation overview map, a driving indication identifier, a remaining mileage overview map, and status data of the vehicle.
  • the scene type is an entertainment scene
  • the data type of the sub-scene corresponding to the entertainment scene includes at least two of the following: entertainment data, traffic condition data, and status data of the vehicle.
  • the type of scene associated with the user For example, if the scene type is an e-commerce life scene, the data type of the sub-scene corresponding to the e-commerce life scene includes logistics data, payment data, and product evaluation data.
  • the type of scene associated with the smart home For example, if the scene type is a home life scene, the data type of the sub-scenario corresponding to the home life scene includes the running information of the working smart home and the faulty smart home information.
  • the data types of the sub-scenarios related to driving include: driving assistance data, emergency data, full amount of information data, and driving necessary data.
  • the data types of the sub-scenarios related to the user include: necessary data for living, and living auxiliary data.
  • the data types of the sub-scenarios related to the smart home include: necessary home data, auxiliary home data.
  • the sub-scene data involved in the embodiment of the present invention is mainly a data class of the sub-scene related to the foregoing.
  • the corresponding sub-scene data corresponds to data such as road congestion and road accidents;
  • the state data of the vehicle corresponds to data such as speed, speed, odometer, engine state, etc.; driving necessary data corresponds to data such as speed, remaining oil, road indication, etc.; driving assistance
  • the data corresponds to the remaining mileage overview map;
  • the necessary data for life corresponds to credit card repayment date, mobile phone arrears information and other data, life assistance data includes social information, goods logistics information, etc.;
  • necessary household data corresponds to the information of the running home equipment, household equipment
  • the fault information, etc., the auxiliary home data corresponds to the information of the home equipment to be operated after the preset time period, the information that the home equipment is insufficient, and the like.
  • FIG. 3 is a schematic flowchart of a method for pushing data according to an embodiment of the present invention.
  • the method may include:
  • the scene data may be a text form, a picture form, a voice form, a video form, or the like.
  • the acquired at least part of the scene data is pushed to the plurality of receiving objects carried by the vehicle, that is, all the scene data may be pushed to the receiving object, or part of the scene data may be pushed to the receiving object, and each receiving object is
  • the scene data to be pushed may be the same, different, or partially identical, or there may be differences in scene data in which at least two received objects are pushed into the scene data pushed by the plurality of receiving objects. This embodiment is not particularly limited herein.
  • the scene data When part of the scene data is pushed to multiple receiving objects and is not the same, the scene data may be randomly split into multiple parts and then pushed to different receiving objects; or the scene data may be processed by a preset rule to obtain multiple Part and then push to different recipients.
  • the receiving object may include at least two of the following receiving objects: a dashboard, a center console, a head-up display HUD, a rear view mirror, a mirror, a projection device, a smart terminal having a communication connection with the vehicle, and the like.
  • the terminal device may be a terminal device such as a mobile phone or a tablet located in the vehicle.
  • the receiving object when the receiving object is a display device, after the receiving object receives the scene data, the receiving object may display the scene data; when the receiving object is the projection device, the receiving object may receive the received scene data. Projected onto a matching projection screen to cause scene data to be displayed on the projection screen.
  • the type of each receiving object can be set according to actual needs, which is not specifically limited in the present invention.
  • the invention obtains at least part of the scene data by using the scene data to obtain the scene data, and the scene data is used for display, so that the diversified scene data is displayed through multiple receiving objects, so that the user can Obtain diversified information, and users can view information through multiple recipients according to individual needs.
  • the scene data is displayed as much as possible through the receiving object for the user to view, and the acquired scene data is reasonably pushed to different receiving objects.
  • the corresponding scene data can be pushed to each receiving object.
  • FIG. 4 is a schematic flowchart of a method for pushing data according to an embodiment of the present invention.
  • the method may include:
  • S402. Process the scene data to obtain multiple sub-scene data.
  • the scene data is processed to obtain a plurality of sub-scene data, and each sub-scene data is part or all of the scene data.
  • the scene data may be randomly split into a plurality of sub-scene data.
  • the data may be randomly divided into multiple sub-scene data with equal data amount according to the amount of data of the scene data.
  • the scene data may be split into multiple sub-scene data according to the sequence of the acquired scene data. A specific implementation manner of randomly splitting the scene data into multiple sub-scene data is not described herein again in this embodiment.
  • the correspondence between the data type of the receiving object and the sub-scene may be preset, and the corresponding relationship may be preset by the user, or may be determined according to the type of the receiving object and the data type of each sub-scene.
  • the scene data may be processed according to the correspondence relationship to obtain a plurality of sub-scene data.
  • each sub-scene data is different. For example, suppose that the scene data includes 10 data, which are respectively recorded as data 1 - data 10, and then it is assumed that the 10 scene data are processed to obtain 3 sub-scene data, and the data included in the three sub-scene data can be as shown in Table 1. Show:
  • Sub-scenario data 1 Data 1, data 2, data 3 Sub-scenario data 2 Data 4, data 5, data 6 Sub-scenario data 3 Data 7, data 8, data 9, data 10
  • Table 1 The sub-scene data shown in Table 1 does not include duplicate data. It should be noted that Table 1 merely illustrates the sub-scene data included in the scene data and the data included in each sub-scene data. This is not a limitation.
  • some sub-scene data is different. For example, suppose that the scene data includes 10 data, which are respectively recorded as data 1 - data 10, and then the 10 scene data are processed to obtain 3 sub-scene data, and the data included in the three sub-scene data can be as shown in Table 2. :
  • Sub-scenario data 1 Data 1, data 2, data 3, data 5
  • Sub-scenario data 2 Data 4, data 5, data 6
  • Sub-scenario data 3 Data 4, data 5, data 6
  • the sub-scene data 2 and the sub-scene data 3 include identical data. It should be noted that Table 2 only illustrates the sub-scene data included in the scene data and the data included in each sub-scene data by way of example, and is not limited thereto.
  • the receiving object corresponding to each sub-scene data is acquired, and the sub-scene data is pushed to the corresponding receiving object.
  • the receiving object corresponding to the sub-scene data 1 is a dashboard
  • the sub-scene data 1 is pushed to the dashboard, and the sub-scene data 1 is displayed by the dashboard
  • the receiving object corresponding to the sub-scene data 2 is the center console.
  • the sub-scene data 2 is pushed to the center console, and the sub-scene data 2 is displayed by the center console.
  • the scene data is processed to obtain at least two sub-scene data, and each sub-scene data is pushed to the corresponding according to the sub-scene data and the corresponding correspondence corresponding to the reception.
  • the receiving object enables each receiving object to obtain the corresponding sub-scene data, fully utilizes the internal space of the vehicle, avoids receiving redundant scene data by each receiving object, and improves the reasonable reception of the scene data by each receiving object. Sexuality and simplicity, and since the number of receiving objects is two, it is also possible to output more scene data to the user, so that the user can obtain the required data by targeted.
  • scenario data can be processed by the following two feasible implementation manners to obtain multiple sub-scene data (S402 in the embodiment shown in FIG. 4), specifically:
  • a feasible implementation manner is: processing the scene data according to a data type of the sub-scene to obtain a plurality of sub-scene data.
  • the data type of the sub-scene may include: driving assistance data, emergency data, full amount of information data, and driving necessary data.
  • the data type of each sub-scene may include scene data of at least one data type.
  • the driving assistance data may include at least one of traffic condition data and a remaining mileage overview map
  • the driving necessary data may include at least one of status data of the vehicle, a driving indication identifier
  • the full amount information data may include a navigation overview map
  • the emergency data may include abnormal data of the vehicle.
  • the data type of the sub-scene to which the data belongs may be determined according to the correspondence between the type of each data in the scene data and the data type of the sub-scene, thereby obtaining multiple sub-scenarios. Scene data.
  • the acquired scene data includes: a remaining mileage overview map, driving speed (status data of the vehicle), remaining oil quantity (status data of the vehicle), a front 100 m left turn logo (driving indication mark),
  • driving speed status of the vehicle
  • remaining oil quantity status data of the vehicle
  • a front 100 m left turn logo driving indication mark
  • the number of data types of the sub-scene is greater than or equal to the number of obtained sub-scene data. That is, the data type of the sub-scene includes five types, but in the actual application process, only the scene data corresponding to the data type of the three seed scenes is acquired, and therefore, the number of obtained sub-scene data is 3.
  • Another feasible implementation manner processing the scene data according to the scene type of the vehicle, and obtaining at least two sub-scene data. Specifically, please refer to the embodiment shown in FIG. 5.
  • FIG. 5 is a schematic flowchart of a method for obtaining sub-scenario data according to the present invention.
  • the method may include:
  • S502 Obtain a data type of the sub-scene corresponding to the scene type according to the correspondence between the scene type and the data type of the sub-scene;
  • S503. Process the scene data according to the data type of the sub-scene to obtain a plurality of sub-scene data.
  • the scene type is determined according to the acquired scene data, and the scene type may include a navigation scene, an entertainment scene, a high-speed scene, Urban scenes, reversing scenes, etc.
  • the data type in the scene data can be analyzed to determine the type of the scene.
  • the scene type may be determined as a navigation scene; whether the multimedia data (such as music, video, etc.) may be included in the scene data, and if yes, the scene type may be determined as music.
  • the scene data includes both a navigation line and multimedia data
  • the scene type may be determined according to the priority of the navigation line and the multimedia data. For example, if the priority of the navigation line is higher than the priority of the multimedia data, it may be determined that the scene type is a navigation scene.
  • the scene type may be determined as a high-speed scene or an urban scene according to the location information of the vehicle.
  • the vehicle may also be driven according to the vehicle. Speed determines the type of scene.
  • the corresponding relationship between the scene type and the data type of the sub-scene is obtained, and the corresponding relationship may be determined according to an empirical value, or may be preset by the user. Then, the data types of the plurality of sub-scene corresponding to the scene type are obtained according to the corresponding relationship, and the scene data is processed according to the data type of the sub-scene to obtain a plurality of sub-scene data.
  • determining, according to the scene data, that the scene type is a road abnormal navigation scene acquiring a data type of the sub-scene corresponding to the road abnormal navigation scene: a navigation overview map, traffic condition data, and vehicle status data, and according to each data in the scene data.
  • the correspondence between the type and the data type of the sub-scene determines the data type of the sub-scene to which each data belongs, thereby obtaining a plurality of sub-scene data. As shown in Table 6:
  • the data type of the sub-scene corresponding to the scene type is determined according to the scene type, and the scene data is processed according to the data type of the sub-scene to obtain a plurality of sub-scene data, and the obtained sub-scene data is determined. Corresponding to the current scene of the vehicle, thereby improving the accuracy of the determined sub-scene data.
  • the receiving object corresponding to each sub-scene data may be determined according to the data type of the sub-scene. Specifically, the receiving object corresponding to each sub-scene data may be determined by the following two feasible implementation manners:
  • a feasible implementation manner is: determining a receiving object corresponding to each sub-scene data according to a pushing priority corresponding to a data type of each sub-scene.
  • the data type of each sub-scene has its corresponding push priority, and different push priorities correspond to different receiving objects.
  • the push priority corresponding to the data type of each sub-scene may be preset by the user, or may be determined according to historical experience values. In the actual application process, the priority of the data type of each sub-scene and the receiving object corresponding to each push priority may be set according to actual needs.
  • the scene data is processed to obtain three seed scene data, which are respectively recorded as sub-scene data 1 - sub-scene data 3, the data type of each seed scene, and the push priority of the data type of each sub-scene, Correspondence between the priority of the push target and the received object, and the correspondence between the sub-scene data and the received object.
  • Table 7 shows that the scene data is processed to obtain three seed scene data, which are respectively recorded as sub-scene data 1 - sub-scene data 3, the data type of each seed scene, and the push priority of the data type of each sub-scene, Correspondence between the priority of the push target and the received object, and the correspondence between the sub-scene data and the received object.
  • Sub-scenario data Sub-scenario data type Push priority Receiving object Sub-scenario data 1 Vehicle status data 3 dash board Sub-scenario data 2 Entertainment data, navigation overview 2 Center console Sub-scenario data 3 Traffic status data 1 rearview mirror
  • the information absolutely necessary for the user to drive (for example, the state data of the vehicle) is pushed to the receiving object that is convenient for viewing by the user, such as a dashboard or a HUD, and correspondingly,
  • the priority of the vehicle's status data is set to 3 (highest).
  • Pushing sub-scene data (such as entertainment data, navigation overview maps, etc.) that users frequently view to the receiving object that the user can conveniently view but does not affect the user's driving, such as the center console, correspondingly, can enter entertainment data, navigation overview
  • the priority of the graph is set to 2.
  • Urgent information that needs to be understood by the user and other secondary driving assistance information are pushed to the receiving object that the user does not often use, such as a rear view mirror, and accordingly, the priority of the traffic condition data can be set to 1 Therefore, targeted data is pushed to the user.
  • Another feasible implementation manner is: determining a receiving object corresponding to each sub-scene data according to a preset correspondence between a data type of each sub-scene and each receiving object.
  • the system may preset a correspondence between a data type of each sub-scene and each received object.
  • the user can also preset the correspondence between the data type of each sub-scenario and each receiving object according to actual needs.
  • the data type of each sub-scene and each received object may be output to the user, and the corresponding relationship determined by the user is received.
  • a feasible implementation manner is that the data type of each sub-scene and each receiving object are output to the user in the form of text, so that the user can perform an associating operation on the data type of each sub-scene and each receiving object, and according to the user input.
  • the association operation determines the data type of each sub-scene and the corresponding relationship of each received object.
  • the association operation may be a data type of the sub-scene and a connection operation of the receiving object, a selection operation, and the like; another feasible implementation manner is: outputting the data type of each sub-scenario to the user in the form of voice and Each of the receiving objects is such that the user inputs the correspondence between the data type of each sub-scene and each receiving object in the form of voice or text according to the received data type of each sub-scene and each received object.
  • the preset correspondence relationship between the data type of each sub-scene and each receiving object may be acquired, and the receiving object corresponding to each sub-scene data is determined according to the corresponding relationship.
  • three seed scene data are obtained, which are respectively recorded as sub-scene data 1 - sub-scene data 3, data types of sub-scene corresponding to each sub-scene data, data types of each sub-scene, and each The correspondence between the receiving objects is shown in Table 8:
  • Sub-scenario data Sub-scenario data type Receiving object Sub-scenario data 1 Full amount of information data Center console Sub-scenario data 2 Emergency data Rearview mirror or mirror Sub-scenario data 3 Driving necessary data Dashboard or HUD
  • each sub-scene data and the receiving object can be determined.
  • Example 1 assuming that the acquired scene data is as shown in Table 9:
  • Navigation overview 500 meters ahead road construction 700 meters ahead of the road is congested 1 km traffic accident ahead 60 km/h Engine speed 5000 rpm
  • the scenario type is a road abnormal navigation scenario
  • the data type of the sub-scene corresponding to the scenario type is obtained
  • the data type of the sub-scene corresponding to the road abnormal navigation scenario is:
  • the map, the traffic status data, and the status data of the vehicle can be determined according to the scene data and the data type of the sub-scene shown in Table 9, and each sub-scene data and the corresponding receiving object are as shown in Table 10:
  • the corresponding object of each sub-scene data may be determined according to the correspondence between the data type of each sub-scene and each receiving object, and the data type of the sub-scene is assumed to be a navigation overview map.
  • the receiving object is the center console. If the data type of the sub-scene is traffic condition data, the corresponding receiving object is the rear view mirror, and when the data type of the sub-scene is the state data of the vehicle, the corresponding receiving object is the dashboard.
  • FIG. 6 is a schematic diagram of a display interface according to an embodiment of the present invention.
  • a center console interface 601 a rearview mirror interface 602, and a dashboard interface 603 are included. among them,
  • a navigation overview map is displayed in the center console interface 601.
  • the navigation overview map includes the navigation route (the route where the positions A, B, and C are located) and the traffic condition of the navigation route, as shown by the interface 601.
  • the current position of the user is position A
  • the traffic is congested in the section from position B to position C
  • the traffic of other sections is smooth.
  • the navigation indicator is also included in the navigation overview map (200 meters ahead to enter the cultural road) and the remaining distance is 25Km.
  • the arrival time is 18:43, and the arrival time is 43 minutes.
  • the interface 601 merely illustrates the data included in the overview map by way of example. Of course, in the actual application process, the navigation can be set according to actual needs.
  • the data included in the map is displayed. At this time, only the basic road condition information of the road is displayed in the navigation overview map, and additional information such as construction and accidents are not superimposed, thereby ensuring the simplicity and readability of the interface of the center console.
  • the traffic condition data is displayed on the rearview mirror interface 602: "500 meters ahead of the road construction, 700 meters ahead of the road, and 1 km ahead traffic accident".
  • the status data of the vehicle is displayed in the dashboard interface 603: "speed of 60 km/h, engine speed of 5000 rpm”.
  • the different sub-scene data in the scene data is uniformly pushed to the center console, the rearview mirror, and the dashboard, so that the center console, the rearview mirror, and The dashboard can acquire scene data in synchronization and in real time.
  • a simple navigation overview map is displayed in the center console, road condition information is displayed in the rearview mirror, and status data of the vehicle is displayed in the dashboard, according to the receiving object.
  • the role of the scenario is to push the corresponding scene data to the receiving object, so as to realize the reasonable distribution of the scene data, so that the user can easily View the view data.
  • Example 2 assuming that the acquired scene data is as shown in Table 11:
  • Navigation overview Remaining mileage overview 200 meters ahead into the cultural road 60 km/h Engine speed 5000 rpm
  • the scenario type is a normal navigation scenario of the road, and then the data type of the sub-scene corresponding to the scenario type is obtained, and the data type of the sub-scene corresponding to the normal navigation scenario of the road is:
  • the map, the driving indication logo, the remaining mileage overview map, and the status data of the vehicle, according to the scene data and the data type of the sub-scene shown in Table 11, can determine each sub-scene data and the corresponding receiving object as shown in Table 12:
  • the corresponding object of each sub-scene data may be determined according to the correspondence between the data type of each sub-scene and each receiving object, and the data type of the sub-scene is a driving overview map, corresponding to When the receiving object is the center console, and the data type of the sub-scene is the remaining mileage overview map, the corresponding receiving object is the rearview mirror. If the data type of the sub-scene is the driving indication identifier or the state data of the vehicle, the corresponding receiving The object is a dashboard.
  • the “Navigation Overview Map” is pushed to the center console, and the “Remaining Mileage Overview Map” is pushed to the rearview mirror, and the “200 meters ahead enters the cultural road and the speed. 60 km/h, engine speed 5000 rpm” is pushed to the dashboard.
  • FIG. 7 is a schematic diagram of a display interface according to an embodiment of the present invention. Please refer to Figure 7, including the center console Interface 701, rearview mirror interface 702, and dashboard interface 703, wherein
  • a navigation overview map is displayed in the center console interface 701.
  • the navigation overview map includes navigation routes (the lines where the positions A, B, C, and D are located) and traffic conditions of the navigation route, such as interface 701.
  • the starting position of the navigation route is position A
  • the ending position of the navigation route is position D
  • the current position of the user is position B
  • the road map is also included in the navigation overview map, for example, the speed limit at position C is 80 km/h.
  • the driving indication sign 200 meters ahead to enter the cultural road
  • the remaining distance 25Km the arrival time 18:43, and the arrival time of 43 minutes are included.
  • the interface 701 is only an example.
  • the form indicates the data included in the overview map. Of course, in the actual application process, the data included in the navigation overview map can be set according to actual needs.
  • a remaining mileage overview map is displayed in the rearview mirror interface 702.
  • the driving indication flag "200 meters ahead of the cultural road” and the state data of the vehicle are displayed: "speed of 60 km/h, engine speed of 5000 rpm”.
  • the different sub-scene data in the scene data is uniformly pushed to the center console, the rearview mirror, and the dashboard, so that the center console, the rearview mirror, and The dashboard can acquire the scene data in synchronization and in real time. Further, by displaying the map and navigation information (navigation overview map) in the center console, the remaining mileage overview map is displayed in the rearview mirror, and the driving is displayed in the dashboard. The status data of the indicator and the vehicle are indicated, and the scene data is reasonably allocated, so that the user can view the scene data.
  • FIG. 8 is a schematic diagram of a display interface according to an embodiment of the present invention.
  • the interface 801 is a rear view mirror interface, including a line overview map, a starting point of the line (Xixi Butterfly Garden), and an end point of the line (Wuzhen East Gate Parking).
  • Figure 8 is only shown as an example in the rearview mirror
  • the scene data in the rear view mirror can be set according to actual needs in the actual application process, which is not specifically limited in the present invention.
  • a push device for data will be described in detail below.
  • Push of these data The delivery device can be implemented in the infrastructure of the vehicle or terminal device, or can be implemented in an interactive system between the server and the client.
  • the push devices for such data can be constructed using commercially available hardware components configured by the steps taught by the present scheme.
  • the processor component or processing module, processing unit
  • the processor component can use components such as a microcontroller, a microcontroller, a microprocessor, etc. from Texas Instruments, Intel Corporation, ARM, and the like.
  • FIG. 9 is a schematic structural diagram of a data pushing apparatus according to an embodiment of the present invention.
  • the data sending apparatus provided in this embodiment includes:
  • the data acquisition module 10 is configured to acquire scene data.
  • the pushing module 11 is configured to respectively push at least part of the scene data to a plurality of receiving objects carried by the vehicle, where the scene data is used for display.
  • the pushing device of the data provided in this embodiment may be used to perform the foregoing method embodiments, and the implementation principle and technical effects are similar, and details are not described herein again.
  • FIG. 10 is a schematic structural diagram of a data pushing apparatus according to an embodiment of the present invention. The embodiment is implemented on the basis of the embodiment of FIG.
  • processing module 12 and the object obtaining module 13 are further included.
  • the processing module 12 is configured to process the scene data to obtain a plurality of sub-scene data
  • the object obtaining module 13 is configured to acquire a receiving object corresponding to each of the sub-scene data.
  • the pushing module 11 is specifically configured to push each of the sub-scene data to a corresponding receiving object.
  • the processing module 12 is specifically configured to process the scene data according to a data type of the sub-scene to obtain a plurality of sub-scene data.
  • the number of data types of the sub-scene is greater than or equal to the number of obtained sub-scene data.
  • the method further includes: a type obtaining module 14, determining, according to the scene data, a scene type;
  • the object obtaining module 13 is specifically configured to determine, according to a data type of the sub-scene, a receiving object corresponding to each of the sub-scene data.
  • the object obtaining module 13 is specifically configured to determine, according to a pushing priority corresponding to a data type of each of the sub-scene, a receiving object corresponding to the sub-scene data.
  • the object obtaining module 13 is specifically configured to acquire a preset correspondence between a data type of each of the sub-scene and each received object;
  • the method further includes: a relationship obtaining module 15 configured to output, to the user, a data type of each of the sub-scene and each of the receiving objects, and receive the corresponding relationship determined by the user.
  • a relationship obtaining module 15 configured to output, to the user, a data type of each of the sub-scene and each of the receiving objects, and receive the corresponding relationship determined by the user.
  • the scene type includes:
  • the data type of the sub-scene corresponding to the road abnormal navigation scenario includes at least two of the following:
  • Navigation overview maps, traffic status data, and vehicle status data.
  • the corresponding receiving object is a central control station
  • the corresponding receiving object is a rearview mirror
  • the corresponding receiving object is a dashboard.
  • the data type of the sub-scene corresponding to the normal navigation scenario includes at least two of the following:
  • Navigation overview map driving indication logo, remaining mileage overview map, and vehicle status data.
  • the corresponding receiving object is a central control station
  • the corresponding receiving object is a rearview mirror
  • the corresponding receiving object is a dashboard.
  • the data type of the sub-scene corresponding to the entertainment scene includes at least two of the following:
  • the corresponding receiving object is a central control station
  • the corresponding receiving object is a rearview mirror
  • the corresponding receiving object is a dashboard.
  • the data type of the sub-scene includes at least two of the following:
  • Driving assistance data emergency data, full amount of information data, and driving necessary data.
  • the driving assistance data includes at least one of traffic condition data and a remaining mileage overview map
  • the driving necessary data includes at least one of status data of the vehicle and a driving indication identifier
  • the full amount of information data includes at least one of a navigation overview map and entertainment data
  • the emergency data includes abnormal data of the vehicle.
  • the corresponding receiving object is a rear view mirror or a rear view mirror
  • the corresponding receiving object is a central control station
  • the corresponding receiving object is a dashboard or a HUD.
  • the data acquiring module 10 is specifically configured to acquire the scenario data sent by at least one of a data collection device, a cloud server, and a smart home device that is in communication connection with the vehicle. .
  • the receiving object includes at least two of the following:
  • a dashboard in communication with the vehicle.
  • a center console in communication with the vehicle.
  • a head-up display HUD head-up display
  • a rearview mirror in communication with the vehicle.
  • a smart terminal in communication with the vehicle.
  • the pushing device of the data provided in this embodiment may be used to perform the foregoing method embodiments, and the implementation principle and technical effects are similar, and details are not described herein again.
  • FIG. 11 is a schematic structural diagram of hardware of a data pushing device according to an embodiment of the present invention. As shown in FIG. 11, the push device of the data includes:
  • the memory 23 may include a high speed RAM memory, and may also include a non-volatile memory NVM, such as at least one disk memory, in which various programs may be stored for performing various processing functions and implementing the method steps of the present embodiment.
  • NVM non-volatile memory
  • the processor 21 can be, for example, a central processing unit (CPU), an application specific integrated circuit (ASIC), a digital signal processor (DSP), a digital signal processing device (DSPD), and programmable logic.
  • CPU central processing unit
  • ASIC application specific integrated circuit
  • DSP digital signal processor
  • DSPD digital signal processing device
  • programmable logic A device (PLD), field programmable gate array (FPGA), controller, microcontroller, microprocessor or other electronic component is implemented that is coupled to the input device 20 and the output device 22 by a wired or wireless connection.
  • PLD field programmable gate array
  • controller microcontroller, microprocessor or other electronic component is implemented that is coupled to the input device 20 and the output device 22 by a wired or wireless connection.
  • the input device 20 may include multiple input devices, for example, at least one of a device-oriented device interface, a software programmable interface, and a transceiver.
  • the output device 22 described above can include a variety of output devices, such as at least one of a programmable interface that can include software, a transceiver, and a device-oriented in-vehicle device interface.
  • the device-oriented device interface may be a wired interface used for data transmission between the device and the device, or may be a hardware insertion interface (for example, a USB interface, a serial port, etc.) for data transmission between the device and the device.
  • the programmable interface of the foregoing software may be, for example, an entry for the user to edit or modify the program, such as an input pin interface or an input interface of the chip; optionally, the transceiver may be a communication function.
  • RF transceiver chip, baseband processing chip, and transceiver antenna may be a communication function.
  • the push device for data in the embodiment of the present invention is a general data push device. It can be applied to any control system or push device or other type of device.
  • the push device of the data may be a push device for data of the vehicle, for example, a push device that may be data for the vehicle, a push device for data of the aircraft, data for the waterway transporter Push equipment, etc.
  • the present invention provides another embodiment for introduction. Please refer to the following embodiments, which will not be described in detail herein.
  • the input device 20 is configured to acquire scene data.
  • the processor 21 is coupled to the output device 22 and the input device 20 for controlling the output device 22 to respectively push at least part of the scene data to a plurality of receiving objects carried by the vehicle, the scene data being used for display.
  • the processor 21 is further configured to: process the scene data, obtain a plurality of sub-scene data, and acquire a receiving object corresponding to each of the sub-scene data;
  • the processor 21 is specifically configured to control the output device 22 to push each of the sub-scene data to a corresponding receiving object.
  • the processor 21 is configured to process the scene data according to a data type of the sub-scene to obtain a plurality of sub-scene data.
  • the processor 21 is further configured to determine, according to a data type of the sub-scene, a receiving object corresponding to each of the sub-scene data.
  • the processor 21 is specifically configured to determine, according to a push priority corresponding to a data type of each of the sub-scene, a receiving object corresponding to each of the sub-scene data.
  • the processor 21 is configured to: obtain a preset correspondence between a data type of each of the sub-scene and each received object, and determine, according to the correspondence, a receiving object corresponding to each of the sub-scene data. .
  • the input device 20 is specifically configured to acquire the scenario data sent by at least one of a data collection device, a cloud server, and a smart home device in a communication connection with the vehicle.
  • the pushing device of the data provided in this embodiment may be used to perform the foregoing method embodiments, and the implementation principle and technical effects are similar, and details are not described herein again.
  • the present invention further provides a vehicle control device, which is a data pushing device for a specific implementation of the vehicle. .
  • the vehicle control device may be a vehicle device, an attached device after the vehicle leaves the factory, and the like.
  • the vehicle control device can include an onboard output device, an onboard input device, and an onboard processor coupled to the onboard output device and the onboard input device.
  • the airborne in the "airborne input device”, “airborne output device”, and “airborne processor” may be an “vehicle input device” carried on a vehicle
  • “Vehicle output device” and “vehicle processor” may also be “onboard input device”, “onboard output device”, “onboard processor” carried on the aircraft, or may be carried on other types of vehicles.
  • the above device does not limit the meaning of "airborne” in the embodiment of the present invention.
  • the onboard input device may be an in-vehicle input device
  • the onboard processor may be an onboard processor
  • the onboard output device may be an onboard output device.
  • the above-described in-vehicle input device may include a variety of input devices, such as at least one of a device-oriented in-vehicle device interface, an in-vehicle programmable interface of software, and a transceiver.
  • the device-oriented in-vehicle device interface may be a wired interface for data transmission between the device and the device (for example, a connection interface with a driving recorder on a center console of the vehicle), or may be used for the device.
  • a hardware insertion interface for example, a USB interface, a serial port, etc.
  • the in-vehicle programmable interface of the above software may be, for example, an entry in the vehicle control system that can be edited or modified by the user, such as in a vehicle.
  • the input pin interface or the input interface of the large and small chips involved; optionally, the transceiver may be a radio frequency transceiver chip with a communication function in the vehicle, a baseband processing chip, and a transceiver antenna, etc., so that the vehicle can be connected with the cloud server. Perform data interaction.
  • the onboard input device is configured to acquire scene data.
  • the onboard input device may be a transceiver that establishes communication with the cloud server, or may also be internal to the vehicle.
  • the onboard processor can use various application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (depending on the type of vehicle installed). PLD), field programmable gate array (FPGA), central processing unit (CPU), controller, microcontroller, microprocessor or other electronic component implementation and used to perform the above methods.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • FPGA field programmable gate array
  • CPU central processing unit
  • controller microcontroller, microprocessor or other electronic component implementation and used to perform the above methods.
  • the above airborne processor is routed through the car or A wireless connection is coupled to the onboard input device and the onboard output device described above.
  • the airborne processor can perform the method in the embodiment corresponding to the foregoing FIG. 1 to FIG. 8.
  • the onboard processor can process the scene data to obtain a plurality of sub-scene data, and the control output device pushes each sub
  • the above-mentioned onboard output device may be a device interface capable of data transmission with a receiving object (for example, a center console, a dashboard), or may be wirelessly established with a user device or the like, depending on the type of the installed vehicle.
  • a transmitting transceiver that can be coupled to the onboard input device and the onboard processor via an in-vehicle line or wirelessly.
  • the airborne output device of this embodiment can perform the method in the embodiment corresponding to FIG. 1 to FIG. 8 described above.
  • the invention also provides an in-vehicle internet operating system. It can be understood by those skilled in the art that the hardware of the push device that can manage and control the above data, or the hardware of the vehicle control device according to the present invention and the computer program of the software resources involved in the present invention are directly operated. System software on the push device or vehicle control device of the above data.
  • the operating system is the interface of the data push device or the vehicle control device, and is also the interface between the hardware and other software.
  • the in-vehicle Internet operating system provided by the present invention can interact with other modules or functional devices on the vehicle to control the functions of the corresponding modules or functional devices.
  • the vehicle in the above embodiment is a vehicle
  • the pushing device of the data is an in-vehicle terminal device.
  • the vehicle is no longer independent of the communication network.
  • the vehicle can be connected to the server to form a network to form an in-vehicle Internet.
  • the in-vehicle Internet system can provide voice communication services, location services, navigation services, mobile internet access, vehicle emergency rescue, vehicle data and management services, in-vehicle entertainment services, and the like.
  • FIG. 12 is a schematic structural diagram of an in-vehicle Internet operating system according to an embodiment of the present invention.
  • the operating system provided by the present invention includes an input control unit 30 and an output control unit 31.
  • the input control unit 30 controls the in-vehicle input device to acquire scene data
  • the output control unit 31 controls the vehicle-mounted output device to respectively push at least part of the scene data to the plurality of receiving objects carried by the vehicle, and the scene data is used for display.
  • the in-vehicle input device in this embodiment may include the input device in the above embodiment, and the input control unit 30 may control the in-vehicle input device to perform the step 301 in the embodiment of FIG. 3 to acquire the scene data.
  • the output control unit 31 can control the vehicle-mounted output device to perform at least part of the scene data to the plurality of receiving objects carried by the vehicle by performing step 302 in the above embodiment of FIG. 3 .
  • a push control unit is further included;
  • the push control unit, the control data push system performs step 402 in the embodiment of FIG. 4 to process the scene data to obtain a plurality of sub-scene data;
  • the push control unit further controls the data pushing system to perform step 403 in the embodiment of FIG. 4 to acquire a receiving object corresponding to each of the sub-scene data;
  • the output control unit further controls the in-vehicle output device to perform step 404 in the embodiment of FIG. 4 to push each of the sub-scene data to a corresponding receiving object.
  • the push control unit further controls the data pushing system to process the scene data according to the data type of the sub-scene to obtain a plurality of sub-scene data.
  • the push control unit further controls the data pushing system to perform step 501 and step 502 in the embodiment of FIG. 5, and determines a scene type according to the scene data, and according to the scene type and the data of the sub-scene. The data type of the sub-scene corresponding to the scene type is obtained.
  • the pushing control unit further controls the data pushing system to determine a receiving object corresponding to each of the sub-scene data according to a data type of the sub-scene.
  • the pushing control unit further controls the data pushing system to determine a receiving object corresponding to each of the sub-scene data according to a pushing priority corresponding to a data type of each of the sub-scene.
  • the pushing control unit further controls the data pushing system to obtain a preset correspondence between a data type of each of the preset sub-scene and each receiving object, and determine, according to the corresponding relationship, each of the sub-scene data. Corresponding receiving object.
  • the input control unit further controls the onboard input device to acquire the scene sent by at least one of a data collection device, a cloud server, and a smart home device having a communication connection with the vehicle. data.
  • the data pushing system may be a function implemented by an operating system, or the data pushing system may be a function implemented by a processor in the foregoing embodiment.
  • the in-vehicle Internet operating system may control the corresponding components to perform the above-mentioned FIG. 1 to FIG. 1 through the above-mentioned input control unit 30, output control unit 31, push control unit or on the basis of the above two units, in combination with other units. All or part of the steps of 8 have similar beneficial effects as described above, and are not described herein again.
  • the present invention also provides a processor readable storage medium having program instructions stored therein for causing a processor to perform the methods described in FIGS. 1-8 of the above-described embodiments.
  • the above readable storage medium may be any type of volatile or non-volatile storage device or group thereof Implementations such as static random access memory (SRAM), electrically erasable programmable read only memory (EEPROM), erasable programmable read only memory (EPROM), programmable read only memory (PROM), read only memory (ROM), magnetic memory, flash memory, disk or optical disk.
  • SRAM static random access memory
  • EEPROM electrically erasable programmable read only memory
  • EPROM erasable programmable read only memory
  • PROM programmable read only memory
  • ROM read only memory
  • magnetic memory magnetic memory
  • flash memory disk or optical disk.
  • first, second, third, etc. may be used to describe XXX in embodiments of the invention, these XXX should not be limited to these terms. These terms are only used to distinguish XXX from each other.
  • first XXX may also be referred to as a second XXX without departing from the scope of the embodiments of the present invention.
  • second XXX may also be referred to as a first XXX.
  • the words “if” and “if” as used herein may be interpreted to mean “when” or “when” or “in response to determining” or “in response to detecting.”
  • the phrase “if determined” or “if detected (conditions or events stated)” may be interpreted as “when determined” or “in response to determination” or “when detected (stated condition or event) “Time” or “in response to a test (condition or event stated)”.

Abstract

La présente invention concerne un procédé, un appareil et un dispositif de poussée de données. Le procédé consiste à : acquérir des données de scène; et pousser respectivement au moins une partie des données de scène vers une pluralité d'objets de réception portés par un outil de transport, les données de scène étant utilisées pour l'affichage. Dans les modes de réalisation, des données de scène diversifiées sont affichées au moyen d'une pluralité d'objets récepteurs, de sorte qu'il soit commode pour un utilisateur de visualiser des informations, et l'utilisateur peut également acquérir des informations diversifiées.
PCT/CN2017/087874 2016-06-23 2017-06-12 Procédé, appareil et dispositif de poussée de données WO2017219883A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610461001.5 2016-06-23
CN201610461001.5A CN107547578A (zh) 2016-06-23 2016-06-23 数据的推送方法、装置和设备

Publications (1)

Publication Number Publication Date
WO2017219883A1 true WO2017219883A1 (fr) 2017-12-28

Family

ID=60783765

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/087874 WO2017219883A1 (fr) 2016-06-23 2017-06-12 Procédé, appareil et dispositif de poussée de données

Country Status (3)

Country Link
CN (1) CN107547578A (fr)
TW (1) TW201800287A (fr)
WO (1) WO2017219883A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111107147A (zh) * 2019-12-17 2020-05-05 苏州思必驰信息科技有限公司 一种消息推送方法及装置
WO2020228041A1 (fr) * 2019-05-16 2020-11-19 深圳市欢太科技有限公司 Procédé et appareil d'exploitation de scénario, dispositif électronique et support lisible par ordinateur
CN112507211A (zh) * 2020-11-18 2021-03-16 青岛海尔科技有限公司 消息推送方法、装置、存储介质及电子装置

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110385991B (zh) * 2018-04-16 2021-04-20 比亚迪股份有限公司 车辆的多媒体娱乐系统、控制方法及车辆
CN110871684A (zh) * 2018-09-04 2020-03-10 比亚迪股份有限公司 车内投影方法、装置、设备和存储介质
CN110147492B (zh) * 2019-04-12 2022-04-12 北京梧桐车联科技有限责任公司 一种信息处理方法、交通工具及存储介质
CN110162719A (zh) * 2019-05-27 2019-08-23 广州小鹏汽车科技有限公司 内容推送方法、装置、存储介质及计算机设备、车辆
CN110641478B (zh) * 2019-10-15 2021-02-19 英博超算(南京)科技有限公司 汽车域控制器显示方法、装置、汽车以及可读存储介质
EP4119399A4 (fr) * 2020-03-31 2023-05-10 Huawei Technologies Co., Ltd. Procédé et appareil de collecte de données de conduite
CN114475479A (zh) * 2022-01-20 2022-05-13 奇瑞汽车股份有限公司 汽车的控制方法、装置及计算机存储介质
CN115422228B (zh) * 2022-11-03 2023-01-03 四川蜀天信息技术有限公司 一种账户套餐管理系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10145701A (ja) * 1996-11-13 1998-05-29 Calsonic Corp 車載用マルチメディアモニタ装置
CN104504775A (zh) * 2014-12-04 2015-04-08 深圳市华宝电子科技有限公司 一种多功能车载多媒体导航系统和方法
CN105292131A (zh) * 2015-09-06 2016-02-03 上海修源网络科技有限公司 用于通知车辆信息的装置及其方法
CN105516243A (zh) * 2015-11-24 2016-04-20 上海汽车集团股份有限公司 远程数据的传输方法、云端数据网关和车载终端
CN105577725A (zh) * 2014-10-17 2016-05-11 中国电信股份有限公司 基于手机进行车载应用显示的方法、车载终端以及系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10145701A (ja) * 1996-11-13 1998-05-29 Calsonic Corp 車載用マルチメディアモニタ装置
CN105577725A (zh) * 2014-10-17 2016-05-11 中国电信股份有限公司 基于手机进行车载应用显示的方法、车载终端以及系统
CN104504775A (zh) * 2014-12-04 2015-04-08 深圳市华宝电子科技有限公司 一种多功能车载多媒体导航系统和方法
CN105292131A (zh) * 2015-09-06 2016-02-03 上海修源网络科技有限公司 用于通知车辆信息的装置及其方法
CN105516243A (zh) * 2015-11-24 2016-04-20 上海汽车集团股份有限公司 远程数据的传输方法、云端数据网关和车载终端

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020228041A1 (fr) * 2019-05-16 2020-11-19 深圳市欢太科技有限公司 Procédé et appareil d'exploitation de scénario, dispositif électronique et support lisible par ordinateur
CN113545009A (zh) * 2019-05-16 2021-10-22 深圳市欢太科技有限公司 场景操作方法、装置、电子设备及计算机可读介质
US11782590B2 (en) 2019-05-16 2023-10-10 Shenzhen Heytap Technology Corp., Ltd. Scene-operation method, electronic device, and non-transitory computer readable medium
CN111107147A (zh) * 2019-12-17 2020-05-05 苏州思必驰信息科技有限公司 一种消息推送方法及装置
CN111107147B (zh) * 2019-12-17 2022-08-16 思必驰科技股份有限公司 一种消息推送方法及装置
CN112507211A (zh) * 2020-11-18 2021-03-16 青岛海尔科技有限公司 消息推送方法、装置、存储介质及电子装置

Also Published As

Publication number Publication date
CN107547578A (zh) 2018-01-05
TW201800287A (zh) 2018-01-01

Similar Documents

Publication Publication Date Title
WO2017219883A1 (fr) Procédé, appareil et dispositif de poussée de données
US20220092719A1 (en) Onboard vehicle sharing service
TWI696976B (zh) 用於監控隨選服務的系統、方法及非暫態電腦可讀取媒體
US20190126867A1 (en) User profile synchronization for a vehicle
US10410427B2 (en) Three dimensional graphical overlays for a three dimensional heads-up display unit of a vehicle
CA3035259C (fr) Identification de demandeurs et de fournisseurs mis en correspondance
US10140417B1 (en) Creating a virtual model of a vehicle event
US20190176731A1 (en) Systems and methods for vehicle management
US8954226B1 (en) Systems and methods for visualizing an accident involving a vehicle
US9381813B2 (en) Selective message presentation by in-vehicle computing system
US9883353B2 (en) Method to transmit real-time in-vehicle information to an internet service
JP2019505026A (ja) 車両の電子インタフェースを介した通信を容易にするためのシステム及び方法
WO2017181908A1 (fr) Procédé de traitement de navigation, dispositif de navigation, dispositif de commande de véhicule, et système d'exploitation
TW201818342A (zh) 確定與車輛相關的參考方向的系統和方法
CN110954117B (zh) 车辆及其导航行程服务推送方法、云服务器
CN110800030B (zh) 用于拼车服务的方法和系统
CN111284325B (zh) 车辆、车机设备及其车辆沿途对象详尽信息显示方法
CN108983773A (zh) 车辆、车机设备、用户通讯终端及其行程自动规划方法
CN111824006A (zh) 基于导航的车辆大灯自动调节方法、系统及车辆
CN111289009A (zh) 车辆、车机设备及其车机设备兴趣点输入搜索方法
CN113791843A (zh) 一种执行方法、装置、设备及存储介质
CN114882579A (zh) 车载屏幕的控制方法、装置及车辆
CN206327303U (zh) 一种车载导航系统
CN112926986A (zh) 车载支付服务设备和车载支付服务方法
CN110986995A (zh) 车辆、车机设备及其基于车机导航的天气预报方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17814621

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 04/04/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17814621

Country of ref document: EP

Kind code of ref document: A1