WO2020004767A1 - Système télématique installé dans un véhicule, et procédé de commande associé - Google Patents

Système télématique installé dans un véhicule, et procédé de commande associé Download PDF

Info

Publication number
WO2020004767A1
WO2020004767A1 PCT/KR2019/001975 KR2019001975W WO2020004767A1 WO 2020004767 A1 WO2020004767 A1 WO 2020004767A1 KR 2019001975 W KR2019001975 W KR 2019001975W WO 2020004767 A1 WO2020004767 A1 WO 2020004767A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
information
telematics system
external server
protocol
Prior art date
Application number
PCT/KR2019/001975
Other languages
English (en)
Korean (ko)
Inventor
신정은
조장형
최성하
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Publication of WO2020004767A1 publication Critical patent/WO2020004767A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]

Definitions

  • the present invention relates to a telematics system provided in a vehicle and a method for controlling the same, and more particularly, to a method for transmitting and receiving a signal to and from an external server through a base station.
  • the vehicle traditionally functions as a user's means of transportation, but for the convenience of the user, the vehicle is provided with various sensors, electronic devices, and the like, to provide driving convenience for the user.
  • ADAS Advanced Driver Assistance System
  • autonomous vehicles Autonomous Vehicle
  • ICE in-car entertainment
  • IVI in-vehicle infotainment
  • 5G use case includes Intersection Movement Assistant, Real-time Situational Awareness & HD Maps, Cooperative Lane Change of Automated Vehicles, See-Through, Vulnerable Road User Discovery, Shared Vision, Collective Sensory, Hybrid Intelligence, Journey Studio, Media Center on Wheels, etc. There is this.
  • a vehicle may be provided with a plurality of display devices.
  • a telematics control unit TCU
  • the present invention is to provide a telematics system provided in a vehicle and a method of controlling the same.
  • receiving a request for content output, requesting information related to the content to an external server through a base station connected to the telematics system and the external through the base station A method of controlling a telematics system provided in a vehicle comprising receiving the information from a server is proposed.
  • the received information may be delivered to a plurality of target devices connected to the telematics system, respectively.
  • the information is received from the external server via a predetermined protocol, and the predetermined protocol may include a transmission protocol, a streaming control protocol, and a quality of service (QoS) protocol.
  • the predetermined protocol may include a transmission protocol, a streaming control protocol, and a quality of service (QoS) protocol.
  • QoS quality of service
  • the telematics system If the received information is media content provided by the external server, the telematics system generates a URL (Universal Resource Locator) to which each of the plurality of target devices can access, and converts the generated URLs into the plurality of targets. Each can be delivered to the device.
  • a URL Universal Resource Locator
  • the transmitted information may be output in synchronization with each of the plurality of target devices based on reference time information included in the QoS protocol and time information included in a packet constituting information received from the external server.
  • the plurality of target devices may include a center information display (CID), an audio video navigation (AVN), rear seat entertainment (RSE), and a consumer electronic (CE) device provided inside or outside the vehicle. Can be.
  • CID center information display
  • AVN audio video navigation
  • RSE rear seat entertainment
  • CE consumer electronic
  • the content received from the external server may correspond to media content or a camera image of another vehicle.
  • the above-described problem of the related art that is, the problem of failing to efficiently provide content to a plurality of display devices in a vehicle through a TCU (Telematics Control Unit) can be solved.
  • the telematics system may provide audio / video synchronized content in a plurality of display devices in a vehicle.
  • FIG. 1 is a view showing the appearance of a vehicle according to an aspect of the present invention.
  • FIG. 2 is a view of the vehicle according to an aspect of the present invention from various angles from the outside.
  • 3 to 4 are views illustrating the interior of a vehicle according to an aspect of the present invention.
  • FIG 5 to 6 are views referred to for explaining an object according to an aspect of the present invention.
  • FIG. 7 is a block diagram referred to describe a vehicle according to an aspect of the present invention.
  • FIGS. 8 to 10 are diagrams for explaining the structure of a telematics system according to an aspect of the present invention.
  • 11 to 14 are diagrams for explaining an implementation example of a telematics system according to an aspect of the present invention.
  • 15 is a view for explaining the characteristics of the telematics system architecture according to an aspect of the present invention.
  • 16 to 17 are diagrams for explaining the SW platform of the telematics system according to an aspect of the present invention.
  • FIG. 18 is a view for explaining a telematics system according to an aspect of the present invention.
  • FIG. 19 illustrates a relationship between a telematics system and a cloud according to an aspect of the present invention.
  • 20 is a view for explaining the relationship between the framework and the cloud of the telematics system according to an aspect of the present invention.
  • 21 is a diagram for describing a media cast center of a telematics system according to an aspect of the present invention.
  • FIG. 22 illustrates a positioning service provided by a telematics system according to an aspect of the present invention.
  • FIG. 23 is a diagram for describing an electronic horizon provided by a telematics system according to an aspect of the present disclosure.
  • 24 to 31 are views for explaining an implementation example of a telematics system according to an aspect of the present invention.
  • 32 to 34 are views for explaining a technical effect of a telematics system according to an aspect of the present invention.
  • 35 to 36 are diagrams for explaining a signal transmission and reception sequence between a TCU and a cloud server in a telematics system according to an aspect of the present invention.
  • the vehicle 100 may include a wheel that rotates by a power source and a steering input device 510 for adjusting a traveling direction of the vehicle 100.
  • the vehicle 100 may be an autonomous vehicle.
  • the vehicle 100 may be switched to an autonomous driving mode or a manual mode based on a user input.
  • the vehicle 100 may be switched from the manual mode to the autonomous driving mode or from the autonomous driving mode to the manual mode based on the received user input through the user interface device 200.
  • the vehicle 100 may be switched to the autonomous driving mode or the manual mode based on the driving situation information.
  • the driving situation information may include at least one of object information, navigation information, and vehicle state information outside the vehicle.
  • the vehicle 100 may be switched from the manual mode to the autonomous driving mode or from the autonomous driving mode to the manual mode based on the driving situation information generated by the object detecting apparatus 300.
  • the vehicle 100 may be switched from the manual mode to the autonomous driving mode or from the autonomous driving mode to the manual mode based on the driving situation information received through the communication device 400.
  • the vehicle 100 may switch from the manual mode to the autonomous driving mode or from the autonomous driving mode to the manual mode based on information, data, and signals provided from an external device.
  • the autonomous vehicle 100 may be driven based on the driving system 700.
  • the autonomous vehicle 100 may be driven based on information, data, or signals generated by the driving system 710, the parking system 740, and the parking system 750.
  • the autonomous vehicle 100 may receive a user input for driving through the driving manipulation apparatus 500. Based on a user input received through the driving manipulation apparatus 500, the vehicle 100 may be driven.
  • the overall length is the length from the front to the rear of the vehicle 100
  • the width is the width of the vehicle 100
  • the height is the length from the bottom of the wheel to the roof.
  • the full length direction L is a direction in which the full length measurement of the vehicle 100 is a reference
  • the full width direction W is a direction in which the full width measurement of the vehicle 100 is a reference
  • the total height direction H is a vehicle. It may mean the direction which is the reference of the height measurement of (100).
  • the vehicle 100 includes a user interface device 200, an object detecting device 300, a communication device 400, a driving manipulation device 500, a vehicle driving device 600, and a traveling system. 700, a navigation system 770, a sensing unit 120, an interface unit 130, a memory 140, a control unit 170, and a power supply unit 190 may be included.
  • the vehicle 100 may further include other components in addition to the components described herein, or may not include some of the described components.
  • the sensing unit 120 may include a state of the vehicle. Can sense.
  • the sensing unit 120 may include an attitude sensor (for example, a yaw sensor, a roll sensor, a pitch sensor), a collision sensor, a wheel sensor, a speed sensor, and an inclination.
  • the sensing unit 120 includes vehicle attitude information, vehicle collision information, vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / reverse information, battery Acquire sensing signals for information, fuel information, tire information, vehicle lamp information, vehicle internal temperature information, vehicle internal humidity information, steering wheel rotation angle, vehicle external illumination, pressure applied to the accelerator pedal, pressure applied to the brake pedal, and the like. can do.
  • the sensing unit 120 may further include an accelerator pedal sensor, a pressure sensor, an engine speed sensor, an air flow sensor (AFS), an intake air temperature sensor (ATS), a water temperature sensor (WTS), and a throttle position sensor. (TPS), TDC sensor, crank angle sensor (CAS), and the like.
  • AFS air flow sensor
  • ATS intake air temperature sensor
  • WTS water temperature sensor
  • TPS throttle position sensor
  • TDC crank angle sensor
  • CAS crank angle sensor
  • the sensing unit 120 may generate vehicle state information based on the sensing data.
  • the vehicle state information may be information generated based on data sensed by various sensors provided in the vehicle.
  • the vehicle state information includes vehicle attitude information, vehicle speed information, vehicle tilt information, vehicle weight information, vehicle direction information, vehicle battery information, vehicle fuel information, vehicle tire pressure information
  • the vehicle may include steering information of the vehicle, vehicle indoor temperature information, vehicle indoor humidity information, pedal position information, vehicle engine temperature information, and the like.
  • the interface unit 130 may serve as a path to various types of external devices connected to the vehicle 100.
  • the interface unit 130 may include a port connectable with the mobile terminal, and may connect with the mobile terminal through the port. In this case, the interface unit 130 may exchange data with the mobile terminal.
  • the interface unit 130 may serve as a path for supplying electrical energy to the connected mobile terminal.
  • the interface unit 130 may provide the mobile terminal with electrical energy supplied from the power supply unit 190.
  • the memory 140 is electrically connected to the controller 170.
  • the memory 140 may store basic data for the unit, control data for controlling the operation of the unit, and input / output data.
  • the memory 140 may be various storage devices such as a ROM, a RAM, an EPROM, a flash drive, a hard drive, and the like, in hardware.
  • the memory 140 may store various data for overall operation of the vehicle 100, such as a program for processing or controlling the controller 170.
  • the memory 140 may be integrally formed with the controller 170 or may be implemented as a subcomponent of the controller 170.
  • the controller 170 may control the overall operation of each unit in the vehicle 100.
  • the controller 170 may be referred to as an electronic control unit (ECU).
  • the power supply unit 190 may supply power required for the operation of each component under the control of the controller 170.
  • the power supply unit 190 may receive power from a battery inside the vehicle.
  • processors and controllers 170 included in vehicle 100 may include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), and field programmable (FPGAs). Gate arrays, processors, controllers, micro-controllers, microprocessors, and other electrical units for performing other functions may be implemented.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • the vehicle driving apparatus 600, the driving system 700, and the navigation system 770 may have separate processors or may be integrated into the controller 170.
  • the user interface device 200 is a device for communicating with the vehicle 100 and a user.
  • the user interface device 200 may receive a user input and provide the user with information generated in the vehicle 100.
  • the vehicle 100 may implement user interfaces (UI) or user experience (UX) through the user interface device 200.
  • UI user interfaces
  • UX user experience
  • the user interface device 200 may include an input unit 210, an internal camera 220, a biometric detector 230, an output unit 250, and a processor 270. Each component of the user interface device 200 may be structurally and functionally separated from or integrated with the interface unit 130 described above.
  • the user interface device 200 may further include other components in addition to the described components, or may not include some of the described components.
  • the input unit 210 is for receiving information from a user, and the data collected by the input unit 210 may be analyzed by the processor 270 and processed as a user's control command.
  • the input unit 210 may be disposed in the vehicle.
  • the input unit 210 may include one area of a steering wheel, one area of an instrument panel, one area of a seat, one area of each pillar, and a door. one area of the door, one area of the center console, one area of the head lining, one area of the sun visor, one area of the windshield or of the window It may be disposed in one area or the like.
  • the input unit 210 may include a voice input unit 211, a gesture input unit 212, a touch input unit 213, and a mechanical input unit 214.
  • the voice input unit 211 may convert a user's voice input into an electrical signal.
  • the converted electrical signal may be provided to the processor 270 or the controller 170.
  • the voice input unit 211 may include one or more microphones.
  • the gesture input unit 212 may convert a user's gesture input into an electrical signal.
  • the converted electrical signal may be provided to the processor 270 or the controller 170.
  • the gesture input unit 212 may include at least one of an infrared sensor and an image sensor for detecting a user's gesture input.
  • the gesture input unit 212 may detect a 3D gesture input of the user.
  • the gesture input unit 212 may include a light output unit or a plurality of image sensors for outputting a plurality of infrared light.
  • the gesture input unit 212 may detect a user's 3D gesture input through a time of flight (TOF) method, a structured light method, or a disparity method.
  • TOF time of flight
  • the touch input unit 213 may convert a user's touch input into an electrical signal.
  • the converted electrical signal may be provided to the processor 270 or the controller 170.
  • the touch input unit 213 may include a touch sensor for detecting a user's touch input.
  • the touch input unit 213 may be integrally formed with the display unit 251 to implement a touch screen. Such a touch screen may provide an input interface and an output interface between the vehicle 100 and the user.
  • the mechanical input unit 214 may include at least one of a button, a dome switch, a jog wheel, and a jog switch.
  • the electrical signal generated by the mechanical input unit 214 may be provided to the processor 270 or the controller 170.
  • the mechanical input unit 214 may be disposed on a steering wheel, a center fascia, a center console, a cockpit module, a door, or the like.
  • the processor 270 starts a learning mode of the vehicle 100 in response to user inputs to at least one of the voice input unit 211, the gesture input unit 212, the touch input unit 213, and the mechanical input unit 214 described above. can do.
  • the vehicle 100 may perform driving path learning and surrounding environment learning of the vehicle 100. The learning mode will be described in detail later with reference to the object detecting apparatus 300 and the driving system 700.
  • the internal camera 220 may acquire a vehicle interior image.
  • the processor 270 may detect a state of the user based on the vehicle interior image.
  • the processor 270 may acquire the gaze information of the user from the vehicle interior image.
  • the processor 270 may detect a gesture of the user in the vehicle interior image.
  • the biometric detector 230 may acquire biometric information of the user.
  • the biometric detector 230 may include a sensor for acquiring biometric information of the user, and may acquire fingerprint information, heartbeat information, etc. of the user using the sensor. Biometric information may be used for user authentication.
  • the output unit 250 is for generating output related to visual, auditory or tactile.
  • the output unit 250 may include at least one of the display unit 251, the audio output unit 252, and the haptic output unit 253.
  • the display unit 251 may display graphic objects corresponding to various pieces of information.
  • the display unit 251 is a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display (flexible) display, a 3D display, or an e-ink display.
  • the display unit 251 forms a layer structure or is integrally formed with the touch input unit 213 to implement a touch screen.
  • the display unit 251 may be implemented as a head up display (HUD).
  • the display unit 251 may include a projection module to output information through an image projected on a wind shield or a window.
  • the display unit 251 may include a transparent display. The transparent display can be attached to the wind shield or window.
  • the transparent display may display a predetermined screen while having a predetermined transparency.
  • a transparent display is a transparent thin film electroluminescent (TFEL), a transparent organic light-emitting diode (OLED), a transparent liquid crystal display (LCD), a transmissive transparent display, a transparent light emitting diode (LED) display It may include at least one of.
  • the transparency of the transparent display can be adjusted.
  • the user interface device 200 may include a plurality of display units 251a to 251g.
  • the display unit 251 may include one region of the steering wheel, one region 251a, 251b, and 251e of the instrument panel, one region 251d of the seat, one region 251f of each pillar, and one region of the door ( 251g), one area of the center console, one area of the head lining, one area of the sun visor, or may be implemented in one area 251c of the windshield and one area 251h of the window.
  • the sound output unit 252 converts an electrical signal provided from the processor 270 or the controller 170 into an audio signal and outputs the audio signal. To this end, the sound output unit 252 may include one or more speakers.
  • the haptic output unit 253 generates a tactile output.
  • the haptic output unit 253 may vibrate the steering wheel, the seat belt, and the seats 110FL, 110FR, 110RL, and 110RR so that the user may recognize the output.
  • the processor 270 may control the overall operation of each unit of the user interface device 200.
  • the user interface device 200 may include a plurality of processors 270 or may not include the processor 270.
  • the user interface device 200 may be operated under the control of the processor or the controller 170 of another device in the vehicle 100.
  • the user interface device 200 may be referred to as a vehicle display device.
  • the user interface device 200 may be operated under the control of the controller 170.
  • the object detecting apparatus 300 is a device for detecting an object located outside the vehicle 100.
  • the object detecting apparatus 300 may generate object information based on the sensing data.
  • the object information may include information about the presence or absence of the object, location information of the object, distance information between the vehicle 100 and the object, and relative speed information between the vehicle 100 and the object.
  • the object may be various objects related to the driving of the vehicle 100.
  • the object O includes a lane OB10, another vehicle OB11, a pedestrian OB12, a two-wheeled vehicle OB13, traffic signals OB14, OB15, light, a road, a structure, Speed bumps, features, animals and the like can be included.
  • the lane OB10 may be a driving lane, a lane next to the driving lane, and a lane in which an opposite vehicle travels.
  • the lane OB10 may be a concept including left and right lines forming a lane.
  • the other vehicle OB11 may be a vehicle that is driving around the vehicle 100.
  • the other vehicle may be a vehicle located within a predetermined distance from the vehicle 100.
  • the other vehicle OB11 may be a vehicle that precedes or follows the vehicle 100.
  • the pedestrian OB12 may be a person located near the vehicle 100.
  • the pedestrian OB12 may be a person located within a predetermined distance from the vehicle 100.
  • the pedestrian OB12 may be a person located on a sidewalk or a roadway.
  • the two-wheeled vehicle OB13 may be a vehicle that is positioned around the vehicle 100 and moves using two wheels.
  • the motorcycle OB13 may be a vehicle having two wheels located within a predetermined distance from the vehicle 100.
  • the motorcycle OB13 may be a motorcycle or a bicycle located on sidewalks or roadways.
  • the traffic signal may include a traffic light OB15, a traffic sign OB14, and a pattern or text drawn on a road surface.
  • the light may be light generated by a lamp provided in another vehicle.
  • the light can be light generated from the street light.
  • the light may be sunlight.
  • the road may include a road surface, a curve, an uphill slope, a slope downhill, or the like.
  • the structure may be an object located around a road and fixed to the ground.
  • the structure may include a street lamp, a roadside tree, a building, a power pole, a traffic light, a bridge.
  • the features may include mountains, hills, and the like.
  • the object may be classified into a moving object and a fixed object.
  • the moving object may be a concept including another vehicle and a pedestrian.
  • the fixed object may be a concept including a traffic signal, a road, and a structure.
  • the object detecting apparatus 300 may include a camera 310, a radar 320, a lidar 330, an ultrasonic sensor 340, an infrared sensor 350, and a processor 370. Each component of the object detecting apparatus 300 may be structurally and functionally separated or integrated with the sensing unit 120 described above.
  • the object detecting apparatus 300 may further include other components in addition to the described components, or may not include some of the described components.
  • the camera 310 may be located at a suitable place outside the vehicle to acquire an image outside the vehicle.
  • the camera 310 may be a mono camera, a stereo camera 310a, an around view monitoring (AVM) camera 310b, or a 360 degree camera.
  • AVM around view monitoring
  • the camera 310 may acquire location information of the object, distance information with respect to the object, or relative speed information with the object by using various image processing algorithms.
  • the camera 310 may obtain distance information and relative speed information with respect to the object based on the change in the object size over time in the acquired image.
  • the camera 310 may acquire distance information and relative velocity information with respect to an object through a pin hole model, road surface profiling, or the like.
  • the camera 310 may obtain distance information and relative speed information with respect to the object based on the disparity information in the stereo image acquired by the stereo camera 310a.
  • the camera 310 may be disposed in close proximity to the front windshield in the interior of the vehicle in order to acquire an image in front of the vehicle.
  • the camera 310 may be disposed around the front bumper or the radiator grille.
  • the camera 310 may be disposed in close proximity to the rear glass in the interior of the vehicle to acquire an image of the rear of the vehicle.
  • the camera 310 may be disposed around the rear bumper, the trunk, or the tail gate.
  • the camera 310 may be disposed in close proximity to at least one of the side windows in the interior of the vehicle to acquire an image of the vehicle side.
  • the camera 310 may be arranged around the side mirror, fender or door.
  • the camera 310 may provide the obtained image to the processor 370.
  • the radar 320 may include an electromagnetic wave transmitter and a receiver.
  • the radar 320 may be implemented in a pulse radar method or a continuous wave radar method in terms of radio wave firing principle.
  • the radar 320 may be implemented by a frequency modulated continuous wave (FSCW) method or a frequency shift keying (FSK) method according to a signal waveform among continuous wave radar methods.
  • FSCW frequency modulated continuous wave
  • FSK frequency shift keying
  • the radar 320 detects an object based on a time of flight (TOF) method or a phase-shift method based on electromagnetic waves, and detects the position of the detected object, distance to the detected object, and relative velocity. Can be detected.
  • TOF time of flight
  • phase-shift method based on electromagnetic waves
  • the radar 320 may be disposed at an appropriate position outside the vehicle to detect an object located in front, rear, or side of the vehicle.
  • the lidar 330 may include a laser transmitter and a receiver.
  • the lidar 330 may be implemented in a time of flight (TOF) method or a phase-shift method.
  • TOF time of flight
  • the lidar 330 may be implemented as driven or non-driven. When implemented in a driving manner, the lidar 330 may be rotated by a motor and detect an object around the vehicle 100. When implemented in a non-driven manner, the lidar 330 may detect an object located within a predetermined range with respect to the vehicle 100 by optical steering.
  • the vehicle 100 may include a plurality of non-driven lidars 330.
  • the lidar 330 detects an object based on a time of flight (TOF) method or a phase-shift method using laser light, and detects an object, a position of the detected object, a distance from the detected object, and Relative speed can be detected.
  • the lidar 330 may be disposed at an appropriate position outside the vehicle to detect an object located in front, rear, or side of the vehicle.
  • the ultrasonic sensor 340 may include an ultrasonic transmitter and a receiver.
  • the ultrasonic sensor 340 may detect an object based on the ultrasonic wave, and detect a position of the detected object, a distance to the detected object, and a relative speed.
  • the ultrasonic sensor 340 may be disposed at an appropriate position outside the vehicle to detect an object located in front, rear, or side of the vehicle.
  • the infrared sensor 350 may include an infrared transmitter and a receiver.
  • the infrared sensor 340 may detect an object based on infrared light, and detect a position of the detected object, a distance to the detected object, and a relative speed.
  • the infrared sensor 350 may be disposed at an appropriate position outside the vehicle to detect an object located in front, rear, or side of the vehicle.
  • the processor 370 may control overall operations of each unit of the object detecting apparatus 300.
  • the processor 370 compares the data sensed by the camera 310, the radar 320, the lidar 330, the ultrasonic sensor 340, and the infrared sensor 350 with previously stored data to detect or classify an object. can do.
  • the processor 370 may detect and track the object based on the obtained image.
  • the processor 370 may perform operations such as calculating a distance to an object and calculating a relative speed with the object through an image processing algorithm.
  • the processor 370 may acquire distance information and relative speed information with respect to the object based on the change in the object size over time in the obtained image.
  • the processor 370 may acquire distance information and relative velocity information with respect to an object through a pin hole model, road surface profiling, or the like.
  • the processor 370 may obtain distance information and relative speed information with the object based on the disparity information in the stereo image acquired by the stereo camera 310a.
  • the processor 370 may detect and track the object based on the reflected electromagnetic wave reflected by the transmitted electromagnetic wave to the object.
  • the processor 370 may perform an operation such as calculating a distance from the object, calculating a relative speed with the object, and the like based on the electromagnetic waves.
  • the processor 370 may detect and track the object based on the reflected laser light reflected by the transmitted laser back to the object.
  • the processor 370 may perform an operation such as calculating a distance from the object, calculating a relative speed with the object, and the like based on the laser light.
  • the processor 370 may detect and track the object based on the reflected ultrasound, in which the transmitted ultrasound is reflected by the object and returned.
  • the processor 370 may perform an operation such as calculating a distance from the object, calculating a relative speed with the object, and the like based on the ultrasound.
  • the processor 370 may detect and track the object based on the reflected infrared light from which the transmitted infrared light is reflected back to the object.
  • the processor 370 may perform an operation such as calculating a distance to the object, calculating a relative speed with the object, and the like based on the infrared light.
  • the processor 370 may include a camera 310, a radar 320, a lidar 330, and an ultrasonic sensor.
  • the data sensed by the 340 and the infrared sensor 350 may be stored in the memory 140.
  • the object detecting apparatus 300 may or may not include the processor 370.
  • the processor 370 may or may not include the processor 370.
  • each of the camera 310, the radar 320, the lidar 330, the ultrasonic sensor 340, and the infrared sensor 350 may individually include a processor.
  • the object detecting apparatus 300 may be operated under the control of the processor or the controller 170 of the apparatus in the vehicle 100.
  • the object detecting apparatus 300 may be operated under the control of the controller 170.
  • the communication device 400 is a device for performing communication with an external device.
  • the external device may be another vehicle, a mobile terminal or a server.
  • the communication device 400 may include at least one of a transmit antenna, a receive antenna, a radio frequency (RF) circuit capable of implementing various communication protocols, and an RF element to perform communication.
  • RF radio frequency
  • the communication device 400 includes a short range communication unit 410, a location information unit 420, a V2X communication unit 430, an optical communication unit 440, a broadcast transmission / reception unit 450, an ITS (Intelligent Transport Systems) communication unit 460, and a processor. 470 may include. According to an embodiment, the communication device 400 may further include other components in addition to the described components, or may not include some of the described components.
  • the short range communication unit 410 is a unit for short range communication.
  • the local area communication unit 410 may include Bluetooth TM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, Near Field Communication (NFC), and Wi-Fi (Wireless). Local area communication may be supported using at least one of Fidelity, Wi-Fi Direct, and Wireless Universal Serial Bus (USB) technologies.
  • the short range communication unit 410 may form short range wireless networks to perform short range communication between the vehicle 100 and at least one external device.
  • the location information unit 420 is a unit for obtaining location information of the vehicle 100.
  • the location information unit 420 may include a global positioning system (GPS) module or a differential global positioning system (DGPS) module.
  • GPS global positioning system
  • DGPS differential global positioning system
  • the V2X communication unit 430 is a unit for performing wireless communication with a server (V2I: Vehicle to Infra), another vehicle (V2V: Vehicle to Vehicle), or a pedestrian (V2P: Vehicle to Pedestrian).
  • the V2X communication unit 430 may include an RF circuit that can implement a communication with the infrastructure (V2I), an inter-vehicle communication (V2V), and a communication with the pedestrian (V2P).
  • the optical communication unit 440 is a unit for performing communication with an external device via light.
  • the optical communication unit 440 may include an optical transmitter that converts an electrical signal into an optical signal and transmits the external signal to the outside, and an optical receiver that converts the received optical signal into an electrical signal.
  • the light emitting unit may be formed to be integrated with the lamp included in the vehicle 100.
  • the broadcast transceiver 450 is a unit for receiving a broadcast signal from an external broadcast management server or transmitting a broadcast signal to a broadcast management server through a broadcast channel.
  • the broadcast channel may include a satellite channel and a terrestrial channel.
  • the broadcast signal may include a TV broadcast signal, a radio broadcast signal, and a data broadcast signal.
  • the ITS communication unit 460 may exchange information, data, or signals with the traffic system.
  • the ITS communication unit 460 may provide the obtained information and data to the transportation system.
  • the ITS communication unit 460 may receive information, data, or a signal from a traffic system.
  • the ITS communication unit 460 may receive road traffic information from the traffic system and provide the road traffic information to the control unit 170.
  • the ITS communication unit 460 may receive a control signal from a traffic system and provide the control signal to a processor provided in the controller 170 or the vehicle 100.
  • the processor 470 may control the overall operation of each unit of the communication device 400.
  • the communication device 400 may include a plurality of processors 470 or may not include the processor 470.
  • the communication device 400 may be operated under the control of the processor or the controller 170 of another device in the vehicle 100.
  • the communication device 400 may implement a vehicle display device together with the user interface device 200.
  • the vehicle display device may be called a telematics device or an audio video navigation (AVN) device.
  • the communication device 400 may be operated under the control of the controller 170.
  • the driving operation apparatus 500 is a device that receives a user input for driving. In the manual mode, the vehicle 100 may be driven based on a signal provided by the driving manipulation apparatus 500.
  • the driving manipulation apparatus 500 may include a steering input apparatus 510, an acceleration input apparatus 530, and a brake input apparatus 570.
  • the steering input device 510 may receive a driving direction input of the vehicle 100 from the user.
  • the steering input device 510 is preferably formed in a wheel shape to enable steering input by rotation.
  • the steering input device may be formed in the form of a touch screen, a touch pad, or a button.
  • the acceleration input device 530 may receive an input for accelerating the vehicle 100 from a user.
  • the brake input device 570 may receive an input for deceleration of the vehicle 100 from a user.
  • the acceleration input device 530 and the brake input device 570 are preferably formed in the form of a pedal. According to an embodiment, the acceleration input device or the brake input device may be formed in the form of a touch screen, a touch pad, or a button.
  • the driving manipulation apparatus 500 may be operated under the control of the controller 170.
  • the vehicle drive device 600 is a device that electrically controls the driving of various devices in the vehicle 100.
  • the vehicle driving apparatus 600 may include a power train driver 610, a chassis driver 620, a door / window driver 630, a safety device driver 640, a lamp driver 650, and an air conditioning driver 660. Can be.
  • the vehicle driving apparatus 600 may further include other components in addition to the described components, or may not include some of the described components.
  • the vehicle driving device 600 may include a processor. Each unit of the vehicle drive apparatus 600 may each include a processor individually.
  • the power train driver 610 may control the operation of the power train device.
  • the power train driver 610 may include a power source driver 611 and a transmission driver 612.
  • the power source driver 611 may control the power source of the vehicle 100.
  • the power source driver 610 may perform electronic control of the engine. Thereby, the output torque of an engine, etc. can be controlled.
  • the power source drive unit 611 can adjust the engine output torque under the control of the control unit 170.
  • the power source driver 610 may control the motor.
  • the power source driver 610 may adjust the rotational speed, torque, and the like of the motor under the control of the controller 170.
  • the transmission driver 612 may control the transmission.
  • the transmission driver 612 can adjust the state of the transmission.
  • the transmission drive part 612 can adjust the state of a transmission to forward D, backward R, neutral N, or parking P.
  • the transmission drive unit 612 can adjust the bite state of the gear in the forward D state.
  • the chassis driver 620 may control the operation of the chassis device.
  • the chassis driver 620 may include a steering driver 621, a brake driver 622, and a suspension driver 623.
  • the steering driver 621 may perform electronic control of a steering apparatus in the vehicle 100.
  • the steering driver 621 may change the traveling direction of the vehicle.
  • the brake driver 622 may perform electronic control of a brake apparatus in the vehicle 100. For example, the speed of the vehicle 100 may be reduced by controlling the operation of the brake disposed on the wheel.
  • the brake drive unit 622 can individually control each of the plurality of brakes.
  • the brake driver 622 may control the braking force applied to the plurality of wheels differently.
  • the suspension driver 623 may perform electronic control of a suspension apparatus in the vehicle 100. For example, when there is a curvature on the road surface, the suspension driver 623 may control the suspension device to control the vibration of the vehicle 100 to be reduced. Meanwhile, the suspension driver 623 may individually control each of the plurality of suspensions.
  • the door / window driver 630 may perform electronic control of a door apparatus or a window apparatus in the vehicle 100.
  • the door / window driver 630 may include a door driver 631 and a window driver 632.
  • the door driver 631 may control the door apparatus.
  • the door driver 631 may control opening and closing of the plurality of doors included in the vehicle 100.
  • the door driver 631 may control the opening or closing of a trunk or a tail gate.
  • the door driver 631 may control the opening or closing of the sunroof.
  • the window driver 632 may perform electronic control of the window apparatus.
  • the opening or closing of the plurality of windows included in the vehicle 100 may be controlled.
  • the safety device driver 640 may perform electronic control of various safety apparatuses in the vehicle 100.
  • the safety device driver 640 may include an airbag driver 641, a seat belt driver 642, and a pedestrian protection device driver 643.
  • the airbag driver 641 may perform electronic control of an airbag apparatus in the vehicle 100.
  • the airbag driver 641 may control the airbag to be deployed when the danger is detected.
  • the seat belt driver 642 may perform electronic control of a seatbelt apparatus in the vehicle 100.
  • the seat belt driver 642 may control the passenger to be fixed to the seats 110FL, 110FR, 110RL, and 110RR by using the seat belt when detecting a danger.
  • the pedestrian protection device driver 643 may perform electronic control of the hood lift and the pedestrian airbag. For example, the pedestrian protection device driver 643 may control the hood lift up and the pedestrian air bag to be deployed when the collision with the pedestrian is detected.
  • the lamp driver 650 may perform electronic control of various lamp apparatuses in the vehicle 100.
  • the air conditioning driver 660 may perform electronic control of an air conditioner in the vehicle 100. For example, when the temperature inside the vehicle is high, the air conditioning driver 660 may control the air conditioning apparatus to operate to supply cool air to the inside of the vehicle.
  • the vehicle driving apparatus 600 may include a processor. Each unit of the vehicle drive apparatus 600 may each include a processor individually. The vehicle driving apparatus 600 may be operated under the control of the controller 170.
  • the travel system 700 is a system for controlling various travels of the vehicle 100.
  • the navigation system 700 can be operated in an autonomous driving mode.
  • the travel system 700 can include a travel system 710, a parking system 740, and a parking system 750.
  • the navigation system 700 may further include other components in addition to the described components, or may not include some of the described components.
  • the driving system 700 may include a processor.
  • Each unit of the navigation system 700 may each include a processor individually.
  • the driving system 700 may control the driving of the autonomous driving mode based on the learning.
  • the learning mode and the operation mode on the premise that the learning is completed may be performed.
  • a method of the processor of the driving system 700 to perform a learning mode and an operating mode will be described below.
  • the learning mode may be performed in the manual mode described above.
  • the processor of the driving system 700 may perform driving path learning and surrounding environment learning of the vehicle 100.
  • the driving route learning may include generating map data on a route on which the vehicle 100 travels.
  • the processor of the driving system 700 may generate map data based on information detected by the object detecting apparatus 300 while the vehicle 100 travels from the starting point to the destination.
  • the surrounding environment learning may include storing and analyzing information about the surrounding environment of the vehicle 100 in the driving process and the parking process of the vehicle 100.
  • the processor of the driving system 700 may detect information detected by the object detecting apparatus 300 during the parking process of the vehicle 100, for example, location information of the parking space, size information, fixed (or not fixed). Information about the surrounding environment of the vehicle 100 may be stored and analyzed based on information such as obstacle information.
  • the operation mode may be performed in the autonomous driving mode described above.
  • the operation mode will be described on the premise that the driving route learning or the surrounding environment learning is completed through the learning mode.
  • the operation mode may be performed in response to a user input through the input unit 210, or may be automatically performed when the vehicle 100 reaches a driving path and a parking space where learning is completed.
  • the operating mode is a semi autonomous operating mode that requires some user's manipulation of the drive manipulator 500 and a full-autonomous operation requiring no user's manipulation of the drive manipulator 500. May include a fully autonomous operating mode.
  • the processor of the driving system 700 may control the driving system 710 in the operation mode to drive the vehicle 100 along the driving path where learning is completed.
  • the processor of the driving system 700 may control the parking system 740 in the operation mode to release the parked vehicle 100 from the parking space where the learning is completed.
  • the processor of the driving system 700 may control the parking system 750 in the operation mode to park the vehicle 100 from the current position to the parking space where the learning is completed.
  • the driving system 700 may be a lower concept of the controller 170.
  • the driving system 700 may include a user interface device 270, an object detecting device 300, a communication device 400, a driving manipulation device 500, a vehicle driving device 600, and a navigation system.
  • the sensing unit 120, and the control unit 170 may include a concept including at least one.
  • the traveling system 710 may perform driving of the vehicle 100.
  • the driving system 710 may receive navigation information from the navigation system 770, provide a control signal to the vehicle driving apparatus 600, and perform driving of the vehicle 100.
  • the driving system 710 may receive object information from the object detecting apparatus 300 and provide a control signal to the vehicle driving apparatus 600 to perform driving of the vehicle 100.
  • the driving system 710 may receive a signal from an external device through the communication device 400, provide a control signal to the vehicle driving device 600, and perform driving of the vehicle 100.
  • the driving system 710 may include a user interface device 270, an object detection device 300, a communication device 400, a driving manipulation device 500, a vehicle driving device 600, a navigation system 770, and a sensing unit ( At least one of the 120 and the controller 170 may be a system concept for driving the vehicle 100.
  • the driving system 710 may be referred to as a vehicle driving control device.
  • the taking-out system 740 may perform taking out of the vehicle 100.
  • the taking-out system 740 may receive navigation information from the navigation system 770, provide a control signal to the vehicle driving apparatus 600, and perform take-out of the vehicle 100.
  • the taking-out system 740 may receive the object information from the object detecting apparatus 300, provide a control signal to the vehicle driving apparatus 600, and perform take-out of the vehicle 100.
  • the taking-off system 740 may receive a signal from an external device through the communication device 400, provide a control signal to the vehicle driving apparatus 600, and perform take-out of the vehicle 100.
  • the car leaving system 740 includes a user interface device 270, an object detecting device 300 and a communication device 400, a driving control device 500, a vehicle driving device 600, a navigation system 770, and a sensing unit ( Including at least one of the controller 120 and the controller 170, the concept of a system that performs the taking out of the vehicle 100 may be performed.
  • Such a car leaving system 740 may be referred to as a vehicle parking control device.
  • the parking system 750 may perform parking of the vehicle 100.
  • the parking system 750 may receive navigation information from the navigation system 770, provide a control signal to the vehicle driving apparatus 600, and perform parking of the vehicle 100.
  • the parking system 750 may receive the object information from the object detecting apparatus 300, provide a control signal to the vehicle driving apparatus 600, and perform parking of the vehicle 100.
  • the parking system 750 may receive a signal from an external device through the communication device 400, provide a control signal to the vehicle driving device 600, and perform parking of the vehicle 100.
  • the parking system 750 includes a user interface device 270, an object detection device 300 and a communication device 400, a driving operation device 500, a vehicle driving device 600, a navigation system 770, and a sensing unit ( At least one of the 120 and the controller 170 may be a system concept for parking the vehicle 100.
  • Such a parking system 750 may be referred to as a vehicle parking control device.
  • the navigation system 770 can provide navigation information.
  • the navigation information may include at least one of map information, set destination information, route information according to the destination setting, information on various objects on the route, lane information, and current location information of the vehicle.
  • the navigation system 770 may include a memory and a processor.
  • the memory may store navigation information.
  • the processor may control the operation of the navigation system 770.
  • the navigation system 770 may receive information from an external device through the communication device 400 and update the pre-stored information. According to an embodiment, the navigation system 770 may be classified as a subcomponent of the user interface device 200.
  • the telematics system according to an aspect of the present invention may provide communication with an external device through a telematics application, an ECU (Electronic Control Unit), a 5G modem, or the like.
  • ECU Electronic Control Unit
  • 5G modem or the like.
  • the telematics system (or 5G telematics system) according to an aspect of the present invention may be provided in a vehicle together with an AVN (Audio / Video / Navigation) and an ECU.
  • the CE device Consumer Electronics Device
  • the telematics system can be connected to the AVN and ECU via Ethernet.
  • telematics systems can be connected to CE devices through the Open Connectivity Foundation (OCF), which is known as one of the IoT protocols.
  • OCF Open Connectivity Foundation
  • An internal configuration of the telematics system according to an aspect of the present invention for example, a service framework, a framework, a platform service, an OS layer, and an HW, will be described in detail below.
  • Telematics system architecture should be considered in terms of E / E (Electrical / Electronic) architecture.
  • the software in the telematics system must be designed to be compatible or applicable to heterogeneous architectures and all ECUs.
  • a telematic control unit (TCU) of a telematics system may be implemented in a structure separate from a modem (case 1).
  • case 1 is a method in which the modem and telematics are separated in the TCU to communicate using only the interface.
  • case 2 is how a modem and an AP are implemented in one SoC and the entire TCU on top of it.
  • a method of separately implementing a modem and a telematics system may be considered.
  • Telematics Control Unit can be considered in two types. First, Modem and Application Processor are implemented as physically different chipsets, so two chipsets can be considered as a form of TCU. Second, Modem and Application Processor are physically composed of one Chipset and can be considered as a form that does not require special I / F configuration. On the other hand, in addition to the two types, Modem and Application Processor are composed of two chipsets, but there is a form that is not one board, but this is similar in terms of the first configuration and S / W configuration.
  • Modem only plays a role as a communication module and may be implemented to provide all data to the AP (Application Processor) as PCIe.
  • the AP can pass data to other connected ECUs.
  • a service having a special function may be configured according to its purpose. For example, there may be a device driver configuration for high-speed data processing related to data routing and a configuration having priority in S / W according to characteristics of ECUs to be linked.
  • the modem and the AP may be configured as one module. Data can be transferred through internal memory without PCIe I / F. On the other hand, due to the priority and performance distribution of Modem for data processing, the role of Application Processor can be reduced.
  • SOA Service Oriented Architecture
  • SW eg SW5 and SW6
  • SOME / IP and DDS can be provided as Framework, and it must be designed to run in various E / E Architectures.
  • E / E Architectures the function of each component of the telematics system architecture will be described.
  • Service Oriented Architecture is i) a fully integrated system with complete control, ii) rapid response to rapidly changing business conditions or vendor-specific platforms, and iii) price advantage.
  • SOME / IP Scalable Service-Oriented Middleware over IP
  • DDS Data Distribution Service
  • Both SOME / IP and DDS are SOA-based middleware solutions.
  • service discovery is used to find service instances and to find out if a service instance is running.
  • SOME / IP was implemented in vehicles before DDS.
  • DDS was first implemented in the aviation and military sectors and could benefit V2X.
  • AGL Automotive Grade Linux
  • GENIVI GENIVI
  • TSN Time Sensitive Network
  • eAVB Time Sensitive Network
  • MultiLink Controller is a module for data transmission management for various networks.
  • MultiLink Controller can support management of connected radio channel such as Modem communication (LTE / 5G) and Wi-Fi.
  • Modem communication can be used for data that needs to be transmitted in real time, and monitoring and backup data can be controlled to be transmitted only via Wi-Fi depending on the memory setting.
  • Each communication must be considered in terms of mobility and charging, and can be flexibly modified according to the service provided in the vehicle.
  • Media cast center represents a multimedia service received or transmitted through the TCU.
  • the media cast center can perform processing on various multimedia sources as well as basic streaming protocol and data compression / encoding / decoding. If 5G network is used, various multimedia services are expected to increase, which is expected to be important service not only in TCU but also in AVN specification.
  • IoT2V is an IoT service gateway that can interoperate with various IoT services and corresponds to a gateway for interworking AI services of service providers (eg, Amazon, Google, Naver).
  • service providers eg, Amazon, Google, Naver.
  • Cloud Service Manager is a service manager to provide services provided by interworking network servers as if they are in-vehicle services.
  • a service supporting SOA may be considered as a proxy service to provide the same functions in a vehicle.
  • Cloud Server Network Server
  • service interworking between SOA frameworks may be affected by the protocol transmission method of the framework rather than the physical location.
  • FIG. 11 is a view for explaining an implementation example of a telematics system according to an aspect of the present invention.
  • the telematics system according to an aspect of the present invention it is possible to improve Cloud Computing usability (feasibility) according to the enhancement of cloud computing performance.
  • low-cost SoC design is possible by comparing TCU's CPU / GPU share with embedded computing.
  • the telematics system it is possible to dynamically switch from embedded computing to cloud computing through a dynamic configuration based on SOA (Service Oriented Architecture).
  • SOA Service Oriented Architecture
  • FIG. 12 is a view for explaining an implementation example of a telematics system according to an aspect of the present invention. According to one aspect of the invention, it is possible to recognize the road situation in real time by searching for the image of the vehicle running in front of the vehicle equipped with a telematics system.
  • FIG. 13 is a diagram for describing an implementation example of a telematics system according to an aspect of the present disclosure.
  • synchronized playback of content for display devices e.g., mobile, CID (Center Information Display), RSE (Rear Seat Entertainment)
  • display devices e.g., mobile, CID (Center Information Display), RSE (Rear Seat Entertainment)
  • in-vehicle Ethernet is possible.
  • display devices e.g., mobile, CID (Center Information Display), RSE (Rear Seat Entertainment)
  • in-vehicle Ethernet e.g., a display devices
  • i) down-streaming a high resolution image and re-transmitting it to a multi display in a vehicle, thereby enabling synchronized playback.
  • CE devices with telematics system using IoT protocol.
  • VIO can be implemented through monocular camera systems and inertial sensors (e.g. accelerometer, gyroscope).
  • the combination of VIO and smartphone-grade GPS can provide centimeter positioning on the map at a lower cost than RTK GPS systems.
  • TCU can perform VIO-based HD-Positioning by combining external camera image and inertial sensor GPS.
  • the TCU transmits the positioning information to the AVN to compare accuracy with general GPS information.
  • AVN can compare the mapping accuracy of the AR through HD-Positioning or General Positioning.
  • the telematics system may provide iii) embedded computing and distributed processing using i) cloud or ii) computing power of a mobile device through a service-oriented communication framework.
  • Telematics AP of telematics system can translate different service-oriented communication methods.
  • the services provided by the cloud and the mobile device may be provided in the TCU through IoTivity.
  • services found in the TCU may be provided via SOME / IP or DDS in the in-vehicle network.
  • the camera-based application service can be verified based on a comparison through cloud, mobile or embedded computing.
  • the telematics system can be utilized as a protocol of media multicast service through OCF and external interface.
  • SW platform of the telematics system according to an aspect of the present invention can be designed from the SOA perspective, the main features may be as follows.
  • E / E Architecture is based on manufacturer's requirements, TCU can be integrated or separated, and SW components of telematics system according to the present invention can be used in any ECU regardless of manufacturer's E / E Architecture. It can operate flexibly.
  • FIG. 18 is a view for explaining a telematics system according to an aspect of the present invention. Referring to FIG. 18, it can be seen that a cloud and an in-vehicle network must be supported to implement a time sensitive network (TSN) through 5G.
  • TSN time sensitive network
  • TSN is a deterministic network that always targets time sensitive applications. TSNs were developed to provide a way to deliver information from source to destination in i) fixed, ii) predictable time. TSN focuses on control data. TSN features include: i) time synchronization (e.g., less than 1us) and guaranteed end-to-end latency, ii) resource reservation, iii) extraordinarily low packet loss ratios (10 -6 to 10 -9 ), iv) There may be convergence all data streams.
  • FIG. 19 illustrates a relationship between a telematics system and a cloud according to an aspect of the present invention.
  • the cloud service may be provided like an in-vehicle service based on the SOA framework.
  • 5G-based cloud computing is provided. That is, according to the present invention, it is possible to process through the cloud a service that is difficult to process in the embedded. Basically, the framework of the cloud must be the same as embedded in the vehicle to enable communication.
  • Services and applications that can be provided according to the present invention may include a vehicle location determination service, a camera video relay service, an IoT service, and a latest DSM engine.
  • a vehicle location determination service a camera video relay service
  • an IoT service a latest DSM engine.
  • all services that the vehicle or the cloud can provide may be registered in advance.
  • API Gateway can provide interoperability between vehicles and devices by means of RESTful APIs.
  • FIG. 20 is a view for explaining the relationship between the framework and the cloud of the telematics system according to an aspect of the present invention.
  • the Cloud and the Telematics AP may be connected through an Open Connectivity Foundation (OCF) for IoT communication.
  • OCF Open Connectivity Foundation
  • the telematics of the vehicle must have i) an OCF framework for IoT communication and ii) an SOA protocol for embedded communication (eg SOME / IP or DDS).
  • OCF Open Connectivity Foundation
  • 21 is a diagram for describing a media cast center of a telematics system according to an aspect of the present invention. Specifically, efficient sharing of media data via broadband Ethernet will be described.
  • the basic functions of media processing performed by the Media Cast Center are as follows. i) Relay (e.g. unicast relay or multicast relay), ii) De-multiplex (separate to 2 media stream for video and audio from 1 stream), iii) Media data transport, iv) Media streaming control, v) Interface between services included out-vehicle OCF Interface, SOME / IP Interface, vi) Encoder / Decoder.
  • Use cases that may be provided through the Media Cast Center may include i) See through and ii) Synchronized playing at multiple CE devices. With See through, it is possible to share in-vehicle media data (e.g., front camera video) to external vehicles or A / V devices connected to the vehicle via the cloud.
  • in-vehicle media data e.g., front camera video
  • the media source may include i) a media file of cloud or web, ii) a media stream URL (Universal Resource Locator) provided by a device connected to a streaming server and a vehicle.
  • a media stream URL Universal Resource Locator
  • FIG. 22 illustrates a positioning service provided by a telematics system according to an aspect of the present invention.
  • High precision positioning of the vehicle can be provided through the integration of GPS / GNSS and Visual-inertial Odometry (VIO).
  • VIO Visual-inertial Odometry
  • Main features of the positioning service provided by the telematics system according to an aspect of the present invention may be as follows.
  • Sensor collection Sensor data is collected from GPS, accelerometer, gyroscope and camera, and this data should be time synchronized
  • VIO algorithm estimates the relative position and orientation of a vehicle using a camera and inertial sensors.
  • the camera processing block detects features and tracks them.
  • the inertial data processing block samples the sensors at a very high frequency (100 Hz or Higher)
  • GPS + VIO fusion Tightly couple GPS / GNSS measurements and local / relative coordinate measurements from VIO to achieve highly accurate global position
  • Position handler The SOA framework-based protocol is used to transfer the global position to the Vehicle Position Manager in the cloud.
  • DSDA Dual SIM Dual Active
  • SIM # 1 is used by vehicle manufacturers for OTA, V2X, big data, and maintenance
  • SIM # 2 allows users to select a carrier for wi-fi hotspot, web access, and video steaming.
  • 5G telematics point of view DSDA has a low 5G dependency on the technical side, and the interests of carriers and vehicle manufacturers are key to commercializing the technology. To date, no prototype of 5G Modem vendor has been published.
  • FIG. 23 is a diagram for describing an electronic horizon provided by a telematics system according to an aspect of the present disclosure.
  • the telematics system can provide geometry information in front of the vehicle. This can be called Electronic Horizon and can be provided through the ADASIS Protocol.
  • a telematics system can act as a horizon provider. Details are as follows.
  • Electronic Horizon provides geometry information to other ECUs.
  • Electronic Horizon requires vehicle's position (GPS) and a digital map (from cloud service).
  • Horizon's Information is Position, Path, Lane, Road (tunnel, bridge), Speed Limit, Traffic sign, Warning (under construction).
  • the ADASIS forum makes ADASIS protocol to standardize the interface between a digital map and ADAS applications.
  • ADASIS v2 is designed for CAN bus communication restricted to 8 byte messages.
  • ADASIS v2 consists of Horizon Provider, Horizon Reconstructor, ADASIS Protocol and ADAS Application.
  • ADASIS v3 is designed for higher bandwidth communication. ADASIS v3 provides additional data with more detailed content. (Lane level information). ADASIS v3 supports detailed information (e.g., HD-GPS, HD-maps, sensors, and V2X).
  • detailed information e.g., HD-GPS, HD-maps, sensors, and V2X.
  • FIGS. 24 to 31 are views for explaining an implementation example of a telematics system according to an aspect of the present invention. More specifically, FIGS. 24 to 25 are diagrams for explaining route forecasting, FIGS. 26 to 27 are diagrams for explaining crowd-eye sourcing, and FIGS. 28 to 29 are diagrams for explaining take-out cinema 30 and 31 are diagrams for explaining adaptive guidance.
  • route forecasting refers to a 5G navigation service capable of predicting road conditions in real time by searching for images of vehicles in front of the vehicle.
  • Key values for implementing route forecasting may be real-time video of a desired road, linkage with a route, and video information search. Through route forecasting, real-time video can be shared between vehicles and the shared video can be analyzed. Route forecasting can be used for cloud computing, HD-Positioning, and AR. As a specific implementation, i) it may be possible to predict road conditions by searching the images of the vehicles ahead of the road when the road is blocked, and ii) may be used in a briefing section of a difficult road before the vehicle starts, and iii) the surroundings. It can be used for route guidance using the image of the feature. Referring to FIG. 25, technical elements required for implementing route forecasting may include the aforementioned TSN, DPDK, SOA framework, Advanced Positioning, Cloud server, Media streaming, and AR engine.
  • crowd-eye sourcing refers to a service used by various vehicles to secure an image of photographing my vehicle at various angles.
  • Crowd-eye sourcing enables real-time video sharing and editing between vehicles. Crowd-eye sourcing can be used for cloud computing and IoT managers. As a specific implementation example, i) a system that obtains a black box image of surrounding vehicles from various angles and immediately combines them to provide a driver and an insurance company in case of an accident, and ii) receives and utilizes images of my vehicle taken by the surrounding vehicles while traveling. It is possible. Referring to FIG. 27, technical elements required to implement crowd-eye sourcing may include the aforementioned TSN, DPDK, Advanced Positioning, Cloud server, and media streaming.
  • the take-out cinema refers to a solution capable of viewing ultra-high definition streaming content from outside the vehicle by connecting a 5G infotainment system to a peripheral device.
  • take-out cinema Key values for implementing take-out cinema can be extended to the role of car HMI to an outdoor media center, utilization of IVI interface on tethered devices, and easy and efficient tethering compared to smartphones.
  • a take-out cinema can be used to build an integrated connectivity system that can utilize peripheral multimedia devices.
  • Take-out cinema can be used for Media Multicast Manager and AR.
  • i) connecting a media device such as a projector or a speaker to enjoy high-definition content outdoors ii) using a high-quality and high-definition device in a vehicle to use as a media room, iii) connecting a VR device It is possible to provide the surrounding video viewing service while boarding.
  • technical elements required to implement take-out cinema may include the above-described media streaming and cloud server.
  • Adaptive Guidance refers to an autonomous driving system that receives a road situation from a front vehicle, sets a driving pattern bonded to an occupant activity, and provides attention notification.
  • Adaptive Guidance may be path / speed control based on occupant activity, preliminary notice and warning of precautions, DMS utilization and major activity analysis.
  • Adaptive Guidance enables sharing of real-time road situation information and analysis of occupant activity.
  • Adaptive Guidance can be used for cloud computing, data managers, and E-horizon.
  • monitoring the driver's forward direction in manual driving to provide traffic light change notification ii) in automatic driving, monitoring the objects related to personal work situation inside the vehicle to warn of starting / stopping / lane changing, iii) If the user is sleeping in automatic driving, there may be a route change so as to avoid uneven section or sudden start / stop.
  • technical elements for implementing adaptive guidance may include TSN, DPDK, Advanced Positioning, Cloud server, DMS, Road forecasting, and IoTivity.
  • 32 to 34 are views for explaining a technical effect of a telematics system according to an aspect of the present invention.
  • Media Cast Center and cloud server of the telematics system shown in FIG. 32 are characterized by high throughput.
  • the Media Cast Center features Relay, Media data transport, Media streaming control, Encoder / Decoder.
  • Restrictions due to high throughput may include i) Non consideration about secure channel between content server, and ii) Non consideration about copyright of content.
  • the TCU and cloud server of the telematics system shown in FIG. 33 are characterized by low latency.
  • MMI Multi Media Interface
  • 360 Media processing are possible through SOA framework (SOME / IP or IoTivity) of TCU and cloud server.
  • SOA framework SOME / IP or IoTivity
  • low latency constraints may include i) need a edge computing by MNO, ii) 360 or VR camera performance, iii) 360 or VR player (Software / Required HW), and iv) Plyer Interface.
  • the telematics system shown in FIG. 34 is characterized by a vehicle to network (V2N) see-through.
  • V2N vehicle to network
  • the telematics system according to an aspect of the present invention may provide a see-through function through an SOA framework, a cloud server, and a media cast center.
  • 35 to 36 illustrate a signal transmission and reception sequence between a TCU and a cloud server (or a network server) in a telematics system according to an aspect of the present invention.
  • a cloud server or a network server
  • the signal transmission and reception sequence of the network server and the TCU will be described with reference to FIG. 35.
  • a remote monitoring start event occurs on the server.
  • B. Check and open the port number where the Camera Cast Service of the server receives the video from the Media Cast Service (hereinafter referred to as MCS) of the 5G TS (initialization of the server).
  • D. Vehicle Event Manager of 5G TS forwards C request information to MCS.
  • E. MCS prepares to turn on the camera and deliver the image to the server based on the information delivered by the Vehicle Event Manager of 5G TS.
  • G. If necessary, re-packetize the camera image and send it to the server.
  • H A remote monitoring stop event occurs on the server.
  • I MQTT Broke on the server asks the 5G TS Vehicle Event Manager to stop sending the camera.
  • J. The Vehicle Event Manager of the 5G TS sends an ICS stop request to the MCS.
  • K. MCS stops transmitting J's camera video.
  • L. MCS turns off the camera.
  • a / V is output in synchronous form by using the reference time information (in RTCP) transmitted periodically and the time stamp of the RTP packet header.
  • a real time transport protocol (RTP) / UDP user datagram protocol
  • a real time streaming protocol (RTSP) may be used as a streaming control protocol
  • a QoS protocol may be used.
  • Real time transport control protocol (RTCP) may be used.
  • the definition and role of each component shown in Figure 35 may be as shown in Table 1 below.
  • 36 is a flowchart illustrating an interworking scenario for using a media cast center between a telematics system and a cloud server according to an aspect of the present invention.
  • the 5G TS of the vehicle prepares to transmit the request to the cloud server.
  • the vehicle controls the frequency output through an antenna unit connected by a 5G TS and a coaxial cable.
  • the vehicle transmits signals to the cloud server through the gNB connected by Ethernet cable.
  • the vehicle receives the content of the media cast center of the cloud server from the cloud server.
  • the vehicle transmits the content received through the antenna to the 5G TS through the cable.
  • Information is transmitted to AVN connected by 5G TS and Ethernet cable.
  • Embodiments of the present invention described above may be implemented through various means.
  • embodiments of the present invention may be implemented by hardware, firmware, software, or a combination thereof.
  • a method according to embodiments of the present invention may include one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), and Programmable Logic Devices (PLDs). It may be implemented by field programmable gate arrays (FPGAs), processors, controllers, microcontrollers, microprocessors, and the like.
  • ASICs Application Specific Integrated Circuits
  • DSPs Digital Signal Processors
  • DSPDs Digital Signal Processing Devices
  • PLDs Programmable Logic Devices
  • FPGAs field programmable gate arrays
  • processors controllers, microcontrollers, microprocessors, and the like.
  • the method according to the embodiments of the present invention may be implemented in the form of a module, a procedure, or a function that performs the functions or operations described above.
  • the software code may be stored in a memory unit and driven by a processor.
  • the memory unit may be located inside or outside the processor, and may exchange data with the processor by various known means.
  • the present invention described above can be embodied as computer readable codes on a medium in which a program is recorded.
  • the computer-readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of computer-readable media include hard disk drives (HDDs), solid state disks (SSDs), silicon disk drives (SDDs), ROMs, RAM, CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, and the like. This also includes implementations in the form of carrier waves (eg, transmission over the Internet).
  • the computer may include the controller 180 of the terminal. Accordingly, the above detailed description should not be construed as limiting in all aspects and should be considered as illustrative. The scope of the invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the invention are included in the scope of the invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un procédé de commande d'un système télématique installé dans un véhicule. De façon spécifique, l'invention concerne un procédé de commande d'un système télématique installé dans un véhicule. Le procédé comprend les étapes consistant à : recevoir une demande de délivrance de contenu ; demander, à un serveur externe, des informations associées au contenu via une station de base connectée au système télématique ; et recevoir les informations, du serveur externe, via la station de base.
PCT/KR2019/001975 2018-06-25 2019-02-19 Système télématique installé dans un véhicule, et procédé de commande associé WO2020004767A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862689254P 2018-06-25 2018-06-25
US62/689,254 2018-06-25

Publications (1)

Publication Number Publication Date
WO2020004767A1 true WO2020004767A1 (fr) 2020-01-02

Family

ID=68985747

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/001975 WO2020004767A1 (fr) 2018-06-25 2019-02-19 Système télématique installé dans un véhicule, et procédé de commande associé

Country Status (1)

Country Link
WO (1) WO2020004767A1 (fr)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111541757A (zh) * 2020-04-17 2020-08-14 一汽解放汽车有限公司 车载交互方法、装置、设备和存储介质
CN113335018A (zh) * 2021-06-11 2021-09-03 华东理工大学 一种基于some/ip的车载空调服务调用系统
WO2021235567A1 (fr) * 2020-05-19 2021-11-25 엘지전자 주식회사 Procédé pour un service v2x, et serveur utilisant ce procédé
US20220094457A1 (en) * 2020-09-19 2022-03-24 Ibiquity Digital Corporation Content Linking Multicast Streaming for Broadcast Radio
CN114268666A (zh) * 2021-12-08 2022-04-01 东软睿驰汽车技术(沈阳)有限公司 支持面向服务架构soa的通用域控制器、车辆及交互系统
CN114553873A (zh) * 2022-02-27 2022-05-27 重庆长安汽车股份有限公司 基于soa的车云协同控制系统、方法及可读存储介质
CN114670902A (zh) * 2022-04-28 2022-06-28 中车青岛四方车辆研究所有限公司 远程复位和紧急制动远程缓解的处理方法及系统
CN114979231A (zh) * 2022-05-30 2022-08-30 重庆长安汽车股份有限公司 一种基于整车dds协议的移动终端实时车辆控制方法、系统及汽车
CN115731710A (zh) * 2022-11-21 2023-03-03 奥特酷智能科技(南京)有限公司 基于DDS协议与LTE-V-Direct技术的自动驾驶车辆自组网系统及方法
CN116828000A (zh) * 2023-08-28 2023-09-29 山东未来互联科技有限公司 基于确定性网络与sdn网络的乘车订单处理系统及方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100750583B1 (ko) * 2006-08-30 2007-08-20 엠큐브웍스(주) 스트리밍 서비스를 제공하는 네트워크 시스템
KR20120040496A (ko) * 2010-10-19 2012-04-27 주식회사 칼리 텔레매틱스 미들웨어 시스템 및 이를 이용한 서비스 제공방법
KR20130113283A (ko) * 2012-04-05 2013-10-15 엘지전자 주식회사 차량용 컨텐츠 획득방법, 차량용 컨텐츠 표시 방법, 차량용 컨텐츠 표시 시스템 및 차량용 전자기기
KR20150071807A (ko) * 2013-12-18 2015-06-29 현대자동차주식회사 차량용 클라우드 시스템
KR20170114051A (ko) * 2016-04-01 2017-10-13 현대엠엔소프트 주식회사 차량의 멀티미디어 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100750583B1 (ko) * 2006-08-30 2007-08-20 엠큐브웍스(주) 스트리밍 서비스를 제공하는 네트워크 시스템
KR20120040496A (ko) * 2010-10-19 2012-04-27 주식회사 칼리 텔레매틱스 미들웨어 시스템 및 이를 이용한 서비스 제공방법
KR20130113283A (ko) * 2012-04-05 2013-10-15 엘지전자 주식회사 차량용 컨텐츠 획득방법, 차량용 컨텐츠 표시 방법, 차량용 컨텐츠 표시 시스템 및 차량용 전자기기
KR20150071807A (ko) * 2013-12-18 2015-06-29 현대자동차주식회사 차량용 클라우드 시스템
KR20170114051A (ko) * 2016-04-01 2017-10-13 현대엠엔소프트 주식회사 차량의 멀티미디어 장치

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111541757A (zh) * 2020-04-17 2020-08-14 一汽解放汽车有限公司 车载交互方法、装置、设备和存储介质
WO2021235567A1 (fr) * 2020-05-19 2021-11-25 엘지전자 주식회사 Procédé pour un service v2x, et serveur utilisant ce procédé
US20220094457A1 (en) * 2020-09-19 2022-03-24 Ibiquity Digital Corporation Content Linking Multicast Streaming for Broadcast Radio
US12009909B2 (en) * 2020-09-19 2024-06-11 Ibiquity Digital Corporation Content linking multicast streaming for broadcast radio
CN113335018A (zh) * 2021-06-11 2021-09-03 华东理工大学 一种基于some/ip的车载空调服务调用系统
CN113335018B (zh) * 2021-06-11 2022-12-06 华东理工大学 一种基于some/ip的车载空调服务调用系统
CN114268666B (zh) * 2021-12-08 2024-05-03 东软睿驰汽车技术(沈阳)有限公司 支持面向服务架构soa的通用域控制器、车辆及交互系统
CN114268666A (zh) * 2021-12-08 2022-04-01 东软睿驰汽车技术(沈阳)有限公司 支持面向服务架构soa的通用域控制器、车辆及交互系统
CN114553873A (zh) * 2022-02-27 2022-05-27 重庆长安汽车股份有限公司 基于soa的车云协同控制系统、方法及可读存储介质
CN114670902A (zh) * 2022-04-28 2022-06-28 中车青岛四方车辆研究所有限公司 远程复位和紧急制动远程缓解的处理方法及系统
CN114979231B (zh) * 2022-05-30 2023-05-26 重庆长安汽车股份有限公司 基于整车dds协议的车辆控制方法、系统及汽车
CN114979231A (zh) * 2022-05-30 2022-08-30 重庆长安汽车股份有限公司 一种基于整车dds协议的移动终端实时车辆控制方法、系统及汽车
CN115731710A (zh) * 2022-11-21 2023-03-03 奥特酷智能科技(南京)有限公司 基于DDS协议与LTE-V-Direct技术的自动驾驶车辆自组网系统及方法
CN116828000A (zh) * 2023-08-28 2023-09-29 山东未来互联科技有限公司 基于确定性网络与sdn网络的乘车订单处理系统及方法
CN116828000B (zh) * 2023-08-28 2023-11-17 山东未来互联科技有限公司 基于确定性网络与sdn网络的乘车订单处理系统及方法

Similar Documents

Publication Publication Date Title
WO2020004767A1 (fr) Système télématique installé dans un véhicule, et procédé de commande associé
WO2020235765A1 (fr) Dispositif de fourniture d'itinéraire et procédé de fourniture d'itinéraire associé
WO2021090971A1 (fr) Dispositif de fourniture de trajet et procédé associé de fourniture de trajet
WO2019031852A1 (fr) Appareil pour fournir une carte
WO2021141142A1 (fr) Dispositif de fourniture d'itinéraire et procédé de fourniture d'itinéraire correspondant
WO2020166749A1 (fr) Procédé et système pour afficher des informations à l'aide d'un véhicule
WO2021157760A1 (fr) Appareil de fourniture d'itinéraire et son procédé de fourniture d'itinéraire
WO2021141143A1 (fr) Dispositif de fourniture d'itinéraire et son procédé de fourniture d'itinéraire
WO2020235714A1 (fr) Véhicule autonome et système et procédé de commande de conduite l'utilisant
WO2020040324A1 (fr) Station its mobile, et procédé de commande de station its mobile
WO2020149427A1 (fr) Dispositif de fourniture d'itinéraire et procédé de fourniture d'itinéraire associé
WO2020080566A1 (fr) Dispositif de commande électronique et dispositif de communication
WO2020116694A1 (fr) Appareil de véhicule et procédé de commande
WO2021010524A1 (fr) Dispositif électronique pour véhicule et procédé de fonctionnement de dispositif électronique pour véhicule
WO2020017677A1 (fr) Dispositif de diffusion d'images
WO2016186319A1 (fr) Dispositif d'assistance à la conduite d'un véhicule et véhicule
WO2021182655A1 (fr) Dispositif de fourniture d'itinéraire et procédé de fourniture d'itinéraire associé
WO2021040057A1 (fr) Dispositif électronique embarqué et procédé de fonctionnement de dispositif électronique embarqué
WO2021025216A1 (fr) Dispositif de fourniture d'itinéraire et procédé de fourniture d'itinéraire par celui-ci
WO2021010507A1 (fr) Dispositif de fourniture d'itinéraire et procédé de fourniture d'itinéraire associé
WO2021246534A1 (fr) Dispositif de fourniture d'itinéraire et procédé de fourniture d'itinéraire associé
WO2022055006A1 (fr) Appareil de traitement d'images pour un véhicule et procédé d'affichage d'informations visuelles sur un afficheur inclus dans un véhicule
WO2021230387A1 (fr) Dispositif de fourniture d'un itinéraire et procédé de fourniture d'un itinéraire pour celui-ci
WO2020145432A1 (fr) Procédé de commande d'un véhicule par un système multi-système sur puce
WO2020246627A1 (fr) Dispositif de sortie d'image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19824446

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19824446

Country of ref document: EP

Kind code of ref document: A1