WO2020004767A1 - Telematics system provided in vehicle and method for controlling same - Google Patents

Telematics system provided in vehicle and method for controlling same Download PDF

Info

Publication number
WO2020004767A1
WO2020004767A1 PCT/KR2019/001975 KR2019001975W WO2020004767A1 WO 2020004767 A1 WO2020004767 A1 WO 2020004767A1 KR 2019001975 W KR2019001975 W KR 2019001975W WO 2020004767 A1 WO2020004767 A1 WO 2020004767A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
information
telematics system
external server
protocol
Prior art date
Application number
PCT/KR2019/001975
Other languages
French (fr)
Korean (ko)
Inventor
신정은
조장형
최성하
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Publication of WO2020004767A1 publication Critical patent/WO2020004767A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]

Definitions

  • the present invention relates to a telematics system provided in a vehicle and a method for controlling the same, and more particularly, to a method for transmitting and receiving a signal to and from an external server through a base station.
  • the vehicle traditionally functions as a user's means of transportation, but for the convenience of the user, the vehicle is provided with various sensors, electronic devices, and the like, to provide driving convenience for the user.
  • ADAS Advanced Driver Assistance System
  • autonomous vehicles Autonomous Vehicle
  • ICE in-car entertainment
  • IVI in-vehicle infotainment
  • 5G use case includes Intersection Movement Assistant, Real-time Situational Awareness & HD Maps, Cooperative Lane Change of Automated Vehicles, See-Through, Vulnerable Road User Discovery, Shared Vision, Collective Sensory, Hybrid Intelligence, Journey Studio, Media Center on Wheels, etc. There is this.
  • a vehicle may be provided with a plurality of display devices.
  • a telematics control unit TCU
  • the present invention is to provide a telematics system provided in a vehicle and a method of controlling the same.
  • receiving a request for content output, requesting information related to the content to an external server through a base station connected to the telematics system and the external through the base station A method of controlling a telematics system provided in a vehicle comprising receiving the information from a server is proposed.
  • the received information may be delivered to a plurality of target devices connected to the telematics system, respectively.
  • the information is received from the external server via a predetermined protocol, and the predetermined protocol may include a transmission protocol, a streaming control protocol, and a quality of service (QoS) protocol.
  • the predetermined protocol may include a transmission protocol, a streaming control protocol, and a quality of service (QoS) protocol.
  • QoS quality of service
  • the telematics system If the received information is media content provided by the external server, the telematics system generates a URL (Universal Resource Locator) to which each of the plurality of target devices can access, and converts the generated URLs into the plurality of targets. Each can be delivered to the device.
  • a URL Universal Resource Locator
  • the transmitted information may be output in synchronization with each of the plurality of target devices based on reference time information included in the QoS protocol and time information included in a packet constituting information received from the external server.
  • the plurality of target devices may include a center information display (CID), an audio video navigation (AVN), rear seat entertainment (RSE), and a consumer electronic (CE) device provided inside or outside the vehicle. Can be.
  • CID center information display
  • AVN audio video navigation
  • RSE rear seat entertainment
  • CE consumer electronic
  • the content received from the external server may correspond to media content or a camera image of another vehicle.
  • the above-described problem of the related art that is, the problem of failing to efficiently provide content to a plurality of display devices in a vehicle through a TCU (Telematics Control Unit) can be solved.
  • the telematics system may provide audio / video synchronized content in a plurality of display devices in a vehicle.
  • FIG. 1 is a view showing the appearance of a vehicle according to an aspect of the present invention.
  • FIG. 2 is a view of the vehicle according to an aspect of the present invention from various angles from the outside.
  • 3 to 4 are views illustrating the interior of a vehicle according to an aspect of the present invention.
  • FIG 5 to 6 are views referred to for explaining an object according to an aspect of the present invention.
  • FIG. 7 is a block diagram referred to describe a vehicle according to an aspect of the present invention.
  • FIGS. 8 to 10 are diagrams for explaining the structure of a telematics system according to an aspect of the present invention.
  • 11 to 14 are diagrams for explaining an implementation example of a telematics system according to an aspect of the present invention.
  • 15 is a view for explaining the characteristics of the telematics system architecture according to an aspect of the present invention.
  • 16 to 17 are diagrams for explaining the SW platform of the telematics system according to an aspect of the present invention.
  • FIG. 18 is a view for explaining a telematics system according to an aspect of the present invention.
  • FIG. 19 illustrates a relationship between a telematics system and a cloud according to an aspect of the present invention.
  • 20 is a view for explaining the relationship between the framework and the cloud of the telematics system according to an aspect of the present invention.
  • 21 is a diagram for describing a media cast center of a telematics system according to an aspect of the present invention.
  • FIG. 22 illustrates a positioning service provided by a telematics system according to an aspect of the present invention.
  • FIG. 23 is a diagram for describing an electronic horizon provided by a telematics system according to an aspect of the present disclosure.
  • 24 to 31 are views for explaining an implementation example of a telematics system according to an aspect of the present invention.
  • 32 to 34 are views for explaining a technical effect of a telematics system according to an aspect of the present invention.
  • 35 to 36 are diagrams for explaining a signal transmission and reception sequence between a TCU and a cloud server in a telematics system according to an aspect of the present invention.
  • the vehicle 100 may include a wheel that rotates by a power source and a steering input device 510 for adjusting a traveling direction of the vehicle 100.
  • the vehicle 100 may be an autonomous vehicle.
  • the vehicle 100 may be switched to an autonomous driving mode or a manual mode based on a user input.
  • the vehicle 100 may be switched from the manual mode to the autonomous driving mode or from the autonomous driving mode to the manual mode based on the received user input through the user interface device 200.
  • the vehicle 100 may be switched to the autonomous driving mode or the manual mode based on the driving situation information.
  • the driving situation information may include at least one of object information, navigation information, and vehicle state information outside the vehicle.
  • the vehicle 100 may be switched from the manual mode to the autonomous driving mode or from the autonomous driving mode to the manual mode based on the driving situation information generated by the object detecting apparatus 300.
  • the vehicle 100 may be switched from the manual mode to the autonomous driving mode or from the autonomous driving mode to the manual mode based on the driving situation information received through the communication device 400.
  • the vehicle 100 may switch from the manual mode to the autonomous driving mode or from the autonomous driving mode to the manual mode based on information, data, and signals provided from an external device.
  • the autonomous vehicle 100 may be driven based on the driving system 700.
  • the autonomous vehicle 100 may be driven based on information, data, or signals generated by the driving system 710, the parking system 740, and the parking system 750.
  • the autonomous vehicle 100 may receive a user input for driving through the driving manipulation apparatus 500. Based on a user input received through the driving manipulation apparatus 500, the vehicle 100 may be driven.
  • the overall length is the length from the front to the rear of the vehicle 100
  • the width is the width of the vehicle 100
  • the height is the length from the bottom of the wheel to the roof.
  • the full length direction L is a direction in which the full length measurement of the vehicle 100 is a reference
  • the full width direction W is a direction in which the full width measurement of the vehicle 100 is a reference
  • the total height direction H is a vehicle. It may mean the direction which is the reference of the height measurement of (100).
  • the vehicle 100 includes a user interface device 200, an object detecting device 300, a communication device 400, a driving manipulation device 500, a vehicle driving device 600, and a traveling system. 700, a navigation system 770, a sensing unit 120, an interface unit 130, a memory 140, a control unit 170, and a power supply unit 190 may be included.
  • the vehicle 100 may further include other components in addition to the components described herein, or may not include some of the described components.
  • the sensing unit 120 may include a state of the vehicle. Can sense.
  • the sensing unit 120 may include an attitude sensor (for example, a yaw sensor, a roll sensor, a pitch sensor), a collision sensor, a wheel sensor, a speed sensor, and an inclination.
  • the sensing unit 120 includes vehicle attitude information, vehicle collision information, vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / reverse information, battery Acquire sensing signals for information, fuel information, tire information, vehicle lamp information, vehicle internal temperature information, vehicle internal humidity information, steering wheel rotation angle, vehicle external illumination, pressure applied to the accelerator pedal, pressure applied to the brake pedal, and the like. can do.
  • the sensing unit 120 may further include an accelerator pedal sensor, a pressure sensor, an engine speed sensor, an air flow sensor (AFS), an intake air temperature sensor (ATS), a water temperature sensor (WTS), and a throttle position sensor. (TPS), TDC sensor, crank angle sensor (CAS), and the like.
  • AFS air flow sensor
  • ATS intake air temperature sensor
  • WTS water temperature sensor
  • TPS throttle position sensor
  • TDC crank angle sensor
  • CAS crank angle sensor
  • the sensing unit 120 may generate vehicle state information based on the sensing data.
  • the vehicle state information may be information generated based on data sensed by various sensors provided in the vehicle.
  • the vehicle state information includes vehicle attitude information, vehicle speed information, vehicle tilt information, vehicle weight information, vehicle direction information, vehicle battery information, vehicle fuel information, vehicle tire pressure information
  • the vehicle may include steering information of the vehicle, vehicle indoor temperature information, vehicle indoor humidity information, pedal position information, vehicle engine temperature information, and the like.
  • the interface unit 130 may serve as a path to various types of external devices connected to the vehicle 100.
  • the interface unit 130 may include a port connectable with the mobile terminal, and may connect with the mobile terminal through the port. In this case, the interface unit 130 may exchange data with the mobile terminal.
  • the interface unit 130 may serve as a path for supplying electrical energy to the connected mobile terminal.
  • the interface unit 130 may provide the mobile terminal with electrical energy supplied from the power supply unit 190.
  • the memory 140 is electrically connected to the controller 170.
  • the memory 140 may store basic data for the unit, control data for controlling the operation of the unit, and input / output data.
  • the memory 140 may be various storage devices such as a ROM, a RAM, an EPROM, a flash drive, a hard drive, and the like, in hardware.
  • the memory 140 may store various data for overall operation of the vehicle 100, such as a program for processing or controlling the controller 170.
  • the memory 140 may be integrally formed with the controller 170 or may be implemented as a subcomponent of the controller 170.
  • the controller 170 may control the overall operation of each unit in the vehicle 100.
  • the controller 170 may be referred to as an electronic control unit (ECU).
  • the power supply unit 190 may supply power required for the operation of each component under the control of the controller 170.
  • the power supply unit 190 may receive power from a battery inside the vehicle.
  • processors and controllers 170 included in vehicle 100 may include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), and field programmable (FPGAs). Gate arrays, processors, controllers, micro-controllers, microprocessors, and other electrical units for performing other functions may be implemented.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • the vehicle driving apparatus 600, the driving system 700, and the navigation system 770 may have separate processors or may be integrated into the controller 170.
  • the user interface device 200 is a device for communicating with the vehicle 100 and a user.
  • the user interface device 200 may receive a user input and provide the user with information generated in the vehicle 100.
  • the vehicle 100 may implement user interfaces (UI) or user experience (UX) through the user interface device 200.
  • UI user interfaces
  • UX user experience
  • the user interface device 200 may include an input unit 210, an internal camera 220, a biometric detector 230, an output unit 250, and a processor 270. Each component of the user interface device 200 may be structurally and functionally separated from or integrated with the interface unit 130 described above.
  • the user interface device 200 may further include other components in addition to the described components, or may not include some of the described components.
  • the input unit 210 is for receiving information from a user, and the data collected by the input unit 210 may be analyzed by the processor 270 and processed as a user's control command.
  • the input unit 210 may be disposed in the vehicle.
  • the input unit 210 may include one area of a steering wheel, one area of an instrument panel, one area of a seat, one area of each pillar, and a door. one area of the door, one area of the center console, one area of the head lining, one area of the sun visor, one area of the windshield or of the window It may be disposed in one area or the like.
  • the input unit 210 may include a voice input unit 211, a gesture input unit 212, a touch input unit 213, and a mechanical input unit 214.
  • the voice input unit 211 may convert a user's voice input into an electrical signal.
  • the converted electrical signal may be provided to the processor 270 or the controller 170.
  • the voice input unit 211 may include one or more microphones.
  • the gesture input unit 212 may convert a user's gesture input into an electrical signal.
  • the converted electrical signal may be provided to the processor 270 or the controller 170.
  • the gesture input unit 212 may include at least one of an infrared sensor and an image sensor for detecting a user's gesture input.
  • the gesture input unit 212 may detect a 3D gesture input of the user.
  • the gesture input unit 212 may include a light output unit or a plurality of image sensors for outputting a plurality of infrared light.
  • the gesture input unit 212 may detect a user's 3D gesture input through a time of flight (TOF) method, a structured light method, or a disparity method.
  • TOF time of flight
  • the touch input unit 213 may convert a user's touch input into an electrical signal.
  • the converted electrical signal may be provided to the processor 270 or the controller 170.
  • the touch input unit 213 may include a touch sensor for detecting a user's touch input.
  • the touch input unit 213 may be integrally formed with the display unit 251 to implement a touch screen. Such a touch screen may provide an input interface and an output interface between the vehicle 100 and the user.
  • the mechanical input unit 214 may include at least one of a button, a dome switch, a jog wheel, and a jog switch.
  • the electrical signal generated by the mechanical input unit 214 may be provided to the processor 270 or the controller 170.
  • the mechanical input unit 214 may be disposed on a steering wheel, a center fascia, a center console, a cockpit module, a door, or the like.
  • the processor 270 starts a learning mode of the vehicle 100 in response to user inputs to at least one of the voice input unit 211, the gesture input unit 212, the touch input unit 213, and the mechanical input unit 214 described above. can do.
  • the vehicle 100 may perform driving path learning and surrounding environment learning of the vehicle 100. The learning mode will be described in detail later with reference to the object detecting apparatus 300 and the driving system 700.
  • the internal camera 220 may acquire a vehicle interior image.
  • the processor 270 may detect a state of the user based on the vehicle interior image.
  • the processor 270 may acquire the gaze information of the user from the vehicle interior image.
  • the processor 270 may detect a gesture of the user in the vehicle interior image.
  • the biometric detector 230 may acquire biometric information of the user.
  • the biometric detector 230 may include a sensor for acquiring biometric information of the user, and may acquire fingerprint information, heartbeat information, etc. of the user using the sensor. Biometric information may be used for user authentication.
  • the output unit 250 is for generating output related to visual, auditory or tactile.
  • the output unit 250 may include at least one of the display unit 251, the audio output unit 252, and the haptic output unit 253.
  • the display unit 251 may display graphic objects corresponding to various pieces of information.
  • the display unit 251 is a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display (flexible) display, a 3D display, or an e-ink display.
  • the display unit 251 forms a layer structure or is integrally formed with the touch input unit 213 to implement a touch screen.
  • the display unit 251 may be implemented as a head up display (HUD).
  • the display unit 251 may include a projection module to output information through an image projected on a wind shield or a window.
  • the display unit 251 may include a transparent display. The transparent display can be attached to the wind shield or window.
  • the transparent display may display a predetermined screen while having a predetermined transparency.
  • a transparent display is a transparent thin film electroluminescent (TFEL), a transparent organic light-emitting diode (OLED), a transparent liquid crystal display (LCD), a transmissive transparent display, a transparent light emitting diode (LED) display It may include at least one of.
  • the transparency of the transparent display can be adjusted.
  • the user interface device 200 may include a plurality of display units 251a to 251g.
  • the display unit 251 may include one region of the steering wheel, one region 251a, 251b, and 251e of the instrument panel, one region 251d of the seat, one region 251f of each pillar, and one region of the door ( 251g), one area of the center console, one area of the head lining, one area of the sun visor, or may be implemented in one area 251c of the windshield and one area 251h of the window.
  • the sound output unit 252 converts an electrical signal provided from the processor 270 or the controller 170 into an audio signal and outputs the audio signal. To this end, the sound output unit 252 may include one or more speakers.
  • the haptic output unit 253 generates a tactile output.
  • the haptic output unit 253 may vibrate the steering wheel, the seat belt, and the seats 110FL, 110FR, 110RL, and 110RR so that the user may recognize the output.
  • the processor 270 may control the overall operation of each unit of the user interface device 200.
  • the user interface device 200 may include a plurality of processors 270 or may not include the processor 270.
  • the user interface device 200 may be operated under the control of the processor or the controller 170 of another device in the vehicle 100.
  • the user interface device 200 may be referred to as a vehicle display device.
  • the user interface device 200 may be operated under the control of the controller 170.
  • the object detecting apparatus 300 is a device for detecting an object located outside the vehicle 100.
  • the object detecting apparatus 300 may generate object information based on the sensing data.
  • the object information may include information about the presence or absence of the object, location information of the object, distance information between the vehicle 100 and the object, and relative speed information between the vehicle 100 and the object.
  • the object may be various objects related to the driving of the vehicle 100.
  • the object O includes a lane OB10, another vehicle OB11, a pedestrian OB12, a two-wheeled vehicle OB13, traffic signals OB14, OB15, light, a road, a structure, Speed bumps, features, animals and the like can be included.
  • the lane OB10 may be a driving lane, a lane next to the driving lane, and a lane in which an opposite vehicle travels.
  • the lane OB10 may be a concept including left and right lines forming a lane.
  • the other vehicle OB11 may be a vehicle that is driving around the vehicle 100.
  • the other vehicle may be a vehicle located within a predetermined distance from the vehicle 100.
  • the other vehicle OB11 may be a vehicle that precedes or follows the vehicle 100.
  • the pedestrian OB12 may be a person located near the vehicle 100.
  • the pedestrian OB12 may be a person located within a predetermined distance from the vehicle 100.
  • the pedestrian OB12 may be a person located on a sidewalk or a roadway.
  • the two-wheeled vehicle OB13 may be a vehicle that is positioned around the vehicle 100 and moves using two wheels.
  • the motorcycle OB13 may be a vehicle having two wheels located within a predetermined distance from the vehicle 100.
  • the motorcycle OB13 may be a motorcycle or a bicycle located on sidewalks or roadways.
  • the traffic signal may include a traffic light OB15, a traffic sign OB14, and a pattern or text drawn on a road surface.
  • the light may be light generated by a lamp provided in another vehicle.
  • the light can be light generated from the street light.
  • the light may be sunlight.
  • the road may include a road surface, a curve, an uphill slope, a slope downhill, or the like.
  • the structure may be an object located around a road and fixed to the ground.
  • the structure may include a street lamp, a roadside tree, a building, a power pole, a traffic light, a bridge.
  • the features may include mountains, hills, and the like.
  • the object may be classified into a moving object and a fixed object.
  • the moving object may be a concept including another vehicle and a pedestrian.
  • the fixed object may be a concept including a traffic signal, a road, and a structure.
  • the object detecting apparatus 300 may include a camera 310, a radar 320, a lidar 330, an ultrasonic sensor 340, an infrared sensor 350, and a processor 370. Each component of the object detecting apparatus 300 may be structurally and functionally separated or integrated with the sensing unit 120 described above.
  • the object detecting apparatus 300 may further include other components in addition to the described components, or may not include some of the described components.
  • the camera 310 may be located at a suitable place outside the vehicle to acquire an image outside the vehicle.
  • the camera 310 may be a mono camera, a stereo camera 310a, an around view monitoring (AVM) camera 310b, or a 360 degree camera.
  • AVM around view monitoring
  • the camera 310 may acquire location information of the object, distance information with respect to the object, or relative speed information with the object by using various image processing algorithms.
  • the camera 310 may obtain distance information and relative speed information with respect to the object based on the change in the object size over time in the acquired image.
  • the camera 310 may acquire distance information and relative velocity information with respect to an object through a pin hole model, road surface profiling, or the like.
  • the camera 310 may obtain distance information and relative speed information with respect to the object based on the disparity information in the stereo image acquired by the stereo camera 310a.
  • the camera 310 may be disposed in close proximity to the front windshield in the interior of the vehicle in order to acquire an image in front of the vehicle.
  • the camera 310 may be disposed around the front bumper or the radiator grille.
  • the camera 310 may be disposed in close proximity to the rear glass in the interior of the vehicle to acquire an image of the rear of the vehicle.
  • the camera 310 may be disposed around the rear bumper, the trunk, or the tail gate.
  • the camera 310 may be disposed in close proximity to at least one of the side windows in the interior of the vehicle to acquire an image of the vehicle side.
  • the camera 310 may be arranged around the side mirror, fender or door.
  • the camera 310 may provide the obtained image to the processor 370.
  • the radar 320 may include an electromagnetic wave transmitter and a receiver.
  • the radar 320 may be implemented in a pulse radar method or a continuous wave radar method in terms of radio wave firing principle.
  • the radar 320 may be implemented by a frequency modulated continuous wave (FSCW) method or a frequency shift keying (FSK) method according to a signal waveform among continuous wave radar methods.
  • FSCW frequency modulated continuous wave
  • FSK frequency shift keying
  • the radar 320 detects an object based on a time of flight (TOF) method or a phase-shift method based on electromagnetic waves, and detects the position of the detected object, distance to the detected object, and relative velocity. Can be detected.
  • TOF time of flight
  • phase-shift method based on electromagnetic waves
  • the radar 320 may be disposed at an appropriate position outside the vehicle to detect an object located in front, rear, or side of the vehicle.
  • the lidar 330 may include a laser transmitter and a receiver.
  • the lidar 330 may be implemented in a time of flight (TOF) method or a phase-shift method.
  • TOF time of flight
  • the lidar 330 may be implemented as driven or non-driven. When implemented in a driving manner, the lidar 330 may be rotated by a motor and detect an object around the vehicle 100. When implemented in a non-driven manner, the lidar 330 may detect an object located within a predetermined range with respect to the vehicle 100 by optical steering.
  • the vehicle 100 may include a plurality of non-driven lidars 330.
  • the lidar 330 detects an object based on a time of flight (TOF) method or a phase-shift method using laser light, and detects an object, a position of the detected object, a distance from the detected object, and Relative speed can be detected.
  • the lidar 330 may be disposed at an appropriate position outside the vehicle to detect an object located in front, rear, or side of the vehicle.
  • the ultrasonic sensor 340 may include an ultrasonic transmitter and a receiver.
  • the ultrasonic sensor 340 may detect an object based on the ultrasonic wave, and detect a position of the detected object, a distance to the detected object, and a relative speed.
  • the ultrasonic sensor 340 may be disposed at an appropriate position outside the vehicle to detect an object located in front, rear, or side of the vehicle.
  • the infrared sensor 350 may include an infrared transmitter and a receiver.
  • the infrared sensor 340 may detect an object based on infrared light, and detect a position of the detected object, a distance to the detected object, and a relative speed.
  • the infrared sensor 350 may be disposed at an appropriate position outside the vehicle to detect an object located in front, rear, or side of the vehicle.
  • the processor 370 may control overall operations of each unit of the object detecting apparatus 300.
  • the processor 370 compares the data sensed by the camera 310, the radar 320, the lidar 330, the ultrasonic sensor 340, and the infrared sensor 350 with previously stored data to detect or classify an object. can do.
  • the processor 370 may detect and track the object based on the obtained image.
  • the processor 370 may perform operations such as calculating a distance to an object and calculating a relative speed with the object through an image processing algorithm.
  • the processor 370 may acquire distance information and relative speed information with respect to the object based on the change in the object size over time in the obtained image.
  • the processor 370 may acquire distance information and relative velocity information with respect to an object through a pin hole model, road surface profiling, or the like.
  • the processor 370 may obtain distance information and relative speed information with the object based on the disparity information in the stereo image acquired by the stereo camera 310a.
  • the processor 370 may detect and track the object based on the reflected electromagnetic wave reflected by the transmitted electromagnetic wave to the object.
  • the processor 370 may perform an operation such as calculating a distance from the object, calculating a relative speed with the object, and the like based on the electromagnetic waves.
  • the processor 370 may detect and track the object based on the reflected laser light reflected by the transmitted laser back to the object.
  • the processor 370 may perform an operation such as calculating a distance from the object, calculating a relative speed with the object, and the like based on the laser light.
  • the processor 370 may detect and track the object based on the reflected ultrasound, in which the transmitted ultrasound is reflected by the object and returned.
  • the processor 370 may perform an operation such as calculating a distance from the object, calculating a relative speed with the object, and the like based on the ultrasound.
  • the processor 370 may detect and track the object based on the reflected infrared light from which the transmitted infrared light is reflected back to the object.
  • the processor 370 may perform an operation such as calculating a distance to the object, calculating a relative speed with the object, and the like based on the infrared light.
  • the processor 370 may include a camera 310, a radar 320, a lidar 330, and an ultrasonic sensor.
  • the data sensed by the 340 and the infrared sensor 350 may be stored in the memory 140.
  • the object detecting apparatus 300 may or may not include the processor 370.
  • the processor 370 may or may not include the processor 370.
  • each of the camera 310, the radar 320, the lidar 330, the ultrasonic sensor 340, and the infrared sensor 350 may individually include a processor.
  • the object detecting apparatus 300 may be operated under the control of the processor or the controller 170 of the apparatus in the vehicle 100.
  • the object detecting apparatus 300 may be operated under the control of the controller 170.
  • the communication device 400 is a device for performing communication with an external device.
  • the external device may be another vehicle, a mobile terminal or a server.
  • the communication device 400 may include at least one of a transmit antenna, a receive antenna, a radio frequency (RF) circuit capable of implementing various communication protocols, and an RF element to perform communication.
  • RF radio frequency
  • the communication device 400 includes a short range communication unit 410, a location information unit 420, a V2X communication unit 430, an optical communication unit 440, a broadcast transmission / reception unit 450, an ITS (Intelligent Transport Systems) communication unit 460, and a processor. 470 may include. According to an embodiment, the communication device 400 may further include other components in addition to the described components, or may not include some of the described components.
  • the short range communication unit 410 is a unit for short range communication.
  • the local area communication unit 410 may include Bluetooth TM, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, Near Field Communication (NFC), and Wi-Fi (Wireless). Local area communication may be supported using at least one of Fidelity, Wi-Fi Direct, and Wireless Universal Serial Bus (USB) technologies.
  • the short range communication unit 410 may form short range wireless networks to perform short range communication between the vehicle 100 and at least one external device.
  • the location information unit 420 is a unit for obtaining location information of the vehicle 100.
  • the location information unit 420 may include a global positioning system (GPS) module or a differential global positioning system (DGPS) module.
  • GPS global positioning system
  • DGPS differential global positioning system
  • the V2X communication unit 430 is a unit for performing wireless communication with a server (V2I: Vehicle to Infra), another vehicle (V2V: Vehicle to Vehicle), or a pedestrian (V2P: Vehicle to Pedestrian).
  • the V2X communication unit 430 may include an RF circuit that can implement a communication with the infrastructure (V2I), an inter-vehicle communication (V2V), and a communication with the pedestrian (V2P).
  • the optical communication unit 440 is a unit for performing communication with an external device via light.
  • the optical communication unit 440 may include an optical transmitter that converts an electrical signal into an optical signal and transmits the external signal to the outside, and an optical receiver that converts the received optical signal into an electrical signal.
  • the light emitting unit may be formed to be integrated with the lamp included in the vehicle 100.
  • the broadcast transceiver 450 is a unit for receiving a broadcast signal from an external broadcast management server or transmitting a broadcast signal to a broadcast management server through a broadcast channel.
  • the broadcast channel may include a satellite channel and a terrestrial channel.
  • the broadcast signal may include a TV broadcast signal, a radio broadcast signal, and a data broadcast signal.
  • the ITS communication unit 460 may exchange information, data, or signals with the traffic system.
  • the ITS communication unit 460 may provide the obtained information and data to the transportation system.
  • the ITS communication unit 460 may receive information, data, or a signal from a traffic system.
  • the ITS communication unit 460 may receive road traffic information from the traffic system and provide the road traffic information to the control unit 170.
  • the ITS communication unit 460 may receive a control signal from a traffic system and provide the control signal to a processor provided in the controller 170 or the vehicle 100.
  • the processor 470 may control the overall operation of each unit of the communication device 400.
  • the communication device 400 may include a plurality of processors 470 or may not include the processor 470.
  • the communication device 400 may be operated under the control of the processor or the controller 170 of another device in the vehicle 100.
  • the communication device 400 may implement a vehicle display device together with the user interface device 200.
  • the vehicle display device may be called a telematics device or an audio video navigation (AVN) device.
  • the communication device 400 may be operated under the control of the controller 170.
  • the driving operation apparatus 500 is a device that receives a user input for driving. In the manual mode, the vehicle 100 may be driven based on a signal provided by the driving manipulation apparatus 500.
  • the driving manipulation apparatus 500 may include a steering input apparatus 510, an acceleration input apparatus 530, and a brake input apparatus 570.
  • the steering input device 510 may receive a driving direction input of the vehicle 100 from the user.
  • the steering input device 510 is preferably formed in a wheel shape to enable steering input by rotation.
  • the steering input device may be formed in the form of a touch screen, a touch pad, or a button.
  • the acceleration input device 530 may receive an input for accelerating the vehicle 100 from a user.
  • the brake input device 570 may receive an input for deceleration of the vehicle 100 from a user.
  • the acceleration input device 530 and the brake input device 570 are preferably formed in the form of a pedal. According to an embodiment, the acceleration input device or the brake input device may be formed in the form of a touch screen, a touch pad, or a button.
  • the driving manipulation apparatus 500 may be operated under the control of the controller 170.
  • the vehicle drive device 600 is a device that electrically controls the driving of various devices in the vehicle 100.
  • the vehicle driving apparatus 600 may include a power train driver 610, a chassis driver 620, a door / window driver 630, a safety device driver 640, a lamp driver 650, and an air conditioning driver 660. Can be.
  • the vehicle driving apparatus 600 may further include other components in addition to the described components, or may not include some of the described components.
  • the vehicle driving device 600 may include a processor. Each unit of the vehicle drive apparatus 600 may each include a processor individually.
  • the power train driver 610 may control the operation of the power train device.
  • the power train driver 610 may include a power source driver 611 and a transmission driver 612.
  • the power source driver 611 may control the power source of the vehicle 100.
  • the power source driver 610 may perform electronic control of the engine. Thereby, the output torque of an engine, etc. can be controlled.
  • the power source drive unit 611 can adjust the engine output torque under the control of the control unit 170.
  • the power source driver 610 may control the motor.
  • the power source driver 610 may adjust the rotational speed, torque, and the like of the motor under the control of the controller 170.
  • the transmission driver 612 may control the transmission.
  • the transmission driver 612 can adjust the state of the transmission.
  • the transmission drive part 612 can adjust the state of a transmission to forward D, backward R, neutral N, or parking P.
  • the transmission drive unit 612 can adjust the bite state of the gear in the forward D state.
  • the chassis driver 620 may control the operation of the chassis device.
  • the chassis driver 620 may include a steering driver 621, a brake driver 622, and a suspension driver 623.
  • the steering driver 621 may perform electronic control of a steering apparatus in the vehicle 100.
  • the steering driver 621 may change the traveling direction of the vehicle.
  • the brake driver 622 may perform electronic control of a brake apparatus in the vehicle 100. For example, the speed of the vehicle 100 may be reduced by controlling the operation of the brake disposed on the wheel.
  • the brake drive unit 622 can individually control each of the plurality of brakes.
  • the brake driver 622 may control the braking force applied to the plurality of wheels differently.
  • the suspension driver 623 may perform electronic control of a suspension apparatus in the vehicle 100. For example, when there is a curvature on the road surface, the suspension driver 623 may control the suspension device to control the vibration of the vehicle 100 to be reduced. Meanwhile, the suspension driver 623 may individually control each of the plurality of suspensions.
  • the door / window driver 630 may perform electronic control of a door apparatus or a window apparatus in the vehicle 100.
  • the door / window driver 630 may include a door driver 631 and a window driver 632.
  • the door driver 631 may control the door apparatus.
  • the door driver 631 may control opening and closing of the plurality of doors included in the vehicle 100.
  • the door driver 631 may control the opening or closing of a trunk or a tail gate.
  • the door driver 631 may control the opening or closing of the sunroof.
  • the window driver 632 may perform electronic control of the window apparatus.
  • the opening or closing of the plurality of windows included in the vehicle 100 may be controlled.
  • the safety device driver 640 may perform electronic control of various safety apparatuses in the vehicle 100.
  • the safety device driver 640 may include an airbag driver 641, a seat belt driver 642, and a pedestrian protection device driver 643.
  • the airbag driver 641 may perform electronic control of an airbag apparatus in the vehicle 100.
  • the airbag driver 641 may control the airbag to be deployed when the danger is detected.
  • the seat belt driver 642 may perform electronic control of a seatbelt apparatus in the vehicle 100.
  • the seat belt driver 642 may control the passenger to be fixed to the seats 110FL, 110FR, 110RL, and 110RR by using the seat belt when detecting a danger.
  • the pedestrian protection device driver 643 may perform electronic control of the hood lift and the pedestrian airbag. For example, the pedestrian protection device driver 643 may control the hood lift up and the pedestrian air bag to be deployed when the collision with the pedestrian is detected.
  • the lamp driver 650 may perform electronic control of various lamp apparatuses in the vehicle 100.
  • the air conditioning driver 660 may perform electronic control of an air conditioner in the vehicle 100. For example, when the temperature inside the vehicle is high, the air conditioning driver 660 may control the air conditioning apparatus to operate to supply cool air to the inside of the vehicle.
  • the vehicle driving apparatus 600 may include a processor. Each unit of the vehicle drive apparatus 600 may each include a processor individually. The vehicle driving apparatus 600 may be operated under the control of the controller 170.
  • the travel system 700 is a system for controlling various travels of the vehicle 100.
  • the navigation system 700 can be operated in an autonomous driving mode.
  • the travel system 700 can include a travel system 710, a parking system 740, and a parking system 750.
  • the navigation system 700 may further include other components in addition to the described components, or may not include some of the described components.
  • the driving system 700 may include a processor.
  • Each unit of the navigation system 700 may each include a processor individually.
  • the driving system 700 may control the driving of the autonomous driving mode based on the learning.
  • the learning mode and the operation mode on the premise that the learning is completed may be performed.
  • a method of the processor of the driving system 700 to perform a learning mode and an operating mode will be described below.
  • the learning mode may be performed in the manual mode described above.
  • the processor of the driving system 700 may perform driving path learning and surrounding environment learning of the vehicle 100.
  • the driving route learning may include generating map data on a route on which the vehicle 100 travels.
  • the processor of the driving system 700 may generate map data based on information detected by the object detecting apparatus 300 while the vehicle 100 travels from the starting point to the destination.
  • the surrounding environment learning may include storing and analyzing information about the surrounding environment of the vehicle 100 in the driving process and the parking process of the vehicle 100.
  • the processor of the driving system 700 may detect information detected by the object detecting apparatus 300 during the parking process of the vehicle 100, for example, location information of the parking space, size information, fixed (or not fixed). Information about the surrounding environment of the vehicle 100 may be stored and analyzed based on information such as obstacle information.
  • the operation mode may be performed in the autonomous driving mode described above.
  • the operation mode will be described on the premise that the driving route learning or the surrounding environment learning is completed through the learning mode.
  • the operation mode may be performed in response to a user input through the input unit 210, or may be automatically performed when the vehicle 100 reaches a driving path and a parking space where learning is completed.
  • the operating mode is a semi autonomous operating mode that requires some user's manipulation of the drive manipulator 500 and a full-autonomous operation requiring no user's manipulation of the drive manipulator 500. May include a fully autonomous operating mode.
  • the processor of the driving system 700 may control the driving system 710 in the operation mode to drive the vehicle 100 along the driving path where learning is completed.
  • the processor of the driving system 700 may control the parking system 740 in the operation mode to release the parked vehicle 100 from the parking space where the learning is completed.
  • the processor of the driving system 700 may control the parking system 750 in the operation mode to park the vehicle 100 from the current position to the parking space where the learning is completed.
  • the driving system 700 may be a lower concept of the controller 170.
  • the driving system 700 may include a user interface device 270, an object detecting device 300, a communication device 400, a driving manipulation device 500, a vehicle driving device 600, and a navigation system.
  • the sensing unit 120, and the control unit 170 may include a concept including at least one.
  • the traveling system 710 may perform driving of the vehicle 100.
  • the driving system 710 may receive navigation information from the navigation system 770, provide a control signal to the vehicle driving apparatus 600, and perform driving of the vehicle 100.
  • the driving system 710 may receive object information from the object detecting apparatus 300 and provide a control signal to the vehicle driving apparatus 600 to perform driving of the vehicle 100.
  • the driving system 710 may receive a signal from an external device through the communication device 400, provide a control signal to the vehicle driving device 600, and perform driving of the vehicle 100.
  • the driving system 710 may include a user interface device 270, an object detection device 300, a communication device 400, a driving manipulation device 500, a vehicle driving device 600, a navigation system 770, and a sensing unit ( At least one of the 120 and the controller 170 may be a system concept for driving the vehicle 100.
  • the driving system 710 may be referred to as a vehicle driving control device.
  • the taking-out system 740 may perform taking out of the vehicle 100.
  • the taking-out system 740 may receive navigation information from the navigation system 770, provide a control signal to the vehicle driving apparatus 600, and perform take-out of the vehicle 100.
  • the taking-out system 740 may receive the object information from the object detecting apparatus 300, provide a control signal to the vehicle driving apparatus 600, and perform take-out of the vehicle 100.
  • the taking-off system 740 may receive a signal from an external device through the communication device 400, provide a control signal to the vehicle driving apparatus 600, and perform take-out of the vehicle 100.
  • the car leaving system 740 includes a user interface device 270, an object detecting device 300 and a communication device 400, a driving control device 500, a vehicle driving device 600, a navigation system 770, and a sensing unit ( Including at least one of the controller 120 and the controller 170, the concept of a system that performs the taking out of the vehicle 100 may be performed.
  • Such a car leaving system 740 may be referred to as a vehicle parking control device.
  • the parking system 750 may perform parking of the vehicle 100.
  • the parking system 750 may receive navigation information from the navigation system 770, provide a control signal to the vehicle driving apparatus 600, and perform parking of the vehicle 100.
  • the parking system 750 may receive the object information from the object detecting apparatus 300, provide a control signal to the vehicle driving apparatus 600, and perform parking of the vehicle 100.
  • the parking system 750 may receive a signal from an external device through the communication device 400, provide a control signal to the vehicle driving device 600, and perform parking of the vehicle 100.
  • the parking system 750 includes a user interface device 270, an object detection device 300 and a communication device 400, a driving operation device 500, a vehicle driving device 600, a navigation system 770, and a sensing unit ( At least one of the 120 and the controller 170 may be a system concept for parking the vehicle 100.
  • Such a parking system 750 may be referred to as a vehicle parking control device.
  • the navigation system 770 can provide navigation information.
  • the navigation information may include at least one of map information, set destination information, route information according to the destination setting, information on various objects on the route, lane information, and current location information of the vehicle.
  • the navigation system 770 may include a memory and a processor.
  • the memory may store navigation information.
  • the processor may control the operation of the navigation system 770.
  • the navigation system 770 may receive information from an external device through the communication device 400 and update the pre-stored information. According to an embodiment, the navigation system 770 may be classified as a subcomponent of the user interface device 200.
  • the telematics system according to an aspect of the present invention may provide communication with an external device through a telematics application, an ECU (Electronic Control Unit), a 5G modem, or the like.
  • ECU Electronic Control Unit
  • 5G modem or the like.
  • the telematics system (or 5G telematics system) according to an aspect of the present invention may be provided in a vehicle together with an AVN (Audio / Video / Navigation) and an ECU.
  • the CE device Consumer Electronics Device
  • the telematics system can be connected to the AVN and ECU via Ethernet.
  • telematics systems can be connected to CE devices through the Open Connectivity Foundation (OCF), which is known as one of the IoT protocols.
  • OCF Open Connectivity Foundation
  • An internal configuration of the telematics system according to an aspect of the present invention for example, a service framework, a framework, a platform service, an OS layer, and an HW, will be described in detail below.
  • Telematics system architecture should be considered in terms of E / E (Electrical / Electronic) architecture.
  • the software in the telematics system must be designed to be compatible or applicable to heterogeneous architectures and all ECUs.
  • a telematic control unit (TCU) of a telematics system may be implemented in a structure separate from a modem (case 1).
  • case 1 is a method in which the modem and telematics are separated in the TCU to communicate using only the interface.
  • case 2 is how a modem and an AP are implemented in one SoC and the entire TCU on top of it.
  • a method of separately implementing a modem and a telematics system may be considered.
  • Telematics Control Unit can be considered in two types. First, Modem and Application Processor are implemented as physically different chipsets, so two chipsets can be considered as a form of TCU. Second, Modem and Application Processor are physically composed of one Chipset and can be considered as a form that does not require special I / F configuration. On the other hand, in addition to the two types, Modem and Application Processor are composed of two chipsets, but there is a form that is not one board, but this is similar in terms of the first configuration and S / W configuration.
  • Modem only plays a role as a communication module and may be implemented to provide all data to the AP (Application Processor) as PCIe.
  • the AP can pass data to other connected ECUs.
  • a service having a special function may be configured according to its purpose. For example, there may be a device driver configuration for high-speed data processing related to data routing and a configuration having priority in S / W according to characteristics of ECUs to be linked.
  • the modem and the AP may be configured as one module. Data can be transferred through internal memory without PCIe I / F. On the other hand, due to the priority and performance distribution of Modem for data processing, the role of Application Processor can be reduced.
  • SOA Service Oriented Architecture
  • SW eg SW5 and SW6
  • SOME / IP and DDS can be provided as Framework, and it must be designed to run in various E / E Architectures.
  • E / E Architectures the function of each component of the telematics system architecture will be described.
  • Service Oriented Architecture is i) a fully integrated system with complete control, ii) rapid response to rapidly changing business conditions or vendor-specific platforms, and iii) price advantage.
  • SOME / IP Scalable Service-Oriented Middleware over IP
  • DDS Data Distribution Service
  • Both SOME / IP and DDS are SOA-based middleware solutions.
  • service discovery is used to find service instances and to find out if a service instance is running.
  • SOME / IP was implemented in vehicles before DDS.
  • DDS was first implemented in the aviation and military sectors and could benefit V2X.
  • AGL Automotive Grade Linux
  • GENIVI GENIVI
  • TSN Time Sensitive Network
  • eAVB Time Sensitive Network
  • MultiLink Controller is a module for data transmission management for various networks.
  • MultiLink Controller can support management of connected radio channel such as Modem communication (LTE / 5G) and Wi-Fi.
  • Modem communication can be used for data that needs to be transmitted in real time, and monitoring and backup data can be controlled to be transmitted only via Wi-Fi depending on the memory setting.
  • Each communication must be considered in terms of mobility and charging, and can be flexibly modified according to the service provided in the vehicle.
  • Media cast center represents a multimedia service received or transmitted through the TCU.
  • the media cast center can perform processing on various multimedia sources as well as basic streaming protocol and data compression / encoding / decoding. If 5G network is used, various multimedia services are expected to increase, which is expected to be important service not only in TCU but also in AVN specification.
  • IoT2V is an IoT service gateway that can interoperate with various IoT services and corresponds to a gateway for interworking AI services of service providers (eg, Amazon, Google, Naver).
  • service providers eg, Amazon, Google, Naver.
  • Cloud Service Manager is a service manager to provide services provided by interworking network servers as if they are in-vehicle services.
  • a service supporting SOA may be considered as a proxy service to provide the same functions in a vehicle.
  • Cloud Server Network Server
  • service interworking between SOA frameworks may be affected by the protocol transmission method of the framework rather than the physical location.
  • FIG. 11 is a view for explaining an implementation example of a telematics system according to an aspect of the present invention.
  • the telematics system according to an aspect of the present invention it is possible to improve Cloud Computing usability (feasibility) according to the enhancement of cloud computing performance.
  • low-cost SoC design is possible by comparing TCU's CPU / GPU share with embedded computing.
  • the telematics system it is possible to dynamically switch from embedded computing to cloud computing through a dynamic configuration based on SOA (Service Oriented Architecture).
  • SOA Service Oriented Architecture
  • FIG. 12 is a view for explaining an implementation example of a telematics system according to an aspect of the present invention. According to one aspect of the invention, it is possible to recognize the road situation in real time by searching for the image of the vehicle running in front of the vehicle equipped with a telematics system.
  • FIG. 13 is a diagram for describing an implementation example of a telematics system according to an aspect of the present disclosure.
  • synchronized playback of content for display devices e.g., mobile, CID (Center Information Display), RSE (Rear Seat Entertainment)
  • display devices e.g., mobile, CID (Center Information Display), RSE (Rear Seat Entertainment)
  • in-vehicle Ethernet is possible.
  • display devices e.g., mobile, CID (Center Information Display), RSE (Rear Seat Entertainment)
  • in-vehicle Ethernet e.g., a display devices
  • i) down-streaming a high resolution image and re-transmitting it to a multi display in a vehicle, thereby enabling synchronized playback.
  • CE devices with telematics system using IoT protocol.
  • VIO can be implemented through monocular camera systems and inertial sensors (e.g. accelerometer, gyroscope).
  • the combination of VIO and smartphone-grade GPS can provide centimeter positioning on the map at a lower cost than RTK GPS systems.
  • TCU can perform VIO-based HD-Positioning by combining external camera image and inertial sensor GPS.
  • the TCU transmits the positioning information to the AVN to compare accuracy with general GPS information.
  • AVN can compare the mapping accuracy of the AR through HD-Positioning or General Positioning.
  • the telematics system may provide iii) embedded computing and distributed processing using i) cloud or ii) computing power of a mobile device through a service-oriented communication framework.
  • Telematics AP of telematics system can translate different service-oriented communication methods.
  • the services provided by the cloud and the mobile device may be provided in the TCU through IoTivity.
  • services found in the TCU may be provided via SOME / IP or DDS in the in-vehicle network.
  • the camera-based application service can be verified based on a comparison through cloud, mobile or embedded computing.
  • the telematics system can be utilized as a protocol of media multicast service through OCF and external interface.
  • SW platform of the telematics system according to an aspect of the present invention can be designed from the SOA perspective, the main features may be as follows.
  • E / E Architecture is based on manufacturer's requirements, TCU can be integrated or separated, and SW components of telematics system according to the present invention can be used in any ECU regardless of manufacturer's E / E Architecture. It can operate flexibly.
  • FIG. 18 is a view for explaining a telematics system according to an aspect of the present invention. Referring to FIG. 18, it can be seen that a cloud and an in-vehicle network must be supported to implement a time sensitive network (TSN) through 5G.
  • TSN time sensitive network
  • TSN is a deterministic network that always targets time sensitive applications. TSNs were developed to provide a way to deliver information from source to destination in i) fixed, ii) predictable time. TSN focuses on control data. TSN features include: i) time synchronization (e.g., less than 1us) and guaranteed end-to-end latency, ii) resource reservation, iii) extraordinarily low packet loss ratios (10 -6 to 10 -9 ), iv) There may be convergence all data streams.
  • FIG. 19 illustrates a relationship between a telematics system and a cloud according to an aspect of the present invention.
  • the cloud service may be provided like an in-vehicle service based on the SOA framework.
  • 5G-based cloud computing is provided. That is, according to the present invention, it is possible to process through the cloud a service that is difficult to process in the embedded. Basically, the framework of the cloud must be the same as embedded in the vehicle to enable communication.
  • Services and applications that can be provided according to the present invention may include a vehicle location determination service, a camera video relay service, an IoT service, and a latest DSM engine.
  • a vehicle location determination service a camera video relay service
  • an IoT service a latest DSM engine.
  • all services that the vehicle or the cloud can provide may be registered in advance.
  • API Gateway can provide interoperability between vehicles and devices by means of RESTful APIs.
  • FIG. 20 is a view for explaining the relationship between the framework and the cloud of the telematics system according to an aspect of the present invention.
  • the Cloud and the Telematics AP may be connected through an Open Connectivity Foundation (OCF) for IoT communication.
  • OCF Open Connectivity Foundation
  • the telematics of the vehicle must have i) an OCF framework for IoT communication and ii) an SOA protocol for embedded communication (eg SOME / IP or DDS).
  • OCF Open Connectivity Foundation
  • 21 is a diagram for describing a media cast center of a telematics system according to an aspect of the present invention. Specifically, efficient sharing of media data via broadband Ethernet will be described.
  • the basic functions of media processing performed by the Media Cast Center are as follows. i) Relay (e.g. unicast relay or multicast relay), ii) De-multiplex (separate to 2 media stream for video and audio from 1 stream), iii) Media data transport, iv) Media streaming control, v) Interface between services included out-vehicle OCF Interface, SOME / IP Interface, vi) Encoder / Decoder.
  • Use cases that may be provided through the Media Cast Center may include i) See through and ii) Synchronized playing at multiple CE devices. With See through, it is possible to share in-vehicle media data (e.g., front camera video) to external vehicles or A / V devices connected to the vehicle via the cloud.
  • in-vehicle media data e.g., front camera video
  • the media source may include i) a media file of cloud or web, ii) a media stream URL (Universal Resource Locator) provided by a device connected to a streaming server and a vehicle.
  • a media stream URL Universal Resource Locator
  • FIG. 22 illustrates a positioning service provided by a telematics system according to an aspect of the present invention.
  • High precision positioning of the vehicle can be provided through the integration of GPS / GNSS and Visual-inertial Odometry (VIO).
  • VIO Visual-inertial Odometry
  • Main features of the positioning service provided by the telematics system according to an aspect of the present invention may be as follows.
  • Sensor collection Sensor data is collected from GPS, accelerometer, gyroscope and camera, and this data should be time synchronized
  • VIO algorithm estimates the relative position and orientation of a vehicle using a camera and inertial sensors.
  • the camera processing block detects features and tracks them.
  • the inertial data processing block samples the sensors at a very high frequency (100 Hz or Higher)
  • GPS + VIO fusion Tightly couple GPS / GNSS measurements and local / relative coordinate measurements from VIO to achieve highly accurate global position
  • Position handler The SOA framework-based protocol is used to transfer the global position to the Vehicle Position Manager in the cloud.
  • DSDA Dual SIM Dual Active
  • SIM # 1 is used by vehicle manufacturers for OTA, V2X, big data, and maintenance
  • SIM # 2 allows users to select a carrier for wi-fi hotspot, web access, and video steaming.
  • 5G telematics point of view DSDA has a low 5G dependency on the technical side, and the interests of carriers and vehicle manufacturers are key to commercializing the technology. To date, no prototype of 5G Modem vendor has been published.
  • FIG. 23 is a diagram for describing an electronic horizon provided by a telematics system according to an aspect of the present disclosure.
  • the telematics system can provide geometry information in front of the vehicle. This can be called Electronic Horizon and can be provided through the ADASIS Protocol.
  • a telematics system can act as a horizon provider. Details are as follows.
  • Electronic Horizon provides geometry information to other ECUs.
  • Electronic Horizon requires vehicle's position (GPS) and a digital map (from cloud service).
  • Horizon's Information is Position, Path, Lane, Road (tunnel, bridge), Speed Limit, Traffic sign, Warning (under construction).
  • the ADASIS forum makes ADASIS protocol to standardize the interface between a digital map and ADAS applications.
  • ADASIS v2 is designed for CAN bus communication restricted to 8 byte messages.
  • ADASIS v2 consists of Horizon Provider, Horizon Reconstructor, ADASIS Protocol and ADAS Application.
  • ADASIS v3 is designed for higher bandwidth communication. ADASIS v3 provides additional data with more detailed content. (Lane level information). ADASIS v3 supports detailed information (e.g., HD-GPS, HD-maps, sensors, and V2X).
  • detailed information e.g., HD-GPS, HD-maps, sensors, and V2X.
  • FIGS. 24 to 31 are views for explaining an implementation example of a telematics system according to an aspect of the present invention. More specifically, FIGS. 24 to 25 are diagrams for explaining route forecasting, FIGS. 26 to 27 are diagrams for explaining crowd-eye sourcing, and FIGS. 28 to 29 are diagrams for explaining take-out cinema 30 and 31 are diagrams for explaining adaptive guidance.
  • route forecasting refers to a 5G navigation service capable of predicting road conditions in real time by searching for images of vehicles in front of the vehicle.
  • Key values for implementing route forecasting may be real-time video of a desired road, linkage with a route, and video information search. Through route forecasting, real-time video can be shared between vehicles and the shared video can be analyzed. Route forecasting can be used for cloud computing, HD-Positioning, and AR. As a specific implementation, i) it may be possible to predict road conditions by searching the images of the vehicles ahead of the road when the road is blocked, and ii) may be used in a briefing section of a difficult road before the vehicle starts, and iii) the surroundings. It can be used for route guidance using the image of the feature. Referring to FIG. 25, technical elements required for implementing route forecasting may include the aforementioned TSN, DPDK, SOA framework, Advanced Positioning, Cloud server, Media streaming, and AR engine.
  • crowd-eye sourcing refers to a service used by various vehicles to secure an image of photographing my vehicle at various angles.
  • Crowd-eye sourcing enables real-time video sharing and editing between vehicles. Crowd-eye sourcing can be used for cloud computing and IoT managers. As a specific implementation example, i) a system that obtains a black box image of surrounding vehicles from various angles and immediately combines them to provide a driver and an insurance company in case of an accident, and ii) receives and utilizes images of my vehicle taken by the surrounding vehicles while traveling. It is possible. Referring to FIG. 27, technical elements required to implement crowd-eye sourcing may include the aforementioned TSN, DPDK, Advanced Positioning, Cloud server, and media streaming.
  • the take-out cinema refers to a solution capable of viewing ultra-high definition streaming content from outside the vehicle by connecting a 5G infotainment system to a peripheral device.
  • take-out cinema Key values for implementing take-out cinema can be extended to the role of car HMI to an outdoor media center, utilization of IVI interface on tethered devices, and easy and efficient tethering compared to smartphones.
  • a take-out cinema can be used to build an integrated connectivity system that can utilize peripheral multimedia devices.
  • Take-out cinema can be used for Media Multicast Manager and AR.
  • i) connecting a media device such as a projector or a speaker to enjoy high-definition content outdoors ii) using a high-quality and high-definition device in a vehicle to use as a media room, iii) connecting a VR device It is possible to provide the surrounding video viewing service while boarding.
  • technical elements required to implement take-out cinema may include the above-described media streaming and cloud server.
  • Adaptive Guidance refers to an autonomous driving system that receives a road situation from a front vehicle, sets a driving pattern bonded to an occupant activity, and provides attention notification.
  • Adaptive Guidance may be path / speed control based on occupant activity, preliminary notice and warning of precautions, DMS utilization and major activity analysis.
  • Adaptive Guidance enables sharing of real-time road situation information and analysis of occupant activity.
  • Adaptive Guidance can be used for cloud computing, data managers, and E-horizon.
  • monitoring the driver's forward direction in manual driving to provide traffic light change notification ii) in automatic driving, monitoring the objects related to personal work situation inside the vehicle to warn of starting / stopping / lane changing, iii) If the user is sleeping in automatic driving, there may be a route change so as to avoid uneven section or sudden start / stop.
  • technical elements for implementing adaptive guidance may include TSN, DPDK, Advanced Positioning, Cloud server, DMS, Road forecasting, and IoTivity.
  • 32 to 34 are views for explaining a technical effect of a telematics system according to an aspect of the present invention.
  • Media Cast Center and cloud server of the telematics system shown in FIG. 32 are characterized by high throughput.
  • the Media Cast Center features Relay, Media data transport, Media streaming control, Encoder / Decoder.
  • Restrictions due to high throughput may include i) Non consideration about secure channel between content server, and ii) Non consideration about copyright of content.
  • the TCU and cloud server of the telematics system shown in FIG. 33 are characterized by low latency.
  • MMI Multi Media Interface
  • 360 Media processing are possible through SOA framework (SOME / IP or IoTivity) of TCU and cloud server.
  • SOA framework SOME / IP or IoTivity
  • low latency constraints may include i) need a edge computing by MNO, ii) 360 or VR camera performance, iii) 360 or VR player (Software / Required HW), and iv) Plyer Interface.
  • the telematics system shown in FIG. 34 is characterized by a vehicle to network (V2N) see-through.
  • V2N vehicle to network
  • the telematics system according to an aspect of the present invention may provide a see-through function through an SOA framework, a cloud server, and a media cast center.
  • 35 to 36 illustrate a signal transmission and reception sequence between a TCU and a cloud server (or a network server) in a telematics system according to an aspect of the present invention.
  • a cloud server or a network server
  • the signal transmission and reception sequence of the network server and the TCU will be described with reference to FIG. 35.
  • a remote monitoring start event occurs on the server.
  • B. Check and open the port number where the Camera Cast Service of the server receives the video from the Media Cast Service (hereinafter referred to as MCS) of the 5G TS (initialization of the server).
  • D. Vehicle Event Manager of 5G TS forwards C request information to MCS.
  • E. MCS prepares to turn on the camera and deliver the image to the server based on the information delivered by the Vehicle Event Manager of 5G TS.
  • G. If necessary, re-packetize the camera image and send it to the server.
  • H A remote monitoring stop event occurs on the server.
  • I MQTT Broke on the server asks the 5G TS Vehicle Event Manager to stop sending the camera.
  • J. The Vehicle Event Manager of the 5G TS sends an ICS stop request to the MCS.
  • K. MCS stops transmitting J's camera video.
  • L. MCS turns off the camera.
  • a / V is output in synchronous form by using the reference time information (in RTCP) transmitted periodically and the time stamp of the RTP packet header.
  • a real time transport protocol (RTP) / UDP user datagram protocol
  • a real time streaming protocol (RTSP) may be used as a streaming control protocol
  • a QoS protocol may be used.
  • Real time transport control protocol (RTCP) may be used.
  • the definition and role of each component shown in Figure 35 may be as shown in Table 1 below.
  • 36 is a flowchart illustrating an interworking scenario for using a media cast center between a telematics system and a cloud server according to an aspect of the present invention.
  • the 5G TS of the vehicle prepares to transmit the request to the cloud server.
  • the vehicle controls the frequency output through an antenna unit connected by a 5G TS and a coaxial cable.
  • the vehicle transmits signals to the cloud server through the gNB connected by Ethernet cable.
  • the vehicle receives the content of the media cast center of the cloud server from the cloud server.
  • the vehicle transmits the content received through the antenna to the 5G TS through the cable.
  • Information is transmitted to AVN connected by 5G TS and Ethernet cable.
  • Embodiments of the present invention described above may be implemented through various means.
  • embodiments of the present invention may be implemented by hardware, firmware, software, or a combination thereof.
  • a method according to embodiments of the present invention may include one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), and Programmable Logic Devices (PLDs). It may be implemented by field programmable gate arrays (FPGAs), processors, controllers, microcontrollers, microprocessors, and the like.
  • ASICs Application Specific Integrated Circuits
  • DSPs Digital Signal Processors
  • DSPDs Digital Signal Processing Devices
  • PLDs Programmable Logic Devices
  • FPGAs field programmable gate arrays
  • processors controllers, microcontrollers, microprocessors, and the like.
  • the method according to the embodiments of the present invention may be implemented in the form of a module, a procedure, or a function that performs the functions or operations described above.
  • the software code may be stored in a memory unit and driven by a processor.
  • the memory unit may be located inside or outside the processor, and may exchange data with the processor by various known means.
  • the present invention described above can be embodied as computer readable codes on a medium in which a program is recorded.
  • the computer-readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of computer-readable media include hard disk drives (HDDs), solid state disks (SSDs), silicon disk drives (SDDs), ROMs, RAM, CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, and the like. This also includes implementations in the form of carrier waves (eg, transmission over the Internet).
  • the computer may include the controller 180 of the terminal. Accordingly, the above detailed description should not be construed as limiting in all aspects and should be considered as illustrative. The scope of the invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the invention are included in the scope of the invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

Proposed is a method for controlling a telematics system provided in a vehicle. Specifically, proposed is a method for controlling a telematics system provided in a vehicle, the method comprising the steps of: receiving a request for content output; requesting, from an external server, information associated with the content through a base station connected to the telematics system; and receiving, from the external server, the information through the base station.

Description

차량에 구비되는 텔레매틱스 시스템 및 이를 제어하는 방법Telematics system provided in a vehicle and a method of controlling the same
본 발명은 차량에 구비되는 텔레매틱스 시스템 및 이를 제어하는 방법에 관한 것으로, 보다 구체적으로는 텔레매틱스 시스템이 기지국을 통해 외부 서버와 신호를 송수신하는 방법에 관한 것이다.The present invention relates to a telematics system provided in a vehicle and a method for controlling the same, and more particularly, to a method for transmitting and receiving a signal to and from an external server through a base station.
차량은 전통적으로 사용자의 이동 수단으로 기능하지만, 사용자의 편의를 위해 각종 센서와 전자 장치 등을 구비하여 사용자의 운전 편의를 제공하고 있다. 특히 사용자의 운전 편의를 위한 운전자 보조 시스템(ADAS: Advanced Driver Assistance System) 및 더 나아가 자율주행차량(Autonomous Vehicle)에 대한 개발이 활발하게 이루어 지고 있다.The vehicle traditionally functions as a user's means of transportation, but for the convenience of the user, the vehicle is provided with various sensors, electronic devices, and the like, to provide driving convenience for the user. In particular, the development of the Advanced Driver Assistance System (ADAS) and autonomous vehicles (Autonomous Vehicle) for the user's driving convenience is being actively made.
최근의 운전자 보조 시스템 및 자율주행차량은 사용자의 운전 편의뿐만 아니라 승객의 편의를 위한 다양한 디스플레이 장치를 제공하고 있다. 이를 일컬어 차량 내 엔터테인먼트 (In-car entertainment, ICE) 또는 차량 내 인포테인먼트 (In-vehicle Infotainment, IVI)라고 한다. ICE 또는 IVI는 오디오 또는 비디오 엔터테인먼트를 제공하는 자동차 하드웨어 및 소프트웨어라고 할 수 있다. Recently, the driver assistance system and the autonomous vehicle provide various display devices for the convenience of the passenger as well as the driving convenience of the user. This is also known as in-car entertainment (ICE) or in-vehicle infotainment (IVI). ICE or IVI are automotive hardware and software that provide audio or video entertainment.
한편, 최근에는 이동 통신 기술, 특히 5G NR (New Radio Access Technology)를 통해 ICE 또는 IVI 서비스를 제공하는 기술에 대한 연구가 활발히 진행되고 있다. 특히, Telematics application, ECU (Electronic Control Unit), 5G modem 등을 통해 외부 기기와의 통신을 제공하는 5G use case에 대한 개발이 진행 중이다. 5G use case에는 Intersection Movement Assistant, Real-time Situational Awareness & HD Maps, Cooperative Lane Change of Automated Vehicles, See-Through, Vulnerable Road User Discovery, Shared Vision, Collective Sensory, Hybrid Intelligence, Journey Studio, Media Center on Wheels 등이 있다.On the other hand, recently, researches are being actively conducted on mobile communication technologies, particularly technologies that provide ICE or IVI services through 5G NR (New Radio Access Technology). In particular, development of a 5G use case that provides communication with external devices through a telematics application, an ECU (Electronic Control Unit), a 5G modem, and the like is in progress. 5G use case includes Intersection Movement Assistant, Real-time Situational Awareness & HD Maps, Cooperative Lane Change of Automated Vehicles, See-Through, Vulnerable Road User Discovery, Shared Vision, Collective Sensory, Hybrid Intelligence, Journey Studio, Media Center on Wheels, etc. There is this.
전술한 5G use case들을 제공하기 위해 차량에는 복수의 디스플레이 장치가 구비될 수 있다. 다만, 종래 기술에 따르면 TCU (Telematics Control Unit)를 통해 차량 내 복수의 디스플레이 장치들에게 컨텐트를 효율적으로 제공하지 못하는 문제가 있다.In order to provide the above-described 5G use cases, a vehicle may be provided with a plurality of display devices. However, according to the related art, there is a problem in that content cannot be efficiently provided to a plurality of display devices in a vehicle through a telematics control unit (TCU).
전술한 문제점을 해결하기 위하여, 본 발명에서 이루고자 하는 기술적 과제는 차량에 구비되는 텔레매틱스 시스템 및 이를 제어하는 방법을 제공하는 데 있다.In order to solve the above problems, the present invention is to provide a telematics system provided in a vehicle and a method of controlling the same.
본 발명에서 이루고자 하는 기술적 과제는 이러한 종래 기술의 문제를 해결하는 것이다. 본 발명에서 이루고자 하는 기술적 과제들은 상기 기술적 과제로 제한되지 않으며, 언급하지 않은 또 다른 기술적 과제들은 아래의 기재로부터 본 발명이 속하는 기술분야에서 통상의 지식을 가진 자에게 명확하게 이해될 수 있을 것이다.The technical problem to be achieved in the present invention is to solve this problem of the prior art. Technical problems to be achieved in the present invention are not limited to the above technical problems, and other technical problems that are not mentioned will be clearly understood by those skilled in the art from the following description.
상술된 기술적 과제를 이루기 위한 본 발명의 일 측면에서는 컨텐트 출력을 위한 요청을 수신하는 단계, 상기 텔레매틱스 시스템에 연결된 기지국을 통해 외부 서버에 상기 컨텐트와 관련된 정보를 요청하는 단계 및 상기 기지국을 통해 상기 외부 서버로부터 상기 정보를 수신하는 단계를 포함하는 차량에 구비되는 텔레매틱스 시스템을 제어하는 방법이 제안된다. 상기 수신된 정보는 상기 텔레매틱스 시스템에 연결된 복수의 타겟 디바이스에 각각 전달될 수 있다.According to an aspect of the present invention for achieving the above technical problem, receiving a request for content output, requesting information related to the content to an external server through a base station connected to the telematics system and the external through the base station A method of controlling a telematics system provided in a vehicle comprising receiving the information from a server is proposed. The received information may be delivered to a plurality of target devices connected to the telematics system, respectively.
상기 정보는 소정의 프로토콜을 통해 상기 외부 서버로부터 수신되고, 상기 소정의 프로토콜은 전송 프로토콜, 스트리밍 제어 프로토콜 및 QoS (Quality of Service) 프로토콜을 포함할 수 있다.The information is received from the external server via a predetermined protocol, and the predetermined protocol may include a transmission protocol, a streaming control protocol, and a quality of service (QoS) protocol.
상기 수신된 정보가 상기 외부 서버에 의해 제공되는 미디어 컨텐트인 경우, 상기 텔레매틱스 시스템은 상기 복수의 타겟 디바이스가 각각 접속할 수 있는 URL (Universal Resource Locator)을 생성하고, 상기 생성된 URL을 상기 복수의 타겟 디바이스에 각각 전달할 수 있다.If the received information is media content provided by the external server, the telematics system generates a URL (Universal Resource Locator) to which each of the plurality of target devices can access, and converts the generated URLs into the plurality of targets. Each can be delivered to the device.
상기 전달된 정보는 상기 QoS 프로토콜에 포함된 기준 시간 정보 및 상기 외부 서버로부터 수신되는 정보를 구성하는 패킷에 포함된 시간 정보에 기초하여 상기 복수의 타겟 디바이스 각각에서 동기화되어 출력될 수 있다.The transmitted information may be output in synchronization with each of the plurality of target devices based on reference time information included in the QoS protocol and time information included in a packet constituting information received from the external server.
상기 복수의 타겟 디바이스는 상기 차량 내부에 구비되는 CID (Center Information Display), AVN (Audio Video Navigation) 및 RSE (Rear Seat Entertainment)과 상기 차량 내부 또는 외부에 구비되는 CE (Consumer Electronics) Device를 포함할 수 있다.The plurality of target devices may include a center information display (CID), an audio video navigation (AVN), rear seat entertainment (RSE), and a consumer electronic (CE) device provided inside or outside the vehicle. Can be.
상기 외부 서버로부터 수신된 컨텐트는 미디어 컨텐트 또는 다른 차량의 카메라 영상에 대응할 수 있다.The content received from the external server may correspond to media content or a camera image of another vehicle.
본 발명의 일 측면에서는 전술한 종래기술의 문제 즉, TCU (Telematics Control Unit)를 통해 차량 내 복수의 디스플레이 장치들에게 컨텐트를 효율적으로 제공하지 못하는 문제를 해결할 수 있다. 구체적으로, 본 발명의 일 측면에 따른 텔레매틱스 시스템은 차량 내 복수의 디스플레이 장치들에서 오디오/비디오 동기화된 컨텐트를 제공할 수 있다.According to an aspect of the present invention, the above-described problem of the related art, that is, the problem of failing to efficiently provide content to a plurality of display devices in a vehicle through a TCU (Telematics Control Unit) can be solved. Specifically, the telematics system according to an aspect of the present invention may provide audio / video synchronized content in a plurality of display devices in a vehicle.
본 발명에서 얻은 수 있는 효과는 이상에서 언급한 효과들로 제한되지 않으며, 언급하지 않은 또 다른 효과들은 아래의 기재로부터 본 발명이 속하는 기술분야에서 통상의 지식을 가진 자에게 명확하게 이해될 수 있을 것이다.Effects obtained in the present invention are not limited to the above-mentioned effects, and other effects not mentioned above may be clearly understood by those skilled in the art from the following description. will be.
본 발명에 관한 이해를 돕기 위해 상세한 설명의 일부로 포함되는, 첨부 도면은 본 발명에 대한 실시예를 제공하고, 상세한 설명과 함께 본 발명의 기술적 사상을 설명한다.BRIEF DESCRIPTION OF THE DRAWINGS The accompanying drawings, which are included as part of the detailed description in order to provide a thorough understanding of the present invention, provide an embodiment of the present invention and together with the description, illustrate the technical idea of the present invention.
도 1은 본 발명의 일 측면에 따른 차량의 외관을 도시한 도면이다.1 is a view showing the appearance of a vehicle according to an aspect of the present invention.
도 2는 본 발명의 일 측면에 따른 차량을 외부의 다양한 각도에서 본 도면이다.2 is a view of the vehicle according to an aspect of the present invention from various angles from the outside.
도 3 내지 도 4는 본 발명의 일 측면에 따른 차량의 내부를 도시한 도면이다.3 to 4 are views illustrating the interior of a vehicle according to an aspect of the present invention.
도 5 내지 도 6은 본 발명의 일 측면에 따른 오브젝트를 설명하는데 참조되는 도면이다.5 to 6 are views referred to for explaining an object according to an aspect of the present invention.
도 7은 본 발명의 일 측면에 따른 차량을 설명하는데 참조되는 블록도이다.7 is a block diagram referred to describe a vehicle according to an aspect of the present invention.
도 8 내지 도 10은 본 발명의 일 측면에 따른 텔레매틱스 시스템의 구조를 설명하기 위한 도면이다.8 to 10 are diagrams for explaining the structure of a telematics system according to an aspect of the present invention.
도 11 내지 도 14는 본 발명의 일 측면에 따른 텔레매틱스 시스템의 구현 예를 설명하기 위한 도면이다.11 to 14 are diagrams for explaining an implementation example of a telematics system according to an aspect of the present invention.
도 15는 본 발명의 일 측면에 따른 텔레매틱스 시스템 아키텍쳐의 특징을 설명하기 위한 도면이다.15 is a view for explaining the characteristics of the telematics system architecture according to an aspect of the present invention.
도 16 내지 도 17은 본 발명의 일 측면에 따른 텔레매틱스 시스템의 SW 플랫폼을 설명하기 위한 도면이다.16 to 17 are diagrams for explaining the SW platform of the telematics system according to an aspect of the present invention.
도 18은 본 발명의 일 측면에 따른 텔레매틱스 시스템을 설명하기 위한 도면이다.18 is a view for explaining a telematics system according to an aspect of the present invention.
도 19는 본 발명의 일 측면에 따른 텔레매틱스 시스템과 클라우드의 관계를 설명하기 위한 도면이다.FIG. 19 illustrates a relationship between a telematics system and a cloud according to an aspect of the present invention.
도 20은 본 발명의 일 측면에 따른 텔레매틱스 시스템의 Framework과 cloud의 관계를 설명하기 위한 도면이다.20 is a view for explaining the relationship between the framework and the cloud of the telematics system according to an aspect of the present invention.
도 21은 본 발명의 일 측면에 따른 텔레매틱스 시스템의 media cast center를 설명하기 위한 도면이다.21 is a diagram for describing a media cast center of a telematics system according to an aspect of the present invention.
도 22는 본 발명의 일 측면에 따른 텔레매틱스 시스템이 제공하는 positioning service를 설명하기 위한 도면이다.FIG. 22 illustrates a positioning service provided by a telematics system according to an aspect of the present invention.
도 23은 본 발명의 일 측면에 따른 텔레매틱스 시스템이 제공하는 electronic horizon을 설명하기 위한 도면이다.FIG. 23 is a diagram for describing an electronic horizon provided by a telematics system according to an aspect of the present disclosure.
도 24 내지 도 31은 본 발명의 일 측면에 따른 텔레매틱스 시스템의 구현 예를 설명하기 위한 도면이다.24 to 31 are views for explaining an implementation example of a telematics system according to an aspect of the present invention.
도 32 내지 도 34는 본 발명의 일 측면에 따른 텔레매틱스 시스템의 기술적 효과를 설명하기 위한 도면이다.32 to 34 are views for explaining a technical effect of a telematics system according to an aspect of the present invention.
도 35 내지 도 36은 본 발명의 일 측면에 따른 텔레매틱스 시스템에서 TCU와 cloud server의 신호 송수신 시퀀스를 설명하기 위한 도면이다.35 to 36 are diagrams for explaining a signal transmission and reception sequence between a TCU and a cloud server in a telematics system according to an aspect of the present invention.
이하, 본 발명에 따른 바람직한 실시 형태를 첨부된 도면을 참조하여 상세하게 설명한다. 첨부된 도면과 함께 이하에 개시될 상세한 설명은 본 발명의 예시적인 실시형태를 설명하고자 하는 것이며, 본 발명이 실시될 수 있는 유일한 실시형태를 나타내고자 하는 것이 아니다. 이하의 상세한 설명은 본 발명의 완전한 이해를 제공하기 위해서 구체적 세부사항을 포함한다. 그러나, 당업자는 본 발명이 이러한 구체적 세부사항 없이도 실시될 수 있음을 안다.Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings. The detailed description, which will be given below with reference to the accompanying drawings, is intended to explain exemplary embodiments of the present invention and is not intended to represent the only embodiments in which the present invention may be practiced. The following detailed description includes specific details in order to provide a thorough understanding of the present invention. However, one of ordinary skill in the art appreciates that the present invention may be practiced without these specific details.
도 1 내지 도 7을 참조하면, 차량(100)은 동력원에 의해 회전하는 바퀴, 차량(100)의 진행 방향을 조절하기 위한 조향 입력 장치(510)를 포함할 수 있다.1 to 7, the vehicle 100 may include a wheel that rotates by a power source and a steering input device 510 for adjusting a traveling direction of the vehicle 100.
차량(100)은 자율 주행 차량일 수 있다. 차량(100)은, 사용자 입력에 기초하여, 자율 주행 모드 또는 메뉴얼 모드로 전환될 수 있다. 예를 들면, 차량(100)은, 사용자 인터페이스 장치(200)를 통해, 수신되는 사용자 입력에 기초하여, 메뉴얼 모드에서 자율 주행 모드로 전환되거나, 자율 주행 모드에서 메뉴얼 모드로 전환될 수 있다. The vehicle 100 may be an autonomous vehicle. The vehicle 100 may be switched to an autonomous driving mode or a manual mode based on a user input. For example, the vehicle 100 may be switched from the manual mode to the autonomous driving mode or from the autonomous driving mode to the manual mode based on the received user input through the user interface device 200.
차량(100)은, 주행 상황 정보에 기초하여, 자율 주행 모드 또는 메뉴얼 모드로 전환될 수 있다. 주행 상황 정보는, 차량 외부의 오브젝트 정보, 내비게이션 정보 및 차량 상태 정보 중 적어도 어느 하나를 포함할 수 있다. The vehicle 100 may be switched to the autonomous driving mode or the manual mode based on the driving situation information. The driving situation information may include at least one of object information, navigation information, and vehicle state information outside the vehicle.
예를 들면, 차량(100)은, 오브젝트 검출 장치(300)에서 생성되는 주행 상황 정보에 기초하여, 메뉴얼 모드에서 자율 주행 모드로 전환되거나, 자율 주행 모드에서 메뉴얼 모드로 전환될 수 있다. 예를 들면, 차량(100)은, 통신 장치(400)를 통해 수신되는 주행 상황 정보에 기초하여, 메뉴얼 모드에서 자율 주행 모드로 전환되거나, 자율 주행 모드에서 메뉴얼 모드로 전환될 수 있다.For example, the vehicle 100 may be switched from the manual mode to the autonomous driving mode or from the autonomous driving mode to the manual mode based on the driving situation information generated by the object detecting apparatus 300. For example, the vehicle 100 may be switched from the manual mode to the autonomous driving mode or from the autonomous driving mode to the manual mode based on the driving situation information received through the communication device 400.
차량(100)은, 외부 디바이스에서 제공되는 정보, 데이터, 신호에 기초하여 메뉴얼 모드에서 자율 주행 모드로 전환되거나, 자율 주행 모드에서 메뉴얼 모드로 전환될 수 있다.The vehicle 100 may switch from the manual mode to the autonomous driving mode or from the autonomous driving mode to the manual mode based on information, data, and signals provided from an external device.
차량(100)이 자율 주행 모드로 운행되는 경우 자율 주행 차량(100)은 운행 시스템(700)에 기초하여 운행될 수 있다. 예를 들면, 자율 주행 차량(100)은, 주행 시스템(710), 출차 시스템(740), 주차 시스템(750)에서 생성되는 정보, 데이터 또는 신호에 기초하여 운행될 수 있다.When the vehicle 100 is driven in the autonomous driving mode, the autonomous vehicle 100 may be driven based on the driving system 700. For example, the autonomous vehicle 100 may be driven based on information, data, or signals generated by the driving system 710, the parking system 740, and the parking system 750.
차량(100)이 메뉴얼 모드로 운행되는 경우, 자율 주행 차량(100)은, 운전 조작 장치(500)를 통해 운전을 위한 사용자 입력을 수신할 수 있다. 운전 조작 장치(500)를 통해 수신되는 사용자 입력에 기초하여, 차량(100)은 운행될 수 있다.When the vehicle 100 is driven in the manual mode, the autonomous vehicle 100 may receive a user input for driving through the driving manipulation apparatus 500. Based on a user input received through the driving manipulation apparatus 500, the vehicle 100 may be driven.
전장(overall length)은 차량(100)의 앞부분에서 뒷부분까지의 길이, 전폭(width)은 차량(100)의 너비, 전고(height)는 바퀴 하부에서 루프까지의 길이를 의미한다. 이하의 설명에서, 전장 방향(L)은 차량(100)의 전장 측정의 기준이 되는 방향, 전폭 방향(W)은 차량(100)의 전폭 측정의 기준이 되는 방향, 전고 방향(H)은 차량(100)의 전고 측정의 기준이 되는 방향을 의미할 수 있다.The overall length is the length from the front to the rear of the vehicle 100, the width is the width of the vehicle 100, and the height is the length from the bottom of the wheel to the roof. In the following description, the full length direction L is a direction in which the full length measurement of the vehicle 100 is a reference, the full width direction W is a direction in which the full width measurement of the vehicle 100 is a reference, and the total height direction H is a vehicle. It may mean the direction which is the reference of the height measurement of (100).
도 7에 예시된 바와 같이, 차량(100)은, 사용자 인터페이스 장치(200), 오브젝트 검출 장치(300), 통신 장치(400), 운전 조작 장치(500), 차량 구동 장치(600), 운행 시스템(700), 내비게이션 시스템(770), 센싱부(120), 인터페이스부(130), 메모리(140), 제어부(170) 및 전원 공급부(190)를 포함할 수 있다.As illustrated in FIG. 7, the vehicle 100 includes a user interface device 200, an object detecting device 300, a communication device 400, a driving manipulation device 500, a vehicle driving device 600, and a traveling system. 700, a navigation system 770, a sensing unit 120, an interface unit 130, a memory 140, a control unit 170, and a power supply unit 190 may be included.
실시예에 따라, 차량(100)은, 본 명세서에서 설명되는 구성 요소 외에 다른 구성 요소를 더 포함하거나, 설명되는 구성 요소 중 일부를 포함하지 않을 수 있다.센싱부(120)는, 차량의 상태를 센싱할 수 있다. 센싱부(120)는, 자세 센서(예를 들면, 요 센서(yaw sensor), 롤 센서(roll sensor), 피치 센서(pitch sensor)), 충돌 센서, 휠 센서(wheel sensor), 속도 센서, 경사 센서, 중량 감지 센서, 헤딩 센서(heading sensor), 자이로 센서(gyro sensor), 포지션 모듈(position module), 차량 전진/후진 센서, 배터리 센서, 연료 센서, 타이어 센서, 핸들 회전에 의한 스티어링 센서, 차량 내부 온도 센서, 차량 내부 습도 센서, 초음파 센서, 조도 센서, 가속 페달 포지션 센서, 브레이크 페달 포지션 센서, 등을 포함할 수 있다.According to an exemplary embodiment, the vehicle 100 may further include other components in addition to the components described herein, or may not include some of the described components. The sensing unit 120 may include a state of the vehicle. Can sense. The sensing unit 120 may include an attitude sensor (for example, a yaw sensor, a roll sensor, a pitch sensor), a collision sensor, a wheel sensor, a speed sensor, and an inclination. Sensor, Weight Sensor, Heading Sensor, Gyro Sensor, Position Module, Vehicle Forward / Reverse Sensor, Battery Sensor, Fuel Sensor, Tire Sensor, Steering Sensor by Steering Wheel, Vehicle And an internal temperature sensor, an in-vehicle humidity sensor, an ultrasonic sensor, an illuminance sensor, an accelerator pedal position sensor, a brake pedal position sensor, and the like.
센싱부(120)는, 차량 자세 정보, 차량 충돌 정보, 차량 방향 정보, 차량 위치 정보(GPS 정보), 차량 각도 정보, 차량 속도 정보, 차량 가속도 정보, 차량 기울기 정보, 차량 전진/후진 정보, 배터리 정보, 연료 정보, 타이어 정보, 차량 램프 정보, 차량 내부 온도 정보, 차량 내부 습도 정보, 스티어링 휠 회전 각도, 차량 외부 조도, 가속 페달에 가해지는 압력, 브레이크 페달에 가해지는 압력 등에 대한 센싱 신호를 획득할 수 있다.The sensing unit 120 includes vehicle attitude information, vehicle collision information, vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / reverse information, battery Acquire sensing signals for information, fuel information, tire information, vehicle lamp information, vehicle internal temperature information, vehicle internal humidity information, steering wheel rotation angle, vehicle external illumination, pressure applied to the accelerator pedal, pressure applied to the brake pedal, and the like. can do.
센싱부(120)는, 그 외, 가속페달센서, 압력센서, 엔진 회전 속도 센서(engine speed sensor), 공기 유량 센서(AFS), 흡기 온도 센서(ATS), 수온 센서(WTS), 스로틀 위치 센서(TPS), TDC 센서, 크랭크각 센서(CAS), 등을 더 포함할 수 있다.The sensing unit 120 may further include an accelerator pedal sensor, a pressure sensor, an engine speed sensor, an air flow sensor (AFS), an intake air temperature sensor (ATS), a water temperature sensor (WTS), and a throttle position sensor. (TPS), TDC sensor, crank angle sensor (CAS), and the like.
센싱부(120)는, 센싱 데이터를 기초로, 차량 상태 정보를 생성할 수 있다. 차량 상태 정보는, 차량 내부에 구비된 각종 센서에서 감지된 데이터를 기초로 생성된 정보일 수 있다.The sensing unit 120 may generate vehicle state information based on the sensing data. The vehicle state information may be information generated based on data sensed by various sensors provided in the vehicle.
예를 들면, 차량 상태 정보는, 차량의 자세 정보, 차량의 속도 정보, 차량의 기울기 정보, 차량의 중량 정보, 차량의 방향 정보, 차량의 배터리 정보, 차량의 연료 정보, 차량의 타이어 공기압 정보, 차량의 스티어링 정보, 차량 실내 온도 정보, 차량 실내 습도 정보, 페달 포지션 정보 및 차량 엔진 온도 정보 등을 포함할 수 있다.For example, the vehicle state information includes vehicle attitude information, vehicle speed information, vehicle tilt information, vehicle weight information, vehicle direction information, vehicle battery information, vehicle fuel information, vehicle tire pressure information, The vehicle may include steering information of the vehicle, vehicle indoor temperature information, vehicle indoor humidity information, pedal position information, vehicle engine temperature information, and the like.
인터페이스부(130)는, 차량(100)에 연결되는 다양한 종류의 외부 기기와의 통로 역할을 수행할 수 있다. 예를 들면, 인터페이스부(130)는 이동 단말기와 연결 가능한 포트를 구비할 수 있고, 상기 포트를 통해, 이동 단말기와 연결할 수 있다. 이 경우, 인터페이스부(130)는 이동 단말기와 데이터를 교환할 수 있다.The interface unit 130 may serve as a path to various types of external devices connected to the vehicle 100. For example, the interface unit 130 may include a port connectable with the mobile terminal, and may connect with the mobile terminal through the port. In this case, the interface unit 130 may exchange data with the mobile terminal.
한편, 인터페이스부(130)는 연결된 이동 단말기에 전기 에너지를 공급하는 통로 역할을 수행할 수 있다. 이동 단말기가 인터페이스부(130)에 전기적으로 연결되는 경우, 제어부(170)의 제어에 따라, 인터페이스부(130)는 전원 공급부(190)에서 공급되는 전기 에너지를 이동 단말기에 제공할 수 있다.Meanwhile, the interface unit 130 may serve as a path for supplying electrical energy to the connected mobile terminal. When the mobile terminal is electrically connected to the interface unit 130, under the control of the controller 170, the interface unit 130 may provide the mobile terminal with electrical energy supplied from the power supply unit 190.
메모리(140)는, 제어부(170)와 전기적으로 연결된다. 메모리(140)는 유닛에 대한 기본데이터, 유닛의 동작제어를 위한 제어데이터, 입출력되는 데이터를 저장할 수 있다. 메모리(140)는, 하드웨어적으로, ROM, RAM, EPROM, 플래시 드라이브, 하드 드라이브 등과 같은 다양한 저장기기 일 수 있다. 메모리(140)는 제어부(170)의 처리 또는 제어를 위한 프로그램 등, 차량(100) 전반의 동작을 위한 다양한 데이터를 저장할 수 있다. 실시예에 따라, 메모리(140)는, 제어부(170)와 일체형으로 형성되거나, 제어부(170)의 하위 구성 요소로 구현될 수 있다.The memory 140 is electrically connected to the controller 170. The memory 140 may store basic data for the unit, control data for controlling the operation of the unit, and input / output data. The memory 140 may be various storage devices such as a ROM, a RAM, an EPROM, a flash drive, a hard drive, and the like, in hardware. The memory 140 may store various data for overall operation of the vehicle 100, such as a program for processing or controlling the controller 170. According to an embodiment, the memory 140 may be integrally formed with the controller 170 or may be implemented as a subcomponent of the controller 170.
제어부(170)는, 차량(100) 내의 각 유닛의 전반적인 동작을 제어할 수 있다. 제어부(170)는 ECU(Electronic Control Unit)로 명명될 수 있다. 전원 공급부(190)는, 제어부(170)의 제어에 따라, 각 구성요소들의 동작에 필요한 전원을 공급할 수 있다. 특히, 전원 공급부(190)는, 차량 내부의 배터리 등으로부터 전원을 공급받을 수 있다.The controller 170 may control the overall operation of each unit in the vehicle 100. The controller 170 may be referred to as an electronic control unit (ECU). The power supply unit 190 may supply power required for the operation of each component under the control of the controller 170. In particular, the power supply unit 190 may receive power from a battery inside the vehicle.
차량(100)에 포함되는 하나 이상의 프로세서 및 제어부(170)는 ASICs (application specific integrated circuits), DSPs(digital signal processors), DSPDs(digital signal processing devices), PLDs(programmable logic devices), FPGAs(field programmable gate arrays), 프로세서(processors), 제어기(controllers), 마이크로 컨트롤러(micro-controllers), 마이크로 프로세서(microprocessors), 기타 기능 수행을 위한 전기적 유닛 중 적어도 하나를 이용하여 구현될 수 있다.One or more processors and controllers 170 included in vehicle 100 may include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), and field programmable (FPGAs). Gate arrays, processors, controllers, micro-controllers, microprocessors, and other electrical units for performing other functions may be implemented.
또한, 센싱부(120), 인터페이스부(130), 메모리(140) 전원 공급부(190), 사용자 인터페이스 장치(200), 오브젝트 검출 장치(300), 통신 장치(400), 운전 조작 장치(500), 차량 구동 장치(600), 운행 시스템(700) 및 내비게이션 시스템(770)은 개별적인 프로세서를 갖거나 제어부(170)에 통합될 수 있다.In addition, the sensing unit 120, the interface unit 130, the memory 140, the power supply unit 190, the user interface device 200, the object detection device 300, the communication device 400, and the driving operation device 500. The vehicle driving apparatus 600, the driving system 700, and the navigation system 770 may have separate processors or may be integrated into the controller 170.
사용자 인터페이스 장치(200)는, 차량(100)과 사용자와의 소통을 위한 장치이다. 사용자 인터페이스 장치(200)는, 사용자 입력을 수신하고, 사용자에게 차량(100)에서 생성된 정보를 제공할 수 있다. 차량(100)은, 사용자 인터페이스 장치(200)를 통해, UI(User Interfaces) 또는 UX(User Experience)를 구현할 수 있다.The user interface device 200 is a device for communicating with the vehicle 100 and a user. The user interface device 200 may receive a user input and provide the user with information generated in the vehicle 100. The vehicle 100 may implement user interfaces (UI) or user experience (UX) through the user interface device 200.
사용자 인터페이스 장치(200)는, 입력부(210), 내부 카메라(220), 생체 감지부(230), 출력부(250) 및 프로세서(270)를 포함할 수 있다. 사용자 인터페이스 장치(200)의 각 구성요소는 전술한 인터페이스부(130)와 구조적, 기능적으로 분리되거나 통합될 수 있다.The user interface device 200 may include an input unit 210, an internal camera 220, a biometric detector 230, an output unit 250, and a processor 270. Each component of the user interface device 200 may be structurally and functionally separated from or integrated with the interface unit 130 described above.
실시예에 따라, 사용자 인터페이스 장치(200)는, 설명되는 구성 요소 외에 다른 구성 요소를 더 포함하거나, 설명되는 구성 요소 중 일부를 포함하지 않을 수도 있다.According to an embodiment, the user interface device 200 may further include other components in addition to the described components, or may not include some of the described components.
입력부(210)는, 사용자로부터 정보를 입력받기 위한 것으로, 입력부(210)에서 수집한 데이터는, 프로세서(270)에 의해 분석되어, 사용자의 제어 명령으로 처리될 수 있다.The input unit 210 is for receiving information from a user, and the data collected by the input unit 210 may be analyzed by the processor 270 and processed as a user's control command.
입력부(210)는, 차량 내부에 배치될 수 있다. 예를 들면, 입력부(210)는, 스티어링 휠(steering wheel)의 일 영역, 인스투루먼트 패널(instrument panel)의 일 영역, 시트(seat)의 일 영역, 각 필러(pillar)의 일 영역, 도어(door)의 일 영역, 센타 콘솔(center console)의 일 영역, 헤드 라이닝(head lining)의 일 영역, 썬바이저(sun visor)의 일 영역, 윈드 쉴드(windshield)의 일 영역 또는 윈도우(window)의 일 영역 등에 배치될 수 있다.The input unit 210 may be disposed in the vehicle. For example, the input unit 210 may include one area of a steering wheel, one area of an instrument panel, one area of a seat, one area of each pillar, and a door. one area of the door, one area of the center console, one area of the head lining, one area of the sun visor, one area of the windshield or of the window It may be disposed in one area or the like.
입력부(210)는, 음성 입력부(211), 제스쳐 입력부(212), 터치 입력부(213) 및 기계식 입력부(214)를 포함할 수 있다.The input unit 210 may include a voice input unit 211, a gesture input unit 212, a touch input unit 213, and a mechanical input unit 214.
음성 입력부(211)는, 사용자의 음성 입력을 전기적 신호로 전환할 수 있다. 전환된 전기적 신호는, 프로세서(270) 또는 제어부(170)에 제공될 수 있다. 음성 입력부(211)는, 하나 이상의 마이크로 폰을 포함할 수 있다.The voice input unit 211 may convert a user's voice input into an electrical signal. The converted electrical signal may be provided to the processor 270 or the controller 170. The voice input unit 211 may include one or more microphones.
제스쳐 입력부(212)는, 사용자의 제스쳐 입력을 전기적 신호로 전환할 수 있다. 전환된 전기적 신호는, 프로세서(270) 또는 제어부(170)에 제공될 수 있다. 제스쳐 입력부(212)는, 사용자의 제스쳐 입력을 감지하기 위한 적외선 센서 및 이미지 센서 중 적어도 어느 하나를 포함할 수 있다.The gesture input unit 212 may convert a user's gesture input into an electrical signal. The converted electrical signal may be provided to the processor 270 or the controller 170. The gesture input unit 212 may include at least one of an infrared sensor and an image sensor for detecting a user's gesture input.
실시예에 따라, 제스쳐 입력부(212)는, 사용자의 3차원 제스쳐 입력을 감지할 수 있다. 이를 위해, 제스쳐 입력부(212)는, 복수의 적외선 광을 출력하는 광출력부 또는 복수의 이미지 센서를 포함할 수 있다. 제스쳐 입력부(212)는, TOF(Time of Flight) 방식, 구조광(Structured light) 방식 또는 디스패러티(Disparity) 방식을 통해 사용자의 3차원 제스쳐 입력을 감지할 수 있다.According to an embodiment, the gesture input unit 212 may detect a 3D gesture input of the user. To this end, the gesture input unit 212 may include a light output unit or a plurality of image sensors for outputting a plurality of infrared light. The gesture input unit 212 may detect a user's 3D gesture input through a time of flight (TOF) method, a structured light method, or a disparity method.
터치 입력부(213)는, 사용자의 터치 입력을 전기적 신호로 전환할 수 있다. 전환된 전기적 신호는 프로세서(270) 또는 제어부(170)에 제공될 수 있다. 터치 입력부(213)는, 사용자의 터치 입력을 감지하기 위한 터치 센서를 포함할 수 있다. 실시예에 따라, 터치 입력부(213)는 디스플레이부(251)와 일체형으로 형성됨으로써, 터치 스크린을 구현할 수 있다. 이러한, 터치 스크린은, 차량(100)과 사용자 사이의 입력 인터페이스 및 출력 인터페이스를 함께 제공할 수 있다.The touch input unit 213 may convert a user's touch input into an electrical signal. The converted electrical signal may be provided to the processor 270 or the controller 170. The touch input unit 213 may include a touch sensor for detecting a user's touch input. According to an embodiment, the touch input unit 213 may be integrally formed with the display unit 251 to implement a touch screen. Such a touch screen may provide an input interface and an output interface between the vehicle 100 and the user.
기계식 입력부(214)는, 버튼, 돔 스위치(dome switch), 조그 휠 및 조그 스위치 중 적어도 어느 하나를 포함할 수 있다. 기계식 입력부(214)에 의해 생성된 전기적 신호는, 프로세서(270) 또는 제어부(170)에 제공될 수 있다. 기계식 입력부(214)는, 스티어링 휠(steering wheel), 센터페시아(center fascia), 센터 콘솔(center console), 콕핏 모듈(cockpit module), 도어 등에 배치될 수 있다.The mechanical input unit 214 may include at least one of a button, a dome switch, a jog wheel, and a jog switch. The electrical signal generated by the mechanical input unit 214 may be provided to the processor 270 or the controller 170. The mechanical input unit 214 may be disposed on a steering wheel, a center fascia, a center console, a cockpit module, a door, or the like.
프로세서(270)는 앞서 설명한 음성 입력부(211), 제스쳐 입력부(212), 터치 입력부(213) 및 기계식 입력부(214) 중 적어도 하나에 대한 사용자 입력에 반응하여, 차량(100)의 학습 모드를 개시할 수 있다. 학습 모드에서 차량(100)은 차량(100)의 주행 경로 학습 및 주변 환경 학습을 수행할 수 있다. 학습 모드에 관해서는 이하 오브젝트 검출 장치(300) 및 운행 시스템(700)과 관련된 부분에서 상세히 설명하도록 한다.The processor 270 starts a learning mode of the vehicle 100 in response to user inputs to at least one of the voice input unit 211, the gesture input unit 212, the touch input unit 213, and the mechanical input unit 214 described above. can do. In the learning mode, the vehicle 100 may perform driving path learning and surrounding environment learning of the vehicle 100. The learning mode will be described in detail later with reference to the object detecting apparatus 300 and the driving system 700.
내부 카메라(220)는, 차량 내부 영상을 획득할 수 있다. 프로세서(270)는, 차량 내부 영상을 기초로, 사용자의 상태를 감지할 수 있다. 프로세서(270)는, 차량 내부 영상에서 사용자의 시선 정보를 획득할 수 있다. 프로세서(270)는, 차량 내부 영상에서 사용자의 제스쳐를 감지할 수 있다.The internal camera 220 may acquire a vehicle interior image. The processor 270 may detect a state of the user based on the vehicle interior image. The processor 270 may acquire the gaze information of the user from the vehicle interior image. The processor 270 may detect a gesture of the user in the vehicle interior image.
생체 감지부(230)는, 사용자의 생체 정보를 획득할 수 있다. 생체 감지부(230)는, 사용자의 생체 정보를 획득할 수 있는 센서를 포함하고, 센서를 이용하여, 사용자의 지문 정보, 심박동 정보 등을 획득할 수 있다. 생체 정보는 사용자 인증을 위해 이용될 수 있다.The biometric detector 230 may acquire biometric information of the user. The biometric detector 230 may include a sensor for acquiring biometric information of the user, and may acquire fingerprint information, heartbeat information, etc. of the user using the sensor. Biometric information may be used for user authentication.
출력부(250)는, 시각, 청각 또는 촉각 등과 관련된 출력을 발생시키기 위한 것이다. 출력부(250)는, 디스플레이부(251), 음향 출력부(252) 및 햅틱 출력부(253) 중 적어도 어느 하나를 포함할 수 있다.The output unit 250 is for generating output related to visual, auditory or tactile. The output unit 250 may include at least one of the display unit 251, the audio output unit 252, and the haptic output unit 253.
디스플레이부(251)는, 다양한 정보에 대응되는 그래픽 객체를 표시할 수 있다. 디스플레이부(251)는 액정 디스플레이(liquid crystal display, LCD), 박막 트랜지스터 액정 디스플레이(thin film transistor-liquid crystal display, TFT LCD), 유기 발광 다이오드(organic light-emitting diode, OLED), 플렉서블 디스플레이(flexible display), 3차원 디스플레이(3D display), 전자잉크 디스플레이(e-ink display) 중에서 적어도 하나를 포함할 수 있다.The display unit 251 may display graphic objects corresponding to various pieces of information. The display unit 251 is a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), a flexible display (flexible) display, a 3D display, or an e-ink display.
디스플레이부(251)는 터치 입력부(213)와 상호 레이어 구조를 이루거나 일체형으로 형성됨으로써, 터치 스크린을 구현할 수 있다. 디스플레이부(251)는 HUD(Head Up Display)로 구현될 수 있다. 디스플레이부(251)가 HUD로 구현되는 경우, 디스플레이부(251)는 투사 모듈을 구비하여 윈드 쉴드 또는 윈도우에 투사되는 이미지를 통해 정보를 출력할 수 있다. 디스플레이부(251)는, 투명 디스플레이를 포함할 수 있다. 투명 디스플레이는 윈드 쉴드 또는 윈도우에 부착될 수 있다. The display unit 251 forms a layer structure or is integrally formed with the touch input unit 213 to implement a touch screen. The display unit 251 may be implemented as a head up display (HUD). When the display unit 251 is implemented as a HUD, the display unit 251 may include a projection module to output information through an image projected on a wind shield or a window. The display unit 251 may include a transparent display. The transparent display can be attached to the wind shield or window.
투명 디스플레이는 소정의 투명도를 가지면서, 소정의 화면을 표시할 수 있다. 투명 디스플레이는, 투명도를 가지기 위해, 투명 디스플레이는 투명 TFEL(Thin Film Electroluminescent), 투명 OLED(Organic Light-Emitting Diode), 투명 LCD(Liquid Crystal Display), 투과형 투명디스플레이, 투명 LED(Light Emitting Diode) 디스플레이 중 적어도 하나를 포함할 수 있다. 투명 디스플레이의 투명도는 조절될 수 있다.The transparent display may display a predetermined screen while having a predetermined transparency. In order to have transparency, a transparent display is a transparent thin film electroluminescent (TFEL), a transparent organic light-emitting diode (OLED), a transparent liquid crystal display (LCD), a transmissive transparent display, a transparent light emitting diode (LED) display It may include at least one of. The transparency of the transparent display can be adjusted.
한편, 사용자 인터페이스 장치(200)는, 복수의 디스플레이부(251a 내지 251g)를 포함할 수 있다. The user interface device 200 may include a plurality of display units 251a to 251g.
디스플레이부(251)는, 스티어링 휠의 일 영역, 인스투루먼트 패널의 일 영역(251a, 251b, 251e), 시트의 일 영역(251d), 각 필러의 일 영역(251f), 도어의 일 영역(251g), 센타 콘솔의 일 영역, 헤드 라이닝의 일 영역, 썬바이저의 일 영역에 배치되거나, 윈드 쉴드의 일영역(251c), 윈도우의 일영역(251h)에 구현될 수 있다.The display unit 251 may include one region of the steering wheel, one region 251a, 251b, and 251e of the instrument panel, one region 251d of the seat, one region 251f of each pillar, and one region of the door ( 251g), one area of the center console, one area of the head lining, one area of the sun visor, or may be implemented in one area 251c of the windshield and one area 251h of the window.
음향 출력부(252)는, 프로세서(270) 또는 제어부(170)로부터 제공되는 전기 신호를 오디오 신호로 변환하여 출력한다. 이를 위해, 음향 출력부(252)는, 하나 이상의 스피커를 포함할 수 있다.The sound output unit 252 converts an electrical signal provided from the processor 270 or the controller 170 into an audio signal and outputs the audio signal. To this end, the sound output unit 252 may include one or more speakers.
햅틱 출력부(253)는, 촉각적인 출력을 발생시킨다. 예를 들면, 햅틱 출력부(253)는, 스티어링 휠, 안전 벨트, 시트(110FL, 110FR, 110RL, 110RR)를 진동시켜, 사용자가 출력을 인지할 수 있게 동작할 수 있다.The haptic output unit 253 generates a tactile output. For example, the haptic output unit 253 may vibrate the steering wheel, the seat belt, and the seats 110FL, 110FR, 110RL, and 110RR so that the user may recognize the output.
프로세서(270)는, 사용자 인터페이스 장치(200)의 각 유닛의 전반적인 동작을 제어할 수 있다. 실시예에 따라, 사용자 인터페이스 장치(200)는, 복수의 프로세서(270)를 포함하거나, 프로세서(270)를 포함하지 않을 수도 있다.The processor 270 may control the overall operation of each unit of the user interface device 200. According to an embodiment, the user interface device 200 may include a plurality of processors 270 or may not include the processor 270.
사용자 인터페이스 장치(200)에 프로세서(270)가 포함되지 않는 경우, 사용자 인터페이스 장치(200)는, 차량(100)내 다른 장치의 프로세서 또는 제어부(170)의 제어에 따라, 동작될 수 있다. 한편, 사용자 인터페이스 장치(200)는, 차량용 디스플레이 장치로 명명될 수 있다. 사용자 인터페이스 장치(200)는, 제어부(170)의 제어에 따라 동작될 수 있다.When the processor 270 is not included in the user interface device 200, the user interface device 200 may be operated under the control of the processor or the controller 170 of another device in the vehicle 100. The user interface device 200 may be referred to as a vehicle display device. The user interface device 200 may be operated under the control of the controller 170.
오브젝트 검출 장치(300)는, 차량(100) 외부에 위치하는 오브젝트를 검출하기 위한 장치이다. 오브젝트 검출 장치(300)는, 센싱 데이터에 기초하여, 오브젝트 정보를 생성할 수 있다. The object detecting apparatus 300 is a device for detecting an object located outside the vehicle 100. The object detecting apparatus 300 may generate object information based on the sensing data.
오브젝트 정보는, 오브젝트의 존재 유무에 대한 정보, 오브젝트의 위치 정보, 차량(100)과 오브젝트와의 거리 정보 및 차량(100)과 오브젝트와의 상대 속도 정보를 포함할 수 있다. 오브젝트는, 차량(100)의 운행과 관련된 다양한 물체들일 수 있다.The object information may include information about the presence or absence of the object, location information of the object, distance information between the vehicle 100 and the object, and relative speed information between the vehicle 100 and the object. The object may be various objects related to the driving of the vehicle 100.
도 5 내지 도 6을 참조하면, 오브젝트(O)는, 차선(OB10), 타 차량(OB11), 보행자(OB12), 이륜차(OB13), 교통 신호(OB14, OB15), 빛, 도로, 구조물, 과속 방지턱, 지형물, 동물 등을 포함할 수 있다.5 to 6, the object O includes a lane OB10, another vehicle OB11, a pedestrian OB12, a two-wheeled vehicle OB13, traffic signals OB14, OB15, light, a road, a structure, Speed bumps, features, animals and the like can be included.
차선(Lane)(OB10)은, 주행 차선, 주행 차선의 옆 차선, 대향되는 차량이 주행하는 차선일 수 있다. 차선(Lane)(OB10)은, 차선(Lane)을 형성하는 좌우측 선(Line)을 포함하는 개념일 수 있다.The lane OB10 may be a driving lane, a lane next to the driving lane, and a lane in which an opposite vehicle travels. The lane OB10 may be a concept including left and right lines forming a lane.
타 차량(OB11)은, 차량(100)의 주변에서 주행 중인 차량일 수 있다. 타 차량은, 차량(100)으로부터 소정 거리 이내에 위치하는 차량일 수 있다. 예를 들면, 타 차량(OB11)은, 차량(100)보다 선행 또는 후행하는 차량일 수 있다. The other vehicle OB11 may be a vehicle that is driving around the vehicle 100. The other vehicle may be a vehicle located within a predetermined distance from the vehicle 100. For example, the other vehicle OB11 may be a vehicle that precedes or follows the vehicle 100.
보행자(OB12)는, 차량(100)의 주변에 위치한 사람일 수 있다. 보행자(OB12)는, 차량(100)으로부터 소정 거리 이내에 위치하는 사람일 수 있다. 예를 들면, 보행자(OB12)는, 인도 또는 차도상에 위치하는 사람일 수 있다.The pedestrian OB12 may be a person located near the vehicle 100. The pedestrian OB12 may be a person located within a predetermined distance from the vehicle 100. For example, the pedestrian OB12 may be a person located on a sidewalk or a roadway.
이륜차(OB13)는, 차량(100)의 주변에 위치하고, 2개의 바퀴를 이용해 움직이는 탈것을 의미할 수 있다. 이륜차(OB13)는, 차량(100)으로부터 소정 거리 이내에 위치하는 2개의 바퀴를 가지는 탈 것일 수 있다. 예를 들면, 이륜차(OB13)는, 인도 또는 차도상에 위치하는 오토바이 또는 자전거일 수 있다.The two-wheeled vehicle OB13 may be a vehicle that is positioned around the vehicle 100 and moves using two wheels. The motorcycle OB13 may be a vehicle having two wheels located within a predetermined distance from the vehicle 100. For example, the motorcycle OB13 may be a motorcycle or a bicycle located on sidewalks or roadways.
교통 신호는, 교통 신호등(OB15), 교통 표지판(OB14), 도로 면에 그려진 문양 또는 텍스트를 포함할 수 있다. 빛은, 타 차량에 구비된 램프에서 생성된 빛일 수 있다. 빛은, 가로등에서 생성된 빛을 수 있다. 빛은 태양광일 수 있다. 도로는, 도로면, 커브, 오르막, 내리막 등의 경사 등을 포함할 수 있다. 구조물은, 도로 주변에 위치하고, 지면에 고정된 물체일 수 있다. 예를 들면, 구조물은, 가로등, 가로수, 건물, 전봇대, 신호등, 다리를 포함할 수 있다. 지형물은, 산, 언덕, 등을 포함할 수 있다.The traffic signal may include a traffic light OB15, a traffic sign OB14, and a pattern or text drawn on a road surface. The light may be light generated by a lamp provided in another vehicle. The light, can be light generated from the street light. The light may be sunlight. The road may include a road surface, a curve, an uphill slope, a slope downhill, or the like. The structure may be an object located around a road and fixed to the ground. For example, the structure may include a street lamp, a roadside tree, a building, a power pole, a traffic light, a bridge. The features may include mountains, hills, and the like.
한편, 오브젝트는, 이동 오브젝트와 고정 오브젝트로 분류될 수 있다. 예를 들면, 이동 오브젝트는, 타 차량, 보행자를 포함하는 개념일 수 있다. 예를 들면, 고정 오브젝트는, 교통 신호, 도로, 구조물을 포함하는 개념일 수 있다.On the other hand, the object may be classified into a moving object and a fixed object. For example, the moving object may be a concept including another vehicle and a pedestrian. For example, the fixed object may be a concept including a traffic signal, a road, and a structure.
오브젝트 검출 장치(300)는, 카메라(310), 레이다(320), 라이다(330), 초음파 센서(340), 적외선 센서(350) 및 프로세서(370)를 포함할 수 있다. 오브젝트 검출 장치(300)의 각 구성요소는 전술한 센싱부(120)와 구조적, 기능적으로 분리되거나 통합될 수 있다.The object detecting apparatus 300 may include a camera 310, a radar 320, a lidar 330, an ultrasonic sensor 340, an infrared sensor 350, and a processor 370. Each component of the object detecting apparatus 300 may be structurally and functionally separated or integrated with the sensing unit 120 described above.
실시예에 따라, 오브젝트 검출 장치(300)는, 설명되는 구성 요소 외에 다른 구성 요소를 더 포함하거나, 설명되는 구성 요소 중 일부를 포함하지 않을 수 있다.According to an embodiment, the object detecting apparatus 300 may further include other components in addition to the described components, or may not include some of the described components.
카메라(310)는, 차량 외부 영상을 획득하기 위해, 차량의 외부의 적절한 곳에 위치할 수 있다. 카메라(310)는, 모노 카메라, 스테레오 카메라(310a), AVM(Around View Monitoring) 카메라(310b) 또는 360도 카메라일 수 있다.The camera 310 may be located at a suitable place outside the vehicle to acquire an image outside the vehicle. The camera 310 may be a mono camera, a stereo camera 310a, an around view monitoring (AVM) camera 310b, or a 360 degree camera.
카메라(310)는, 다양한 영상 처리 알고리즘을 이용하여, 오브젝트의 위치 정보, 오브젝트와의 거리 정보 또는 오브젝트와의 상대 속도 정보를 획득할 수 있다. The camera 310 may acquire location information of the object, distance information with respect to the object, or relative speed information with the object by using various image processing algorithms.
예를 들면, 카메라(310)는, 획득된 영상에서, 시간에 따른 오브젝트 크기의 변화를 기초로, 오브젝트와의 거리 정보 및 상대 속도 정보를 획득할 수 있다. For example, the camera 310 may obtain distance information and relative speed information with respect to the object based on the change in the object size over time in the acquired image.
예를 들면, 카메라(310)는, 핀홀(pin hole) 모델, 노면 프로파일링 등을 통해, 오브젝트와의 거리 정보 및 상대 속도 정보를 획득할 수 있다.For example, the camera 310 may acquire distance information and relative velocity information with respect to an object through a pin hole model, road surface profiling, or the like.
예를 들면, 카메라(310)는, 스테레오 카메라(310a)에서 획득된 스테레오 영상에서 디스패러티(disparity) 정보를 기초로 오브젝트와의 거리 정보 및 상대 속도 정보를 획득할 수 있다.For example, the camera 310 may obtain distance information and relative speed information with respect to the object based on the disparity information in the stereo image acquired by the stereo camera 310a.
예를 들면, 카메라(310)는, 차량 전방의 영상을 획득하기 위해, 차량의 실내에서, 프런트 윈드 쉴드에 근접하게 배치될 수 있다. 또는, 카메라(310)는, 프런트 범퍼 또는 라디에이터 그릴 주변에 배치될 수 있다.For example, the camera 310 may be disposed in close proximity to the front windshield in the interior of the vehicle in order to acquire an image in front of the vehicle. Alternatively, the camera 310 may be disposed around the front bumper or the radiator grille.
예를 들면, 카메라(310)는, 차량 후방의 영상을 획득하기 위해, 차량의 실내에서, 리어 글라스에 근접하게 배치될 수 있다. 또는, 카메라(310)는, 리어 범퍼, 트렁크 또는 테일 게이트 주변에 배치될 수 있다.For example, the camera 310 may be disposed in close proximity to the rear glass in the interior of the vehicle to acquire an image of the rear of the vehicle. Alternatively, the camera 310 may be disposed around the rear bumper, the trunk, or the tail gate.
예를 들면, 카메라(310)는, 차량 측방의 영상을 획득하기 위해, 차량의 실내에서 사이드 윈도우 중 적어도 어느 하나에 근접하게 배치될 수 있다. 또는, 카메라(310)는, 사이드 미러, 휀더 또는 도어 주변에 배치될 수 있다.For example, the camera 310 may be disposed in close proximity to at least one of the side windows in the interior of the vehicle to acquire an image of the vehicle side. Alternatively, the camera 310 may be arranged around the side mirror, fender or door.
카메라(310)는, 획득된 영상을 프로세서(370)에 제공할 수 있다. The camera 310 may provide the obtained image to the processor 370.
레이다(320)는, 전자파 송신부, 수신부를 포함할 수 있다. 레이다(320)는 전파 발사 원리상 펄스 레이다(Pulse Radar) 방식 또는 연속파 레이다(Continuous Wave Radar) 방식으로 구현될 수 있다. 레이다(320)는 연속파 레이다 방식 중에서 신호 파형에 따라 FMCW(Frequency Modulated Continuous Wave)방식 또는 FSK(Frequency Shift Keying) 방식으로 구현될 수 있다.The radar 320 may include an electromagnetic wave transmitter and a receiver. The radar 320 may be implemented in a pulse radar method or a continuous wave radar method in terms of radio wave firing principle. The radar 320 may be implemented by a frequency modulated continuous wave (FSCW) method or a frequency shift keying (FSK) method according to a signal waveform among continuous wave radar methods.
레이다(320)는 전자파를 매개로, TOF(Time of Flight) 방식 또는 페이즈 쉬프트(phase-shift) 방식에 기초하여, 오브젝트를 검출하고, 검출된 오브젝트의 위치, 검출된 오브젝트와의 거리 및 상대 속도를 검출할 수 있다. The radar 320 detects an object based on a time of flight (TOF) method or a phase-shift method based on electromagnetic waves, and detects the position of the detected object, distance to the detected object, and relative velocity. Can be detected.
레이다(320)는, 차량의 전방, 후방 또는 측방에 위치하는 오브젝트를 감지하기 위해 차량의 외부의 적절한 위치에 배치될 수 있다. The radar 320 may be disposed at an appropriate position outside the vehicle to detect an object located in front, rear, or side of the vehicle.
라이다(330)는, 레이저 송신부, 수신부를 포함할 수 있다. 라이다(330)는, TOF(Time of Flight) 방식 또는 페이즈 쉬프트(phase-shift) 방식으로 구현될 수 있다. The lidar 330 may include a laser transmitter and a receiver. The lidar 330 may be implemented in a time of flight (TOF) method or a phase-shift method.
라이다(330)는, 구동식 또는 비구동식으로 구현될 수 있다. 구동식으로 구현되는 경우, 라이다(330)는, 모터에 의해 회전되며, 차량(100) 주변의 오브젝트를 검출할 수 있다. 비구동식으로 구현되는 경우, 라이다(330)는, 광 스티어링에 의해, 차량(100)을 기준으로 소정 범위 내에 위치하는 오브젝트를 검출할 수 있다. 차량(100)은 복수의 비구동식 라이다(330)를 포함할 수 있다.The lidar 330 may be implemented as driven or non-driven. When implemented in a driving manner, the lidar 330 may be rotated by a motor and detect an object around the vehicle 100. When implemented in a non-driven manner, the lidar 330 may detect an object located within a predetermined range with respect to the vehicle 100 by optical steering. The vehicle 100 may include a plurality of non-driven lidars 330.
라이다(330)는, 레이저 광 매개로, TOF(Time of Flight) 방식 또는 페이즈 쉬프트(phase-shift) 방식에 기초하여, 오브젝트를 검출하고, 검출된 오브젝트의 위치, 검출된 오브젝트와의 거리 및 상대 속도를 검출할 수 있다. 라이다(330)는, 차량의 전방, 후방 또는 측방에 위치하는 오브젝트를 감지하기 위해 차량의 외부의 적절한 위치에 배치될 수 있다.The lidar 330 detects an object based on a time of flight (TOF) method or a phase-shift method using laser light, and detects an object, a position of the detected object, a distance from the detected object, and Relative speed can be detected. The lidar 330 may be disposed at an appropriate position outside the vehicle to detect an object located in front, rear, or side of the vehicle.
초음파 센서(340)는, 초음파 송신부, 수신부를 포함할 수 있다. 초음파 센서(340)은, 초음파를 기초로 오브젝트를 검출하고, 검출된 오브젝트의 위치, 검출된 오브젝트와의 거리 및 상대 속도를 검출할 수 있다. 초음파 센서(340)는, 차량의 전방, 후방 또는 측방에 위치하는 오브젝트를 감지하기 위해 차량의 외부의 적절한 위치에 배치될 수 있다.The ultrasonic sensor 340 may include an ultrasonic transmitter and a receiver. The ultrasonic sensor 340 may detect an object based on the ultrasonic wave, and detect a position of the detected object, a distance to the detected object, and a relative speed. The ultrasonic sensor 340 may be disposed at an appropriate position outside the vehicle to detect an object located in front, rear, or side of the vehicle.
적외선 센서(350)는, 적외선 송신부, 수신부를 포함할 수 있다. 적외선 센서(340)는, 적외선 광을 기초로 오브젝트를 검출하고, 검출된 오브젝트의 위치, 검출된 오브젝트와의 거리 및 상대 속도를 검출할 수 있다. 적외선 센서(350)는, 차량의 전방, 후방 또는 측방에 위치하는 오브젝트를 감지하기 위해 차량의 외부의 적절한 위치에 배치될 수 있다.The infrared sensor 350 may include an infrared transmitter and a receiver. The infrared sensor 340 may detect an object based on infrared light, and detect a position of the detected object, a distance to the detected object, and a relative speed. The infrared sensor 350 may be disposed at an appropriate position outside the vehicle to detect an object located in front, rear, or side of the vehicle.
프로세서(370)는, 오브젝트 검출 장치(300)의 각 유닛의 전반적인 동작을 제어할 수 있다. 프로세서(370)는, 카메라(310, 레이다(320), 라이다(330), 초음파 센서(340) 및 적외선 센서(350)에 의해 센싱된 데이터와 기 저장된 데이터를 비교하여, 오브젝트를 검출하거나 분류할 수 있다.The processor 370 may control overall operations of each unit of the object detecting apparatus 300. The processor 370 compares the data sensed by the camera 310, the radar 320, the lidar 330, the ultrasonic sensor 340, and the infrared sensor 350 with previously stored data to detect or classify an object. can do.
프로세서(370)는, 획득된 영상에 기초하여, 오브젝트를 검출하고, 트래킹할 수 있다. 프로세서(370)는, 영상 처리 알고리즘을 통해, 오브젝트와의 거리 산출, 오브젝트와의 상대 속도 산출 등의 동작을 수행할 수 있다.The processor 370 may detect and track the object based on the obtained image. The processor 370 may perform operations such as calculating a distance to an object and calculating a relative speed with the object through an image processing algorithm.
예를 들면, 프로세서(370)는, 획득된 영상에서, 시간에 따른 오브젝트 크기의 변화를 기초로, 오브젝트와의 거리 정보 및 상대 속도 정보를 획득할 수 있다. For example, the processor 370 may acquire distance information and relative speed information with respect to the object based on the change in the object size over time in the obtained image.
예를 들면, 프로세서(370)는, 핀홀(pin hole) 모델, 노면 프로파일링 등을 통해, 오브젝트와의 거리 정보 및 상대 속도 정보를 획득할 수 있다.For example, the processor 370 may acquire distance information and relative velocity information with respect to an object through a pin hole model, road surface profiling, or the like.
예를 들면, 프로세서(370)는, 스테레오 카메라(310a)에서 획득된 스테레오 영상에서 디스패러티(disparity) 정보를 기초로 오브젝트와의 거리 정보 및 상대 속도 정보를 획득할 수 있다.For example, the processor 370 may obtain distance information and relative speed information with the object based on the disparity information in the stereo image acquired by the stereo camera 310a.
프로세서(370)는, 송신된 전자파가 오브젝트에 반사되어 되돌아오는 반사 전자파에 기초하여, 오브젝트를 검출하고, 트래킹할 수 있다. 프로세서(370)는, 전자파에 기초하여, 오브젝트와의 거리 산출, 오브젝트와의 상대 속도 산출 등의 동작을 수행할 수 있다.The processor 370 may detect and track the object based on the reflected electromagnetic wave reflected by the transmitted electromagnetic wave to the object. The processor 370 may perform an operation such as calculating a distance from the object, calculating a relative speed with the object, and the like based on the electromagnetic waves.
프로세서(370)는, 송신된 레이저가 오브젝트에 반사되어 되돌아오는 반사 레이저 광에 기초하여, 오브젝트를 검출하고, 트래킹할 수 있다. 프로세서(370)는, 레이저 광에 기초하여, 오브젝트와의 거리 산출, 오브젝트와의 상대 속도 산출 등의 동작을 수행할 수 있다.The processor 370 may detect and track the object based on the reflected laser light reflected by the transmitted laser back to the object. The processor 370 may perform an operation such as calculating a distance from the object, calculating a relative speed with the object, and the like based on the laser light.
프로세서(370)는, 송신된 초음파가 오브젝트에 반사되어 되돌아오는 반사 초음파에 기초하여, 오브젝트를 검출하고, 트래킹할 수 있다. 프로세서(370)는, 초음파에 기초하여, 오브젝트와의 거리 산출, 오브젝트와의 상대 속도 산출 등의 동작을 수행할 수 있다.The processor 370 may detect and track the object based on the reflected ultrasound, in which the transmitted ultrasound is reflected by the object and returned. The processor 370 may perform an operation such as calculating a distance from the object, calculating a relative speed with the object, and the like based on the ultrasound.
프로세서(370)는, 송신된 적외선 광이 오브젝트에 반사되어 되돌아오는 반사 적외선 광에 기초하여, 오브젝트를 검출하고, 트래킹할 수 있다. 프로세서(370)는, 적외선 광에 기초하여, 오브젝트와의 거리 산출, 오브젝트와의 상대 속도 산출 등의 동작을 수행할 수 있다.The processor 370 may detect and track the object based on the reflected infrared light from which the transmitted infrared light is reflected back to the object. The processor 370 may perform an operation such as calculating a distance to the object, calculating a relative speed with the object, and the like based on the infrared light.
앞서 설명한 바와 같이, 입력부(210)에 대한 사용자 입력에 반응하여 차량(100)의 학습 모드가 개시되면, 프로세서(370)는 카메라(310), 레이다(320), 라이다(330), 초음파 센서(340) 및 적외선 센서(350)에 의해 센싱된 데이터를 메모리(140)에 저장할 수 있다.As described above, when the learning mode of the vehicle 100 is started in response to a user input to the input unit 210, the processor 370 may include a camera 310, a radar 320, a lidar 330, and an ultrasonic sensor. The data sensed by the 340 and the infrared sensor 350 may be stored in the memory 140.
저장된 데이터의 분석을 기초로 한 학습 모드의 각 단계와 학습 모드에 후행하는 동작 모드에 대해서는 이하 운행 시스템(700)과 관련된 부분에서 상세히 설명하도록 한다.실시예에 따라, 오브젝트 검출 장치(300)는, 복수의 프로세서(370)를 포함하거나, 프로세서(370)를 포함하지 않을 수도 있다. 예를 들면, 카메라(310), 레이다(320), 라이다(330), 초음파 센서(340) 및 적외선 센서(350) 각각은 개별적으로 프로세서를 포함할 수 있다.Each step of the learning mode based on the analysis of the stored data and an operation mode following the learning mode will be described in detail with reference to the operation system 700. According to an embodiment, the object detecting apparatus 300 In addition, the processor 370 may or may not include the processor 370. For example, each of the camera 310, the radar 320, the lidar 330, the ultrasonic sensor 340, and the infrared sensor 350 may individually include a processor.
오브젝트 검출 장치(300)에 프로세서(370)가 포함되지 않는 경우, 오브젝트 검출 장치(300)는, 차량(100)내 장치의 프로세서 또는 제어부(170)의 제어에 따라, 동작될 수 있다. 오브젝트 검출 장치(300)는, 제어부(170)의 제어에 따라 동작될 수 있다.When the processor 370 is not included in the object detecting apparatus 300, the object detecting apparatus 300 may be operated under the control of the processor or the controller 170 of the apparatus in the vehicle 100. The object detecting apparatus 300 may be operated under the control of the controller 170.
통신 장치(400)는, 외부 디바이스와 통신을 수행하기 위한 장치이다. 여기서, 외부 디바이스는, 타 차량, 이동 단말기 또는 서버일 수 있다. 통신 장치(400)는, 통신을 수행하기 위해 송신 안테나, 수신 안테나, 각종 통신 프로토콜이 구현 가능한 RF(Radio Frequency) 회로 및 RF 소자 중 적어도 어느 하나를 포함할 수 있다.The communication device 400 is a device for performing communication with an external device. Here, the external device may be another vehicle, a mobile terminal or a server. The communication device 400 may include at least one of a transmit antenna, a receive antenna, a radio frequency (RF) circuit capable of implementing various communication protocols, and an RF element to perform communication.
통신 장치(400)는, 근거리 통신부(410), 위치 정보부(420), V2X 통신부(430), 광통신부(440), 방송 송수신부(450), ITS(Intelligent Transport Systems) 통신부(460) 및 프로세서(470)를 포함할 수 있다. 실시예에 따라, 통신 장치(400)는, 설명되는 구성 요소 외에 다른 구성 요소를 더 포함하거나, 설명되는 구성 요소 중 일부를 포함하지 않을 수 있다.The communication device 400 includes a short range communication unit 410, a location information unit 420, a V2X communication unit 430, an optical communication unit 440, a broadcast transmission / reception unit 450, an ITS (Intelligent Transport Systems) communication unit 460, and a processor. 470 may include. According to an embodiment, the communication device 400 may further include other components in addition to the described components, or may not include some of the described components.
근거리 통신부(410)는, 근거리 통신(Short range communication)을 위한 유닛이다. 근거리 통신부(410)는, 블루투스(Bluetooth™), RFID(Radio Frequency Identification), 적외선 통신(Infrared Data Association; IrDA), UWB(Ultra Wideband), ZigBee, NFC(Near Field Communication), Wi-Fi(Wireless-Fidelity), Wi-Fi Direct, Wireless USB(Wireless Universal Serial Bus) 기술 중 적어도 하나를 이용하여, 근거리 통신을 지원할 수 있다. 근거리 통신부(410)는, 근거리 무선 통신망(Wireless Area Networks)을 형성하여, 차량(100)과 적어도 하나의 외부 디바이스 사이의 근거리 통신을 수행할 수 있다.The short range communication unit 410 is a unit for short range communication. The local area communication unit 410 may include Bluetooth ™, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, Near Field Communication (NFC), and Wi-Fi (Wireless). Local area communication may be supported using at least one of Fidelity, Wi-Fi Direct, and Wireless Universal Serial Bus (USB) technologies. The short range communication unit 410 may form short range wireless networks to perform short range communication between the vehicle 100 and at least one external device.
위치 정보부(420)는, 차량(100)의 위치 정보를 획득하기 위한 유닛이다. 예를 들면, 위치 정보부(420)는, GPS(Global Positioning System) 모듈 또는 DGPS(Differential Global Positioning System) 모듈을 포함할 수 있다.The location information unit 420 is a unit for obtaining location information of the vehicle 100. For example, the location information unit 420 may include a global positioning system (GPS) module or a differential global positioning system (DGPS) module.
V2X 통신부(430)는, 서버(V2I : Vehicle to Infra), 타 차량(V2V : Vehicle to Vehicle) 또는 보행자(V2P : Vehicle to Pedestrian)와의 무선 통신 수행을 위한 유닛이다. V2X 통신부(430)는, 인프라와의 통신(V2I), 차량간 통신(V2V), 보행자와의 통신(V2P) 프로토콜이 구현 가능한 RF 회로를 포함할 수 있다.The V2X communication unit 430 is a unit for performing wireless communication with a server (V2I: Vehicle to Infra), another vehicle (V2V: Vehicle to Vehicle), or a pedestrian (V2P: Vehicle to Pedestrian). The V2X communication unit 430 may include an RF circuit that can implement a communication with the infrastructure (V2I), an inter-vehicle communication (V2V), and a communication with the pedestrian (V2P).
광통신부(440)는, 광을 매개로 외부 디바이스와 통신을 수행하기 위한 유닛이다. 광통신부(440)는, 전기 신호를 광 신호로 전환하여 외부에 발신하는 광발신부 및 수신된 광 신호를 전기 신호로 전환하는 광수신부를 포함할 수 있다. 실시예에 따라, 광발신부는, 차량(100)에 포함된 램프와 일체화되게 형성될 수 있다.The optical communication unit 440 is a unit for performing communication with an external device via light. The optical communication unit 440 may include an optical transmitter that converts an electrical signal into an optical signal and transmits the external signal to the outside, and an optical receiver that converts the received optical signal into an electrical signal. According to an embodiment, the light emitting unit may be formed to be integrated with the lamp included in the vehicle 100.
방송 송수신부(450)는, 방송 채널을 통해, 외부의 방송 관리 서버로부터 방송 신호를 수신하거나, 방송 관리 서버에 방송 신호를 송출하기 위한 유닛이다. 방송 채널은, 위성 채널, 지상파 채널을 포함할 수 있다. 방송 신호는, TV 방송 신호, 라디오 방송 신호, 데이터 방송 신호를 포함할 수 있다.The broadcast transceiver 450 is a unit for receiving a broadcast signal from an external broadcast management server or transmitting a broadcast signal to a broadcast management server through a broadcast channel. The broadcast channel may include a satellite channel and a terrestrial channel. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, and a data broadcast signal.
ITS 통신부(460)는, 교통 시스템과 정보, 데이터 또는 신호를 교환할 수 있다. ITS 통신부(460)는, 교통 시스템에 획득한 정보, 데이터를 제공할 수 있다. ITS 통신부(460)는, 교통 시스템으로부터, 정보, 데이터 또는 신호를 제공받을 수 있다. 예를 들면, ITS 통신부(460)는, 교통 시스템으로부터 도로 교통 정보를 수신하여, 제어부(170)에 제공할 수 있다. 예를 들면, ITS 통신부(460)는, 교통 시스템으로부터 제어 신호를 수신하여, 제어부(170) 또는 차량(100) 내부에 구비된 프로세서에 제공할 수 있다.The ITS communication unit 460 may exchange information, data, or signals with the traffic system. The ITS communication unit 460 may provide the obtained information and data to the transportation system. The ITS communication unit 460 may receive information, data, or a signal from a traffic system. For example, the ITS communication unit 460 may receive road traffic information from the traffic system and provide the road traffic information to the control unit 170. For example, the ITS communication unit 460 may receive a control signal from a traffic system and provide the control signal to a processor provided in the controller 170 or the vehicle 100.
프로세서(470)는, 통신 장치(400)의 각 유닛의 전반적인 동작을 제어할 수 있다. 실시예에 따라, 통신 장치(400)는, 복수의 프로세서(470)를 포함하거나, 프로세서(470)를 포함하지 않을 수도 있다. 통신 장치(400)에 프로세서(470)가 포함되지 않는 경우, 통신 장치(400)는, 차량(100)내 다른 장치의 프로세서 또는 제어부(170)의 제어에 따라, 동작될 수 있다.The processor 470 may control the overall operation of each unit of the communication device 400. According to an embodiment, the communication device 400 may include a plurality of processors 470 or may not include the processor 470. When the processor 470 is not included in the communication device 400, the communication device 400 may be operated under the control of the processor or the controller 170 of another device in the vehicle 100.
한편, 통신 장치(400)는, 사용자 인터페이스 장치(200)와 함께 차량용 디스플레이 장치를 구현할 수 있다. 이 경우, 차량용 디스플레이 장치는, 텔레 매틱스(telematics) 장치 또는 AVN(Audio Video Navigation) 장치로 명명될 수 있다. 통신 장치(400)는, 제어부(170)의 제어에 따라 동작될 수 있다.Meanwhile, the communication device 400 may implement a vehicle display device together with the user interface device 200. In this case, the vehicle display device may be called a telematics device or an audio video navigation (AVN) device. The communication device 400 may be operated under the control of the controller 170.
운전 조작 장치(500)는, 운전을 위한 사용자 입력을 수신하는 장치이다. 메뉴얼 모드인 경우, 차량(100)은, 운전 조작 장치(500)에 의해 제공되는 신호에 기초하여 운행될 수 있다. 운전 조작 장치(500)는, 조향 입력 장치(510), 가속 입력 장치(530) 및 브레이크 입력 장치(570)를 포함할 수 있다.The driving operation apparatus 500 is a device that receives a user input for driving. In the manual mode, the vehicle 100 may be driven based on a signal provided by the driving manipulation apparatus 500. The driving manipulation apparatus 500 may include a steering input apparatus 510, an acceleration input apparatus 530, and a brake input apparatus 570.
조향 입력 장치(510)는, 사용자로부터 차량(100)의 진행 방향 입력을 수신할 수 있다. 조향 입력 장치(510)는, 회전에 의해 조향 입력이 가능하도록 휠 형태로 형성되는 것이 바람직하다. 실시예에 따라, 조향 입력 장치는, 터치 스크린, 터치 패드 또는 버튼 형태로 형성될 수도 있다.The steering input device 510 may receive a driving direction input of the vehicle 100 from the user. The steering input device 510 is preferably formed in a wheel shape to enable steering input by rotation. According to an embodiment, the steering input device may be formed in the form of a touch screen, a touch pad, or a button.
가속 입력 장치(530)는, 사용자로부터 차량(100)의 가속을 위한 입력을 수신할 수 있다. 브레이크 입력 장치(570)는, 사용자로부터 차량(100)의 감속을 위한 입력을 수신할 수 있다. 가속 입력 장치(530) 및 브레이크 입력 장치(570)는, 페달 형태로 형성되는 것이 바람직하다. 실시예에 따라, 가속 입력 장치 또는 브레이크 입력 장치는, 터치 스크린, 터치 패드 또는 버튼 형태로 형성될 수도 있다.The acceleration input device 530 may receive an input for accelerating the vehicle 100 from a user. The brake input device 570 may receive an input for deceleration of the vehicle 100 from a user. The acceleration input device 530 and the brake input device 570 are preferably formed in the form of a pedal. According to an embodiment, the acceleration input device or the brake input device may be formed in the form of a touch screen, a touch pad, or a button.
운전 조작 장치(500)는, 제어부(170)의 제어에 따라 동작될 수 있다.The driving manipulation apparatus 500 may be operated under the control of the controller 170.
차량 구동 장치(600)는, 차량(100)내 각종 장치의 구동을 전기적으로 제어하는 장치이다. 차량 구동 장치(600)는, 파워 트레인 구동부(610), 샤시 구동부(620), 도어/윈도우 구동부(630), 안전 장치 구동부(640), 램프 구동부(650) 및 공조 구동부(660)를 포함할 수 있다. 실시예에 따라, 차량 구동 장치(600)는, 설명되는 구성 요소 외에 다른 구성 요소를 더 포함하거나, 설명되는 구성 요소 중 일부를 포함하지 않을 수 있다. 한편, 차량 구동 장치(600)는 프로세서를 포함할 수 있다. 차량 구동 장치(600)의 각 유닛은, 각각 개별적으로 프로세서를 포함할 수 있다. The vehicle drive device 600 is a device that electrically controls the driving of various devices in the vehicle 100. The vehicle driving apparatus 600 may include a power train driver 610, a chassis driver 620, a door / window driver 630, a safety device driver 640, a lamp driver 650, and an air conditioning driver 660. Can be. According to an embodiment, the vehicle driving apparatus 600 may further include other components in addition to the described components, or may not include some of the described components. On the other hand, the vehicle driving device 600 may include a processor. Each unit of the vehicle drive apparatus 600 may each include a processor individually.
파워 트레인 구동부(610)는, 파워 트레인 장치의 동작을 제어할 수 있다. 파워 트레인 구동부(610)는, 동력원 구동부(611) 및 변속기 구동부(612)를 포함할 수 있다.The power train driver 610 may control the operation of the power train device. The power train driver 610 may include a power source driver 611 and a transmission driver 612.
동력원 구동부(611)는, 차량(100)의 동력원에 대한 제어를 수행할 수 있다. 예를 들면, 화석 연료 기반의 엔진이 동력원인 경우, 동력원 구동부(610)는, 엔진에 대한 전자식 제어를 수행할 수 있다. 이에 의해, 엔진의 출력 토크 등을 제어할 수 있다. 동력원 구동부(611)는, 제어부(170)의 제어에 따라, 엔진 출력 토크를 조정할 수 있다.The power source driver 611 may control the power source of the vehicle 100. For example, when the fossil fuel-based engine is a power source, the power source driver 610 may perform electronic control of the engine. Thereby, the output torque of an engine, etc. can be controlled. The power source drive unit 611 can adjust the engine output torque under the control of the control unit 170.
예를 들면, 전기 에너지 기반의 모터가 동력원인 경우, 동력원 구동부(610)는, 모터에 대한 제어를 수행할 수 있다. 동력원 구동부(610)는, 제어부(170)의 제어에 따라, 모터의 회전 속도, 토크 등을 조정할 수 있다.For example, when the electric energy based motor is a power source, the power source driver 610 may control the motor. The power source driver 610 may adjust the rotational speed, torque, and the like of the motor under the control of the controller 170.
변속기 구동부(612)는, 변속기에 대한 제어를 수행할 수 있다. 변속기 구동부(612)는, 변속기의 상태를 조정할 수 있다. 변속기 구동부(612)는, 변속기의 상태를, 전진(D), 후진(R), 중립(N) 또는 주차(P)로 조정할 수 있다. 한편, 엔진이 동력원인 경우, 변속기 구동부(612)는, 전진(D) 상태에서, 기어의 물림 상태를 조정할 수 있다.The transmission driver 612 may control the transmission. The transmission driver 612 can adjust the state of the transmission. The transmission drive part 612 can adjust the state of a transmission to forward D, backward R, neutral N, or parking P. FIG. On the other hand, when the engine is a power source, the transmission drive unit 612 can adjust the bite state of the gear in the forward D state.
샤시 구동부(620)는, 샤시 장치의 동작을 제어할 수 있다. 샤시 구동부(620)는, 조향 구동부(621), 브레이크 구동부(622) 및 서스펜션 구동부(623)를 포함할 수 있다.The chassis driver 620 may control the operation of the chassis device. The chassis driver 620 may include a steering driver 621, a brake driver 622, and a suspension driver 623.
조향 구동부(621)는, 차량(100) 내의 조향 장치(steering apparatus)에 대한 전자식 제어를 수행할 수 있다. 조향 구동부(621)는, 차량의 진행 방향을 변경할 수 있다.The steering driver 621 may perform electronic control of a steering apparatus in the vehicle 100. The steering driver 621 may change the traveling direction of the vehicle.
브레이크 구동부(622)는, 차량(100) 내의 브레이크 장치(brake apparatus)에 대한 전자식 제어를 수행할 수 있다. 예를 들면, 바퀴에 배치되는 브레이크의 동작을 제어하여, 차량(100)의 속도를 줄일 수 있다. The brake driver 622 may perform electronic control of a brake apparatus in the vehicle 100. For example, the speed of the vehicle 100 may be reduced by controlling the operation of the brake disposed on the wheel.
한편, 브레이크 구동부(622)는, 복수의 브레이크 각각을 개별적으로 제어할 수 있다. 브레이크 구동부(622)는, 복수의 휠에 걸리는 제동력을 서로 다르게 제어할 수 있다.On the other hand, the brake drive unit 622 can individually control each of the plurality of brakes. The brake driver 622 may control the braking force applied to the plurality of wheels differently.
서스펜션 구동부(623)는, 차량(100) 내의 서스펜션 장치(suspension apparatus)에 대한 전자식 제어를 수행할 수 있다. 예를 들면, 서스펜션 구동부(623)는 도로 면에 굴곡이 있는 경우, 서스펜션 장치를 제어하여, 차량(100)의 진동이 저감되도록 제어할 수 있다. 한편, 서스펜션 구동부(623)는, 복수의 서스펜션 각각을 개별적으로 제어할 수 있다.The suspension driver 623 may perform electronic control of a suspension apparatus in the vehicle 100. For example, when there is a curvature on the road surface, the suspension driver 623 may control the suspension device to control the vibration of the vehicle 100 to be reduced. Meanwhile, the suspension driver 623 may individually control each of the plurality of suspensions.
도어/윈도우 구동부(630)는, 차량(100) 내의 도어 장치(door apparatus) 또는 윈도우 장치(window apparatus)에 대한 전자식 제어를 수행할 수 있다. 도어/윈도우 구동부(630)는, 도어 구동부(631) 및 윈도우 구동부(632)를 포함할 수 있다.The door / window driver 630 may perform electronic control of a door apparatus or a window apparatus in the vehicle 100. The door / window driver 630 may include a door driver 631 and a window driver 632.
도어 구동부(631)는, 도어 장치에 대한 제어를 수행할 수 있다. 도어 구동부(631)는, 차량(100)에 포함되는 복수의 도어의 개방, 폐쇄를 제어할 수 있다. 도어 구동부(631)는, 트렁크(trunk) 또는 테일 게이트(tail gate)의 개방 또는 폐쇄를 제어할 수 있다. 도어 구동부(631)는, 썬루프(sunroof)의 개방 또는 폐쇄를 제어할 수 있다.The door driver 631 may control the door apparatus. The door driver 631 may control opening and closing of the plurality of doors included in the vehicle 100. The door driver 631 may control the opening or closing of a trunk or a tail gate. The door driver 631 may control the opening or closing of the sunroof.
윈도우 구동부(632)는, 윈도우 장치(window apparatus)에 대한 전자식 제어를 수행할 수 있다. 차량(100)에 포함되는 복수의 윈도우의 개방 또는 폐쇄를 제어할 수 있다.The window driver 632 may perform electronic control of the window apparatus. The opening or closing of the plurality of windows included in the vehicle 100 may be controlled.
안전 장치 구동부(640)는, 차량(100) 내의 각종 안전 장치(safety apparatus)에 대한 전자식 제어를 수행할 수 있다. 안전 장치 구동부(640)는, 에어백 구동부(641), 시트벨트 구동부(642) 및 보행자 보호 장치 구동부(643)를 포함할 수 있다.The safety device driver 640 may perform electronic control of various safety apparatuses in the vehicle 100. The safety device driver 640 may include an airbag driver 641, a seat belt driver 642, and a pedestrian protection device driver 643.
에어백 구동부(641)는, 차량(100) 내의 에어백 장치(airbag apparatus)에 대한 전자식 제어를 수행할 수 있다. 예를 들면, 에어백 구동부(641)는, 위험 감지시, 에어백이 전개되도록 제어할 수 있다.The airbag driver 641 may perform electronic control of an airbag apparatus in the vehicle 100. For example, the airbag driver 641 may control the airbag to be deployed when the danger is detected.
시트벨트 구동부(642)는, 차량(100) 내의 시트벨트 장치(seatbelt apparatus)에 대한 전자식 제어를 수행할 수 있다. 예를 들면, 시트벨트 구동부(642)는, 위험 감지 시, 시트 벨트를 이용해 탑승객이 시트(110FL, 110FR, 110RL, 110RR)에 고정되도록 제어할 수 있다.The seat belt driver 642 may perform electronic control of a seatbelt apparatus in the vehicle 100. For example, the seat belt driver 642 may control the passenger to be fixed to the seats 110FL, 110FR, 110RL, and 110RR by using the seat belt when detecting a danger.
보행자 보호 장치 구동부(643)는, 후드 리프트 및 보행자 에어백에 대한 전자식 제어를 수행할 수 있다. 예를 들면, 보행자 보호 장치 구동부(643)는, 보행자와의 충돌 감지 시, 후드 리프트 업 및 보행자 에어백 전개되도록 제어할 수 있다.The pedestrian protection device driver 643 may perform electronic control of the hood lift and the pedestrian airbag. For example, the pedestrian protection device driver 643 may control the hood lift up and the pedestrian air bag to be deployed when the collision with the pedestrian is detected.
램프 구동부(650)는, 차량(100) 내의 각종 램프 장치(lamp apparatus)에 대한 전자식 제어를 수행할 수 있다.The lamp driver 650 may perform electronic control of various lamp apparatuses in the vehicle 100.
공조 구동부(660)는, 차량(100) 내의 공조 장치(air conditioner)에 대한 전자식 제어를 수행할 수 있다. 예를 들면, 공조 구동부(660)는, 차량 내부의 온도가 높은 경우, 공조 장치가 동작하여, 냉기가 차량 내부로 공급되도록 제어할 수 있다.The air conditioning driver 660 may perform electronic control of an air conditioner in the vehicle 100. For example, when the temperature inside the vehicle is high, the air conditioning driver 660 may control the air conditioning apparatus to operate to supply cool air to the inside of the vehicle.
차량 구동 장치(600)는, 프로세서를 포함할 수 있다. 차량 구동 장치(600)의 각 유닛은, 각각 개별적으로 프로세서를 포함할 수 있다. 차량 구동 장치(600)는, 제어부(170)의 제어에 따라 동작될 수 있다.The vehicle driving apparatus 600 may include a processor. Each unit of the vehicle drive apparatus 600 may each include a processor individually. The vehicle driving apparatus 600 may be operated under the control of the controller 170.
운행 시스템(700)은, 차량(100)의 각종 운행을 제어하는 시스템이다. 운행 시스템(700)은, 자율 주행 모드에서 동작될 수 있다.The travel system 700 is a system for controlling various travels of the vehicle 100. The navigation system 700 can be operated in an autonomous driving mode.
운행 시스템(700)은, 주행 시스템(710), 출차 시스템(740) 및 주차 시스템(750)을 포함할 수 있다. 실시예에 따라, 운행 시스템(700)은, 설명되는 구성 요소 외에 다른 구성 요소를 더 포함하거나, 설명되는 구성 요소 중 일부를 포함하지 않을 수 있다. 한편, 운행 시스템(700)은, 프로세서를 포함할 수 있다. 운행 시스템(700)의 각 유닛은, 각각 개별적으로 프로세서를 포함할 수 있다.The travel system 700 can include a travel system 710, a parking system 740, and a parking system 750. In some embodiments, the navigation system 700 may further include other components in addition to the described components, or may not include some of the described components. Meanwhile, the driving system 700 may include a processor. Each unit of the navigation system 700 may each include a processor individually.
한편, 운행 시스템(700)은 학습에 기초한 자율 주행 모드의 운행을 제어할 수 있다. 이러한 경우에는 학습 모드 및 학습이 완료됨을 전제로 한 동작 모드가 수행될 수 있다. 운행 시스템(700)의 프로세서가 학습 모드(learning mode) 및 동작 모드(operating mode)를 수행하는 방법에 대하여 이하 설명하도록 한다.On the other hand, the driving system 700 may control the driving of the autonomous driving mode based on the learning. In this case, the learning mode and the operation mode on the premise that the learning is completed may be performed. A method of the processor of the driving system 700 to perform a learning mode and an operating mode will be described below.
학습 모드는 앞서 설명한 메뉴얼 모드에서 수행될 수 있다. 학습 모드에서 운행 시스템(700)의 프로세서는 차량(100)의 주행 경로 학습 및 주변 환경 학습을 수행할 수 있다. The learning mode may be performed in the manual mode described above. In the learning mode, the processor of the driving system 700 may perform driving path learning and surrounding environment learning of the vehicle 100.
주행 경로 학습은 차량(100)이 주행하는 경로에 대한 맵 데이터를 생성하는 단계를 포함할 수 있다. 특히, 운행 시스템(700)의 프로세서는 차량(100)이 출발지로부터 목적지까지 주행하는 동안 오브젝트 검출 장치(300)를 통해 검출된 정보에 기초하여 맵 데이터를 생성할 수 있다.The driving route learning may include generating map data on a route on which the vehicle 100 travels. In particular, the processor of the driving system 700 may generate map data based on information detected by the object detecting apparatus 300 while the vehicle 100 travels from the starting point to the destination.
주변 환경 학습은 차량(100)의 주행 과정 및 주차 과정에서 차량(100)의 주변 환경에 대한 정보를 저장하고 분석하는 단계를 포함할 수 있다. 특히, 운행 시스템(700)의 프로세서는 차량(100)의 주차 과정에서 오브젝트 검출 장치(300)를 통해 검출된 정보, 예를 들면 주차 공간의 위치 정보, 크기 정보, 고정된(또는 고정되지 않은) 장애물 정보 등과 같은 정보에 기초하여 차량(100)의 주변 환경에 대한 정보를 저장하고 분석할 수 있다.The surrounding environment learning may include storing and analyzing information about the surrounding environment of the vehicle 100 in the driving process and the parking process of the vehicle 100. In particular, the processor of the driving system 700 may detect information detected by the object detecting apparatus 300 during the parking process of the vehicle 100, for example, location information of the parking space, size information, fixed (or not fixed). Information about the surrounding environment of the vehicle 100 may be stored and analyzed based on information such as obstacle information.
동작 모드는 앞서 설명한 자율 주행 모드에서 수행될 수 있다. 학습 모드를 통하여 주행 경로 학습 또는 주변 환경 학습이 완료된 것을 전제로 동작 모드에 대하여 설명한다.The operation mode may be performed in the autonomous driving mode described above. The operation mode will be described on the premise that the driving route learning or the surrounding environment learning is completed through the learning mode.
동작 모드는 입력부(210)를 통한 사용자 입력에 반응하여 수행되거나, 학습이 완료된 주행 경로 및 주차 공간에 차량(100)이 도달하면 자동으로 수행될 수 있다.The operation mode may be performed in response to a user input through the input unit 210, or may be automatically performed when the vehicle 100 reaches a driving path and a parking space where learning is completed.
동작 모드는 운전 조작 장치(500)에 대한 사용자의 조작을 일부 요구하는 반-자율 동작 모드(semi autonomous operating mode) 및 운전 조작 장치(500)에 대한 사용자의 조작을 전혀 요구하지 않는 완전-자율 동작 모드(fully autonomous operating mode)를 포함할 수 있다.The operating mode is a semi autonomous operating mode that requires some user's manipulation of the drive manipulator 500 and a full-autonomous operation requiring no user's manipulation of the drive manipulator 500. May include a fully autonomous operating mode.
한편, 실시예에 따라 운행 시스템(700)의 프로세서는 동작 모드에서 주행 시스템(710)을 제어하여 학습이 완료된 주행 경로를 따라 차량(100)을 주행시킬 수 있다.Meanwhile, according to an exemplary embodiment, the processor of the driving system 700 may control the driving system 710 in the operation mode to drive the vehicle 100 along the driving path where learning is completed.
한편, 실시예에 따라 운행 시스템(700)의 프로세서는 동작 모드에서 출차 시스템(740)을 제어하여 학습이 완료된 주차 공간으로부터 주차된 차량(100)을 출차 시킬 수 있다.Meanwhile, according to an exemplary embodiment, the processor of the driving system 700 may control the parking system 740 in the operation mode to release the parked vehicle 100 from the parking space where the learning is completed.
한편, 실시예에 따라 운행 시스템(700)의 프로세서는 동작 모드에서 주차 시스템(750)을 제어하여 현재 위치로부터 학습이 완료된 주차 공간으로 차량(100)을 주차 시킬 수 있다.한편, 실시예에 따라, 운행 시스템(700)이 소프트웨어적으로 구현되는 경우, 제어부(170)의 하위 개념일 수도 있다.Meanwhile, according to an exemplary embodiment, the processor of the driving system 700 may control the parking system 750 in the operation mode to park the vehicle 100 from the current position to the parking space where the learning is completed. When the driving system 700 is implemented in software, the driving system 700 may be a lower concept of the controller 170.
한편, 실시예에 따라, 운행 시스템(700)은, 사용자 인터페이스 장치(270), 오브젝트 검출 장치(300) 및 통신 장치(400), 운전 조작 장치(500), 차량 구동 장치(600), 내비게이션 시스템(770), 센싱부(120) 및 제어부(170) 중 적어도 어느 하나를 포함하는 개념일 수 있다.In some embodiments, the driving system 700 may include a user interface device 270, an object detecting device 300, a communication device 400, a driving manipulation device 500, a vehicle driving device 600, and a navigation system. 770, the sensing unit 120, and the control unit 170 may include a concept including at least one.
주행 시스템(710)은, 차량(100)의 주행을 수행할 수 있다. 주행 시스템(710)은, 내비게이션 시스템(770)으로부터 내비게이션 정보를 제공받아, 차량 구동 장치(600)에 제어 신호를 제공하여, 차량(100)의 주행을 수행할 수 있다.The traveling system 710 may perform driving of the vehicle 100. The driving system 710 may receive navigation information from the navigation system 770, provide a control signal to the vehicle driving apparatus 600, and perform driving of the vehicle 100.
주행 시스템(710)은, 오브젝트 검출 장치(300)로부터 오브젝트 정보를 제공받아, 차량 구동 장치(600)에 제어 신호를 제공하여, 차량(100)의 주행을 수행할 수 있다. 주행 시스템(710)은, 통신 장치(400)를 통해, 외부 디바이스로부터 신호를 제공받아, 차량 구동 장치(600)에 제어 신호를 제공하여, 차량(100)의 주행을 수행할 수 있다.The driving system 710 may receive object information from the object detecting apparatus 300 and provide a control signal to the vehicle driving apparatus 600 to perform driving of the vehicle 100. The driving system 710 may receive a signal from an external device through the communication device 400, provide a control signal to the vehicle driving device 600, and perform driving of the vehicle 100.
주행 시스템(710)은, 사용자 인터페이스 장치(270), 오브젝트 검출 장치(300) 및 통신 장치(400), 운전 조작 장치(500), 차량 구동 장치(600), 내비게이션 시스템(770), 센싱부(120) 및 제어부(170) 중 적어도 어느 하나를 포함하여, 차량(100)의 주행을 수행하는 시스템 개념일 수 있다. 이러한, 주행 시스템(710)은, 차량 주행 제어 장치로 명명될 수 있다.The driving system 710 may include a user interface device 270, an object detection device 300, a communication device 400, a driving manipulation device 500, a vehicle driving device 600, a navigation system 770, and a sensing unit ( At least one of the 120 and the controller 170 may be a system concept for driving the vehicle 100. The driving system 710 may be referred to as a vehicle driving control device.
출차 시스템(740)은, 차량(100)의 출차를 수행할 수 있다. 출차 시스템(740)은, 내비게이션 시스템(770)으로부터 내비게이션 정보를 제공받아, 차량 구동 장치(600)에 제어 신호를 제공하여, 차량(100)의 출차를 수행할 수 있다.The taking-out system 740 may perform taking out of the vehicle 100. The taking-out system 740 may receive navigation information from the navigation system 770, provide a control signal to the vehicle driving apparatus 600, and perform take-out of the vehicle 100.
출차 시스템(740)은, 오브젝트 검출 장치(300)로부터 오브젝트 정보를 제공받아, 차량 구동 장치(600)에 제어 신호를 제공하여, 차량(100)의 출차를 수행할 수 있다.The taking-out system 740 may receive the object information from the object detecting apparatus 300, provide a control signal to the vehicle driving apparatus 600, and perform take-out of the vehicle 100.
출차 시스템(740)은, 통신 장치(400)를 통해, 외부 디바이스로부터 신호를 제공받아, 차량 구동 장치(600)에 제어 신호를 제공하여, 차량(100)의 출차를 수행할 수 있다.The taking-off system 740 may receive a signal from an external device through the communication device 400, provide a control signal to the vehicle driving apparatus 600, and perform take-out of the vehicle 100.
출차 시스템(740)은, 사용자 인터페이스 장치(270), 오브젝트 검출 장치(300) 및 통신 장치(400), 운전 조작 장치(500), 차량 구동 장치(600), 내비게이션 시스템(770), 센싱부(120) 및 제어부(170) 중 적어도 어느 하나를 포함하여, 차량(100)의 출차를 수행하는 시스템 개념일 수 있다.The car leaving system 740 includes a user interface device 270, an object detecting device 300 and a communication device 400, a driving control device 500, a vehicle driving device 600, a navigation system 770, and a sensing unit ( Including at least one of the controller 120 and the controller 170, the concept of a system that performs the taking out of the vehicle 100 may be performed.
이러한, 출차 시스템(740)은, 차량 출차 제어 장치로 명명될 수 있다.Such a car leaving system 740 may be referred to as a vehicle parking control device.
주차 시스템(750)은, 차량(100)의 주차를 수행할 수 있다. 주차 시스템(750)은, 내비게이션 시스템(770)으로부터 내비게이션 정보를 제공받아, 차량 구동 장치(600)에 제어 신호를 제공하여, 차량(100)의 주차를 수행할 수 있다.The parking system 750 may perform parking of the vehicle 100. The parking system 750 may receive navigation information from the navigation system 770, provide a control signal to the vehicle driving apparatus 600, and perform parking of the vehicle 100.
주차 시스템(750)은, 오브젝트 검출 장치(300)로부터 오브젝트 정보를 제공받아, 차량 구동 장치(600)에 제어 신호를 제공하여, 차량(100)의 주차를 수행할 수 있다.The parking system 750 may receive the object information from the object detecting apparatus 300, provide a control signal to the vehicle driving apparatus 600, and perform parking of the vehicle 100.
주차 시스템(750)은, 통신 장치(400)를 통해, 외부 디바이스로부터 신호를 제공받아, 차량 구동 장치(600)에 제어 신호를 제공하여, 차량(100)의 주차를 수행할 수 있다.The parking system 750 may receive a signal from an external device through the communication device 400, provide a control signal to the vehicle driving device 600, and perform parking of the vehicle 100.
주차 시스템(750)은, 사용자 인터페이스 장치(270), 오브젝트 검출 장치(300) 및 통신 장치(400), 운전 조작 장치(500), 차량 구동 장치(600), 내비게이션 시스템(770), 센싱부(120) 및 제어부(170) 중 적어도 어느 하나를 포함하여, 차량(100)의 주차를 수행하는 시스템 개념일 수 있다.The parking system 750 includes a user interface device 270, an object detection device 300 and a communication device 400, a driving operation device 500, a vehicle driving device 600, a navigation system 770, and a sensing unit ( At least one of the 120 and the controller 170 may be a system concept for parking the vehicle 100.
이러한, 주차 시스템(750)은, 차량 주차 제어 장치로 명명될 수 있다.Such a parking system 750 may be referred to as a vehicle parking control device.
내비게이션 시스템(770)은, 내비게이션 정보를 제공할 수 있다. 내비게이션 정보는, 맵(map) 정보, 설정된 목적지 정보, 상기 목적지 설정 따른 경로 정보, 경로 상의 다양한 오브젝트에 대한 정보, 차선 정보 및 차량의 현재 위치 정보 중 적어도 어느 하나를 포함할 수 있다.The navigation system 770 can provide navigation information. The navigation information may include at least one of map information, set destination information, route information according to the destination setting, information on various objects on the route, lane information, and current location information of the vehicle.
내비게이션 시스템(770)은, 메모리, 프로세서를 포함할 수 있다. 메모리는 내비게이션 정보를 저장할 수 있다. 프로세서는 내비게이션 시스템(770)의 동작을 제어할 수 있다.The navigation system 770 may include a memory and a processor. The memory may store navigation information. The processor may control the operation of the navigation system 770.
실시예에 따라, 내비게이션 시스템(770)은, 통신 장치(400)를 통해, 외부 디바이스로부터 정보를 수신하여, 기 저장된 정보를 업데이트 할 수 있다. 실시예에 따라, 내비게이션 시스템(770)은, 사용자 인터페이스 장치(200)의 하위 구성 요소로 분류될 수도 있다.According to an embodiment, the navigation system 770 may receive information from an external device through the communication device 400 and update the pre-stored information. According to an embodiment, the navigation system 770 may be classified as a subcomponent of the user interface device 200.
도 8은 본 발명의 일 측면에 따른 텔레매틱스 시스템을 설명하기 위한 도면이다. 본 발명의 일 측면에 따른 텔레매틱스 시스템은 Telematics application, ECU (Electronic Control Unit), 5G modem 등을 통해 외부 기기와의 통신을 제공할 수 있다.8 is a view for explaining a telematics system according to an aspect of the present invention. The telematics system according to an aspect of the present invention may provide communication with an external device through a telematics application, an ECU (Electronic Control Unit), a 5G modem, or the like.
본 발명의 일 측면에 따른 텔레매틱스 시스템 (또는 5G Telematics System)은 AVN (Audio/Video/Navigation), ECU와 함께 차량 내에 구비될 수 있다. 또한, 탑승자의 CE Device (Consumer Electronics Device)가 차량 내에 함께 존재할 수 있다. 텔레매틱스 시스템은 이더넷 (Ethernet)을 통해 AVN, ECU와 연결될 수 있다. 또한, 텔레매틱스 시스템은 IoT Protocol 중 하나로 알려져 있는 OCF (Open Connectivity Foundation)를 통해 CE Device와 연결될 수 있다. 본 발명의 일 측면에 따른 텔레매틱스 시스템의 내부 구성, 예를 들면 Service Framework, Framework, Platform Service, OS layer 및 HW에 대해서는 이하 구체적으로 설명하도록 한다.The telematics system (or 5G telematics system) according to an aspect of the present invention may be provided in a vehicle together with an AVN (Audio / Video / Navigation) and an ECU. In addition, the CE device (Consumer Electronics Device) of the occupant may exist together in the vehicle. The telematics system can be connected to the AVN and ECU via Ethernet. In addition, telematics systems can be connected to CE devices through the Open Connectivity Foundation (OCF), which is known as one of the IoT protocols. An internal configuration of the telematics system according to an aspect of the present invention, for example, a service framework, a framework, a platform service, an OS layer, and an HW, will be described in detail below.
도 9 내지 도 10은 본 발명의 일 측면에 따른 텔레매틱스 시스템을 설명하기 위한 도면이다. 텔레매틱스 시스템 아키텍쳐는 E/E (Electrical/Electronic) 아키텍쳐의 관점에서 고려되어야 한다. 또한, 텔레매틱스 시스템의 소프트웨어는 이종 아키텍쳐 및 모든 ECU에도 호환되거나 적용될 수 있도록 설계되어야 한다. 9 to 10 are diagrams for explaining a telematics system according to an aspect of the present invention. Telematics system architecture should be considered in terms of E / E (Electrical / Electronic) architecture. In addition, the software in the telematics system must be designed to be compatible or applicable to heterogeneous architectures and all ECUs.
도 9를 참조하면, 텔레매틱스 시스템의 TCU (Telematic Control Unit)는 모뎀과 분리된 구조로 구현될 수 있다 (case 1). 즉, case 1은 모뎀과 텔레매틱스가 TCU 내에 분리되어 있어서 인터페이스만으로 통신하는 방법이다. case 1과 달리, case 2는 모뎀과 AP가 하나의 SoC 내에 구현되고 그 위에서 TCU 전체를 구현하는 방법이다. 한편, 도 9의 오른쪽에 도시된 바와 같이 모뎀과 Telematics system을 별도로 구현해서 연동하는 방법이 고려될 수도 있다.Referring to FIG. 9, a telematic control unit (TCU) of a telematics system may be implemented in a structure separate from a modem (case 1). In other words, case 1 is a method in which the modem and telematics are separated in the TCU to communicate using only the interface. Unlike case 1, case 2 is how a modem and an AP are implemented in one SoC and the entire TCU on top of it. Meanwhile, as shown in the right side of FIG. 9, a method of separately implementing a modem and a telematics system may be considered.
TCU (Telematics Control Unit)는 크게 두 가지의 형태로 고려될 수 있다. 첫 째, Modem 과 Application Processor 가 물리적으로 다른 Chipset 으로 구현되어 두 개의 Chipset 이 TCU 를 구성하는 형태로 고려될 수 있다. 둘 째, Modem 과 Application Processor 가 물리적으로 하나의 Chipset 으로 구성되어 특별한 I/F 구성이 필요 없는 형태로 고려될 수 있다. 한편, 두 가지 형태 이외에도 Modem 과 Application Processor 가 두 개의 Chipset 으로 구성되나, 구성되는 형태가 하나의 보드가 아닌 형태가 있으나, 이는 첫 번째 구성과 S/W 구성 측면에서 유사하다.Telematics Control Unit (TCU) can be considered in two types. First, Modem and Application Processor are implemented as physically different chipsets, so two chipsets can be considered as a form of TCU. Second, Modem and Application Processor are physically composed of one Chipset and can be considered as a form that does not require special I / F configuration. On the other hand, in addition to the two types, Modem and Application Processor are composed of two chipsets, but there is a form that is not one board, but this is similar in terms of the first configuration and S / W configuration.
전술한 첫 번째 구성에서 Modem 은 통신 모듈로써의 역할만 수행하게 되며, 모든 Data 를 PCIe 로 AP(Application Processor)에 제공하도록 구현될 수 있다. AP는 연결된 다른 ECUs 에 Data 를 전달할 수 있다. TCU 내에는 그 목적에 따라 특별한 기능의 Service가 구성될 수 있다. 예를 들면, Data Routing과 관련된 고속 데이터 처리를 위한 Device Driver 구성, 연동되는 ECUs 의 특성에 따라 S/W 적으로 우선순위를 갖는 구성이 있을 수 있다.In the first configuration described above, Modem only plays a role as a communication module and may be implemented to provide all data to the AP (Application Processor) as PCIe. The AP can pass data to other connected ECUs. In the TCU, a service having a special function may be configured according to its purpose. For example, there may be a device driver configuration for high-speed data processing related to data routing and a configuration having priority in S / W according to characteristics of ECUs to be linked.
전술한 두 번째 구성은 Modem 과 AP 가 하나의 모듈로 구성될 수 있다. Data는 PCIe I/F 없이 내부 Memory 를 통해 전달될 수 있다. 한편, 데이터의 처리를 위한 Modem 의 우선순위 및 성능분배로 인해 Application Processor 의 역할은 줄어들 수 있다.In the second configuration described above, the modem and the AP may be configured as one module. Data can be transferred through internal memory without PCIe I / F. On the other hand, due to the priority and performance distribution of Modem for data processing, the role of Application Processor can be reduced.
한편, 첫 번째 및 두 번째 구성을 지원하는 S/W 를 구현하는 여러 가지 방법이 있을 수 있다. 개발 후 관리의 효율성 및 개발 결과물의 수평 전개 및 개선에 대한 효과적인 수행을 위해, Reusability 를 높일 수 있는 S/W 구현이 필요하다. 이를 위해 Platform Dependency 를 제거하는 것이 기본적인 방법일 수 있다. 물리적으로 다른 위치에 있는 S/W 간 통신을 가능하게 하는 방법도 고려되어야 한다.On the other hand, there may be a number of ways to implement the software supporting the first and second configuration. In order to effectively manage post-development and perform horizontal development and improvement of development results, S / W should be implemented to increase reusability. For this purpose, removing the platform dependency may be the basic method. Consideration should also be given to methods for enabling communication between software in different physical locations.
한편, 구현된 SW가 위치에 관계없이 동작할 수 있도록 Service Oriented Architecture (SOA)를 지향하는 Protocol이 사용되어야 한다. 이를 통해 SW (예를 들면, SW5 and SW6) 의 구현이 물리적인 위치에 관계없이 동작할 수 있어야 한다. Service Oriented Architecture 를 지향하는 Protocol 로 SOME/IP 와 DDS 를 Framework 으로 제공될 수 있으며, 다양한 E/E Architecture 에서 구동할 수 있도록 설계 되어야 한다. 이하에서는, 텔레매틱스 시스템 아키텍쳐의 구성요소 별 기능에 대해 설명하도록 한다.Meanwhile, a protocol oriented to Service Oriented Architecture (SOA) should be used so that implemented SW can operate regardless of location. This should allow implementations of SW (eg SW5 and SW6) to work regardless of their physical location. As a protocol oriented to Service Oriented Architecture, SOME / IP and DDS can be provided as Framework, and it must be designed to run in various E / E Architectures. Hereinafter, the function of each component of the telematics system architecture will be described.
Service Oriented Architecture (SOA)는 i) 완전히 통합된 시스템 (fully integrated system)으로서 완전한 제어가 가능하고, ii) 급격히 변하는 business condition 또는 제조사별 플랫폼에 빠르게 대처할 수 있고, iii) 가격 측면에서 유리하다.Service Oriented Architecture (SOA) is i) a fully integrated system with complete control, ii) rapid response to rapidly changing business conditions or vendor-specific platforms, and iii) price advantage.
SOA 기반의 Architecture로 SOME/IP (Scalable Service-Oriented Middleware over IP) 또는 DDS (Data Distribution Service)가 이용될 수 있다. SOME/IP와 DDS 모두 SOA 기반의 middleware solution이다. SOME/IP와 DDS에서 서비스 검색은 서비스 인스턴스 (instance)를 찾고 서비스 인스턴스가 실행 중인지 검색하는 데 사용된다. SOME/IP는 DDS 보다 먼저 차량에 구현되었다. DDS는 항공, 군사 분야에 먼저 구현되었으며 V2X에 유리할 수 있다.As an SOA-based architecture, SOME / IP (Scalable Service-Oriented Middleware over IP) or DDS (Data Distribution Service) can be used. Both SOME / IP and DDS are SOA-based middleware solutions. In SOME / IP and DDS, service discovery is used to find service instances and to find out if a service instance is running. SOME / IP was implemented in vehicles before DDS. DDS was first implemented in the aviation and military sectors and could benefit V2X.
Platform으로는 AGL(Automotive Grade Linux) 기반의 Platform 이 이용될 수 있다. AGL은 GENIVI 등 타 Linux Platform 과 큰 차이가 없고, 제조사별로 운영되는 Platform 에 비하여 확대 전개 뿐 아니라 수정 개선에 대한 폭넓은 전개가 가능한 규격이다.Platform based on AGL (Automotive Grade Linux) can be used. AGL is not much different from other Linux platforms such as GENIVI, and it is a standard that can be expanded not only for expansion but also for revision and improvement.
TSN (Time Sensitive Network)은 eAVB 의 upversion 으로 활용될 수 있는 규격이지만, 현재 규격이 진행 중이다. TSN 은 전체 network 에 적용이 되어야 효율적인 기술로, 상용 Backbone 망까지 지원할 수 있기에는 많은 시간이 필요하므로, 차량 내 적용을 통해 효과를 먼저 확보 할 수 있다.TSN (Time Sensitive Network) is a standard that can be used as an upversion of eAVB, but the standard is in progress. TSN is an efficient technology that needs to be applied to the entire network, and it takes a lot of time to support commercial backbone networks, so it can be secured first through in-vehicle applications.
MultiLink Controller 는 다양한 Network 에 대한 데이터 전송 관리를 위한 모듈이다. MultiLink Controller 는 Modem 통신(LTE/5G) 및 Wi-Fi 등 연결되어 전송 가능한 Radio channel 에 대한 관리를 지원할 수 있다. 실시간 전송이 필요한 데이터의 경우 Modem 통신이 활용될 수 있으며, Monitoring 및 Backup 데이터의 경우 메모리의 설정에 따라 Wi-Fi 를 통해서만 전송되도록 제어될 수 있다. 각각의 통신은 Mobility 및 과금 관점에서 고려되어야 하며, 차량 내에서 제공되는 서비스에 따라 유연하게 변경 설계될 수 있다.MultiLink Controller is a module for data transmission management for various networks. MultiLink Controller can support management of connected radio channel such as Modem communication (LTE / 5G) and Wi-Fi. Modem communication can be used for data that needs to be transmitted in real time, and monitoring and backup data can be controlled to be transmitted only via Wi-Fi depending on the memory setting. Each communication must be considered in terms of mobility and charging, and can be flexibly modified according to the service provided in the vehicle.
Media cast center 는 TCU 를 통해 수신되거나 전송되는 multimedia service 를 대표한다. Media cast center 는 기본적인 streaming protocol, 데이터의 압축/인코딩/디코딩뿐만 아니라, 다양한 Multimedia source 에 대한 처리를 수행할 수 있다. 5G network 을 활용할 경우 다양한 multimedia service 의 증가가 예상되어 TCU 뿐 아니라 AVN 사양에서도 중요한 서비스로 예상된다. Media cast center represents a multimedia service received or transmitted through the TCU. The media cast center can perform processing on various multimedia sources as well as basic streaming protocol and data compression / encoding / decoding. If 5G network is used, various multimedia services are expected to increase, which is expected to be important service not only in TCU but also in AVN specification.
IoT2V 는 다양한 IoT service 를 연동할 수 있는 IoT service gateway 역할로, 서비스 제공자 (예를 들면, Amazon, Google, Naver)의 AI service 연동을 위한 gateway에 해당한다.IoT2V is an IoT service gateway that can interoperate with various IoT services and corresponds to a gateway for interworking AI services of service providers (eg, Amazon, Google, Naver).
Cloud Service Manager 는 연동되는 Network server 에서 제공되는 Services 를 차량 내 Services 인 것처럼 제공하기 위한 service manager이다. 다양한 TCU 성능/지원범위에 따라 Embedded 불가능한 기능이 Network 을 통해 제공되는 경우, 동일한 기능을 차량 내에 제공하기 위한 Proxy service 형태로서 SOA 를 지원하는 service가 고려될 수 있다.Cloud Service Manager is a service manager to provide services provided by interworking network servers as if they are in-vehicle services. When functions that cannot be embedded are provided through the network according to various TCU capabilities / support ranges, a service supporting SOA may be considered as a proxy service to provide the same functions in a vehicle.
이 외 Cloud Server 로 표기된 Network 상의 Server 는 Mobile Edge Computing 을 지원하는 경우, Embedded service 와 같은 기능을 제공할 수 있다. 이를 위해 Cloud Server (Network Server) 또한 SOA 를 지원하는 Framework 으로 제공되어야 한다. 한편, SOA framework 간 Service 연동은 물리적인 위치보다 Framework 의 Protocol 전송 방식에 영향 받을 수 있다.In addition, the server on the network marked Cloud Server can provide the same function as the embedded service when supporting Mobile Edge Computing. To do this, Cloud Server (Network Server) must also be provided as a framework supporting SOA. Meanwhile, service interworking between SOA frameworks may be affected by the protocol transmission method of the framework rather than the physical location.
도 10을 참조하면, E/E Architecture 관점에서 텔레매틱스 AP 시스템과 모뎀, IVI로의 SoC 통합, 안테나와의 Harnessing을 고려한 TCU 위치 등에 대한 고려가 필요하다. 구체적으로, consolidation 측면에서 다른 ECU로의 TCU 기능 통합, 다른 기능의 TCU로의 통합, TCU AP와 모뎀과의 통합이 고려될 수 있다. 또한, Wiring Harnessing 관점에서는 consolidation과 연계하여 RF cable 및 Ethernet 등의 Harnessing을 고려하여 차량 내 최적의 위치를 고려할 필요가 있다.Referring to FIG. 10, it is necessary to consider a telematics AP system, a modem, SoC integration into IVI, and TCU location considering harnessing with an antenna from an E / E architecture perspective. Specifically, in terms of consolidation, integration of TCU functions into other ECUs, integration of other functions into TCUs, and integration of TCU APs and modems can be considered. In addition, in terms of wiring harnessing, it is necessary to consider the optimal location in the vehicle in consideration of harnessing such as RF cable and Ethernet in connection with consolidation.
도 11은 본 발명의 일 측면에 따른 텔레매틱스 시스템의 구현 예를 설명하기 위한 도면이다.11 is a view for explaining an implementation example of a telematics system according to an aspect of the present invention.
본 발명의 일 측면에 따른 텔레매틱스 시스템에 따르면, 카메라 영상 up-streaming을 통한 5G throughput과 관련하여, camera 영상의 up-streaming과 cloud computing을 통한 카메라 기반 응용이 가능하다.According to the telematics system according to an aspect of the present invention, with respect to 5G throughput through camera image up-streaming, camera-based applications through up-streaming and cloud computing of camera images are possible.
한편, 본 발명의 일 측면에 따른 텔레매틱스 시스템에 따르면 cloud 연산 성능 강화에 따른 Cloud Computing 사용성 (feasibility) 향상이 가능하다. 또한, embedded computing과의 TCU의 CPU/GPU 점유율을 비교함으로써 Low-cost SoC 설계가 가능하다.On the other hand, according to the telematics system according to an aspect of the present invention it is possible to improve Cloud Computing usability (feasibility) according to the enhancement of cloud computing performance. In addition, low-cost SoC design is possible by comparing TCU's CPU / GPU share with embedded computing.
한편, 본 발명의 일 측면에 따른 텔레매틱스 시스템에 따르면 SOA (Service Oriented Architecture) 기반의 동적 configuration을 통해 embedded computing에서 cloud computing으로의 동적 전환이 가능하다.On the other hand, according to the telematics system according to an aspect of the present invention it is possible to dynamically switch from embedded computing to cloud computing through a dynamic configuration based on SOA (Service Oriented Architecture).
도 12는 본 발명의 일 측면에 따른 텔레매틱스 시스템의 구현 예를 설명하기 위한 도면이다. 본 발명의 일 측면에 따르면, 텔레매틱스 시스템을 구비한 차량의 전방에 주행 중인 차량들의 영상을 탐색함으로써 도로 상황을 실시간으로 인지하는 것이 가능하다.12 is a view for explaining an implementation example of a telematics system according to an aspect of the present invention. According to one aspect of the invention, it is possible to recognize the road situation in real time by searching for the image of the vehicle running in front of the vehicle equipped with a telematics system.
도 13은 본 발명의 일 측면에 따른 텔레매틱스 시스템의 구현 예를 설명하기 위한 도면이다. 본 발명의 일 측면에 따르면, in-vehicle Ethernet에 연결된 디스플레이 장치들 (e.g., mobile, CID (Center Information Display), RSE (Rear Seat Entertainment))에 대한 컨텐트의 동시 재생 (synchronized playback)이 가능하다. 구체적으로, i) 고해상도 영상을 down-streaming하여 차량 내 멀티 디스플레이로 재전송함으로써 동기화 재생이 가능하다. ii) IoT 프로토콜을 이용하여 CE Device들을 텔레매틱스 시스템과 연동시키는 것이 가능하다.FIG. 13 is a diagram for describing an implementation example of a telematics system according to an aspect of the present disclosure. According to an aspect of the present invention, synchronized playback of content for display devices (e.g., mobile, CID (Center Information Display), RSE (Rear Seat Entertainment)) connected to in-vehicle Ethernet is possible. Specifically, i) down-streaming a high resolution image and re-transmitting it to a multi display in a vehicle, thereby enabling synchronized playback. ii) It is possible to integrate CE devices with telematics system using IoT protocol.
도 14는 본 발명의 일 측면에 따른 텔레매틱스 시스템의 구현 예를 설명하기 위한 도면이다. 본 발명의 일 측면에 따르면, i) VIO (Visual-inertial Odometry) 기반의 HD-Positioning 및 ii) Here LiveSight AR이 제공될 수 있다. VIO는 monocular camera system 및 inertial sensors (e.g., accelerometer, gyroscope)을 통해 구현될 수 있다. VIO와 smartphone-grade GPS를 함께 사용하면 RTK GPS 시스템에 비해 저렴한 비용으로 지도에서 센티미터 단위의 positioning이 제공될 수 있다.14 is a view for explaining an implementation example of a telematics system according to an aspect of the present invention. According to an aspect of the present invention, i) HD-Positioning based on Visual-inertial Odometry (VIO) and ii) Here LiveSight AR may be provided. VIO can be implemented through monocular camera systems and inertial sensors (e.g. accelerometer, gyroscope). The combination of VIO and smartphone-grade GPS can provide centimeter positioning on the map at a lower cost than RTK GPS systems.
구체적으로, VIO 기반의 HD-Positioning을 위해 i) TCU는 external camera 영상과 관성센서 GPS를 융합한 VIO 기반의 HD-Positioning을 수행할 수 있다. 또한, ii) TCU는 Positioning 정보를 AVN으로 전달하여 일반 GPS 정보와 정확도를 비교할 수 있다. Here LiveSight AR 제공을 위해 AVN은 HD-Positioning 또는 일반 Positioning을 통해 AR의 매핑 정확도를 비교할 수 있다.Specifically, for VIO-based HD-Positioning, i) TCU can perform VIO-based HD-Positioning by combining external camera image and inertial sensor GPS. In addition, ii) the TCU transmits the positioning information to the AVN to compare accuracy with general GPS information. To provide LiveSight AR, AVN can compare the mapping accuracy of the AR through HD-Positioning or General Positioning.
도 15는 본 발명의 일 측면에 따른 텔레매틱스 시스템 아키텍쳐의 특징을 설명하기 위한 도면이다. 구체적으로, 본 발명의 일 측면에 따른 텔레매틱스 시스템은 서비스 지향 통신 프레임워크를 통해서 i) 클라우드 또는 ii) 모바일 기기의 컴퓨팅 파워를 이용해서 iii) 임베디드 컴퓨팅과 분산처리를 제공할 수 있다. 15 is a view for explaining the characteristics of the telematics system architecture according to an aspect of the present invention. Specifically, the telematics system according to an aspect of the present invention may provide iii) embedded computing and distributed processing using i) cloud or ii) computing power of a mobile device through a service-oriented communication framework.
첫 째, 텔레매틱스 시스템의 Telematics AP은 서로 다른 서비스 지향 통신 방법을 Translation할 수 있다. 구체적으로, cloud와 mobile device에서 제공하는 서비스가 IoTivity를 통해 TCU에서 제공될 수 있다. 또한, TCU에서 발견한 서비스가 차량 내 네트워크에서 SOME/IP 또는 DDS를 통해 제공될 수 있다. 둘 째, 텔레매틱스 시스템에 따르면 카메라 기반의 응용 서비스가 클라우드, 모바일 또는 임베디드 컴퓨팅을 통한 비교에 기초하여 검증될 수 있다. 셋 째, 텔레매틱스 시스템은 OCF 및 외부 인터페이스를 통해 media multicast service의 프로토콜로 활용될 수 있다.First, Telematics AP of telematics system can translate different service-oriented communication methods. Specifically, the services provided by the cloud and the mobile device may be provided in the TCU through IoTivity. In addition, services found in the TCU may be provided via SOME / IP or DDS in the in-vehicle network. Secondly, according to the telematics system, the camera-based application service can be verified based on a comparison through cloud, mobile or embedded computing. Third, the telematics system can be utilized as a protocol of media multicast service through OCF and external interface.
도 16 내지 도 17은 본 발명의 일 측면에 따른 텔레매틱스 시스템의 SW 플랫폼을 설명하기 위한 도면이다. 본 발명의 일 측면에 따른 텔레매틱스 시스템의 SW 플랫폼은 SOA 관점으로 설계될 수 있고, 주요 특징은 다음과 같을 수 있다.16 to 17 are diagrams for explaining the SW platform of the telematics system according to an aspect of the present invention. SW platform of the telematics system according to an aspect of the present invention can be designed from the SOA perspective, the main features may be as follows.
i) Plug/Play : E/E Architecture는 제조사별 요구사항에 기초하여, TCU는 통합 또는 분리될 수 있고, 본 발명에 따른 텔레매틱스 시스템의 SW components는 제조사별 E/E Architecture에 관계없이 어떠한 ECU에서도 flexible하게 동작할 수 있다.i) Plug / Play: E / E Architecture is based on manufacturer's requirements, TCU can be integrated or separated, and SW components of telematics system according to the present invention can be used in any ECU regardless of manufacturer's E / E Architecture. It can operate flexibly.
ii) Loosely Coupling: Service 관점의 Publish/Subscribe 모델을 통해서, 통신 부분 추상화를 통한 ECU간의 의존도를 줄일 수 있다. 텔레매틱스는 Data hub로서 외부로부터 전달되는 모든 데이터를 publishing하고, 내부로부터 발생하는 모든 데이터를 subscribe하도록 설계될 수 있다. ii) Loosely Coupling: Through the publish / subscribe model from the service point of view, the dependency between ECUs can be reduced through abstraction of communication part. Telematics can be designed to publish all data from outside as a data hub and subscribe to all data coming from inside.
데이터 관리 측면에서, 우선적으로 처리해야 할 데이터의 scheduling 및 QoS 관리, Data Queue 관리, 데이터의 실시간성 보장이 고려되어야 한다. 데이터 저장 측면에서는 대용량 데이터의 버퍼 관리, 데이터 저장소로서의 flash storage의 수명 관리, data log 관리가 고려되어야 한다. 한편, 5G telematics 관점에서는 SOA based SW 설계, TCU 및 이종 ECU와의 유연한 E/E Architecture 설계 (예를 들면, PnP 지원, 10Gbps 네트워크 등), 실시간성을 보장하는 data flow 관리, flash storage 수명 연장 등이 고려되어야 한다.In terms of data management, consideration should be given to scheduling and QoS management of data to be processed first, data queue management, and ensuring real-time data. In terms of data storage, consideration should be given to buffer management of large data, lifetime management of flash storage as data storage, and data log management. On the other hand, from a 5G telematics perspective, SOA based SW design, flexible E / E architecture design with TCU and heterogeneous ECUs (eg PnP support, 10Gbps network, etc.), real-time data flow management and extended flash storage life Should be considered.
도 18은 본 발명의 일 측면에 따른 텔레매틱스 시스템을 설명하기 위한 도면이다. 도 18을 참조하면, 5G를 통한 TSN (Time Sensitive Network)을 구현하기 위해서는 Cloud 및 In-vehicle network이 지원되어야 함을 알 수 있다.18 is a view for explaining a telematics system according to an aspect of the present invention. Referring to FIG. 18, it can be seen that a cloud and an in-vehicle network must be supported to implement a time sensitive network (TSN) through 5G.
TSN은 결정론적 네트워크 (Deterministic Network)로서 항상 time sensitive applications을 타겟으로 한다. TSN은 source에서 destination까지 정보를 i) 고정되고 (fixed), ii) 예측 가능한 (predictable) 시간 내에 전달하는 방법을 제공하기 위해 개발되었다. TSN은 control data에 포커스한다. TSN의 특징으로는 i) time synchronization (예를 들면, 1us 미만) 및 guaranteed end-to-end latency, ii) resource reservation, iii) extraordinarily low packet loss ratios (10 -6 ~ 10 -9), iv) convergence all data stream이 있을 수 있다.TSN is a deterministic network that always targets time sensitive applications. TSNs were developed to provide a way to deliver information from source to destination in i) fixed, ii) predictable time. TSN focuses on control data. TSN features include: i) time synchronization (e.g., less than 1us) and guaranteed end-to-end latency, ii) resource reservation, iii) extraordinarily low packet loss ratios (10 -6 to 10 -9 ), iv) There may be convergence all data streams.
도 19는 본 발명의 일 측면에 따른 텔레매틱스 시스템과 클라우드의 관계를 설명하기 위한 도면이다. 도 19에 도시된 클라우드 시스템 아키텍쳐에 따르면, 클라우드 서비스가 SOA framework 기반의 차량 내 서비스처럼 제공될 수 있다. 종래에는 차량에 embedded된 SW에 대한 개발이 중점적으로 이루어졌지만, 본 발명에 따르면 5G 기반의 Cloud computing이 제공된다. 즉, 본 발명에 따르면 embedded에서 처리하기 힘든 서비스를 cloud를 통해 처리하는 것이 가능하다. 기본적으로 cloud의 framework은 차량 내 embedded와 동일해야 통신이 가능할 수 있다.FIG. 19 illustrates a relationship between a telematics system and a cloud according to an aspect of the present invention. According to the cloud system architecture shown in FIG. 19, the cloud service may be provided like an in-vehicle service based on the SOA framework. In the related art, development of SW embedded in a vehicle has been focused, but according to the present invention, 5G-based cloud computing is provided. That is, according to the present invention, it is possible to process through the cloud a service that is difficult to process in the embedded. Basically, the framework of the cloud must be the same as embedded in the vehicle to enable communication.
본 발명에 따라 제공 가능한 서비스로 및 애플리케이션은 vehicle location determination service, camera video relay service, IoT service, latest DSM engine이 있을 수 있다. Database (service DB)에는 차량 또는 클라우드가 제공할 수 있는 모든 서비스들이 사전에 등록될 수 있다. API Gateway는 RESTful API에 의해 차량과 기기들을 사이의 상호 운용성 (Interoperability)을 제공할 수 있다.Services and applications that can be provided according to the present invention may include a vehicle location determination service, a camera video relay service, an IoT service, and a latest DSM engine. In the database (service DB), all services that the vehicle or the cloud can provide may be registered in advance. API Gateway can provide interoperability between vehicles and devices by means of RESTful APIs.
도 20은 본 발명의 일 측면에 따른 텔레매틱스 시스템의 Framework과 cloud의 관계를 설명하기 위한 도면이다. 도 20에 도시된 것과 같이, Cloud와 Telematics AP는 IoT 통신을 위한 OCF (Open Connectivity Foundation)를 통해 연결될 수 있다. 이를 위해 차량의 Telematics는 i) IoT 통신을 위한 OCF framework, ii) embedded 통신을 위한 SOA protocol (예를 들면, SOME/IP 또는 DDS)를 구비해야 한다.20 is a view for explaining the relationship between the framework and the cloud of the telematics system according to an aspect of the present invention. As shown in FIG. 20, the Cloud and the Telematics AP may be connected through an Open Connectivity Foundation (OCF) for IoT communication. For this purpose, the telematics of the vehicle must have i) an OCF framework for IoT communication and ii) an SOA protocol for embedded communication (eg SOME / IP or DDS).
도 20에 도시된 구조를 통해, 기기간의 상호 운용성 (interoperability) 및 원활한 연결성 (seamless connectivity)이 재고될 수 있다. 이를 통해, i) secure and reliable device discovery, ii) connectivity across multiple Oss and platforms가 제공될 수 있다. Through the structure shown in FIG. 20, interoperability and seamless connectivity between devices can be reconsidered. Through this, i) secure and reliable device discovery, ii) connectivity across multiple Oss and platforms may be provided.
도 21은 본 발명의 일 측면에 따른 텔레매틱스 시스템의 media cast center를 설명하기 위한 도면이다. 구체적으로, broadband Ethernet을 통한 미디어 데이터의 효율적인 공유에 대하여 설명하도록 한다.21 is a diagram for describing a media cast center of a telematics system according to an aspect of the present invention. Specifically, efficient sharing of media data via broadband Ethernet will be described.
Media Cast Center가 수행하는 Media processing의 기본적인 기능은 다음과 같다. i) Relay (예를 들면, unicast relay 또는 multicast relay), ii) De-multiplex (separate to 2 media stream for video and audio from 1 stream), iii) Media data transport, iv) Media streaming control, v) Interface between services included out-vehicle OCF Interface, SOME/IP Interface, vi) Encoder/Decoder.The basic functions of media processing performed by the Media Cast Center are as follows. i) Relay (e.g. unicast relay or multicast relay), ii) De-multiplex (separate to 2 media stream for video and audio from 1 stream), iii) Media data transport, iv) Media streaming control, v) Interface between services included out-vehicle OCF Interface, SOME / IP Interface, vi) Encoder / Decoder.
Media Cast Center를 통해 제공될 수 있는 Use case에는 i) See through 및 ii) Synchronized playing at multiple CE devices 가 있을 수 있다. See through를 통해서, 차량 내 미디어 데이터 (예를 들면, 전방 카메라 영상)를 클라우드를 통해 외부 차량 또는 차량에 연결된 A/V devices에 공유하는 것이 가능하다. Use cases that may be provided through the Media Cast Center may include i) See through and ii) Synchronized playing at multiple CE devices. With See through, it is possible to share in-vehicle media data (e.g., front camera video) to external vehicles or A / V devices connected to the vehicle via the cloud.
한편, Synchronized playing을 통해서, 차량에 연결된 다수의 CE devices에서 공유된 화면을 제공하는 것이 가능하다. Media source로는 i) cloud 또는 web의 media file, ii) 스트리밍 서버 및 차량에 연결된 기기에 의해 제공되는 media stream URL (Universal Resource Locator)이 있을 수 있다.On the other hand, through synchronized playback, it is possible to provide a shared screen from multiple CE devices connected to the vehicle. The media source may include i) a media file of cloud or web, ii) a media stream URL (Universal Resource Locator) provided by a device connected to a streaming server and a vehicle.
도 22는 본 발명의 일 측면에 따른 텔레매틱스 시스템이 제공하는 positioning service를 설명하기 위한 도면이다. 차량의 고정밀도 positioning은 GPS/GNSS와 Visual-inertial Odometry (VIO)의 통합을 통해 제공될 수 있다. 본 발명의 일 측면에 따른 텔레매틱스 시스템이 제공하는 positioning service의 주요 특징은 다음과 같을 수 있다.FIG. 22 illustrates a positioning service provided by a telematics system according to an aspect of the present invention. High precision positioning of the vehicle can be provided through the integration of GPS / GNSS and Visual-inertial Odometry (VIO). Main features of the positioning service provided by the telematics system according to an aspect of the present invention may be as follows.
i) Sensor collection: Sensor data is collected from GPS, accelerometer, gyroscope and camera, and this data should be time synchronizedi) Sensor collection: Sensor data is collected from GPS, accelerometer, gyroscope and camera, and this data should be time synchronized
ii) VIO operation: VIO algorithm estimates the relative position and orientation of a vehicle using a camera and inertial sensors. The camera processing block detects features and tracks them. The inertial data processing block samples the sensors at a very high frequency (100 Hz or Higher)ii) VIO operation: VIO algorithm estimates the relative position and orientation of a vehicle using a camera and inertial sensors. The camera processing block detects features and tracks them. The inertial data processing block samples the sensors at a very high frequency (100 Hz or Higher)
iii) GPS + VIO fusion: Tightly couple GPS/GNSS measurements and local/relative coordinate measurements from VIO to achieve highly accurate global positioniii) GPS + VIO fusion: Tightly couple GPS / GNSS measurements and local / relative coordinate measurements from VIO to achieve highly accurate global position
iv) Position handler: The SOA framework-based protocol is used to transfer the global position to the Vehicle Position Manager in the cloud.iv) Position handler: The SOA framework-based protocol is used to transfer the global position to the Vehicle Position Manager in the cloud.
한편, DSDA (Dual SIM Dual Active)란 하나의 TCU에 두 개의 SIM을 설치하고, 두 개의 통신사 서비스를 동시에 사용하는 것을 의미한다. 예를 들면, SIM #1은 OTA, V2X, big data, Maintenance를 위해 차량 제조사가 사용하고, SIM #2는 wi-fi hotspot, web access, video steaming을 위해 사용자가 통신사를 선택하여 사용할 수 있다. 5G telematics 관점에서, DSDA는 기술적인 측면에서 5G 의존성은 낮고, 통신사와 차량 제조사의 이해관계가 기술 상용화의 주요 관건이다. 현재까지 5G Modem vendor의 prototype은 발표된 바 없다.Meanwhile, DSDA (Dual SIM Dual Active) means installing two SIMs in one TCU and using two carrier services at the same time. For example, SIM # 1 is used by vehicle manufacturers for OTA, V2X, big data, and maintenance, while SIM # 2 allows users to select a carrier for wi-fi hotspot, web access, and video steaming. From a 5G telematics point of view, DSDA has a low 5G dependency on the technical side, and the interests of carriers and vehicle manufacturers are key to commercializing the technology. To date, no prototype of 5G Modem vendor has been published.
도 23은 본 발명의 일 측면에 따른 텔레매틱스 시스템이 제공하는 electronic horizon을 설명하기 위한 도면이다. 텔레매틱스 시스템은 차량의 전방의 기하 정보 (geometry information)을 제공할 수 있다. 이는 Electronic Horizon이라 명명될 수 있으며 ADASIS Protocol을 통해 제공될 수 있다. 텔레매틱스 시스템은 horizon provider로서의 역할을 수행할 수 있으며, 구체적인 사항은 아래와 같다.FIG. 23 is a diagram for describing an electronic horizon provided by a telematics system according to an aspect of the present disclosure. The telematics system can provide geometry information in front of the vehicle. This can be called Electronic Horizon and can be provided through the ADASIS Protocol. A telematics system can act as a horizon provider. Details are as follows.
i) Electronic Horizoni) Electronic Horizon
Electronic Horizon provides geometry information to other ECUs. (ADAS, Navigation, ...). Electronic Horizon requires vehicle's position (GPS) and a digital map (from cloud service). Horizon's Information is Position, Path, Lane, Road (tunnel, bridge), Speed Limit, Traffic sign, Warning (under construction). The ADASIS forum makes ADASIS protocol to standardize the interface between a digital map and ADAS applications.Electronic Horizon provides geometry information to other ECUs. (ADAS, Navigation, ...). Electronic Horizon requires vehicle's position (GPS) and a digital map (from cloud service). Horizon's Information is Position, Path, Lane, Road (tunnel, bridge), Speed Limit, Traffic sign, Warning (under construction). The ADASIS forum makes ADASIS protocol to standardize the interface between a digital map and ADAS applications.
ii) ADASIS v2ii) ADASIS v2
ADASIS v2 is designed for CAN bus communication restricted to 8 byte messages. ADASIS v2 consists of Horizon Provider, Horizon Reconstructor, ADASIS Protocol and ADAS Application.ADASIS v2 is designed for CAN bus communication restricted to 8 byte messages. ADASIS v2 consists of Horizon Provider, Horizon Reconstructor, ADASIS Protocol and ADAS Application.
iii) ADASIS v3iii) ADASIS v3
ADASIS v3 is designed for higher bandwidth communication. ADASIS v3 provides additional data with more detailed content. (Lane level information). ADASIS v3 supports detailed information (e.g., HD-GPS, HD-maps, sensors, and V2X).ADASIS v3 is designed for higher bandwidth communication. ADASIS v3 provides additional data with more detailed content. (Lane level information). ADASIS v3 supports detailed information (e.g., HD-GPS, HD-maps, sensors, and V2X).
도 24 내지 도 31은 본 발명의 일 측면에 따른 텔레매틱스 시스템의 구현 예를 설명하기 위한 도면이다. 보다 구체적으로, 도 24 내지 도 25는 route forecasting을 설명하기 위한 도면이고, 도 26 내지 도 27은 crowd-eye sourcing을 설명하기 위한 도면이고, 도 28 내지 도 29는 take-out cinema를 설명하기 위한 도면이고, 도 30 내지 도 31은 adaptive guidance를 설명하기 위한 도면이다.24 to 31 are views for explaining an implementation example of a telematics system according to an aspect of the present invention. More specifically, FIGS. 24 to 25 are diagrams for explaining route forecasting, FIGS. 26 to 27 are diagrams for explaining crowd-eye sourcing, and FIGS. 28 to 29 are diagrams for explaining take-out cinema 30 and 31 are diagrams for explaining adaptive guidance.
도 24 내지 도 25를 참조하면, route forecasting은 차량의 전방에 주행 중인 차량들의 영상을 탐색하여 도로 상황을 실시간으로 미리 예측할 수 있는 5G 네비게이션 서비스를 의미한다. Referring to FIGS. 24 to 25, route forecasting refers to a 5G navigation service capable of predicting road conditions in real time by searching for images of vehicles in front of the vehicle.
route forecasting을 구현하기 위한 key value는 원하는 도로의 실시간 영상과 경로 연동, 영상 정보 탐색일 수 있다. route forecasting을 통해 차량간 실시간 영상이 공유될 수 있고, 공유된 영상이 분석될 수 있다. route forecasting은 cloud computing, HD-Positioning, AR에 이용될 수 있다. 구체적인 구현 예로서, i) 길이 막히는 경우, 앞 차량들의 영상을 탐색하여 도로 상황을 예측하는 것이 가능하고, ii) 차량 출발 전, 운행이 어려운 구간을 브리핑하는 상황에 이용될 수 있으며, iii) 주변 지형지물의 이미지를 활용한 경로 안내에 이용될 수 있다. 도 25를 참조하면, Route forecasting을 구현하기 위해 필요한 기술 요소로는 전술한 TSN, DPDK, SOA framework, Advanced Positioning, Cloud server, Media streaming, AR engine이 있을 수 있다.Key values for implementing route forecasting may be real-time video of a desired road, linkage with a route, and video information search. Through route forecasting, real-time video can be shared between vehicles and the shared video can be analyzed. Route forecasting can be used for cloud computing, HD-Positioning, and AR. As a specific implementation, i) it may be possible to predict road conditions by searching the images of the vehicles ahead of the road when the road is blocked, and ii) may be used in a briefing section of a difficult road before the vehicle starts, and iii) the surroundings. It can be used for route guidance using the image of the feature. Referring to FIG. 25, technical elements required for implementing route forecasting may include the aforementioned TSN, DPDK, SOA framework, Advanced Positioning, Cloud server, Media streaming, and AR engine.
도 26 내지 도 27을 참조하면, crowd-eye sourcing은 주변 차량들이 내 차량을 촬영한 영상을 다각도로 확보하여 다양한 상황에서 활용하는 서비스를 의미한다.Referring to FIGS. 26 to 27, crowd-eye sourcing refers to a service used by various vehicles to secure an image of photographing my vehicle at various angles.
crowd-eye sourcing을 구현하기 위한 key value는 특정 상황 (예를 들면, 사고)에서 관련 영상을 자동으로 수집하는 것, 다시점/다채널 사고 영상의 실시간 편집, 보험사/렌터카 업체 등 B2B 연계 서비스가 있을 수 있다. crowd-eye sourcing을 통해 차량간 실시간 영상 공유 및 편집이 가능하다. crowd-eye sourcing은 cloud computing 및 IoT manager에 이용될 수 있다. 구체적인 구현 예로서, i) 사고 발생 시, 주변 차량들의 블랙박스 영상을 다각도에서 확보하고 즉시 조합하여 운전자 및 보험사에 제공하는 시스템, ii) 여행 중 주변 차량이 찍은 내 차량의 영상을 전송받아 활용하는 것이 가능하다. 도 27을 참조하면, crowd-eye sourcing을 구현하기 위해 필요한 기술 요소로는 전술한 TSN, DPDK, Advanced Positioning, Cloud server, Media streaming이 있을 수 있다.Key values for implementing crowd-eye sourcing include automatic collection of relevant images in specific situations (eg accidents), real-time editing of multi-view / multi-channel accident images, insurance companies / car rental companies, etc. There may be. Crowd-eye sourcing enables real-time video sharing and editing between vehicles. Crowd-eye sourcing can be used for cloud computing and IoT managers. As a specific implementation example, i) a system that obtains a black box image of surrounding vehicles from various angles and immediately combines them to provide a driver and an insurance company in case of an accident, and ii) receives and utilizes images of my vehicle taken by the surrounding vehicles while traveling. It is possible. Referring to FIG. 27, technical elements required to implement crowd-eye sourcing may include the aforementioned TSN, DPDK, Advanced Positioning, Cloud server, and media streaming.
도 28 내지 도 29를 참조하면, take-out cinema는 5G 인포테인먼트 시스템을 주변 기기에 연결하여 초고화질 스트리밍 컨텐트를 차량 외부에서 감상할 수 있는 솔루션을 의미한다.Referring to FIGS. 28 to 29, the take-out cinema refers to a solution capable of viewing ultra-high definition streaming content from outside the vehicle by connecting a 5G infotainment system to a peripheral device.
take-out cinema를 구현하기 위한 key value는 야외 미디어 센터로의 car HMI 역할 확장, 테더링된 기기에서 IVI Interface 활용, 스마트폰 대비 쉽고 효율적인 테더링일 수 있다. take-out cinema을 통해 주변 멀티미디어 기기 활용이 가능한 통합 커넥티비티 시스템이 구축될 수 있다. take-out cinema는 Media Multicast Manager 및 AR에 이용될 수 있다. 구체적인 구현 예로서, i) 프로젝터, 스피커 등 미디어 기기와 연결하여 야외에서 고화질 컨텐트를 감상하는 것, ii) 고음질, 고화질 기기들을 차량에 탑재하여 미디어 룸으로 활용하는 것, iii) VR 기기와 연결하여 탑승 중에 주변 영상 감상 서비스 제공 등이 가능하다. 도 29를 참조하면, take-out cinema를 구현하기 위해 필요한 기술 요소로는, 전술한 Media streaming, Cloud server가 있을 수 있다.Key values for implementing take-out cinema can be extended to the role of car HMI to an outdoor media center, utilization of IVI interface on tethered devices, and easy and efficient tethering compared to smartphones. A take-out cinema can be used to build an integrated connectivity system that can utilize peripheral multimedia devices. Take-out cinema can be used for Media Multicast Manager and AR. As a specific example of implementation, i) connecting a media device such as a projector or a speaker to enjoy high-definition content outdoors, ii) using a high-quality and high-definition device in a vehicle to use as a media room, iii) connecting a VR device It is possible to provide the surrounding video viewing service while boarding. Referring to FIG. 29, technical elements required to implement take-out cinema may include the above-described media streaming and cloud server.
도 30 내지 도 31을 참조하면, Adaptive Guidance는 전방 차량에서 도로 상황을 전달받아, 탑승자 활동에 접합한 주행 패턴을 설정하고 주의 알림을 제공하는 자율주행 시스템을 의미한다.Referring to FIGS. 30 to 31, Adaptive Guidance refers to an autonomous driving system that receives a road situation from a front vehicle, sets a driving pattern bonded to an occupant activity, and provides attention notification.
Adaptive Guidance를 구현하기 위한 key value는 탑승자 activity 기반의 경로/속도 조절, 주의사항 사전 알림 및 경고, DMS 활용 및 주요 activity 분석일 수 있다. Adaptive Guidance를 통해 실시간 도로 상황 정보의 공유와 탑승자 활동 분석이 가능하다. Adaptive Guidance는 cloud computing, data manager, E-horizon에 이용될 수 있다. 구체적인 구현 예로서, i) manual driving에서 운전자의 전방 주시 여부를 모니터링 하여 신호등 변경 알림 제공, ii) automatic driving에서 차량 내부에서 개인 작업 상황과 관련된 물건들을 모니터링 하여 출발/급정거/차선변경 등을 경고, iii) automatic driving에서 사용자가 수면 중인 경우 요철 구간이나 급출발/급정거 등을 피해갈 수 있도록 경로 변경이 있을 수 있다. 도 31을 참조하면, Adaptive Guidance를 구현하기 위한 기술 요소로는 TSN, DPDK, Advanced Positioning, Cloud server, DMS, Road forecasting, IoTivity가 있을 수 있다.Key values for implementing Adaptive Guidance may be path / speed control based on occupant activity, preliminary notice and warning of precautions, DMS utilization and major activity analysis. Adaptive Guidance enables sharing of real-time road situation information and analysis of occupant activity. Adaptive Guidance can be used for cloud computing, data managers, and E-horizon. As a specific implementation example, i) monitoring the driver's forward direction in manual driving to provide traffic light change notification, ii) in automatic driving, monitoring the objects related to personal work situation inside the vehicle to warn of starting / stopping / lane changing, iii) If the user is sleeping in automatic driving, there may be a route change so as to avoid uneven section or sudden start / stop. Referring to FIG. 31, technical elements for implementing adaptive guidance may include TSN, DPDK, Advanced Positioning, Cloud server, DMS, Road forecasting, and IoTivity.
도 32 내지 도 34는 본 발명의 일 측면에 따른 텔레매틱스 시스템의 기술적 효과를 설명하기 위한 도면이다.32 to 34 are views for explaining a technical effect of a telematics system according to an aspect of the present invention.
도 32에 도시된 텔레매틱스 시스템의 Media Cast Center와 cloud server는 높은 throughput을 특징으로 한다. 특히, Media Cast Center는 Relay, Media data transport, Media streaming control, Encoder/Decoder를 특징으로 한다. 그에 따른 이점으로, i) System for increased bandwidth efficiency, ii) Media service scalability, iii) High quality of media services가 있을 수 있다. 높은 throughput에 따른 제약으로는 i) Non consideration about secure channel between content server, ii) Non consideration about copyright of content가 있을 수 있다.Media Cast Center and cloud server of the telematics system shown in FIG. 32 are characterized by high throughput. In particular, the Media Cast Center features Relay, Media data transport, Media streaming control, Encoder / Decoder. As a result, there may be i) System for increased bandwidth efficiency, ii) Media service scalability, and iii) High quality of media services. Restrictions due to high throughput may include i) Non consideration about secure channel between content server, and ii) Non consideration about copyright of content.
도 33에 도시된 텔레매틱스 시스템의 TCU와 cloud server는 low latency를 특징으로 한다. 특히, TCU및 cloud server의 SOA framework (SOME/IP 또는 IoTivity)을 통한 MMI (Multi Media Interface), 360 Media processing이 가능하다. 반면, low latency에 따른 제약으로는 i) need a edge computing by MNO, ii) 360 or VR camera performance, iii) 360 or VR player (Software/Required HW), iv) Plyer Interface가 있을 수 있다.The TCU and cloud server of the telematics system shown in FIG. 33 are characterized by low latency. In particular, MMI (Multi Media Interface) and 360 Media processing are possible through SOA framework (SOME / IP or IoTivity) of TCU and cloud server. On the other hand, low latency constraints may include i) need a edge computing by MNO, ii) 360 or VR camera performance, iii) 360 or VR player (Software / Required HW), and iv) Plyer Interface.
도 34에 도시된 텔레매틱스 시스템은 V2N (Vehicle to Network) see-through를 특징으로 한다. 도 21에서 전술한 바와 같이 본 발명의 일 측면에 따른 텔레매틱스 시스템은 SOA framework, cloud server, media cast center를 통해 see-through 기능을 제공할 수 있다.The telematics system shown in FIG. 34 is characterized by a vehicle to network (V2N) see-through. As described above in FIG. 21, the telematics system according to an aspect of the present invention may provide a see-through function through an SOA framework, a cloud server, and a media cast center.
도 35 내지 도 36은 본 발명의 일 측면에 따른 텔레매틱스 시스템에서 TCU와 cloud server (또는 network server)의 신호 송수신 시퀀스를 설명하기 위한 도면이다. 우선, 도 35를 참조하여 network server와 TCU의 신호 송수신 시퀀스를 설명하도록 한다.35 to 36 illustrate a signal transmission and reception sequence between a TCU and a cloud server (or a network server) in a telematics system according to an aspect of the present invention. First, the signal transmission and reception sequence of the network server and the TCU will be described with reference to FIG. 35.
1. 카메라 영상 원격 1. Camera video remote 모니터링monitoring 시 5G  5G telematicstelematics system (이하 5G TS)와 cloud server (이하 서버)간의 연동 방법 Interworking method between system (hereafter 5G TS) and cloud server (hereafter server)
1.1 카메라 영상 업로드1.1 Upload Camera Video
A. 서버에서 원격 모니터링 시작 이벤트 발생. B. 서버의 Camera Cast Service가 5G TS의 Media Cast Service(이하 MCS)로부터 영상을 전송 받을 포트 번호 확인 및 개방 (서버의 초기화). C. 서버의 MQTT (Message Queuing Telemetry Transport) Broker가 5G TS의 Vehicle Event Manager에게 카메라 영상 요청 (요청 파라미터 : 프로토콜 종류 (UDP/TCP), 포트 번호, 카메라 종류 (전방, 360, 캐빈)). D. 5G TS의 Vehicle Event Manager가 MCS에게 C번의 요청 정보 전달. E. MCS는 5G TS의 Vehicle Event Manager가 전달한 정보를 바탕으로 해당 카메라를 on시키고 영상을 서버로 전달할 준비. F. MCS는 카메라 영상 획득 시작. G. F의 카메라 영상에 대해서 필요하다면 re-packetizing을 수행하여 서버로 전송A. A remote monitoring start event occurs on the server. B. Check and open the port number where the Camera Cast Service of the server receives the video from the Media Cast Service (hereinafter referred to as MCS) of the 5G TS (initialization of the server). C. Message Queuing Telemetry Transport (MQTT) Broker of server requests camera video from Vehicle Event Manager of 5G TS (Request parameters: protocol type (UDP / TCP), port number, camera type (front, 360, cabin)). D. Vehicle Event Manager of 5G TS forwards C request information to MCS. E. MCS prepares to turn on the camera and deliver the image to the server based on the information delivered by the Vehicle Event Manager of 5G TS. F. MCS starts to acquire camera image. G. If necessary, re-packetize the camera image and send it to the server.
1.2 카메라 영상 업로드 중지1.2 Stop uploading camera video
H. 서버에서 원격 모니터링 중지 이벤트 발생. I. 서버의 MQTT Broke가 5G TS의 Vehicle Event Manager에게 카메라 전송 중지 요청. J. 5G TS의 Vehicle Event Manager는 MCS에게 I의 전송 중지 요청을 전달. K. MCS는 J의 카메라 영상 전송 중지. L. MCS는 해당 카메라를 off.H. A remote monitoring stop event occurs on the server. I. MQTT Broke on the server asks the 5G TS Vehicle Event Manager to stop sending the camera. J. The Vehicle Event Manager of the 5G TS sends an ICS stop request to the MCS. K. MCS stops transmitting J's camera video. L. MCS turns off the camera.
2. 미디어 동기 재생 시 Media Sharing Service 활용 방법2. How to Use Media Sharing Service for Media Sync Play
A. AVN 또는 RSE에서 5G TS의 Media Sharing Service (이하 MSS)에게 동기 재생할 미디어 스트리밍 URL (예를 들면, 서버 URL, 차량 내 스트리밍 URL)을 전달. B. MSS는 A의 미디어 스트리밍 URL의 유효성 확인. C. MSS는 A의 미디어 스트리밍 URL과 연결되는 스트리밍 파이프라인 구성. D. MSS는 C의 파이프라인을 접속할 수 있는 URL을 생성 (RTSP). E. URL을 target device에 전달. F. target device에서는 D의 URL에 접속하여 미디어 스트리밍 재생. G. target device에서 동기 재생을 위해서는 주기적으로 전송되는 기준 시간 정보 (in RTCP) 및 RTP 패킷 헤더의 time stamp를 이용하여 동기된 형태로 A/V를 출력.A. Pass the media streaming URL (e.g., server URL, in-vehicle streaming URL) to AVG or RSE to the Media Sharing Service (MSS) of 5G TS for synchronous playback. B. MSS validates A's media streaming URL. C. MSS constructs a streaming pipeline that is associated with A's media streaming URL. D. MSS generates a URL to connect to C's pipeline (RTSP). E. Pass the URL to the target device. F. Play the media streaming on the target device's URL. G. For synchronous playback on the target device, A / V is output in synchronous form by using the reference time information (in RTCP) transmitted periodically and the time stamp of the RTP packet header.
한편, C에서 전송 프로토콜로는 RTP (Real time Transport Protocol)/UDP (User Datagram Protocol)가 이용될 수 있고, 스트리밍 제어 프로토콜로는 RTSP (Real Time Streaming Protocol)가 이용될 수 있고, QoS 프로토콜로는 RTCP (Real time Transport Control Protocol)가 이용될 수 있다. 한편, 도 35에 도시된 각 구성요소들의 정의 및 역할은 아래 표 1 과 같을 수 있다.Meanwhile, in the C, a real time transport protocol (RTP) / UDP (user datagram protocol) may be used as a transport protocol, a real time streaming protocol (RTSP) may be used as a streaming control protocol, and a QoS protocol may be used. Real time transport control protocol (RTCP) may be used. On the other hand, the definition and role of each component shown in Figure 35 may be as shown in Table 1 below.
Figure PCTKR2019001975-appb-img-000001
Figure PCTKR2019001975-appb-img-000001
도 36은 본 발명의 일 측면에 따른 텔레매틱스 시스템과 Cloud server간의 Media cast center 사용을 위한 연동 시나리오를 설명하기 위한 순서도이다.36 is a flowchart illustrating an interworking scenario for using a media cast center between a telematics system and a cloud server according to an aspect of the present invention.
차량으로 media 정보요청이 입력되면, 차량의 5G TS는 cloud server에 요청을 전송하기 위한 준비를 한다. 차량은 5G TS와 coaxial cable로 연결된 antenna unit을 통해 주파수 출력을 제어한다. 차량은 Ethernet cable로 연결된 gNB를 통해 cloud server로 신호를 전송한다. 차량은 cloud server로부터 cloud server의 media cast center의 컨텐트를 수신한다. 차량은 antenna를 통해 수신된 content를 cable을 통해 5G TS로 전달한다. 5G TS와 Ethernet cable로 연결된 AVN으로 정보가 전달된다.When the media information request is input to the vehicle, the 5G TS of the vehicle prepares to transmit the request to the cloud server. The vehicle controls the frequency output through an antenna unit connected by a 5G TS and a coaxial cable. The vehicle transmits signals to the cloud server through the gNB connected by Ethernet cable. The vehicle receives the content of the media cast center of the cloud server from the cloud server. The vehicle transmits the content received through the antenna to the 5G TS through the cable. Information is transmitted to AVN connected by 5G TS and Ethernet cable.
상술한 본 발명의 실시예들은 다양한 수단을 통해 구현될 수 있다. 예를 들어, 본 발명의 실시예들은 하드웨어, 펌웨어(firmware), 소프트웨어 또는 그것들의 결합 등에 의해 구현될 수 있다. Embodiments of the present invention described above may be implemented through various means. For example, embodiments of the present invention may be implemented by hardware, firmware, software, or a combination thereof.
하드웨어에 의한 구현의 경우, 본 발명의 실시예들에 따른 방법은 하나 또는 그 이상의 ASICs(Application Specific Integrated Circuits), DSPs(Digital Signal Processors), DSPDs(Digital Signal Processing Devices), PLDs(Programmable Logic Devices), FPGAs(Field Programmable Gate Arrays), 프로세서, 컨트롤러, 마이크로 컨트롤러, 마이크로 프로세서 등에 의해 구현될 수 있다.For implementation in hardware, a method according to embodiments of the present invention may include one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), and Programmable Logic Devices (PLDs). It may be implemented by field programmable gate arrays (FPGAs), processors, controllers, microcontrollers, microprocessors, and the like.
펌웨어나 소프트웨어에 의한 구현의 경우, 본 발명의 실시예들에 따른 방법은 이상에서 설명된 기능 또는 동작들을 수행하는 모듈, 절차 또는 함수 등의 형태로 구현될 수 있다. 소프트웨어 코드는 메모리 유닛에 저장되어 프로세서에 의해 구동될 수 있다. 상기 메모리 유닛은 상기 프로세서 내부 또는 외부에 위치하여, 이미 공지된 다양한 수단에 의해 상기 프로세서와 데이터를 주고 받을 수 있다.In the case of an implementation by firmware or software, the method according to the embodiments of the present invention may be implemented in the form of a module, a procedure, or a function that performs the functions or operations described above. The software code may be stored in a memory unit and driven by a processor. The memory unit may be located inside or outside the processor, and may exchange data with the processor by various known means.
상술한 바와 같이 개시된 본 발명의 바람직한 실시형태에 대한 상세한 설명은 당업자가 본 발명을 구현하고 실시할 수 있도록 제공되었다. 상기에서는 본 발명의 바람직한 실시 형태를 참조하여 설명하였지만, 해당 기술 분야의 숙련된 당업자는 하기의 특허 청구의 범위에 기재된 본 발명의 사상 및 영역으로부터 벗어나지 않는 범위 내에서 본 발명을 다양하게 수정 및 변경시킬 수 있음을 이해할 수 있을 것이다. 따라서, 본 발명은 여기에 나타난 실시형태들에 제한되려는 것이 아니라, 여기서 개시된 원리들 및 신규한 특징들과 일치하는 최광의 범위를 부여하려는 것이다. 또한, 이상에서는 본 명세서의 바람직한 실시예에 대하여 도시하고 설명하였지만, 본 명세서는 상술한 특정의 실시예에 한정되지 아니하며, 청구범위에서 청구하는 본 명세서의 요지를 벗어남이 없이 당해 발명이 속하는 기술분야에서 통상의 지식을 가진 자에 의해 다양한 변형실시가 가능한 것은 물론이고, 이러한 변형 실시들은 본 명세서의 기술적 사상이나 전망으로부터 개별적으로 이해되어서는 안될 것이다.The detailed description of the preferred embodiments of the invention disclosed as described above is provided to enable any person skilled in the art to make and practice the invention. Although the above has been described with reference to the preferred embodiments of the present invention, those skilled in the art will variously modify and change the present invention without departing from the spirit and scope of the invention as set forth in the claims below. I can understand that you can. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein. In addition, while the preferred embodiments of the present specification have been shown and described, the present specification is not limited to the specific embodiments described above, and the technical field to which the present invention belongs without departing from the gist of the present specification claimed in the claims. Of course, various modifications can be made by those skilled in the art, and these modifications should not be individually understood from the technical spirit or prospect of the present specification.
그리고 당해 명세서에서는 물건 발명과 방법 발명이 모두 설명되고 있으며, 필요에 따라 양 발명의 설명은 보충적으로 적용될 수 있다.In the present specification, both the object invention and the method invention are described, and the description of both inventions may be supplementarily applied as necessary.
발명의 실시를 위한 다양한 형태가 상기 발명의 실시를 위한 최선의 형태에서 설명되었다.Various aspects for carrying out the invention have been described in the best mode for the practice of the invention.
상기 설명은 모든 면에서 제한적으로 해석되어서는 아니되고 예시적인 것으로 고려되어야 한다. 본 발명의 범위는 첨부된 청구항의 합리적 해석에 의해 결정되어야 하고, 본 발명의 등가적 범위 내에서의 모든 변경은 본 발명의 범위에 포함된다.The description is not to be construed as limiting in all respects, but should be considered as illustrative. The scope of the invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the invention are included in the scope of the invention.
전술한 본 발명은, 프로그램이 기록된 매체에 컴퓨터가 읽을 수 있는 코드로서 구현하는 것이 가능하다. 컴퓨터가 읽을 수 있는 매체는, 컴퓨터 시스템에 의하여 읽혀질 수 있는 데이터가 저장되는 모든 종류의 기록장치를 포함한다. 컴퓨터가 읽을 수 있는 매체의 예로는, HDD(Hard Disk Drive), SSD(Solid State Disk), SDD(Silicon Disk Drive), ROM, RAM, CD-ROM, 자기 테이프, 플로피 디스크, 광 데이터 저장 장치 등이 있으며, 또한 캐리어 웨이브(예를 들어, 인터넷을 통한 전송)의 형태로 구현되는 것도 포함한다. 또한, 상기 컴퓨터는 단말기의 제어부(180)를 포함할 수도 있다. 따라서, 상기의 상세한 설명은 모든 면에서 제한적으로 해석되어서는 아니되고 예시적인 것으로 고려되어야 한다. 본 발명의 범위는 첨부된 청구항의 합리적 해석에 의해 결정되어야 하고, 본 발명의 등가적 범위 내에서의 모든 변경은 본 발명의 범위에 포함된다.The present invention described above can be embodied as computer readable codes on a medium in which a program is recorded. The computer-readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of computer-readable media include hard disk drives (HDDs), solid state disks (SSDs), silicon disk drives (SDDs), ROMs, RAM, CD-ROMs, magnetic tapes, floppy disks, optical data storage devices, and the like. This also includes implementations in the form of carrier waves (eg, transmission over the Internet). In addition, the computer may include the controller 180 of the terminal. Accordingly, the above detailed description should not be construed as limiting in all aspects and should be considered as illustrative. The scope of the invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the invention are included in the scope of the invention.

Claims (12)

  1. 차량에 구비되는 텔레매틱스 시스템을 제어하는 방법에 있어서,In the method for controlling a telematics system provided in a vehicle,
    컨텐트 출력을 위한 요청을 수신하는 단계; 상기 텔레매틱스 시스템에 연결된 기지국을 통해 외부 서버에 상기 컨텐트와 관련된 정보를 요청하는 단계; 및 상기 기지국을 통해 상기 외부 서버로부터 상기 정보를 수신하는 단계를 포함하고,Receiving a request for outputting content; Requesting information related to the content from an external server through a base station connected to the telematics system; And receiving the information from the external server via the base station,
    상기 수신된 정보는 상기 텔레매틱스 시스템에 연결된 복수의 타겟 디바이스에 각각 전달되는, 차량에 구비되는 텔레매틱스 시스템을 제어하는 방법.And the received information is transmitted to a plurality of target devices respectively connected to the telematics system.
  2. 제 1 항에 있어서, The method of claim 1,
    상기 정보는 소정의 프로토콜을 통해 상기 외부 서버로부터 수신되고, 상기 소정의 프로토콜은 전송 프로토콜, 스트리밍 제어 프로토콜 및 QoS (Quality of Service) 프로토콜을 포함하는, 차량에 구비되는 텔레매틱스 시스템을 제어하는 방법.Wherein said information is received from said external server via a predetermined protocol, said predetermined protocol comprising a transmission protocol, a streaming control protocol, and a quality of service protocol.
  3. 제 1 항에 있어서,The method of claim 1,
    상기 수신된 정보가 상기 외부 서버에 의해 제공되는 미디어 컨텐트인 경우, 상기 텔레매틱스 시스템은 상기 복수의 타겟 디바이스가 각각 접속할 수 있는 URL (Universal Resource Locator)을 생성하고, 상기 생성된 URL을 상기 복수의 타겟 디바이스에 각각 전달하는, 차량에 구비되는 텔레매틱스 시스템을 제어하는 방법.If the received information is media content provided by the external server, the telematics system generates a URL (Universal Resource Locator) to which each of the plurality of target devices can access, and converts the generated URLs into the plurality of targets. A method for controlling a telematics system provided in a vehicle, each delivered to a device.
  4. 제 2 항에 있어서,The method of claim 2,
    상기 전달된 정보는 상기 QoS 프로토콜에 포함된 기준 시간 정보 및 상기 외부 서버로부터 수신되는 정보를 구성하는 패킷에 포함된 시간 정보에 기초하여 상기 복수의 타겟 디바이스 각각에서 동기화되어 출력되는, 차량에 구비되는 텔레매틱스 시스템을 제어하는 방법.The transmitted information is synchronized to each of the plurality of target devices based on reference time information included in the QoS protocol and time information included in a packet constituting information received from the external server. How to control your telematics system.
  5. 제 1 항에 있어서,The method of claim 1,
    상기 복수의 타겟 디바이스는 상기 차량 내부에 구비되는 CID (Center Information Display), AVN (Audio Video Navigation) 및 RSE (Rear Seat Entertainment)과 상기 차량 내부 또는 외부에 구비되는 CE (Consumer Electronics) Device를 포함하는, 차량에 구비되는 텔레매틱스 시스템을 제어하는 방법.The plurality of target devices include a center information display (CID), audio video navigation (AVN), rear seat entertainment (RSE), and consumer electronics (CE) device provided inside or outside the vehicle. , A method for controlling a telematics system provided in a vehicle.
  6. 제 1 항에 있어서,The method of claim 1,
    상기 외부 서버로부터 수신된 컨텐트는 미디어 컨텐트 또는 다른 차량의 카메라 영상에 대응하는, 차량에 구비되는 텔레매틱스 시스템을 제어하는 방법.The content received from the external server corresponds to the media content or the camera image of another vehicle, the method of controlling a telematics system provided in the vehicle.
  7. 차량에 구비되는 텔레매틱스 시스템에 있어서,In the telematics system provided in a vehicle,
    RF (Radio Frequency) 유닛; 및RF (Radio Frequency) unit; And
    컨텐트 출력을 위한 요청을 수신하고, 상기 RF 유닛을 제어하여 상기 텔레매틱스 시스템에 연결된 기지국을 통해 외부 서버에 상기 컨텐트와 관련된 정보를 요청하고, 상기 RF 유닛을 제어하여 상기 기지국을 통해 상기 외부 서버로부터 상기 정보를 수신하는 프로세서를 포함하고,Receiving a request for content output, controlling the RF unit to request information related to the content to an external server through a base station connected to the telematics system, and controlling the RF unit to control the RF unit from the external server through the base station; A processor for receiving information,
    상기 수신된 정보는 상기 텔레매틱스 시스템에 연결된 복수의 타겟 디바이스에 각각 전달되는, 차량에 구비되는 텔레매틱스 시스템.And the received information is respectively delivered to a plurality of target devices connected to the telematics system.
  8. 제 7 항에 있어서,The method of claim 7, wherein
    상기 정보는 소정의 프로토콜을 통해 상기 외부 서버로부터 수신되고, 상기 소정의 프로토콜은 전송 프로토콜, 스트리밍 제어 프로토콜 및 QoS (Quality of Service) 프로토콜을 포함하는, 차량에 구비되는 텔레매틱스 시스템.Wherein the information is received from the external server via a predetermined protocol, wherein the predetermined protocol comprises a transport protocol, a streaming control protocol, and a quality of service (QoS) protocol.
  9. 제 7 항에 있어서,The method of claim 7, wherein
    상기 수신된 정보가 상기 외부 서버에 의해 제공되는 미디어 컨텐트인 경우, 상기 프로세서는 상기 복수의 타겟 디바이스가 각각 접속할 수 있는 URL (Universal Resource Locator)을 생성하고, 상기 생성된 URL을 상기 복수의 타겟 디바이스에 각각 전달하는, 차량에 구비되는 텔레매틱스 시스템.When the received information is media content provided by the external server, the processor generates a universal resource locator (URL) to which the plurality of target devices can connect, and converts the generated URLs into the plurality of target devices. Each telematics system provided to the vehicle, to transmit to.
  10. 제 8 항에 있어서,The method of claim 8,
    상기 전달된 정보는 상기 QoS 프로토콜에 포함된 기준 시간 정보 및 상기 외부 서버로부터 수신되는 정보를 구성하는 패킷에 포함된 시간 정보에 기초하여 상기 복수의 타겟 디바이스 각각에서 동기화되어 출력되는, 차량에 구비되는 텔레매틱스 시스템.The transmitted information is synchronized to each of the plurality of target devices based on reference time information included in the QoS protocol and time information included in a packet constituting information received from the external server. Telematics system.
  11. 제 7 항에 있어서,The method of claim 7, wherein
    상기 복수의 타겟 디바이스는 상기 차량 내부에 구비되는 CID (Center Information Display), AVN (Audio Video Navigation) 및 RSE (Rear Seat Entertainment)과 상기 차량 내부 또는 외부에 구비되는 CE (Consumer Electronics) Device를 포함하는, 차량에 구비되는 텔레매틱스 시스템.The plurality of target devices include a center information display (CID), audio video navigation (AVN), rear seat entertainment (RSE), and consumer electronics (CE) device provided inside or outside the vehicle. , Telematics system provided in the vehicle.
  12. 제 7 항에 있어서,The method of claim 7, wherein
    상기 외부 서버로부터 수신된 컨텐트는 미디어 컨텐트 또는 다른 차량의 카메라 영상에 대응하는, 차량에 구비되는 텔레매틱스 시스템.And the content received from the external server corresponds to media content or a camera image of another vehicle.
PCT/KR2019/001975 2018-06-25 2019-02-19 Telematics system provided in vehicle and method for controlling same WO2020004767A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862689254P 2018-06-25 2018-06-25
US62/689,254 2018-06-25

Publications (1)

Publication Number Publication Date
WO2020004767A1 true WO2020004767A1 (en) 2020-01-02

Family

ID=68985747

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/001975 WO2020004767A1 (en) 2018-06-25 2019-02-19 Telematics system provided in vehicle and method for controlling same

Country Status (1)

Country Link
WO (1) WO2020004767A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111541757A (en) * 2020-04-17 2020-08-14 一汽解放汽车有限公司 Vehicle-mounted interaction method, device, equipment and storage medium
CN113335018A (en) * 2021-06-11 2021-09-03 华东理工大学 Vehicle-mounted air conditioner service calling system based on SOME/IP
WO2021235567A1 (en) * 2020-05-19 2021-11-25 엘지전자 주식회사 Method for v2x service, and server using same
US20220094457A1 (en) * 2020-09-19 2022-03-24 Ibiquity Digital Corporation Content Linking Multicast Streaming for Broadcast Radio
CN114268666A (en) * 2021-12-08 2022-04-01 东软睿驰汽车技术(沈阳)有限公司 Universal domain controller, vehicle and interactive system supporting service oriented architecture SOA
CN114553873A (en) * 2022-02-27 2022-05-27 重庆长安汽车股份有限公司 SOA-based vehicle cloud cooperative control system and method and readable storage medium
CN114670902A (en) * 2022-04-28 2022-06-28 中车青岛四方车辆研究所有限公司 Remote reset and emergency brake remote release processing method and system
CN114979231A (en) * 2022-05-30 2022-08-30 重庆长安汽车股份有限公司 Mobile terminal real-time vehicle control method and system based on whole vehicle DDS protocol and automobile
CN116828000A (en) * 2023-08-28 2023-09-29 山东未来互联科技有限公司 Bus order processing system and method based on deterministic network and SDN network

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100750583B1 (en) * 2006-08-30 2007-08-20 엠큐브웍스(주) Network system of supplying streaming service
KR20120040496A (en) * 2010-10-19 2012-04-27 주식회사 칼리 A system and a method for controlling telematics middleware
KR20130113283A (en) * 2012-04-05 2013-10-15 엘지전자 주식회사 Acquiring method vehicle contents, displaying method vehicle contents, displaying system for vehicle contents and automotive electronic device
KR20150071807A (en) * 2013-12-18 2015-06-29 현대자동차주식회사 Cloud System For A Vehicle
KR20170114051A (en) * 2016-04-01 2017-10-13 현대엠엔소프트 주식회사 Multimedia apparatus for vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100750583B1 (en) * 2006-08-30 2007-08-20 엠큐브웍스(주) Network system of supplying streaming service
KR20120040496A (en) * 2010-10-19 2012-04-27 주식회사 칼리 A system and a method for controlling telematics middleware
KR20130113283A (en) * 2012-04-05 2013-10-15 엘지전자 주식회사 Acquiring method vehicle contents, displaying method vehicle contents, displaying system for vehicle contents and automotive electronic device
KR20150071807A (en) * 2013-12-18 2015-06-29 현대자동차주식회사 Cloud System For A Vehicle
KR20170114051A (en) * 2016-04-01 2017-10-13 현대엠엔소프트 주식회사 Multimedia apparatus for vehicle

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111541757A (en) * 2020-04-17 2020-08-14 一汽解放汽车有限公司 Vehicle-mounted interaction method, device, equipment and storage medium
WO2021235567A1 (en) * 2020-05-19 2021-11-25 엘지전자 주식회사 Method for v2x service, and server using same
US20220094457A1 (en) * 2020-09-19 2022-03-24 Ibiquity Digital Corporation Content Linking Multicast Streaming for Broadcast Radio
CN113335018B (en) * 2021-06-11 2022-12-06 华东理工大学 Vehicle-mounted air conditioner service calling system based on SOME/IP
CN113335018A (en) * 2021-06-11 2021-09-03 华东理工大学 Vehicle-mounted air conditioner service calling system based on SOME/IP
CN114268666A (en) * 2021-12-08 2022-04-01 东软睿驰汽车技术(沈阳)有限公司 Universal domain controller, vehicle and interactive system supporting service oriented architecture SOA
CN114268666B (en) * 2021-12-08 2024-05-03 东软睿驰汽车技术(沈阳)有限公司 Universal domain controller supporting Service Oriented Architecture (SOA), vehicle and interaction system
CN114553873A (en) * 2022-02-27 2022-05-27 重庆长安汽车股份有限公司 SOA-based vehicle cloud cooperative control system and method and readable storage medium
CN114670902A (en) * 2022-04-28 2022-06-28 中车青岛四方车辆研究所有限公司 Remote reset and emergency brake remote release processing method and system
CN114979231B (en) * 2022-05-30 2023-05-26 重庆长安汽车股份有限公司 Vehicle control method and system based on whole vehicle DDS protocol and automobile
CN114979231A (en) * 2022-05-30 2022-08-30 重庆长安汽车股份有限公司 Mobile terminal real-time vehicle control method and system based on whole vehicle DDS protocol and automobile
CN116828000A (en) * 2023-08-28 2023-09-29 山东未来互联科技有限公司 Bus order processing system and method based on deterministic network and SDN network
CN116828000B (en) * 2023-08-28 2023-11-17 山东未来互联科技有限公司 Bus order processing system and method based on deterministic network and SDN network

Similar Documents

Publication Publication Date Title
WO2020004767A1 (en) Telematics system provided in vehicle and method for controlling same
WO2020235765A1 (en) Route providing device and route providing method therefor
WO2021090971A1 (en) Path providing device and path providing method thereof
WO2019031852A1 (en) Apparatus for providing map
WO2021157760A1 (en) Route provision apparatus and route provision method therefor
WO2021141143A1 (en) Route provision device and route provision method therefor
WO2020235714A1 (en) Autonomous vehicle and driving control system and method using same
WO2020166749A1 (en) Method and system for displaying information by using vehicle
WO2020145432A1 (en) Method for controlling vehicle through multi soc system
WO2020040324A1 (en) Mobile its station and method of operating mobile its station
WO2020080566A1 (en) Electronic control device and communication device
WO2020017677A1 (en) Image output device
WO2021045256A1 (en) Route provision apparatus and route provision method therefor
WO2017104888A1 (en) Vehicle driving assistance device and vehicle driving assistance method therefor
WO2021010524A1 (en) Electronic device for vehicle and operation method of electronic device for vehicle
WO2021040057A1 (en) In-vehicle electronic device and method for operating in-vehicle electronic device
WO2021025216A1 (en) Route providing device and method for providing route by same
WO2021010507A1 (en) Route providing apparatus and route providing method thereof
WO2020149427A1 (en) Route providing device and route providing method therefor
WO2021246534A1 (en) Route providing device and route providing method therefor
WO2020116694A1 (en) Vehicle apparatus and control method
WO2022055006A1 (en) Image processing apparatus for vehicle and method for displaying visual information on display included in vehicle
WO2021230387A1 (en) Device for providing route and method for providing route therefor
WO2016186319A1 (en) Vehicle driving assisting device and vehicle
WO2021182655A1 (en) Route providing device and route providing method therefor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19824446

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19824446

Country of ref document: EP

Kind code of ref document: A1