KR20170020620A - Method and Apparatus for Providing Panoramic Vehicle Status Information using Multiple Cameras and Radar Sensor - Google Patents

Method and Apparatus for Providing Panoramic Vehicle Status Information using Multiple Cameras and Radar Sensor Download PDF

Info

Publication number
KR20170020620A
KR20170020620A KR1020150114235A KR20150114235A KR20170020620A KR 20170020620 A KR20170020620 A KR 20170020620A KR 1020150114235 A KR1020150114235 A KR 1020150114235A KR 20150114235 A KR20150114235 A KR 20150114235A KR 20170020620 A KR20170020620 A KR 20170020620A
Authority
KR
South Korea
Prior art keywords
information
vehicle
camera
radar sensor
module
Prior art date
Application number
KR1020150114235A
Other languages
Korean (ko)
Other versions
KR101734985B1 (en
Inventor
신사임
장달원
김성동
윤경로
이종설
장세진
Original Assignee
전자부품연구원
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 전자부품연구원 filed Critical 전자부품연구원
Priority to KR1020150114235A priority Critical patent/KR101734985B1/en
Priority to PCT/KR2015/009850 priority patent/WO2016060384A1/en
Publication of KR20170020620A publication Critical patent/KR20170020620A/en
Application granted granted Critical
Publication of KR101734985B1 publication Critical patent/KR101734985B1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q5/00Arrangement or adaptation of acoustic signal devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/013Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
    • B60R21/0134Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle, e.g. using radar systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/10Interpretation of driver requests or demands
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/04Systems determining presence of a target
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/42Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/52Radar, Lidar

Abstract

The present invention relates to a method and apparatus for providing vehicle status information in the form of a panorama using multiple cameras and radar sensor information, and more particularly, to a method and apparatus for providing vehicle status information in a panoramic form using multiple cameras and radar sensor information, The situation information providing device judges the situation of each vehicle and reconstructs the interface according to the situation of each driver based on the received image information and sensing information, The vehicle condition information can be widely obtained from various viewpoints to acquire the information about the surroundings of the vehicle. Thus, it is possible to contribute to the safe driving by increasing the driver's understanding of the surroundings of the vehicle and the responsiveness thereof.

Figure P1020150114235

Description

TECHNICAL FIELD [0001] The present invention relates to a method and apparatus for providing panoramic vehicle status information using multi-camera and radar sensor information,

The present invention relates to a method and apparatus for providing vehicle status information in the form of a panorama using multiple cameras and radar sensor information, and more particularly, to a method and apparatus for providing vehicle status information in a panoramic form using multiple cameras and radar sensor information, The situation information providing apparatus judges the situation of each vehicle and reconstructs the interface based on the received image information and sensing information and transmits the reconstructed interface to the vehicle, And more particularly, to a vehicle status information providing method and apparatus for configuring a dynamic vehicle interface according to each driver's situation.

The contents described in this section merely provide background information on the present embodiment and do not constitute the prior art.

Recently, technologies for processing sensed information measured by sensors and techniques for processing image information captured by cameras have been developed and applied in various fields.

Particularly, in a vehicle information providing system, a multi-camera and a radar sensor are attached to a vehicle and provided to a driver by using the information obtained from the camera and the sensor, thereby providing information about a vehicle that the driver can not see or miss while driving Thereby inducing safe driving.

In addition, such a vehicle information providing system can reproduce the real world of the driving environment in the virtual world as it is, and transmit intuitive and effective information using the complex multimedia (video and text information).

In recent years, multi cameras are attached to four sides of the vehicle, and the image information obtained from the obtained images is utilized to construct image information in a 360-degree panoramic format inside the vehicle, thereby enhancing the driver's understanding and responsiveness in real-time situations.

However, in the conventional vehicle information providing system, the information obtained in one vehicle is reconfigured by the vehicle information providing system installed in one specific vehicle and provided to the same specific vehicle, so that it is not possible to provide a wide range of operating conditions around the vehicle , The image information obtained by the camera attached to the vehicle due to any obstacle may be inaccurate and the surrounding situation of the vehicle can not be provided from various viewpoints by providing only the images photographed from one viewpoint.

Korean Patent Laid-Open No. 10-2015-0011629 (entitled " around view system for displaying the outline of an object and a method for providing an around view of a vehicle, publication date 2015.02.02.)

In order to solve the above-described problems, the present invention provides a vehicle control system in which a plurality of adjacent vehicles are linked to share image information and sensing information obtained from each vehicle, and configure an interface for each driver's situation, And to provide a vehicle status information providing method that can be obtained.

In particular, the vehicle situation information providing apparatus receives a plurality of neighboring vehicle information through a communication network, estimates the situation of each driver, and configures a panorama-type interface to match the estimated driver's situation, And to contribute to safe driving.

However, the object of the present invention is not limited to the above-mentioned objects, and other objects not mentioned can be clearly understood from the following description.

According to an aspect of the present invention, there is provided a situation information providing apparatus including a communication module for transmitting and receiving data to and from at least one vehicle in cooperation with a communication network, a plurality of cameras installed in a specific vehicle, A sensor module for receiving at least one of information from the at least one vehicle, information about at least one of camera image information received from the at least one vehicle, and radar sensor information, A vehicle status determination module for estimating the presence of the vehicle, and a vehicle status display module for displaying the estimated status information and information received from the sensor module, at least one of camera image information and radar sensor information received from at least one vehicle, Make the interface look like a panorama And an interface configuration module for configuring the interface.

At this time, the vehicle condition determination module may further include a storage module that can control to transmit the emergency notification information to the specific vehicle when the specific vehicle is determined to be in an emergency, and to store the received information The interface configuration module may control the driver to reconfigure and transmit the driver's situation or a surrounding situation for a predetermined time based on the stored information when the driver requests the driver.

The camera image information may include at least one of camera identification information (Camera), time information (TimeStamp), relative direction information (RelativeOrientation), and relative position information (RelativeLocation) Time information (TimeStamp), information about the detected object (DetectedObject), distance information (Distance), direction information (Orientation), and relative speed information (RelativeSpeed).

The interface configuration module may further include a storage module for storing the valid range of camera image information that can be processed by the interface configuration module. The valid range of the camera image information may include a number of camera sensors (NumberOfCameraCapability) And an access point capable of accessing image information of each camera, and the storage module storing the valid range of the radar sensor information that can be processed by the interface configuration module , The valid range of the radar sensor information includes at least any one of distance information for detecting an object, distance information for sensing an object, direction information for sensing an object, and relative speed information for detecting an object (RelativeSpeedCapability) can do.

The interface configuration module may further include a storage module for storing information processing reference information that is a reference when the interface configuration module processes the camera image information to configure the interface, The width information of the image to be processed, and the height information of the image to be processed, and the interface configuration module may process the radar sensor information The reference information of the radar sensor may include a maximum number of objects (MaxValue) that can be detected at a maximum, and the interface configuration module The distance information (Distance) which records the maximum / minimum value of the distance information to be processed and the unit, And information on at least any one of direction information (Orientaion) on which a range and unit of information is recorded, maximum / minimum values of relative speeds to be processed, and relative speed information on which units are recorded.

According to another aspect of the present invention, there is provided a vehicle status information providing method for a vehicle, the vehicle status information providing method comprising: receiving at least one of camera image information and radar sensor information from at least one vehicle, Estimating a driver's situation or a surrounding situation of the specific vehicle based on the received information, comparing the estimated situation information with at least one of a plurality of cameras installed in the vehicle and the specific vehicle, Configuring an interface to be displayed on a vehicle internal output module based on at least one of camera image information and radar sensor information in a panorama form, and transmitting the configured interface information to the specific vehicle.

According to the present invention, a vehicle capable of acquiring information about a vehicle widely from various viewpoints by connecting a plurality of adjacent vehicles, sharing image information and sensing information obtained from each vehicle, and configuring an interface for each driver's situation, The driver can understand the situation of the vehicle and improve the driver's ability to cope with the driving situation.

In addition, various effects other than the above-described effects can be directly or implicitly disclosed in the detailed description according to the embodiment of the present invention to be described later.

1 is a diagram showing a configuration of a system for providing vehicle information according to an embodiment of the present invention.
2 is a block diagram showing a configuration of a vehicle situation information providing apparatus according to the present invention.
3 is a flowchart for explaining a method of providing vehicle status information according to an embodiment of the present invention.
4 is a diagram for explaining an information processing process and a method according to an embodiment of the present invention.
5 to 6 are exemplary diagrams for explaining an embodiment of a vehicle information providing method according to the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS For a more complete understanding of the nature and advantages of the present invention, reference should be made to the following detailed description taken in conjunction with the accompanying drawings, in which:

In the following description and the accompanying drawings, detailed description of well-known functions or constructions that may obscure the subject matter of the present invention will be omitted. It should be noted that the same constituent elements are denoted by the same reference numerals as possible throughout the drawings.

The terms and words used in the following description and drawings are not to be construed in an ordinary sense or a dictionary, and the inventor can properly define his or her invention as a concept of a term to be described in the best way It should be construed as meaning and concept consistent with the technical idea of the present invention. Therefore, the embodiments described in the present specification and the configurations shown in the drawings are merely the most preferred embodiments of the present invention, and not all of the technical ideas of the present invention are described. Therefore, It is to be understood that equivalents and modifications are possible.

Also, terms including ordinal numbers such as first, second, etc. are used to describe various elements, and are used only for the purpose of distinguishing one element from another, Not used. For example, without departing from the scope of the present invention, the second component may be referred to as a first component, and similarly, the first component may also be referred to as a second component.

Also, the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. It is also to be understood that the terms such as " comprising "or" having ", as used herein, are intended to specify the presence of stated features, integers, It should be understood that the foregoing does not preclude the presence or addition of other features, numbers, steps, operations, elements, parts, or combinations thereof.

Also, the terms "part," "module," and the like, which are described in the specification, refer to a unit for processing at least one function or operation, and may be implemented by hardware or software or a combination of hardware and software. Also, the terms " a or ", "one "," the ", and the like are synonyms in the context of describing the invention (particularly in the context of the following claims) May be used in a sense including both singular and plural, unless the context clearly dictates otherwise.

In addition to the above-described terms, certain terms used in the following description are provided to facilitate understanding of the present invention, and the use of such specific terms may be changed to other forms without departing from the technical idea of the present invention.

Now, a vehicle status information providing method according to an embodiment of the present invention will be described in detail with reference to the drawings.

1 is a diagram showing a configuration of a system for providing vehicle status information according to an embodiment of the present invention.

Referring to FIG. 1, a system for providing vehicle condition information according to an embodiment of the present invention includes a plurality of vehicles 200 adjacent to each other, a camera and a RADAR sensor 210 attached to each vehicle, And an information providing device 300. The plurality of vehicles 200 adjacent to each other are linked through a communication network 100. [

Hereinafter, each component will be schematically described with reference to FIG.

First, the communication network 100 transmits data for transmitting / receiving data between adjacent vehicles 200, and may be a wireless LAN (WLAN), a Wi-Fi, a WiBro ), WiMAX, High Speed Downlink Packet Access (HSDPA), Long Term Evolution (LTE), and Long Term Evolution Advanced (LTE-A). Bluetooth, ZigBee, UWB (Ultra-WideBand), ANT, and various personal area network (PAN) communication technologies.

In addition to the above-described communication methods, other widely known or later-developed communication methods may be included.

The plurality of vehicles 200 may be provided with a plurality of cameras and a radar sensor 210 on four outer sides of the vehicle to transmit information collected from the plurality of installed cameras and the radar sensor 210 through a communication network 100 to a specific vehicle To the context information providing apparatus 300 of FIG.

At this time, the plurality of vehicles 200 can transmit the information collected from the plurality of cameras and the radar sensor 210 directly to the specific vehicle through the PAN-type local area network, and transmit the collected information to the server through the wireless communication network The server may again transmit to the specific vehicle through the wireless communication network.

A plurality of cameras and a radar sensor 210 provided on the plurality of vehicles 200 collect information on the circumstance of each vehicle.

That is, a plurality of cameras photographs the situation of the position where the camera is installed, and transmits the photographed image information to the situation information providing device 300 in the plurality of vehicles 200, and the plurality of radar sensors The relative speed, the relative position, etc., to which the object is approached, to the situation information providing device 300. The state information providing device 300 receives the information about the object,

Based on at least one of the image information and the radar sensor information collected by the specific vehicle, the camera image information received from at least one vehicle adjacent to the specific vehicle, and the radar sensor information, And an interface to be displayed on the in-vehicle output module based on at least one of the estimated situation information and the image information and the radar sensor information collected by the specific vehicle is configured in a panorama form , And to output the interface configured in the panorama form to the in-vehicle output module.

At this time, when it is determined that the specific vehicle is in an emergency situation, the situation information providing device 300 can transmit the emergency situation notification information to the output module and output it to the output module, It is possible to reconstruct the driver's situation or the surrounding situation for a predetermined time based on the information stored in the situation information providing device 300. [

Meanwhile, the situation information providing apparatus 300 may be implemented in the vehicle as shown in FIG. 1, or may be implemented outside the vehicle in the form of a server to transmit the information to a specific vehicle through the communication network 100.

The configuration of the vehicle condition information providing system according to the embodiment of the present invention has been described above.

Hereinafter, the configuration and operation method of the situation information providing apparatus 300 according to the present invention will be described.

FIG. 2 is a block diagram showing a main configuration of the context information providing apparatus 300, and FIG. 3 is a flowchart for explaining an operation method of the context information providing apparatus 300 according to the present invention.

Referring to FIG. 2, the apparatus 300 for providing status information according to the present invention may include a communication module 310, a storage module 330, a control module 350, and a sensor module 370.

The communication module 310 is for communicating with at least one vehicle 200b to 200d adjacent to the specific vehicle 200a. The communication network for performing the functions of the communication module 410 may be a WLAN (Wireless LAN), a Wi-Fi (Bluetooth), ZigBee, Ultra-WideBand (UWB), and ANT, as well as wireless communication methods such as Wi-Fi, WiBro, WiMAX and HSDPA, Personal Area Network).

Particularly, in the present invention, the communication module 310 photographs the surrounding situation of the vehicle from at least one vehicle 200b to 200d adjacent to the specific vehicle 200a, and receives the sensed camera image information and radar sensor information And can photograph the surrounding situation of the specific vehicle 200a and transmit the sensed camera image information and radar sensor information to at least one vehicle 200b to 200d.

Here, the radar sensor information refers to a RADAR sensor type for sensing a moving or fixed target by continuously observing a relative speed, an arrival angle and a distance of an object around the specific vehicle 200a, The information may include at least one of time information (TimeStamp), information on detected objects (DetectedObject), distance information (Distance), orientation information (Orientation), and relative speed information (RelativeSpeed).

<Syntax example 1> <complexType name = "RADARSensorType">
<complexContent>
<extension base = "iidl: SensedInfoBaseType">
<sequence maxOccurs = "unbounded">
<element name = "DetectedObject">
<complexType>
<sequence>
<element name = "Distance" type = "siv: FValueWithUnitType"/>
<element name = "Orientation" type = "siv: FValueWithUnitType"/>
<element name = "RelativeSpeed" type = "siv: FValueWithUnitType"/>
</ sequence>
</ complexType>
</ element>
</ sequence>
</ extension>
</ complexContent>
</ complexType>

<complexType name = "FValueWithUnitType">
<simpleContent>
<extension base = "float">
<attribute name = "unit" type = "mpegvct: unitType"/>
</ extension>
</ simpleContent>
</ complexType>

The above <Syntactic Example 1> is an example for describing a declaration for radar sensor information.

In this case, the TimeStamp is time information indicating the time when information is sensed, DetectedObject is object information indicating information of the detected object, Distance is distance information indicating a distance to the sensed object, and Orientation is an angle And RelativeSpeed is the relative speed information of the detected object.

<Syntax Example 2> mdfg-v: 2010: 01-CT-NS "xmlns: siv = murn: mpeg-v: 2010: 01-SIV-NS "xmlns: iidl =" urn: mpeg: mpeg-v : 2010: 01-IIDL-NS "xsi: schemaLocation =" urn: mpeg: mpeg-v: 2010: 01-SIV-NS MPEG-V-SIV.xsd ">
<iidl: SensedInfoList>
<iidl: SensedInfo xsi: type = "siv: RADARSensorType" value = "100" detected = "true" unit = "meter" id = "RST001"
<iidl: TimeStamp xsi: type = "mpegvct: ClockTickTimeType" timeScale = "100" pts = "6000"/>
<siv: Distance unit = "meter"> 100 </ siv: Distance>
<siv: Orientation unit = "degree"> 30 </ siv: Orientation>
<siv: RelativeSpeed unit = "kmperhour"> 60 </ siv: RelativeSpeed>
</ iidl: SensedInfo>
</ iidl: SensedInfoList>
</ iidl: InteractionInfo>

The above <Syntax Example 2> is an example sentence showing information about the object detected by the radar sensor of the above <Syntactic Example 1>.

The sensor's ID is "RST001", the sensor is active (activate = "true") and the distance value of the detected object is 100 meters. The radar sensor has a TimeStamp value of 6000 at a detection rate of 100 clock ticks per second. The angle of the object detected by the radar sensor is 30 degrees, and the state velocity is 60 km / h.

In addition, the camera image information refers to a camera sensor type that senses a real world by an array of cameras, and the array camera sensor type can be defined as an extensibility of camera image information together with added relative position information , The camera image information may include at least one of identification information (Camera), time information (TimeStamp), relative direction information (RelativeOrientation) and relative position information (RelativeLocation) of each camera.

<Syntax Example 3> <complexType name = "ArrayCameraSensorType">
<sequence maxOccurs = "unbounded">
<element name = "Camera">
<complexType>
<complexContent>
<extension base = "siv: CameraSensorType">
<sequence>
<element name = "RelativeOrientation" type = "siv: OrientationSensorType"/>
<element name = "RelativeLocation">
<complexType>
<complexContent>
<extension base = "mpegvct: Float3DVectorType">
<attribute name = "unit" type = "mpegvct: unitType"/>
</ extension>
</ complexContent>
</ complexType>
</ element>
</ sequence>
</ extension>
</ complexContent>
</ complexType>
</ element>
</ sequence>
</ complexType>

The above <Syntax Example 3> is an example of a declaration for camera image information.

The ArrayCameraSensorType is a tool for describing an array camera sensor, a camera is identification information of each camera sensor, a TimeStamp is information indicating a time when information is sensed, RelativeOrientation is for describing a relative direction of each camera sensor, If relative orientation is specified, the CameraOrientation described in CameraSensorType is ignored. All the camera sensors are relatively set based on the first camera sensor.

<Syntax Example 4> mdfg-v: 2010: 01-CT-NS "xmlns: siv_ =" iidl: InteractionInfo xmlns: xsi = "http://www.w3.org/2001/XMLSchema-instance" xmlns: mpegvct = "urn: mpeg: murn: mpeg-v: 2010: 01-SIV-NS "xmlns: iidl =" urn: mpeg: mpeg-v : 2010: 01-IIDL-NS "xsi: schemaLocation =" urn: mpeg: mpeg-v: 2010: 01-SIV-NS MPEG-V-SIV.xsd ">
<iidl: SensedInfoList>
<iidl: SensedInfo xsi: type = "siv: ArrayCameraSensorType" id = "ACST001" activate = "true">
<siv: Camera focalLength = "10" aperture = "2.8" id = "CID000">
<iidl: TimeStamp xsi: type = "mpegvct: ClockTickTimeType" timeScale = "100" pts = "6000"/>
<siv: RelativeOrientation unit = "degree">
<siv: Orientation>
<mpegvct: X> 0 </ mpegvct: X>
<mpegvct: Y> 0 </ mpegvct: Y>
<mpegvct: Z> 0 </ mpegvct: Z>
</ siv: Orientation>
</ siv: RelativeOrientation>
<siv: RelativeLocation unit = "cm">
<mpegvct: X> 0 </ mpegvct: X>
<mpegvct: Y> 0 </ mpegvct: Y>
<mpegvct: Z> 0 </ mpegvct: Z>
</ siv: RelativeLocation>
</ siv: Camera>
<siv: Camera focalLength = "10" aperture = "2.8" id = "CID001">
<iidl: TimeStamp xsi: type = "mpegvct: ClockTickTimeType" timeScale = "100" pts = "6000"/>
<siv: RelativeOrientation unit = "degree">
<siv: Orientation>
<mpegvct: X> 0 </ mpegvct: X>
<mpegvct: Y> 0 </ mpegvct: Y>
<mpegvct: Z> 90 </ mpegvct: Z>
</ siv: Orientation>
</ siv: RelativeOrientation>
<siv: RelativeLocation unit = "cm">
<mpegvct: X> 90 </ mpegvct: X>
<mpegvct: Y> -250 </ mpegvct: Y>
<mpegvct: Z> 0 </ mpegvct: Z>
</ siv: RelativeLocation>
</ siv: Camera>
<siv: Camera focalLength = "10" aperture = "2.8" id = "CID002">
<iidl: TimeStamp xsi: type = "mpegvct: ClockTickTimeType" timeScale = "100" pts = "6000"/>
<siv: RelativeOrientation unit = "degree">
<siv: Orientation>
<mpegvct: X> 0 </ mpegvct: X>
<mpegvct: Y> 0 </ mpegvct: Y>
<mpegvct: Z> 180 </ mpegvct: Z>
</ siv: Orientation>
</ siv: RelativeOrientation>
<siv: RelativeLocation unit = "cm">
<mpegvct: X> 0 </ mpegvct: X>
<mpegvct: Y> -500 </ mpegvct: Y>
<mpegvct: Z> 0 </ mpegvct: Z>
</ siv: RelativeLocation>
</ siv: Camera>
<siv: Camera focalLength = "10" aperture = "2.8" id = "CID003">
<iidl: TimeStamp xsi: type = "mpegvct: ClockTickTimeType" timeScale = "100" pts = "6000"/>
<siv: RelativeOrientation unit = "degree">
<siv: Orientation>
<mpegvct: X> 0 </ mpegvct: X>
<mpegvct: Y> 0 </ mpegvct: Y>
<mpegvct: Z> 270 </ mpegvct: Z>
</ siv: Orientation>
</ siv: RelativeOrientation>
<siv: RelativeLocation unit = "cm">
<mpegvct: X> -90 </ mpegvct: X>
<mpegvct: Y> -250 </ mpegvct: Y>
<mpegvct: Z> 0 </ mpegvct: Z>
</ siv: RelativeLocation>
</ siv: Camera>
</ iidl: SensedInfo>
</ iidl: SensedInfoList>
</ iidl: InteractionInfo>

<Syntax Example 4> is an example sentence describing the camera arrangement of <Syntax Example 3>.

The camera sensor has an ID of &quot; ACST001 &quot;, and has four sub-cameras having IDs such as &quot; CID000 to CID003 &quot;. The camera sensor is active and the sensor has a time information value of 6000 at a detection rate of 100 clock ticks per second.

The first camera sensor is located at (0, 0, 0) and the camera orientation (Orientation) is (0, 0, 0). The remaining camera sensors are located at (90, -250,0), (0, -500,0), (-90, -250,0) 0,0,180), (0,0,270). The orientation of all service cameras only moves in the Z-axis to cover the entire exterior of the vehicle.

The storage module 330 is a device for storing data, and includes a main storage device and an auxiliary storage device, and stores an application program required for the functional operation of the situation information providing device 300. The storage module 330 may include a program area and a data area. Here, when activating each function according to a request of the user, the context information providing device 300 executes the corresponding application programs under the control of the control module 350 to provide each function.

Particularly, the storage module 330 according to the present invention stores the validity of the camera image information and the radar sensor information that can be processed by the interface configuration module 351, and the interface configuration module 351 And may include information processing reference information (Adaptation Preference) as a reference when processing an interface and configuring the interface.

The Array Camera Sensor Capability Type of the camera image information includes a number of camera sensors (NumberOfCameraCapability), specifications of each camera sensor (CameraCapability), and access information (AccessPoint ). &Lt; / RTI &gt;

<Syntax Example 5> <complexType name = "ArrayCameraSensorCapabilityType">
<complexContent>
<extension base = "cidl: SensorCapabilityBaseType">
<sequence>
<element name = "NumberOfCameraCapability">
<complexType>
<complexContent>
<extension base = "cidl: SensorCapabilityBaseType"/>
</ complexContent>
</ complexType>
</ element>
<element name = "CameraCapability" maxOccurs = "unbounded">
<complexType>
<complexContent>
<extension base = "scdv: CameraSensorCapabilityType">
<sequence>
<element name = "AccessPoint" type = "anyURI" minOccurs = "0"/>
</ sequence>
</ extension>
</ complexContent>
</ complexType>
</ element>
</ sequence>
</ extension>
</ complexContent>
</ complexType>

Syntactic Example 5 describes the Capability of the array camera. The Array Camera Capability specifies a sensing effective range for each camera sensor, and an extension of the sensor effective range base type .

The ArrayCameraSensorCapabilityType is a tool for describing the valid range of the array camera sensor, the NumberOfCameraCapability describes the number of available camera sensors, the CameraCapability describes the capability of each camera sensor, and the AccessPoint is a video stream, Described is an access point used for raw contents of a sensor.

<Syntax Example 6> <cidl: SensorDeviceCapability xsi: type = "scdv: ArrayCameraSensorCapabilityType" id = "ACSCT001">
<scdv: NumberOfCameraCapability maxValue = "4" minValue = "0"/>
<scdv: CameraCapability id = "CC000">
<scdv: AccessPoint> http://vpv.keti.re.kr/C000 </ scdv: AccessPoint>
</ scdv: CameraCapability>
<scdv: CameraCapability id = "CC001">
<scdv: AccessPoint> http://vpv.keti.re.kr/C001 </ scdv: AccessPoint>
</ scdv: CameraCapability>
<scdv: CameraCapability id = "CC002">
<scdv: AccessPoint> http://vpv.keti.re.kr/C002 </ scdv: AccessPoint>
</ scdv: CameraCapability>
<scdv: CameraCapability id = "CC003">
<scdv: AccessPoint> http://vpv.keti.re.kr/C003 </ scdv: AccessPoint>
</ scdv: CameraCapability>
</ cidl: SensoryDeviceCapabilityList>

<Syntax Example 6> is an example for illustrating a description of the array camera sensor valid range in the above <Syntax Example 5>.

The sensor validity range has an ID of &quot; ACSCT001 &quot;, and the array camera has four sub-cameras each having an access point for retrieving the low content from the camera.

In addition, the RADAR Sensor Capability Type of the radar sensor information includes at least one of the distance information for detecting the object (Distance Capability), the effective direction information for detecting the object (Orientation Capability) and the effective relative speed information for detecting the object (RelativeSpeedCapability) And may include any one of the information.

<Syntax Example 7> <complexType name = "RADARSensorCapabilityType">
<complexContent>
<extension base = "cidl: SensorCapabilityBaseType">
<sequence>
<element name = "DistanceCapability">
<complexType>
<complexContent>
<extension base = "cidl: SensorCapabilityBaseType"/>
</ complexContent>
</ complexType>
</ element>
<element name = "OrientationCapability">
<complexType>
<complexContent>
<extension base = "cidl: SensorCapabilityBaseType"/>
</ complexContent>
</ complexType>
</ element>
<element name = "RelativeSpeedCapability">
<complexType>
<complexContent>
<extension base = "cidl: SensorCapabilityBaseType"/>
</ complexContent>
</ complexType>
</ element>
</ sequence>
</ extension>
</ complexContent>
</ complexType>

The above-described Synthetic Example 7 describes an effective range of a moving or fixed target by continuously observing a distance, a steering angle, and a relative speed for a radar sensor peripheral object.

The RADARSensorCapabilityType is a tool for describing the RADAR sensor valid range, the DistanceCapability describes the effective distance of the object sensed by the radar sensor, the OrientationCapability describes the effective direction of the object sensed by the radar sensor, and RelativeSpeedCapability It describes the effective relative velocity of the object detected by the radar sensor.

<Syntax Example 8> <cidl: SensorDeviceCapability xsi: type = "RADARSensorCapabilityType" id = "RSCT001">
<scdv: DistanceCapability maxValue = "500" minValue = "0.5" unit = "meter"/>
<scdv: OrientationCapability maxValue = "100" minValue = "0" unit = "degree"/>
<scdv: RelativeSpeedCapability maxValue = "250" minValue = "- 250" unit = "kmperhour"/>
</ cidl: SensoryDeviceCapability>

The <Syntactic Example 8> is an example for illustrating the description of the valid range of the radar sensor according to the <Syntactic Example 7>. The effective range of the sensor is "RSCT001", and the sensor can measure objects between 0.5 and 500 meters. The orientation angle information of the detected object is provided between 0 and 100 degrees, and the relative speed information can be provided between -250 km / h and 250 km / h.

On the other hand, the Array Camera sensor adaptation preference type information of the camera image information includes Resolution Resolution of the image processed by the interface configuration module 351, Width of the image to be processed, And vertical size information (Height) of the image.

<Syntax Example 9> <! - ##################################################### ## ->
<! - Array Camera Sensor Adaptation Preference type ->
<! - ##################################################### ## ->
<complexType name = "ArrayCameraAdaptationPrefType">
<complexContent>
<extension base = "cidl: SensorAdaptationPreferenceBaseType">
<sequence maxOccurs = "unbounded">
<element name = "Camera" type = "sapv: CameraAdaptationPrefType"/>
</ sequence>
</ extension>
</ complexContent>
</ complexType>

<complexType name = "CameraAdaptationPrefType">
<complexContent>
<extension base = "cidl: SensorAdaptationPreferenceBaseType">
<sequence>
<element name = "Resolution">
<complexType>
<sequence>
<element name = "Width">
<complexType>
<complexContent>
<extension base = "cidl: SensorAdaptationPreferenceBaseType"/>
</ complexContent>
</ complexType>
</ element>
<element name = "Height">
<complexType>
<complexContent>
<extension base = "cidl: SensorAdaptationPreferenceBaseType"/>
</ complexContent>
</ complexType>
</ element>
</ sequence>
</ complexType>
</ element>
</ sequence>
</ extension>
</ complexContent>
</ complexType>

The <Syntactic Example 9> specifies the syntax of the user's setting criteria for selectively adjusting the sensed data from the array camera sensor.

ArrayCameraAdaptationPrefType is a tool for describing the processing criterion of information sensed by an array camera. The camera describes the information processing standard of each camera sensor, the resolution describes the resolution required for the camera sensing information, the width describes the horizontal size range, and the height describes the vertical size range .

<Syntax Example 10> <cidl: SAPreference xsi: type = "sapv: ArrayCameraAdaptationPrefType" id = "ACSAP001" sensorAdaptationMode = "strict">
<sapv: Camera id = "CID000">
<sapv: Resolution>
<sapv: Width maxValue = "4096" minValue = "2048" numOfLevels = "5"/>
<sapv: Height maxValue = "2160" minValue = "1080" numOfLevels = "5"/>
</ sapv: Resolution>
</ sapv: Camera>
<sapv: Camera id = "CID001">
<sapv: Resolution>
<sapv: Width maxValue = "4096" minValue = "2048" numOfLevels = "5"/>
<sapv: Height maxValue = "2160" minValue = "1080" numOfLevels = "5"/>
</ sapv: Resolution>
</ sapv: Camera>
<sapv: Camera id = "CID002">
<sapv: Resolution>
<sapv: Width maxValue = "4096" minValue = "2048" numOfLevels = "5"/>
<sapv: Height maxValue = "2160" minValue = "1080" numOfLevels = "5"/>
</ sapv: Resolution>
</ sapv: Camera>
<sapv: Camera id = "CID003">
<sapv: Resolution>
<sapv: Width maxValue = "4096" minValue = "2048" numOfLevels = "5"/>
<sapv: Height maxValue = "2160" minValue = "1080" numOfLevels = "5"/>
</ sapv: Resolution>
</ sapv: Camera>
</ cidl: SAPreference>

The above <Syntax Example 10> is an example showing a processing standard of the information sensed by the array camera described in the <Syntax Example 9>.

This processing criterion has an ID of &quot; ACSAP001 &quot;, and four camera sensors describe each sensor application criterion. The application criteria are defined by a horizontal size range and a vertical size range, and a horizontal size range has a maximum value of 4096 and a minimum value of 2048. In addition, it is required to be divided into five steps between the maximum value and the minimum value of the horizontal size, and the vertical size range has a maximum value of 2160 and a minimum value of 1080. Also, it is required to be divided into five steps between the maximum value and the minimum value .

On the other hand, the RADAR Sensor Adaptation Preference type includes a maximum number of objects (MaxValue) that can be detected at a maximum, a maximum / minimum value of distance information processed by the interface configuration module, and distance information Direction information (Orientaion) in which a range and a unit of angle information to be processed are recorded, a maximum / minimum value of the relative velocity to be processed, and relative speed information indicating a unit relative speed information.

<Syntax Example 11> <! - ######################################### ->
<! - RADAR Sensor Adaptation Preference type ->
<! - ######################################### ->
<complexType name = "RADARAdaptationPrefType">
<complexContent>
<extension base = "cidl: SensorAdaptationPreferenceBaseType">
<sequence>
<element name = "Distance" type = "sapv: DistanceAdaptationPrefType" minOccurs = "0"/>
<element name = "Orientation" type = "sapv: OrientationAdaptationPrefType" minOccurs = "0"/>
<element name = "RelativeSpeed" type = "sapv: VelocityAdaptationPrefType" minOccurs = "0"/>
</ sequence>
</ extension>
</ complexContent>
</ complexType>

Syntax Example 11 specifies the syntax of the user's setting criteria for selectively adjusting the sensed data from the radar sensor.

RADARAdaptationPrefType is a tool for describing the criteria for processing the information sensed by the radar, and maxValue is the maximum number of objects that can be detected. If the detected object exceeds the maximum value, it returns the object closest to the maximum value. Distance describes the distance reference of the sensed object, Orientation describes the angle reference of the sensed object, RelativeSpeed describes the relative velocity reference of the sensed object.

<Syntax Example 12> <cidl: SAPreference xsi: type = "sapv: RADARAdaptationPrefType" id = "RADARSAP001" sensorAdaptationMode = "strict" maxValue = "3">
<Distance unit = "meter" maxValue = "100.0" minValue = "0.0" numOfLevels = "100"/>
<Orientation unit = "degree" numOfLevels = "50">
<OrientationRange>
<YawMin> 0 </ YawMin>
<YawMax> 100 </ YawMax>
<PitchMin> 0 </ PitchMin>
<PitchMax> 0 </ PitchMax>
<RollMin> 0 </ RollMin>
<RollMax> 0 </ RollMax>
</ OrientationRange>
</ Orientation>
<RelativeSpeed unit = "kmperhour" maxValue = "250" minValue = "250" numOfLevels = "500"/>
</ cidl: SAPreference>

The <Syntax Example 12> shows an example of the information processing standard of <Syntax Example 11>.

The information processing standard has an ID of "RADARSAP001", and the maximum number of objects that can be detected from the radar is "3". The maximum distance standard is 100 meters and the minimum distance standard is 0 meters. Also, the distance value is divided into 100 steps between a maximum value and a minimum value.

The orientation angle information is described only by the Yaw value, and the maximum value is 100 degrees and the minimum value is 0 degree. It is divided into 50 steps between maximum value and minimum value.

The application of the relative speed has a maximum value of 250 km / h and a minimum value of 250 km / h, and the maximum value and the minimum value are divided into 500 steps.

Meanwhile, the storage module 330 stores various types of information received through the sensor module 370 or the communication module 310.

The control module 350 may be an operating system (OS) and a process device for driving each configuration.

In particular, the control module 350 according to the present invention may include an interface configuration module 351 and a vehicle condition determination module 353. [

At least one of the camera image information and the radar sensor information received from at least one of the vehicles 200b to 200d adjacent to the specific vehicle 200a and the information received from the sensor module 370, And estimates the driver's situation or the surrounding situation of the specific vehicle based on the information of the driver.

When the vehicle condition judging module 353 judges the driver situation or the surroundings of the vehicle, it is judged whether the situation occurred before the present time based on the time information included in the radar sensor information, and the direction of the sensed object Information and relative speed information can be used to predict in which direction the specific vehicle will approach the specific vehicle after and after which time.

In addition, the information on the sensed object included in the radar sensor and the size and the shape of the sensed object based on the camera image information are measured and stored in the specific vehicle 200a based on the object information database stored in the storage module 330. [ Can be estimated.

If it is determined that the specific vehicle 200a is in an emergency state, the vehicle condition determination module 353 may control to transmit the emergency notification information to the specific vehicle 200a.

Here, the case where the specific vehicle 200a is judged to be in an emergency state may be an access of an object that may cause damage to the specific vehicle 200a when there is a collision with the specific vehicle 200a, The approach of such an object probabilistically calculates the time required for the sensed object to collide with the specific vehicle 200a using the time information, the distance information, the direction information, and the relative speed information included in the radar sensor information, The vehicle status determination module 353 may determine that there is an emergency situation in the current specific vehicle 200a and control to transmit the emergency alert information.

The interface configuration module 351 receives the estimated situation information and the information received from the sensor module 370 and the camera image information received from at least one of the vehicles 200b to 200d adjacent to the specific vehicle 200a, And an interface to be displayed on the in-vehicle output module based on at least one of the information of the radar sensor information.

In this case, the configuration of the interface may be configured using information included in the camera image information.

For example, it is possible to select the images included in the current time based on the time information as a candidate group of the image information constituting the interface, and to determine the position to enter the interface configuration based on the identification information of each camera.

That is, when the camera image is received from the rear camera of the surrounding vehicle in front of the specific vehicle 200a, the rear camera position of the surrounding vehicle becomes the front face state in the position of the specific vehicle 200a, The rear camera image information of the surrounding vehicle can be displayed at the portion illuminating the front surface situation.

In addition, it is possible to determine which of the received camera image information is to be exposed at the designated position based on the relative direction information or the relative position information of the sensed object.

That is, it may be advantageous to the driver to expose information on the interface that is closest to the specific vehicle 200a to the interface based on the relative direction information or the relative position information, so that such image information can be selected and exposed.

In addition, when an accident occurs around the specific vehicle 200a, the image information of the adjacent vehicle photographed on the front, back, left, and right sides of the area where the accident occurred is transmitted to the specific vehicle 200a And expose it in a panoramic form.

That is, if there is an accident vehicle, the rear camera image information of the adjacent vehicle photographed from the front side of the accident vehicle is exposed to the interface part corresponding to the front face situation of the specific vehicle 200a, The right side camera image information of the adjacent vehicle can be exposed to the interface portion corresponding to the left side situation of the specific vehicle 200a.

In addition, if it is determined that an object that is approaching based on the information about the sensed object included in the radar sensor information and the object that is approaching based on the image information, information about the specific object (e.g., size, vehicle model, Price, fuel consumption, etc.) can be displayed on a specific display window, and the distance information and the direction information are converted into texts and output, and the collision prediction time or the probabilistically calculated collision probability information is converted into text and exposed to the driver .

If the collision probability becomes equal to or greater than a predetermined value, an alert alarm such as flashing the warning light is transmitted to the display corresponding to the position estimated to be collided with the emergency alarm message transmitted by the vehicle condition determination module 353 Can be controlled.

That is, when it is expected that a collision will occur on the left side of the specific vehicle 200a, a warning light may blink along with a situation on the left side of the display corresponding to the left side, so that the driver's attention can be raised.

In addition, the interface configuration module 351 can control the driver to reconfigure and transmit the driver's situation or the surrounding situation for a predetermined time based on the information stored in the storage module 330, when the driver requests the driver.

The predetermined time can be changed according to the setting of the driver, the camera image information can be extracted based on the time information included in the camera image information, and the reconstruction of the surrounding situation can be performed in the same manner as the above- can do.

The sensor module 370 is a device for receiving sensing information collected by a sensor. In the present invention, sensing information collected by a camera installed in a specific vehicle 200a and radar sensors 210a to 210d is received.

The collected sensing information includes the same information as the camera image information or the radar sensor information received through the communication module 310.

Referring to FIG. 3, the operation of the situation information providing apparatus 300 will be described. The situation information providing apparatus 300 is installed in a specific vehicle 200a and a plurality of vehicles 200b to 200d adjacent to the specific vehicle 200a The information collected by the camera or the radar sensor is received (S101). The situation information providing apparatus 300 receives the information and estimates the driver's situation or the surrounding situation based on at least one of the information (S103). At this time, if it is determined that the specific vehicle 200a is in an emergency situation based on the estimated situation information, the emergency alert information may be transmitted to the specific vehicle 200a.

The situation information providing apparatus 300 constructs a panorama type interface to be displayed in the vehicle based on the estimated status information and one or more pieces of the received information (S105 to S107).

Meanwhile, the situation information providing device 300 stores the received information, and if there is a request from the driver, the situation information or the surrounding situation for a predetermined time can be reconstructed based on the stored information.

For example, when an accident occurs in the driver's driving, the situation information providing device 300 requests the driver to reconstruct a situation around the driver for a certain period of time before and after the accident occurs to determine whether the accident has occurred, The accident situation at the time of occurrence of the accident can be reconstructed and provided to the driver.

The configuration and operation method of the situation information providing apparatus 300 according to the present invention have been described above.

Hereinafter, an embodiment of a panorama-type interface configuration method according to the present invention will be described.

4 is a view for explaining a process of receiving the camera image information and the radar sensor information from the situation information providing device 300 and configuring it as a panorama type interface and displaying the same. And FIG. 6 is a view of an embodiment for exposing the configured interface to various display devices.

4, when the situation information providing device 300 receives the camera and the radar sensor information of the specific vehicle 200a and the camera and radar sensor information 400 of the external vehicle, the situation information providing device 300 provides status information based on the received information The device 300 may configure a panorama type interface based on the valid range information and the information processing reference information stored in the storage module 330.

Referring to FIG. 5, an example of the configured interface receives image information from four cameras 210a to 210d installed in the vehicle 200, and reconfigures the image information in a panorama form like the image 210e.

The interface thus configured can be transmitted to and displayed on the actuator 230 including various display devices.

Referring to FIG. 6, an example in which the reconfigured panorama type interface is exposed to various display devices is shown in FIGS. 6A to 6C. The interfaces 230a to 230c include a head up display 230a, a vehicle instrument panel 230b, a rearview mirror 230c, Can be displayed on the device.

As described above, the present specification contains details of a number of specific implementations, but they should not be construed as being limitations on the scope of any invention or claimability, but rather on the particular embodiment of a particular invention But should be understood as an explanation of the features. Certain features described herein in the context of separate embodiments may be implemented in combination in a single embodiment. Conversely, various features described in the context of a single embodiment may also be implemented in multiple embodiments, either individually or in any suitable subcombination. Further, although the features may operate in a particular combination and may be initially described as so claimed, one or more features from the claimed combination may in some cases be excluded from the combination, Or a variant of a subcombination.

Likewise, although the operations are depicted in the drawings in a particular order, it should be understood that such operations must be performed in that particular order or sequential order shown to achieve the desired result, or that all illustrated operations should be performed. In certain cases, multitasking and parallel processing may be advantageous. Also, the separation of the various system components of the above-described embodiments should not be understood as requiring such separation in all embodiments, and the described program components and systems will generally be integrated together into a single software product or packaged into multiple software products It should be understood.

Certain embodiments of the subject matter described herein have been described. Other embodiments are within the scope of the following claims. For example, the operations recited in the claims may be performed in a different order and still achieve desirable results. By way of example, the process illustrated in the accompanying drawings does not necessarily require that particular illustrated or sequential order to obtain the desired results. In certain implementations, multitasking and parallel processing may be advantageous.

The description sets forth the best modes of the present invention and provides examples for the purpose of illustrating the invention and enabling a person skilled in the art to make and use the invention. The written description is not intended to limit the invention to the specific terminology presented. Thus, while the present invention has been described in detail with reference to the above examples, those skilled in the art will recognize that modifications, changes, and modifications can be made thereto without departing from the scope of the present invention.

Therefore, the scope of the present invention should not be limited by the described embodiments but should be defined by the claims.

The present invention relates to a method and apparatus for providing vehicle status information in the form of a panorama using multiple cameras and radar sensor information, and more particularly, to a method and apparatus for providing vehicle status information in a panoramic form using multiple cameras and radar sensor information, The situation information providing apparatus judges the situation of each vehicle and reconstructs the interface based on the received image information and sensing information and transmits the reconstructed interface to the vehicle, And more particularly, to a vehicle status information providing method and apparatus for configuring a dynamic vehicle interface according to each driver's situation.

According to the present invention, a vehicle capable of acquiring information about a vehicle widely from various viewpoints by connecting a plurality of adjacent vehicles, sharing image information and sensing information obtained from each vehicle, and configuring an interface for each driver's situation, Thereby providing the driver with the driver's understanding of the surrounding situation of the vehicle, thereby contributing to safe driving and contributing to the development of the automobile industry.

In addition, since the present invention is not only possible to be marketed or operated, but also can be practically and practically carried out, it is industrially applicable.

100: communication network 200: vehicle 210: camera and radar sensor
300: vehicle situation information providing server

Claims (10)

A communication module for transmitting and receiving data with at least one vehicle in cooperation with a communication network;
A sensor module for receiving at least one of camera image information and radar sensor information from a plurality of cameras and radar sensors installed in a specific vehicle;
A vehicle condition determination module for estimating a driver condition or a surrounding condition of a specific vehicle based on at least one of information received from the sensor module, camera image information received from at least one vehicle, and radar sensor information; And
An interface to be displayed on the in-vehicle output module based on at least one of the estimated situation information, the information received from the sensor module, the camera image information received from at least one vehicle, and the radar sensor information, Interface configuration module
Wherein the status information providing apparatus comprises:
2. The vehicle driving force control apparatus according to claim 1,
And transmits the emergency notification information to the specific vehicle when it is determined that the specific vehicle is in an emergency situation.
The method according to claim 1,
A storage module for storing the received information;
Further comprising:
The interface configuration module
And when the driver is requested, reconstructs the driver's situation or surrounding situation for a predetermined time based on the stored information.
The method of claim 1,
And at least one of identification information (Camera), time information (TimeStamp), relative direction information (RelativeOrientation) and relative position information (RelativeLocation) of each camera.
The method of claim 1, wherein the radar sensor information
Wherein the time information includes at least one of time information, time information, detected object information, distance information, orientation information, and relative speed information.
The method according to claim 1,
Further comprising a storage module for storing an effective range of camera image information that can be processed by said interface configuration module,
The effective range of the camera image information is
A number of camera sensors, a number of camera sensors, a number of camera sensors, a number of camera sensors, a number of camera sensors, a number of camera sensors, a number of camera sensors, number of cameras, number of cameras, number of cameras, and access points.
The method according to claim 1,
Further comprising a storage module for storing an effective range of the radar sensor information that can be processed by said interface configuration module,
The effective range of the radar sensor information is
Information on at least one of distance information for detecting an object, distance information for sensing an object, distance information for sensing an object, distance information for sensing an object, distance information for sensing an object, distance information for sensing an object, .
The method according to claim 1,
Further comprising a storage module for storing information processing reference information that becomes a reference when the interface configuration module processes the camera image information to configure the interface,
The reference information of the camera image is
Wherein the interface configuration module includes at least one of resolution information of the image processed by the interface configuration module, width information of the image to be processed, and height information of the image to be processed. Information providing device.
The method according to claim 1,
Further comprising a storage module for storing information processing reference information that is a reference when the interface configuration module processes the radar sensor information to configure the interface,
The reference information of the radar sensor
A maximum value / minimum value of the distance information processed by the interface configuration module and distance information indicating a unit, a range of the angle information to be processed, and direction information And a relative speed information (RelativeSpeed) recording a maximum / minimum value of a relative speed to be processed and a unit.
Receiving at least one of camera image information and radar sensor information from a plurality of cameras and a radar sensor installed in at least one vehicle and a specific vehicle;
Estimating a driver condition or a surrounding condition of the specific vehicle based on the received information;
An interface to be displayed on the in-vehicle output module based on at least one of the estimated situation information, the at least one camera, the camera image information received from the radar sensor, and the radar sensor information, ; And
Transmitting the configured interface information to the specific vehicle
And the vehicle status information is provided to the vehicle.
KR1020150114235A 2014-10-17 2015-08-13 Method and Apparatus for Providing Panoramic Vehicle Status Information using Multiple Cameras and Radar Sensor KR101734985B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020150114235A KR101734985B1 (en) 2015-08-13 2015-08-13 Method and Apparatus for Providing Panoramic Vehicle Status Information using Multiple Cameras and Radar Sensor
PCT/KR2015/009850 WO2016060384A1 (en) 2014-10-17 2015-09-21 Method and device for providing panoramic vehicle situation information using multiple cameras and radar sensor information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150114235A KR101734985B1 (en) 2015-08-13 2015-08-13 Method and Apparatus for Providing Panoramic Vehicle Status Information using Multiple Cameras and Radar Sensor

Publications (2)

Publication Number Publication Date
KR20170020620A true KR20170020620A (en) 2017-02-23
KR101734985B1 KR101734985B1 (en) 2017-05-25

Family

ID=59050759

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150114235A KR101734985B1 (en) 2014-10-17 2015-08-13 Method and Apparatus for Providing Panoramic Vehicle Status Information using Multiple Cameras and Radar Sensor

Country Status (1)

Country Link
KR (1) KR101734985B1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109552315A (en) * 2017-09-26 2019-04-02 纵目科技(上海)股份有限公司 Full visual field camera engine control system
CN112346040A (en) * 2019-08-09 2021-02-09 大众汽车股份公司 Method and device for determining parallax problem in sensor data of two sensors
KR102612227B1 (en) * 2023-07-12 2023-12-11 노지효 System for smart lighting for traffic safety and control method thereof

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023204319A1 (en) * 2022-04-18 2023-10-26 ㈜와토시스 Apparatus and method for processing vehicle image data and system comprising apparatus

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005115484A (en) * 2003-09-17 2005-04-28 Fujitsu Ten Ltd Driving support device
JP4952421B2 (en) * 2007-07-18 2012-06-13 株式会社デンソー Driving support system

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109552315A (en) * 2017-09-26 2019-04-02 纵目科技(上海)股份有限公司 Full visual field camera engine control system
CN109552315B (en) * 2017-09-26 2022-06-24 纵目科技(上海)股份有限公司 Full-view camera host control system
CN112346040A (en) * 2019-08-09 2021-02-09 大众汽车股份公司 Method and device for determining parallax problem in sensor data of two sensors
KR102612227B1 (en) * 2023-07-12 2023-12-11 노지효 System for smart lighting for traffic safety and control method thereof

Also Published As

Publication number Publication date
KR101734985B1 (en) 2017-05-25

Similar Documents

Publication Publication Date Title
KR101734985B1 (en) Method and Apparatus for Providing Panoramic Vehicle Status Information using Multiple Cameras and Radar Sensor
TWI469886B (en) Cooperative event data record system and method
CN108844754B (en) For assessing the test device of Senior Officer&#39;s auxiliary system
CA2882693C (en) Method, apparatus and system for performing facial recognition
US20190096215A1 (en) Amber alert monitoring and support
JP5262976B2 (en) In-vehicle display device
CN103786644B (en) Apparatus and method for following the trail of peripheral vehicle location
US10410064B2 (en) System for tracking and identifying vehicles and pedestrians
US20200322585A1 (en) Image processing device, image processing method, and vehicle
CA2594337A1 (en) Method and system for tracking position of an object using imaging and non-imaging surveillance devices
JP2007129560A (en) Object detector
US20150360617A1 (en) Automated Emergency Response Systems for a Vehicle
JP2000244897A (en) State recognition system and state recognition display generation method
JP4643860B2 (en) VISUAL SUPPORT DEVICE AND SUPPORT METHOD FOR VEHICLE
KR20130088480A (en) Integration control system and method using surveillance camera for vehicle
WO2008105935A2 (en) Video surveillance system providing tracking of a moving object in a geospatial model and related methods
CN110738150B (en) Camera linkage snapshot method and device and computer storage medium
US20160267336A1 (en) Method for calibrating a camera for a gaze direction detection in a vehicle, device for a motor vehicle having a camera and at least one further element, and computer program product
CN109249857A (en) Automobile calling system
CN113240939A (en) Vehicle early warning method, device, equipment and storage medium
US10248132B2 (en) Method and apparatus for visualization of an environment of a motor vehicle
US20150258935A1 (en) Surroundings monitoring system for a vehicle
JP6891926B2 (en) Vehicle systems, methods performed on vehicle systems, and driver assistance systems
US8213683B2 (en) Driving support system with plural dimension processing units
WO2016060384A1 (en) Method and device for providing panoramic vehicle situation information using multiple cameras and radar sensor information

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
AMND Amendment
E90F Notification of reason for final refusal
AMND Amendment
E601 Decision to refuse application
AMND Amendment
X701 Decision to grant (after re-examination)
GRNT Written decision to grant