KR20170020620A - Method and Apparatus for Providing Panoramic Vehicle Status Information using Multiple Cameras and Radar Sensor - Google Patents
Method and Apparatus for Providing Panoramic Vehicle Status Information using Multiple Cameras and Radar Sensor Download PDFInfo
- Publication number
- KR20170020620A KR20170020620A KR1020150114235A KR20150114235A KR20170020620A KR 20170020620 A KR20170020620 A KR 20170020620A KR 1020150114235 A KR1020150114235 A KR 1020150114235A KR 20150114235 A KR20150114235 A KR 20150114235A KR 20170020620 A KR20170020620 A KR 20170020620A
- Authority
- KR
- South Korea
- Prior art keywords
- information
- vehicle
- camera
- radar sensor
- module
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q5/00—Arrangement or adaptation of acoustic signal devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R21/00—Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
- B60R21/01—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
- B60R21/013—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
- B60R21/0134—Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle, e.g. using radar systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/10—Interpretation of driver requests or demands
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/04—Systems determining presence of a target
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/42—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/52—Radar, Lidar
Abstract
The present invention relates to a method and apparatus for providing vehicle status information in the form of a panorama using multiple cameras and radar sensor information, and more particularly, to a method and apparatus for providing vehicle status information in a panoramic form using multiple cameras and radar sensor information, The situation information providing device judges the situation of each vehicle and reconstructs the interface according to the situation of each driver based on the received image information and sensing information, The vehicle condition information can be widely obtained from various viewpoints to acquire the information about the surroundings of the vehicle. Thus, it is possible to contribute to the safe driving by increasing the driver's understanding of the surroundings of the vehicle and the responsiveness thereof.
Description
The present invention relates to a method and apparatus for providing vehicle status information in the form of a panorama using multiple cameras and radar sensor information, and more particularly, to a method and apparatus for providing vehicle status information in a panoramic form using multiple cameras and radar sensor information, The situation information providing apparatus judges the situation of each vehicle and reconstructs the interface based on the received image information and sensing information and transmits the reconstructed interface to the vehicle, And more particularly, to a vehicle status information providing method and apparatus for configuring a dynamic vehicle interface according to each driver's situation.
The contents described in this section merely provide background information on the present embodiment and do not constitute the prior art.
Recently, technologies for processing sensed information measured by sensors and techniques for processing image information captured by cameras have been developed and applied in various fields.
Particularly, in a vehicle information providing system, a multi-camera and a radar sensor are attached to a vehicle and provided to a driver by using the information obtained from the camera and the sensor, thereby providing information about a vehicle that the driver can not see or miss while driving Thereby inducing safe driving.
In addition, such a vehicle information providing system can reproduce the real world of the driving environment in the virtual world as it is, and transmit intuitive and effective information using the complex multimedia (video and text information).
In recent years, multi cameras are attached to four sides of the vehicle, and the image information obtained from the obtained images is utilized to construct image information in a 360-degree panoramic format inside the vehicle, thereby enhancing the driver's understanding and responsiveness in real-time situations.
However, in the conventional vehicle information providing system, the information obtained in one vehicle is reconfigured by the vehicle information providing system installed in one specific vehicle and provided to the same specific vehicle, so that it is not possible to provide a wide range of operating conditions around the vehicle , The image information obtained by the camera attached to the vehicle due to any obstacle may be inaccurate and the surrounding situation of the vehicle can not be provided from various viewpoints by providing only the images photographed from one viewpoint.
In order to solve the above-described problems, the present invention provides a vehicle control system in which a plurality of adjacent vehicles are linked to share image information and sensing information obtained from each vehicle, and configure an interface for each driver's situation, And to provide a vehicle status information providing method that can be obtained.
In particular, the vehicle situation information providing apparatus receives a plurality of neighboring vehicle information through a communication network, estimates the situation of each driver, and configures a panorama-type interface to match the estimated driver's situation, And to contribute to safe driving.
However, the object of the present invention is not limited to the above-mentioned objects, and other objects not mentioned can be clearly understood from the following description.
According to an aspect of the present invention, there is provided a situation information providing apparatus including a communication module for transmitting and receiving data to and from at least one vehicle in cooperation with a communication network, a plurality of cameras installed in a specific vehicle, A sensor module for receiving at least one of information from the at least one vehicle, information about at least one of camera image information received from the at least one vehicle, and radar sensor information, A vehicle status determination module for estimating the presence of the vehicle, and a vehicle status display module for displaying the estimated status information and information received from the sensor module, at least one of camera image information and radar sensor information received from at least one vehicle, Make the interface look like a panorama And an interface configuration module for configuring the interface.
At this time, the vehicle condition determination module may further include a storage module that can control to transmit the emergency notification information to the specific vehicle when the specific vehicle is determined to be in an emergency, and to store the received information The interface configuration module may control the driver to reconfigure and transmit the driver's situation or a surrounding situation for a predetermined time based on the stored information when the driver requests the driver.
The camera image information may include at least one of camera identification information (Camera), time information (TimeStamp), relative direction information (RelativeOrientation), and relative position information (RelativeLocation) Time information (TimeStamp), information about the detected object (DetectedObject), distance information (Distance), direction information (Orientation), and relative speed information (RelativeSpeed).
The interface configuration module may further include a storage module for storing the valid range of camera image information that can be processed by the interface configuration module. The valid range of the camera image information may include a number of camera sensors (NumberOfCameraCapability) And an access point capable of accessing image information of each camera, and the storage module storing the valid range of the radar sensor information that can be processed by the interface configuration module , The valid range of the radar sensor information includes at least any one of distance information for detecting an object, distance information for sensing an object, direction information for sensing an object, and relative speed information for detecting an object (RelativeSpeedCapability) can do.
The interface configuration module may further include a storage module for storing information processing reference information that is a reference when the interface configuration module processes the camera image information to configure the interface, The width information of the image to be processed, and the height information of the image to be processed, and the interface configuration module may process the radar sensor information The reference information of the radar sensor may include a maximum number of objects (MaxValue) that can be detected at a maximum, and the interface configuration module The distance information (Distance) which records the maximum / minimum value of the distance information to be processed and the unit, And information on at least any one of direction information (Orientaion) on which a range and unit of information is recorded, maximum / minimum values of relative speeds to be processed, and relative speed information on which units are recorded.
According to another aspect of the present invention, there is provided a vehicle status information providing method for a vehicle, the vehicle status information providing method comprising: receiving at least one of camera image information and radar sensor information from at least one vehicle, Estimating a driver's situation or a surrounding situation of the specific vehicle based on the received information, comparing the estimated situation information with at least one of a plurality of cameras installed in the vehicle and the specific vehicle, Configuring an interface to be displayed on a vehicle internal output module based on at least one of camera image information and radar sensor information in a panorama form, and transmitting the configured interface information to the specific vehicle.
According to the present invention, a vehicle capable of acquiring information about a vehicle widely from various viewpoints by connecting a plurality of adjacent vehicles, sharing image information and sensing information obtained from each vehicle, and configuring an interface for each driver's situation, The driver can understand the situation of the vehicle and improve the driver's ability to cope with the driving situation.
In addition, various effects other than the above-described effects can be directly or implicitly disclosed in the detailed description according to the embodiment of the present invention to be described later.
1 is a diagram showing a configuration of a system for providing vehicle information according to an embodiment of the present invention.
2 is a block diagram showing a configuration of a vehicle situation information providing apparatus according to the present invention.
3 is a flowchart for explaining a method of providing vehicle status information according to an embodiment of the present invention.
4 is a diagram for explaining an information processing process and a method according to an embodiment of the present invention.
5 to 6 are exemplary diagrams for explaining an embodiment of a vehicle information providing method according to the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS For a more complete understanding of the nature and advantages of the present invention, reference should be made to the following detailed description taken in conjunction with the accompanying drawings, in which:
In the following description and the accompanying drawings, detailed description of well-known functions or constructions that may obscure the subject matter of the present invention will be omitted. It should be noted that the same constituent elements are denoted by the same reference numerals as possible throughout the drawings.
The terms and words used in the following description and drawings are not to be construed in an ordinary sense or a dictionary, and the inventor can properly define his or her invention as a concept of a term to be described in the best way It should be construed as meaning and concept consistent with the technical idea of the present invention. Therefore, the embodiments described in the present specification and the configurations shown in the drawings are merely the most preferred embodiments of the present invention, and not all of the technical ideas of the present invention are described. Therefore, It is to be understood that equivalents and modifications are possible.
Also, terms including ordinal numbers such as first, second, etc. are used to describe various elements, and are used only for the purpose of distinguishing one element from another, Not used. For example, without departing from the scope of the present invention, the second component may be referred to as a first component, and similarly, the first component may also be referred to as a second component.
Also, the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. It is also to be understood that the terms such as " comprising "or" having ", as used herein, are intended to specify the presence of stated features, integers, It should be understood that the foregoing does not preclude the presence or addition of other features, numbers, steps, operations, elements, parts, or combinations thereof.
Also, the terms "part," "module," and the like, which are described in the specification, refer to a unit for processing at least one function or operation, and may be implemented by hardware or software or a combination of hardware and software. Also, the terms " a or ", "one "," the ", and the like are synonyms in the context of describing the invention (particularly in the context of the following claims) May be used in a sense including both singular and plural, unless the context clearly dictates otherwise.
In addition to the above-described terms, certain terms used in the following description are provided to facilitate understanding of the present invention, and the use of such specific terms may be changed to other forms without departing from the technical idea of the present invention.
Now, a vehicle status information providing method according to an embodiment of the present invention will be described in detail with reference to the drawings.
1 is a diagram showing a configuration of a system for providing vehicle status information according to an embodiment of the present invention.
Referring to FIG. 1, a system for providing vehicle condition information according to an embodiment of the present invention includes a plurality of
Hereinafter, each component will be schematically described with reference to FIG.
First, the
In addition to the above-described communication methods, other widely known or later-developed communication methods may be included.
The plurality of
At this time, the plurality of
A plurality of cameras and a
That is, a plurality of cameras photographs the situation of the position where the camera is installed, and transmits the photographed image information to the situation
Based on at least one of the image information and the radar sensor information collected by the specific vehicle, the camera image information received from at least one vehicle adjacent to the specific vehicle, and the radar sensor information, And an interface to be displayed on the in-vehicle output module based on at least one of the estimated situation information and the image information and the radar sensor information collected by the specific vehicle is configured in a panorama form , And to output the interface configured in the panorama form to the in-vehicle output module.
At this time, when it is determined that the specific vehicle is in an emergency situation, the situation
Meanwhile, the situation
The configuration of the vehicle condition information providing system according to the embodiment of the present invention has been described above.
Hereinafter, the configuration and operation method of the situation
FIG. 2 is a block diagram showing a main configuration of the context
Referring to FIG. 2, the
The
Particularly, in the present invention, the
Here, the radar sensor information refers to a RADAR sensor type for sensing a moving or fixed target by continuously observing a relative speed, an arrival angle and a distance of an object around the
<complexContent>
<extension base = "iidl: SensedInfoBaseType">
<sequence maxOccurs = "unbounded">
<element name = "DetectedObject">
<complexType>
<sequence>
<element name = "Distance" type = "siv: FValueWithUnitType"/>
<element name = "Orientation" type = "siv: FValueWithUnitType"/>
<element name = "RelativeSpeed" type = "siv: FValueWithUnitType"/>
</ sequence>
</ complexType>
</ element>
</ sequence>
</ extension>
</ complexContent>
</ complexType>
<complexType name = "FValueWithUnitType">
<simpleContent>
<extension base = "float">
<attribute name = "unit" type = "mpegvct: unitType"/>
</ extension>
</ simpleContent>
</ complexType>
The above <Syntactic Example 1> is an example for describing a declaration for radar sensor information.
In this case, the TimeStamp is time information indicating the time when information is sensed, DetectedObject is object information indicating information of the detected object, Distance is distance information indicating a distance to the sensed object, and Orientation is an angle And RelativeSpeed is the relative speed information of the detected object.
<iidl: SensedInfoList>
<iidl: SensedInfo xsi: type = "siv: RADARSensorType" value = "100" detected = "true" unit = "meter" id = "RST001"
<iidl: TimeStamp xsi: type = "mpegvct: ClockTickTimeType" timeScale = "100" pts = "6000"/>
<siv: Distance unit = "meter"> 100 </ siv: Distance>
<siv: Orientation unit = "degree"> 30 </ siv: Orientation>
<siv: RelativeSpeed unit = "kmperhour"> 60 </ siv: RelativeSpeed>
</ iidl: SensedInfo>
</ iidl: SensedInfoList>
</ iidl: InteractionInfo>
The above <Syntax Example 2> is an example sentence showing information about the object detected by the radar sensor of the above <Syntactic Example 1>.
The sensor's ID is "RST001", the sensor is active (activate = "true") and the distance value of the detected object is 100 meters. The radar sensor has a TimeStamp value of 6000 at a detection rate of 100 clock ticks per second. The angle of the object detected by the radar sensor is 30 degrees, and the state velocity is 60 km / h.
In addition, the camera image information refers to a camera sensor type that senses a real world by an array of cameras, and the array camera sensor type can be defined as an extensibility of camera image information together with added relative position information , The camera image information may include at least one of identification information (Camera), time information (TimeStamp), relative direction information (RelativeOrientation) and relative position information (RelativeLocation) of each camera.
<sequence maxOccurs = "unbounded">
<element name = "Camera">
<complexType>
<complexContent>
<extension base = "siv: CameraSensorType">
<sequence>
<element name = "RelativeOrientation" type = "siv: OrientationSensorType"/>
<element name = "RelativeLocation">
<complexType>
<complexContent>
<extension base = "mpegvct: Float3DVectorType">
<attribute name = "unit" type = "mpegvct: unitType"/>
</ extension>
</ complexContent>
</ complexType>
</ element>
</ sequence>
</ extension>
</ complexContent>
</ complexType>
</ element>
</ sequence>
</ complexType>
The above <Syntax Example 3> is an example of a declaration for camera image information.
The ArrayCameraSensorType is a tool for describing an array camera sensor, a camera is identification information of each camera sensor, a TimeStamp is information indicating a time when information is sensed, RelativeOrientation is for describing a relative direction of each camera sensor, If relative orientation is specified, the CameraOrientation described in CameraSensorType is ignored. All the camera sensors are relatively set based on the first camera sensor.
<iidl: SensedInfoList>
<iidl: SensedInfo xsi: type = "siv: ArrayCameraSensorType" id = "ACST001" activate = "true">
<siv: Camera focalLength = "10" aperture = "2.8" id = "CID000">
<iidl: TimeStamp xsi: type = "mpegvct: ClockTickTimeType" timeScale = "100" pts = "6000"/>
<siv: RelativeOrientation unit = "degree">
<siv: Orientation>
<mpegvct: X> 0 </ mpegvct: X>
<mpegvct: Y> 0 </ mpegvct: Y>
<mpegvct: Z> 0 </ mpegvct: Z>
</ siv: Orientation>
</ siv: RelativeOrientation>
<siv: RelativeLocation unit = "cm">
<mpegvct: X> 0 </ mpegvct: X>
<mpegvct: Y> 0 </ mpegvct: Y>
<mpegvct: Z> 0 </ mpegvct: Z>
</ siv: RelativeLocation>
</ siv: Camera>
<siv: Camera focalLength = "10" aperture = "2.8" id = "CID001">
<iidl: TimeStamp xsi: type = "mpegvct: ClockTickTimeType" timeScale = "100" pts = "6000"/>
<siv: RelativeOrientation unit = "degree">
<siv: Orientation>
<mpegvct: X> 0 </ mpegvct: X>
<mpegvct: Y> 0 </ mpegvct: Y>
<mpegvct: Z> 90 </ mpegvct: Z>
</ siv: Orientation>
</ siv: RelativeOrientation>
<siv: RelativeLocation unit = "cm">
<mpegvct: X> 90 </ mpegvct: X>
<mpegvct: Y> -250 </ mpegvct: Y>
<mpegvct: Z> 0 </ mpegvct: Z>
</ siv: RelativeLocation>
</ siv: Camera>
<siv: Camera focalLength = "10" aperture = "2.8" id = "CID002">
<iidl: TimeStamp xsi: type = "mpegvct: ClockTickTimeType" timeScale = "100" pts = "6000"/>
<siv: RelativeOrientation unit = "degree">
<siv: Orientation>
<mpegvct: X> 0 </ mpegvct: X>
<mpegvct: Y> 0 </ mpegvct: Y>
<mpegvct: Z> 180 </ mpegvct: Z>
</ siv: Orientation>
</ siv: RelativeOrientation>
<siv: RelativeLocation unit = "cm">
<mpegvct: X> 0 </ mpegvct: X>
<mpegvct: Y> -500 </ mpegvct: Y>
<mpegvct: Z> 0 </ mpegvct: Z>
</ siv: RelativeLocation>
</ siv: Camera>
<siv: Camera focalLength = "10" aperture = "2.8" id = "CID003">
<iidl: TimeStamp xsi: type = "mpegvct: ClockTickTimeType" timeScale = "100" pts = "6000"/>
<siv: RelativeOrientation unit = "degree">
<siv: Orientation>
<mpegvct: X> 0 </ mpegvct: X>
<mpegvct: Y> 0 </ mpegvct: Y>
<mpegvct: Z> 270 </ mpegvct: Z>
</ siv: Orientation>
</ siv: RelativeOrientation>
<siv: RelativeLocation unit = "cm">
<mpegvct: X> -90 </ mpegvct: X>
<mpegvct: Y> -250 </ mpegvct: Y>
<mpegvct: Z> 0 </ mpegvct: Z>
</ siv: RelativeLocation>
</ siv: Camera>
</ iidl: SensedInfo>
</ iidl: SensedInfoList>
</ iidl: InteractionInfo>
<Syntax Example 4> is an example sentence describing the camera arrangement of <Syntax Example 3>.
The camera sensor has an ID of " ACST001 ", and has four sub-cameras having IDs such as " CID000 to CID003 ". The camera sensor is active and the sensor has a time information value of 6000 at a detection rate of 100 clock ticks per second.
The first camera sensor is located at (0, 0, 0) and the camera orientation (Orientation) is (0, 0, 0). The remaining camera sensors are located at (90, -250,0), (0, -500,0), (-90, -250,0) 0,0,180), (0,0,270). The orientation of all service cameras only moves in the Z-axis to cover the entire exterior of the vehicle.
The
Particularly, the
The Array Camera Sensor Capability Type of the camera image information includes a number of camera sensors (NumberOfCameraCapability), specifications of each camera sensor (CameraCapability), and access information (AccessPoint ). ≪ / RTI >
<complexContent>
<extension base = "cidl: SensorCapabilityBaseType">
<sequence>
<element name = "NumberOfCameraCapability">
<complexType>
<complexContent>
<extension base = "cidl: SensorCapabilityBaseType"/>
</ complexContent>
</ complexType>
</ element>
<element name = "CameraCapability" maxOccurs = "unbounded">
<complexType>
<complexContent>
<extension base = "scdv: CameraSensorCapabilityType">
<sequence>
<element name = "AccessPoint" type = "anyURI" minOccurs = "0"/>
</ sequence>
</ extension>
</ complexContent>
</ complexType>
</ element>
</ sequence>
</ extension>
</ complexContent>
</ complexType>
Syntactic Example 5 describes the Capability of the array camera. The Array Camera Capability specifies a sensing effective range for each camera sensor, and an extension of the sensor effective range base type .
The ArrayCameraSensorCapabilityType is a tool for describing the valid range of the array camera sensor, the NumberOfCameraCapability describes the number of available camera sensors, the CameraCapability describes the capability of each camera sensor, and the AccessPoint is a video stream, Described is an access point used for raw contents of a sensor.
<scdv: NumberOfCameraCapability maxValue = "4" minValue = "0"/>
<scdv: CameraCapability id = "CC000">
<scdv: AccessPoint> http://vpv.keti.re.kr/C000 </ scdv: AccessPoint>
</ scdv: CameraCapability>
<scdv: CameraCapability id = "CC001">
<scdv: AccessPoint> http://vpv.keti.re.kr/C001 </ scdv: AccessPoint>
</ scdv: CameraCapability>
<scdv: CameraCapability id = "CC002">
<scdv: AccessPoint> http://vpv.keti.re.kr/C002 </ scdv: AccessPoint>
</ scdv: CameraCapability>
<scdv: CameraCapability id = "CC003">
<scdv: AccessPoint> http://vpv.keti.re.kr/C003 </ scdv: AccessPoint>
</ scdv: CameraCapability>
</ cidl: SensoryDeviceCapabilityList>
<Syntax Example 6> is an example for illustrating a description of the array camera sensor valid range in the above <Syntax Example 5>.
The sensor validity range has an ID of " ACSCT001 ", and the array camera has four sub-cameras each having an access point for retrieving the low content from the camera.
In addition, the RADAR Sensor Capability Type of the radar sensor information includes at least one of the distance information for detecting the object (Distance Capability), the effective direction information for detecting the object (Orientation Capability) and the effective relative speed information for detecting the object (RelativeSpeedCapability) And may include any one of the information.
<complexContent>
<extension base = "cidl: SensorCapabilityBaseType">
<sequence>
<element name = "DistanceCapability">
<complexType>
<complexContent>
<extension base = "cidl: SensorCapabilityBaseType"/>
</ complexContent>
</ complexType>
</ element>
<element name = "OrientationCapability">
<complexType>
<complexContent>
<extension base = "cidl: SensorCapabilityBaseType"/>
</ complexContent>
</ complexType>
</ element>
<element name = "RelativeSpeedCapability">
<complexType>
<complexContent>
<extension base = "cidl: SensorCapabilityBaseType"/>
</ complexContent>
</ complexType>
</ element>
</ sequence>
</ extension>
</ complexContent>
</ complexType>
The above-described Synthetic Example 7 describes an effective range of a moving or fixed target by continuously observing a distance, a steering angle, and a relative speed for a radar sensor peripheral object.
The RADARSensorCapabilityType is a tool for describing the RADAR sensor valid range, the DistanceCapability describes the effective distance of the object sensed by the radar sensor, the OrientationCapability describes the effective direction of the object sensed by the radar sensor, and RelativeSpeedCapability It describes the effective relative velocity of the object detected by the radar sensor.
<scdv: DistanceCapability maxValue = "500" minValue = "0.5" unit = "meter"/>
<scdv: OrientationCapability maxValue = "100" minValue = "0" unit = "degree"/>
<scdv: RelativeSpeedCapability maxValue = "250" minValue = "- 250" unit = "kmperhour"/>
</ cidl: SensoryDeviceCapability>
The <Syntactic Example 8> is an example for illustrating the description of the valid range of the radar sensor according to the <Syntactic Example 7>. The effective range of the sensor is "RSCT001", and the sensor can measure objects between 0.5 and 500 meters. The orientation angle information of the detected object is provided between 0 and 100 degrees, and the relative speed information can be provided between -250 km / h and 250 km / h.
On the other hand, the Array Camera sensor adaptation preference type information of the camera image information includes Resolution Resolution of the image processed by the
<! - Array Camera Sensor Adaptation Preference type ->
<! - ##################################################### ## ->
<complexType name = "ArrayCameraAdaptationPrefType">
<complexContent>
<extension base = "cidl: SensorAdaptationPreferenceBaseType">
<sequence maxOccurs = "unbounded">
<element name = "Camera" type = "sapv: CameraAdaptationPrefType"/>
</ sequence>
</ extension>
</ complexContent>
</ complexType>
<complexType name = "CameraAdaptationPrefType">
<complexContent>
<extension base = "cidl: SensorAdaptationPreferenceBaseType">
<sequence>
<element name = "Resolution">
<complexType>
<sequence>
<element name = "Width">
<complexType>
<complexContent>
<extension base = "cidl: SensorAdaptationPreferenceBaseType"/>
</ complexContent>
</ complexType>
</ element>
<element name = "Height">
<complexType>
<complexContent>
<extension base = "cidl: SensorAdaptationPreferenceBaseType"/>
</ complexContent>
</ complexType>
</ element>
</ sequence>
</ complexType>
</ element>
</ sequence>
</ extension>
</ complexContent>
</ complexType>
The <Syntactic Example 9> specifies the syntax of the user's setting criteria for selectively adjusting the sensed data from the array camera sensor.
ArrayCameraAdaptationPrefType is a tool for describing the processing criterion of information sensed by an array camera. The camera describes the information processing standard of each camera sensor, the resolution describes the resolution required for the camera sensing information, the width describes the horizontal size range, and the height describes the vertical size range .
<sapv: Camera id = "CID000">
<sapv: Resolution>
<sapv: Width maxValue = "4096" minValue = "2048" numOfLevels = "5"/>
<sapv: Height maxValue = "2160" minValue = "1080" numOfLevels = "5"/>
</ sapv: Resolution>
</ sapv: Camera>
<sapv: Camera id = "CID001">
<sapv: Resolution>
<sapv: Width maxValue = "4096" minValue = "2048" numOfLevels = "5"/>
<sapv: Height maxValue = "2160" minValue = "1080" numOfLevels = "5"/>
</ sapv: Resolution>
</ sapv: Camera>
<sapv: Camera id = "CID002">
<sapv: Resolution>
<sapv: Width maxValue = "4096" minValue = "2048" numOfLevels = "5"/>
<sapv: Height maxValue = "2160" minValue = "1080" numOfLevels = "5"/>
</ sapv: Resolution>
</ sapv: Camera>
<sapv: Camera id = "CID003">
<sapv: Resolution>
<sapv: Width maxValue = "4096" minValue = "2048" numOfLevels = "5"/>
<sapv: Height maxValue = "2160" minValue = "1080" numOfLevels = "5"/>
</ sapv: Resolution>
</ sapv: Camera>
</ cidl: SAPreference>
The above <Syntax Example 10> is an example showing a processing standard of the information sensed by the array camera described in the <Syntax Example 9>.
This processing criterion has an ID of " ACSAP001 ", and four camera sensors describe each sensor application criterion. The application criteria are defined by a horizontal size range and a vertical size range, and a horizontal size range has a maximum value of 4096 and a minimum value of 2048. In addition, it is required to be divided into five steps between the maximum value and the minimum value of the horizontal size, and the vertical size range has a maximum value of 2160 and a minimum value of 1080. Also, it is required to be divided into five steps between the maximum value and the minimum value .
On the other hand, the RADAR Sensor Adaptation Preference type includes a maximum number of objects (MaxValue) that can be detected at a maximum, a maximum / minimum value of distance information processed by the interface configuration module, and distance information Direction information (Orientaion) in which a range and a unit of angle information to be processed are recorded, a maximum / minimum value of the relative velocity to be processed, and relative speed information indicating a unit relative speed information.
<! - RADAR Sensor Adaptation Preference type ->
<! - ######################################### ->
<complexType name = "RADARAdaptationPrefType">
<complexContent>
<extension base = "cidl: SensorAdaptationPreferenceBaseType">
<sequence>
<element name = "Distance" type = "sapv: DistanceAdaptationPrefType" minOccurs = "0"/>
<element name = "Orientation" type = "sapv: OrientationAdaptationPrefType" minOccurs = "0"/>
<element name = "RelativeSpeed" type = "sapv: VelocityAdaptationPrefType" minOccurs = "0"/>
</ sequence>
</ extension>
</ complexContent>
</ complexType>
Syntax Example 11 specifies the syntax of the user's setting criteria for selectively adjusting the sensed data from the radar sensor.
RADARAdaptationPrefType is a tool for describing the criteria for processing the information sensed by the radar, and maxValue is the maximum number of objects that can be detected. If the detected object exceeds the maximum value, it returns the object closest to the maximum value. Distance describes the distance reference of the sensed object, Orientation describes the angle reference of the sensed object, RelativeSpeed describes the relative velocity reference of the sensed object.
<Distance unit = "meter" maxValue = "100.0" minValue = "0.0" numOfLevels = "100"/>
<Orientation unit = "degree" numOfLevels = "50">
<OrientationRange>
<YawMin> 0 </ YawMin>
<YawMax> 100 </ YawMax>
<PitchMin> 0 </ PitchMin>
<PitchMax> 0 </ PitchMax>
<RollMin> 0 </ RollMin>
<RollMax> 0 </ RollMax>
</ OrientationRange>
</ Orientation>
<RelativeSpeed unit = "kmperhour" maxValue = "250" minValue = "250" numOfLevels = "500"/>
</ cidl: SAPreference>
The <Syntax Example 12> shows an example of the information processing standard of <Syntax Example 11>.
The information processing standard has an ID of "RADARSAP001", and the maximum number of objects that can be detected from the radar is "3". The maximum distance standard is 100 meters and the minimum distance standard is 0 meters. Also, the distance value is divided into 100 steps between a maximum value and a minimum value.
The orientation angle information is described only by the Yaw value, and the maximum value is 100 degrees and the minimum value is 0 degree. It is divided into 50 steps between maximum value and minimum value.
The application of the relative speed has a maximum value of 250 km / h and a minimum value of 250 km / h, and the maximum value and the minimum value are divided into 500 steps.
Meanwhile, the
The
In particular, the
At least one of the camera image information and the radar sensor information received from at least one of the
When the vehicle
In addition, the information on the sensed object included in the radar sensor and the size and the shape of the sensed object based on the camera image information are measured and stored in the
If it is determined that the
Here, the case where the
The
In this case, the configuration of the interface may be configured using information included in the camera image information.
For example, it is possible to select the images included in the current time based on the time information as a candidate group of the image information constituting the interface, and to determine the position to enter the interface configuration based on the identification information of each camera.
That is, when the camera image is received from the rear camera of the surrounding vehicle in front of the
In addition, it is possible to determine which of the received camera image information is to be exposed at the designated position based on the relative direction information or the relative position information of the sensed object.
That is, it may be advantageous to the driver to expose information on the interface that is closest to the
In addition, when an accident occurs around the
That is, if there is an accident vehicle, the rear camera image information of the adjacent vehicle photographed from the front side of the accident vehicle is exposed to the interface part corresponding to the front face situation of the
In addition, if it is determined that an object that is approaching based on the information about the sensed object included in the radar sensor information and the object that is approaching based on the image information, information about the specific object (e.g., size, vehicle model, Price, fuel consumption, etc.) can be displayed on a specific display window, and the distance information and the direction information are converted into texts and output, and the collision prediction time or the probabilistically calculated collision probability information is converted into text and exposed to the driver .
If the collision probability becomes equal to or greater than a predetermined value, an alert alarm such as flashing the warning light is transmitted to the display corresponding to the position estimated to be collided with the emergency alarm message transmitted by the vehicle
That is, when it is expected that a collision will occur on the left side of the
In addition, the
The predetermined time can be changed according to the setting of the driver, the camera image information can be extracted based on the time information included in the camera image information, and the reconstruction of the surrounding situation can be performed in the same manner as the above- can do.
The
The collected sensing information includes the same information as the camera image information or the radar sensor information received through the
Referring to FIG. 3, the operation of the situation
The situation
Meanwhile, the situation
For example, when an accident occurs in the driver's driving, the situation
The configuration and operation method of the situation
Hereinafter, an embodiment of a panorama-type interface configuration method according to the present invention will be described.
4 is a view for explaining a process of receiving the camera image information and the radar sensor information from the situation
4, when the situation
Referring to FIG. 5, an example of the configured interface receives image information from four
The interface thus configured can be transmitted to and displayed on the
Referring to FIG. 6, an example in which the reconfigured panorama type interface is exposed to various display devices is shown in FIGS. 6A to 6C. The
As described above, the present specification contains details of a number of specific implementations, but they should not be construed as being limitations on the scope of any invention or claimability, but rather on the particular embodiment of a particular invention But should be understood as an explanation of the features. Certain features described herein in the context of separate embodiments may be implemented in combination in a single embodiment. Conversely, various features described in the context of a single embodiment may also be implemented in multiple embodiments, either individually or in any suitable subcombination. Further, although the features may operate in a particular combination and may be initially described as so claimed, one or more features from the claimed combination may in some cases be excluded from the combination, Or a variant of a subcombination.
Likewise, although the operations are depicted in the drawings in a particular order, it should be understood that such operations must be performed in that particular order or sequential order shown to achieve the desired result, or that all illustrated operations should be performed. In certain cases, multitasking and parallel processing may be advantageous. Also, the separation of the various system components of the above-described embodiments should not be understood as requiring such separation in all embodiments, and the described program components and systems will generally be integrated together into a single software product or packaged into multiple software products It should be understood.
Certain embodiments of the subject matter described herein have been described. Other embodiments are within the scope of the following claims. For example, the operations recited in the claims may be performed in a different order and still achieve desirable results. By way of example, the process illustrated in the accompanying drawings does not necessarily require that particular illustrated or sequential order to obtain the desired results. In certain implementations, multitasking and parallel processing may be advantageous.
The description sets forth the best modes of the present invention and provides examples for the purpose of illustrating the invention and enabling a person skilled in the art to make and use the invention. The written description is not intended to limit the invention to the specific terminology presented. Thus, while the present invention has been described in detail with reference to the above examples, those skilled in the art will recognize that modifications, changes, and modifications can be made thereto without departing from the scope of the present invention.
Therefore, the scope of the present invention should not be limited by the described embodiments but should be defined by the claims.
The present invention relates to a method and apparatus for providing vehicle status information in the form of a panorama using multiple cameras and radar sensor information, and more particularly, to a method and apparatus for providing vehicle status information in a panoramic form using multiple cameras and radar sensor information, The situation information providing apparatus judges the situation of each vehicle and reconstructs the interface based on the received image information and sensing information and transmits the reconstructed interface to the vehicle, And more particularly, to a vehicle status information providing method and apparatus for configuring a dynamic vehicle interface according to each driver's situation.
According to the present invention, a vehicle capable of acquiring information about a vehicle widely from various viewpoints by connecting a plurality of adjacent vehicles, sharing image information and sensing information obtained from each vehicle, and configuring an interface for each driver's situation, Thereby providing the driver with the driver's understanding of the surrounding situation of the vehicle, thereby contributing to safe driving and contributing to the development of the automobile industry.
In addition, since the present invention is not only possible to be marketed or operated, but also can be practically and practically carried out, it is industrially applicable.
100: communication network 200: vehicle 210: camera and radar sensor
300: vehicle situation information providing server
Claims (10)
A sensor module for receiving at least one of camera image information and radar sensor information from a plurality of cameras and radar sensors installed in a specific vehicle;
A vehicle condition determination module for estimating a driver condition or a surrounding condition of a specific vehicle based on at least one of information received from the sensor module, camera image information received from at least one vehicle, and radar sensor information; And
An interface to be displayed on the in-vehicle output module based on at least one of the estimated situation information, the information received from the sensor module, the camera image information received from at least one vehicle, and the radar sensor information, Interface configuration module
Wherein the status information providing apparatus comprises:
And transmits the emergency notification information to the specific vehicle when it is determined that the specific vehicle is in an emergency situation.
A storage module for storing the received information;
Further comprising:
The interface configuration module
And when the driver is requested, reconstructs the driver's situation or surrounding situation for a predetermined time based on the stored information.
And at least one of identification information (Camera), time information (TimeStamp), relative direction information (RelativeOrientation) and relative position information (RelativeLocation) of each camera.
Wherein the time information includes at least one of time information, time information, detected object information, distance information, orientation information, and relative speed information.
Further comprising a storage module for storing an effective range of camera image information that can be processed by said interface configuration module,
The effective range of the camera image information is
A number of camera sensors, a number of camera sensors, a number of camera sensors, a number of camera sensors, a number of camera sensors, a number of camera sensors, a number of camera sensors, number of cameras, number of cameras, number of cameras, and access points.
Further comprising a storage module for storing an effective range of the radar sensor information that can be processed by said interface configuration module,
The effective range of the radar sensor information is
Information on at least one of distance information for detecting an object, distance information for sensing an object, distance information for sensing an object, distance information for sensing an object, distance information for sensing an object, distance information for sensing an object, .
Further comprising a storage module for storing information processing reference information that becomes a reference when the interface configuration module processes the camera image information to configure the interface,
The reference information of the camera image is
Wherein the interface configuration module includes at least one of resolution information of the image processed by the interface configuration module, width information of the image to be processed, and height information of the image to be processed. Information providing device.
Further comprising a storage module for storing information processing reference information that is a reference when the interface configuration module processes the radar sensor information to configure the interface,
The reference information of the radar sensor
A maximum value / minimum value of the distance information processed by the interface configuration module and distance information indicating a unit, a range of the angle information to be processed, and direction information And a relative speed information (RelativeSpeed) recording a maximum / minimum value of a relative speed to be processed and a unit.
Estimating a driver condition or a surrounding condition of the specific vehicle based on the received information;
An interface to be displayed on the in-vehicle output module based on at least one of the estimated situation information, the at least one camera, the camera image information received from the radar sensor, and the radar sensor information, ; And
Transmitting the configured interface information to the specific vehicle
And the vehicle status information is provided to the vehicle.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150114235A KR101734985B1 (en) | 2015-08-13 | 2015-08-13 | Method and Apparatus for Providing Panoramic Vehicle Status Information using Multiple Cameras and Radar Sensor |
PCT/KR2015/009850 WO2016060384A1 (en) | 2014-10-17 | 2015-09-21 | Method and device for providing panoramic vehicle situation information using multiple cameras and radar sensor information |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150114235A KR101734985B1 (en) | 2015-08-13 | 2015-08-13 | Method and Apparatus for Providing Panoramic Vehicle Status Information using Multiple Cameras and Radar Sensor |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170020620A true KR20170020620A (en) | 2017-02-23 |
KR101734985B1 KR101734985B1 (en) | 2017-05-25 |
Family
ID=59050759
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150114235A KR101734985B1 (en) | 2014-10-17 | 2015-08-13 | Method and Apparatus for Providing Panoramic Vehicle Status Information using Multiple Cameras and Radar Sensor |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101734985B1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109552315A (en) * | 2017-09-26 | 2019-04-02 | 纵目科技(上海)股份有限公司 | Full visual field camera engine control system |
CN112346040A (en) * | 2019-08-09 | 2021-02-09 | 大众汽车股份公司 | Method and device for determining parallax problem in sensor data of two sensors |
KR102612227B1 (en) * | 2023-07-12 | 2023-12-11 | 노지효 | System for smart lighting for traffic safety and control method thereof |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023204319A1 (en) * | 2022-04-18 | 2023-10-26 | ㈜와토시스 | Apparatus and method for processing vehicle image data and system comprising apparatus |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005115484A (en) * | 2003-09-17 | 2005-04-28 | Fujitsu Ten Ltd | Driving support device |
JP4952421B2 (en) * | 2007-07-18 | 2012-06-13 | 株式会社デンソー | Driving support system |
-
2015
- 2015-08-13 KR KR1020150114235A patent/KR101734985B1/en active IP Right Grant
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109552315A (en) * | 2017-09-26 | 2019-04-02 | 纵目科技(上海)股份有限公司 | Full visual field camera engine control system |
CN109552315B (en) * | 2017-09-26 | 2022-06-24 | 纵目科技(上海)股份有限公司 | Full-view camera host control system |
CN112346040A (en) * | 2019-08-09 | 2021-02-09 | 大众汽车股份公司 | Method and device for determining parallax problem in sensor data of two sensors |
KR102612227B1 (en) * | 2023-07-12 | 2023-12-11 | 노지효 | System for smart lighting for traffic safety and control method thereof |
Also Published As
Publication number | Publication date |
---|---|
KR101734985B1 (en) | 2017-05-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101734985B1 (en) | Method and Apparatus for Providing Panoramic Vehicle Status Information using Multiple Cameras and Radar Sensor | |
TWI469886B (en) | Cooperative event data record system and method | |
CN108844754B (en) | For assessing the test device of Senior Officer's auxiliary system | |
CA2882693C (en) | Method, apparatus and system for performing facial recognition | |
US20190096215A1 (en) | Amber alert monitoring and support | |
JP5262976B2 (en) | In-vehicle display device | |
CN103786644B (en) | Apparatus and method for following the trail of peripheral vehicle location | |
US10410064B2 (en) | System for tracking and identifying vehicles and pedestrians | |
US20200322585A1 (en) | Image processing device, image processing method, and vehicle | |
CA2594337A1 (en) | Method and system for tracking position of an object using imaging and non-imaging surveillance devices | |
JP2007129560A (en) | Object detector | |
US20150360617A1 (en) | Automated Emergency Response Systems for a Vehicle | |
JP2000244897A (en) | State recognition system and state recognition display generation method | |
JP4643860B2 (en) | VISUAL SUPPORT DEVICE AND SUPPORT METHOD FOR VEHICLE | |
KR20130088480A (en) | Integration control system and method using surveillance camera for vehicle | |
WO2008105935A2 (en) | Video surveillance system providing tracking of a moving object in a geospatial model and related methods | |
CN110738150B (en) | Camera linkage snapshot method and device and computer storage medium | |
US20160267336A1 (en) | Method for calibrating a camera for a gaze direction detection in a vehicle, device for a motor vehicle having a camera and at least one further element, and computer program product | |
CN109249857A (en) | Automobile calling system | |
CN113240939A (en) | Vehicle early warning method, device, equipment and storage medium | |
US10248132B2 (en) | Method and apparatus for visualization of an environment of a motor vehicle | |
US20150258935A1 (en) | Surroundings monitoring system for a vehicle | |
JP6891926B2 (en) | Vehicle systems, methods performed on vehicle systems, and driver assistance systems | |
US8213683B2 (en) | Driving support system with plural dimension processing units | |
WO2016060384A1 (en) | Method and device for providing panoramic vehicle situation information using multiple cameras and radar sensor information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
AMND | Amendment | ||
E90F | Notification of reason for final refusal | ||
AMND | Amendment | ||
E601 | Decision to refuse application | ||
AMND | Amendment | ||
X701 | Decision to grant (after re-examination) | ||
GRNT | Written decision to grant |