CN110296715A - Electronic device and its control method, computer readable recording medium - Google Patents
Electronic device and its control method, computer readable recording medium Download PDFInfo
- Publication number
- CN110296715A CN110296715A CN201910530448.7A CN201910530448A CN110296715A CN 110296715 A CN110296715 A CN 110296715A CN 201910530448 A CN201910530448 A CN 201910530448A CN 110296715 A CN110296715 A CN 110296715A
- Authority
- CN
- China
- Prior art keywords
- driveway
- mentioned
- vehicle
- change
- electronic device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 49
- 230000008859 change Effects 0.000 claims abstract description 64
- 230000001427 coherent effect Effects 0.000 claims abstract description 18
- 230000003190 augmentative effect Effects 0.000 claims description 42
- 238000001514 detection method Methods 0.000 claims description 15
- 238000004891 communication Methods 0.000 description 33
- 230000006870 function Effects 0.000 description 21
- 238000012545 processing Methods 0.000 description 19
- 230000015654 memory Effects 0.000 description 14
- 108020005196 Mitochondrial DNA Proteins 0.000 description 13
- 238000013507 mapping Methods 0.000 description 11
- 230000033001 locomotion Effects 0.000 description 8
- 230000008569 process Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 238000006073 displacement reaction Methods 0.000 description 6
- 238000012937 correction Methods 0.000 description 5
- 230000002708 enhancing effect Effects 0.000 description 5
- 238000012360 testing method Methods 0.000 description 5
- 238000004040 coloring Methods 0.000 description 4
- 230000001965 increasing effect Effects 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 239000011521 glass Substances 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 241000208340 Araliaceae Species 0.000 description 2
- 235000005035 Panax pseudoginseng ssp. pseudoginseng Nutrition 0.000 description 2
- 235000003140 Panax quinquefolius Nutrition 0.000 description 2
- 206010039203 Road traffic accident Diseases 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000013506 data mapping Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 235000008434 ginseng Nutrition 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 241000256844 Apis mellifera Species 0.000 description 1
- 208000035126 Facies Diseases 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 235000000332 black box Nutrition 0.000 description 1
- 238000000151 deposition Methods 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 230000005764 inhibitory process Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000003032 molecular docking Methods 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 230000001172 regenerating effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 239000011800 void material Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3658—Lane guidance
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/22—Display screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/28—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/10—Path keeping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
- B60W40/06—Road conditions
- B60W40/072—Curvature of the road
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/10—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
- B60W40/105—Speed
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3626—Details of the output of route guidance instructions
- G01C21/3647—Guidance involving output of stored or live camera images or video streams
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/20—Optical features of instruments
- B60K2360/21—Optical features of instruments using cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W2050/0001—Details of the control system
- B60W2050/0002—Automatic control, details of type of controller or control system architecture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2556/00—Input parameters relating to data
- B60W2556/45—External transmission of data to or from the vehicle
- B60W2556/50—External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Y—INDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
- B60Y2300/00—Purposes or special features of road vehicle drive control systems
- B60Y2300/10—Path keeping
Landscapes
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Mathematical Physics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
- Instructional Devices (AREA)
Abstract
The present invention discloses electronic device and its control method, computer readable recording medium, and the control method of electronic device of the invention includes: the step of determining the type and color of the driveway two sides lane line in vehicle driving from the driving coherent video data of vehicle;Type and color to the two sides lane line based on above-mentioned judgement are compared with stored driveway judgement table, thus the step of determining the driveway location information of the vehicle in traveling;And away from above-mentioned vehicle be predetermined distance in front of path on exist turning place in the case where, the driveway location information of above-mentioned decision is compared with the path direction at above-mentioned turning place, judge whether the step of needing the driveway of above-mentioned vehicle to change, wherein, above-mentioned stored driveway judgement table is the table for defining type and color according to the two sides lane line of vehicle.
Description
It is on June 17th, 2015 that the application, which is the applying date, application No. is 201510336818.5, entitled " electronics
The divisional application of the application for a patent for invention of device and its control method ".
Technical field
The present invention relates to electronic device and its control methods, computer readable recording medium, in more detail, are related to can recognize
Driveway position and the electronic device for driving related guidance and its control method, computer-readable are executed based on identification driveway
Recording medium.
Background technique
When the vehicle is running, it is most important that safety traffic and prevention traffic accident execute control for this purpose, being equipped in vehicle
The safety dresses such as a variety of auxiliary devices and safety belt, air bag of the posture of vehicle processed, the function of controlling vehicle structure device etc.
It sets.
Moreover, in the recent period, it is set to the traveling image of the device storage vehicle of black box of vehicle etc. and from various biographies
The data of sensor transmission, thus in the cause of accident when vehicle setting is when vehicle generation traffic accident for finding out vehicle
The trend of device.Due to that can also be answered in the portable terminals such as smart phone, tablet computer installation black box or navigator
With etc., thus it is used as vehicle device as described above.
But in fact, the utilization rate of the traveling image in this vehicle device is very low at present.More specifically, even if
The traveling image of vehicle is obtained by the visual sensor of the camera that is installed on vehicle at present etc., the electronic device of vehicle is only
It stays in and merely shows, transmits this image or only stay in whether generation is detached from the simple periphery prompt letter such as lane line
Breath.
Also, as new vehicle electronics concerned at present, it is also proposed that head-up display (HUD, Head-Up
Display) or augmented reality interface, but in these devices, the utilization rate of the traveling image of vehicle is also only rested on merely
Show or generate the level of simple prompt information.
Summary of the invention
The present invention proposes that the object of the present invention is to provide the driving using vehicle in order to solve the problem above-mentioned
Coherent video data generate driveway location information locating for vehicle, and the electronics dress for driving related guidance is executed based on this
It sets and its control method.
Also, another object of the present invention is to provide can perform effectively to drive related guidance based on augmented reality
Electronic device and its control method.
The control method of the electronic device of one embodiment of the invention for achieving the above object includes: the driving from vehicle
The step of lane line region part is identified in coherent video data;Image data from the above-mentioned lane line region part identified
The step of middle generation lane line information corresponding with driveway locating for above-mentioned vehicle;Believed using above-mentioned lane line generated
At least one of driveway information of road locating for breath and above-mentioned vehicle generates driveway position locating for above-mentioned vehicle
The step of information;And the step of the related guidance of driving of above-mentioned vehicle is executed using above-mentioned driveway location information generated
Suddenly.
Moreover, above-mentioned lane line information may include and be located at above-mentioned driveway on the basis of the driveway locating for the above-mentioned vehicle
The corresponding lane mark type information of each lane line of two sides, lane line colouring information.
Also, the step of generating above-mentioned driveway location information can include: obtained from map datum locating for above-mentioned vehicle
Road driveway information the step of;Judge it is above-mentioned whether above-mentioned vehicle is located at using above-mentioned lane line information generated
The step of the first driveway or end driveway of road;And it is located at above-mentioned first driveway or end garage in above-mentioned vehicle
In the case where road, reflect the driveway information of above-mentioned road come the step of generating driveway location information locating for above-mentioned vehicle.
Moreover, if the step of generating above-mentioned driveway location information can comprise the further steps of: vehicle locating for above-mentioned vehicle
Trade is changed to the driveway between above-mentioned first driveway and end driveway with above-mentioned vehicle to change driveway,
The step of then by above-mentioned driveway updating location information generated at above-mentioned driveway location information after change.
If also, the present invention may also include driveway locating for above-mentioned vehicle as above-mentioned vehicle to change driveway comes from position
Driveway between above-mentioned first driveway and end driveway is changed to above-mentioned first driveway or end driveway, then instead
The driveway information of road locating for above-mentioned vehicle is reflected come the step of regenerating driveway location information locating for above-mentioned vehicle.
Moreover, the step of driving for executing above-mentioned vehicle related guidance may include utilizing the guidance path of above-mentioned vehicle and upper
The step of driveway location information is stated to export driveway change guidance.
Also, the step of executing the driving correlation guidance of above-mentioned vehicle may include using above-mentioned driveway location information come defeated
The step of driveway locating for above-mentioned vehicle guides out.
Moreover, the invention also includes the two sides of the driveway according to locating for the vehicle identified based on above-mentioned lane line information
Lane line type, to select and export the step of lane line appropriate is detached from guidance.
Also, above-mentioned output step can include: the step of generating the indicator for executing above-mentioned driving correlation guidance;With
And by augmented reality come the step of exporting above-mentioned indicator generated.
On the other hand, the electronic device of one embodiment of the invention for achieving the above object includes: that lane line information is raw
At portion, lane line region part is identified from the driving coherent video data of vehicle, and from the above-mentioned lane line region identified
Lane line information corresponding with driveway locating for above-mentioned vehicle is generated in partial image data;Driveway location information is raw
At portion, using at least one of driveway information of road locating for above-mentioned lane line information generated and above-mentioned vehicle come
Generate driveway location information locating for above-mentioned vehicle;And control unit, using above-mentioned driveway location information generated come
Execute the driving correlation guidance of above-mentioned vehicle.
Moreover, above-mentioned lane line information may include and be located at above-mentioned driveway on the basis of the driveway locating for the above-mentioned vehicle
The corresponding lane mark type information of each lane line of two sides, lane line colouring information.
Also, above-mentioned driveway location information generating unit can obtain the vehicle of road locating for above-mentioned vehicle from map datum
Trade information judges whether above-mentioned vehicle is located at the first driveway of above-mentioned road using above-mentioned lane line information generated
Or end driveway, in the case where above-mentioned vehicle is located at above-mentioned first driveway or end driveway, and reflect above-mentioned road
Driveway information generate driveway location information locating for above-mentioned vehicle.
Moreover, if driveway locating for above-mentioned vehicle is changed to be located at above-mentioned first with above-mentioned vehicle to change driveway
Driveway between driveway and end driveway, then above-mentioned driveway location information generating unit can be by above-mentioned garage generated
Road updating location information is at the above-mentioned driveway location information after change.
If also, driveway locating for above-mentioned vehicle with above-mentioned vehicle to change driveway come from be located at above-mentioned first garage
Driveway between road and end driveway is changed to above-mentioned first driveway or end driveway, then above-mentioned driveway position letter
Breath generating unit can reflect the driveway information of road locating for above-mentioned vehicle to regenerate driveway position locating for above-mentioned vehicle
Confidence breath.
Moreover, above-mentioned control unit can control output section that can believe using the guidance path of above-mentioned vehicle and above-mentioned driveway position
Breath guides to export driveway change.
Also, above-mentioned control unit can control output section to export locating for above-mentioned vehicle using above-mentioned driveway location information
Driveway guidance.
Moreover, above-mentioned control unit can control output section vehicle according to locating for the vehicle identified based on above-mentioned lane line information
The type of the lane line of the two sides on trade is detached from guidance to select and export lane line appropriate.
Also, above-mentioned control unit can control output section to generate the indicator for executing above-mentioned driving correlation guidance, and lead to
Augmented reality is crossed to export above-mentioned indicator generated.
On the other hand, the control method of the electronic device of one embodiment of the invention for achieving the above object includes: to connect
The step of receiving the input of the user for requesting to carry out Route guiding;Based on destination information next life corresponding with above-mentioned request
The step of at Route guiding line;Reflect the radius of the driving trace of actual travel vehicle to correct above-mentioned Route guiding generated
The step of line;It is variable to execute based on keeping the height of above-mentioned Route guiding line after correction different from the distance between this vehicle
The step of three-dimensional;To the above-mentioned stereo data mapping textures generated based on variable three-dimensional, refer to generate Route guiding
The step of showing symbol;And by augmented reality come the step of picture exports above-mentioned Route guiding indicator.
Moreover, the step of correcting above-mentioned Route guiding line can include: addition is for tieing up the forward path guide line of this vehicle
The step of holding the vertex of straight line;The curved vertex for realizing curve section is added to the curve section of Route guiding line
Step;And the Route guiding line of the radius of the driving trace of reflection actual travel vehicle is generated using added vertex
Step.
Moreover, the step of executing above-mentioned variable three-dimensional can include: the two sides of above-mentioned Route guiding line after calibration are raw
The step of at virtual Route guiding line;So that correction after above-mentioned Route guiding line included by vertex height value and this vehicle
Distance increase in proportion to mode computed altitude value the step of;And by polygon come to calculating the above-mentioned of height value
The vertex that the vertex of Route guiding line and above-mentioned virtual Route guiding line are respectively included executes the step of three-dimensional.
Also, in the step of generating above-mentioned Route guiding indicator, it can map in above-mentioned stereo data according to vehicle
Speed has the texture of displacement, to generate Route guiding indicator.
Moreover, the present invention may also include that execution for presumption and camera in filmed image obtained by shooting from camera
The step of calibration (Calibration) of corresponding camera parameter;And it is shot based on camera parameter from camera
Obtained by the step of virtual three-dimensional space is generated in image, in above-mentioned output step, can be combined in three-dimensional space generated
Above-mentioned Route guiding indicator generated exports.
On the other hand, the electronic device of one embodiment of the invention for achieving the above object can include: input unit receives
For requesting to carry out the input of the user of Route guiding;Route guiding line generating unit is based on purpose corresponding with above-mentioned request
Ground information generates Route guiding line;Route guiding indicator generating unit utilizes above-mentioned Route guiding line generated, Lai Zeng
The Route guiding indicator for being used for Route guiding is generated in strong reality;And display unit, it is exported by augmented reality in picture
Above-mentioned Route guiding indicator generated, above-mentioned Route guiding indicator generating unit can include: Route guiding line processing unit, instead
The radius of the driving trace of actual travel vehicle is reflected to correct above-mentioned Route guiding line generated;Route guiding DNA mitochondrial DNA
Portion based on keeping the height of above-mentioned Route guiding line after correction different from the distance between this vehicle, and executes and can be changed three-dimensional;
And texture mapping portion, to the above-mentioned stereo data mapping textures generated based on variable three-dimensional, to generate Route guiding
Indicator.
Moreover, above-mentioned Route guiding line processing unit can add the top for making the forward path guide line of this vehicle maintain straight line
Point adds the curved vertex for realizing curve section to the curve section of Route guiding line, and utilizes added top
It puts to generate the Route guiding line of the radius of the driving trace of reflection actual travel vehicle.Moreover, above-mentioned Route guiding DNA mitochondrial DNA
The two sides of the above-mentioned Route guiding line of change portion after calibration generate virtual Route guiding line, so that the above-mentioned path after correction is drawn
The mode computed altitude value that the height value on vertex included by conducting wire increases in proportion at a distance from this vehicle, and pass through polygon
Come the vertex and the vertex that is respectively included of above-mentioned virtual Route guiding line to the above-mentioned Route guiding line for calculating height value
Execute three-dimensional.
Also, above-mentioned texture mapping portion can be by having displacement according to the speed of vehicle in the mapping of above-mentioned stereo data
Texture, to generate Route guiding indicator.
Moreover, the present invention may also include that calibration portion, execute for presumption in filmed image obtained by being shot from camera with
The calibration of the corresponding camera parameter of camera;And three-dimensional space generating unit, it is clapped based on camera parameter from camera
Virtual three-dimensional space is generated in image obtained by taking the photograph, above-mentioned display unit can combine generated above-mentioned in three-dimensional space generated
Route guiding indicator exports.
On the other hand, the recording medium of one embodiment of the invention for achieving the above object can record in computer
Execute the program code of the control method of above-mentioned electronic device.
According to above-mentioned various embodiments of the invention, can be generated from the image data of lane line region part and vehicle
The corresponding lane line information of locating driveway, and necessary processing can be performed.It can be performed and believe including the use of lane line as a result,
The lane line interface of breath exports and generates the much information processing at augmented reality interface.
Also, various embodiments according to the present invention, can the central driveway of road be it is multiple in the case where (for example,
The quantity of driveway is more than four situations), it can exact knowledge driveway locating for vehicle in multiple central driveways.
Also, various embodiments according to the present invention can determine whether driveway locating for vehicle to draw to driver
It leads, thus the effect of executable auxiliary driver.
Also, various embodiments according to the present invention, driveway locating for the routing information and vehicle using navigator come
It is accurate to execute driveway change prompt, so as to seek the convenience of user.
Also, various embodiments according to the present invention, using lane line information come according to the vehicle being in driving process
The type of the lane line of the two sides on trade optionally executes driveway change prompt appropriate, so that driveway change can be improved
The performance more prompted, and can provide information abundant.
Also, various embodiments according to the present invention, the three-dimensional processing of passage path guide line are suitable to construct in real time
The Route guiding indicator of augmented reality, so as to really, effectively show that three-dimensional path draws on two-dimensional camera image
Lead indicator.I.e., it is possible to show in such a way that Route guiding line is located at real road, rather than show that previous augmented reality is led
The Route guiding line of the simple form of boat instrument.
Detailed description of the invention
Fig. 1 is the block diagram for indicating the electronic device of one embodiment of the invention.
Fig. 2 is the figure of the network of the system for illustrating to be connected with the electronic device of one embodiment of the invention.
Fig. 3 is the flow chart for indicating the lane line information generating method of electronic device of one embodiment of the invention.
Fig. 4 is the flow chart of the lane line information generating method of the specific electronic device for indicating one embodiment of the invention.
Fig. 5 is the grayscale image conversion for indicating one embodiment of the invention and the figure of lane line region detection process.
Fig. 6 is the figure of the lane mark type region-of-interest in the grayscale image for indicate one embodiment of the invention.
Fig. 7 is the binarization and one of the lane mark type region-of-interest in the grayscale image for indicate one embodiment of the invention
Tie up the figure of mapping.
Fig. 8 is the flow chart of the specific driveway location information generation method for indicating one embodiment of the invention.
Fig. 9 is to indicate that the driveway of one embodiment of the invention judges the figure of table.
Figure 10 is to indicate that the driveway of another embodiment of the present invention judges the figure of table.
Figure 11 is the flow chart for indicating the control method of electronic device of one embodiment of the invention.
Figure 12 is the block diagram of the specific augmented reality control unit for indicating one embodiment of the invention.
Figure 13 is the figure that the processing to path guide line is compared before and after processing.
Figure 14 is the figure for indicating the Route guiding DNA mitochondrial DNA process of one embodiment of the invention.
Figure 15 is the flow chart for indicating the augmented reality path guide method of one embodiment of the invention.
Figure 16 is the figure for indicating the Route guiding picture of one embodiment of the invention.
Figure 17 be indicate one embodiment of the invention camera and electronic device be divergence type in the case where embodiments
Figure.
Figure 18 be indicate one embodiment of the invention camera and electronic device be integrated type in the case where embodiments
Figure.
Figure 19 is the figure indicated using the head-up display of one embodiment of the invention and the embodiments of electronic device.
Specific embodiment
Content below is merely exemplary the principle of the present invention.Therefore, those skilled in the art can send out
Although it is bright go out do not clearly state or illustrate in this specification, the principle of the present invention can be embodied and be contained in idea of the invention and
A variety of devices of range.Also, all conditional terms and embodiment enumerated in the present invention are specifically used in principle
Understand idea of the invention, and, it is thus understood that do not limit the embodiment and state especially enumerated in the manner.
Also, for enumerating the principle of the present invention, viewpoint and embodiment, and enumerate all detailed description of specific embodiment
It should be understood the structural and functional equivalent technical solutions comprising this item.Also, this equivalent technical solutions are answered
Be understood to include presently disclosed equivalent technical solutions, further include the equivalent technical solutions that will be developed from now on, that is, with knot
All elements that the mode that structure independently executes identical function is invented.
Thus, for example, the block diagram of this specification should be understood to mean the illustrative of the principle of the present invention materialization
The viewpoint of the concept in circuit.Similar, all flow charts, state transition graph, pseudocode etc. should be understood regardless of whether real
Be shown in computer-readable medium to matter, regardless of whether be explicitly illustrated computer or processor, indicate by computer or
Handle the multiple programs executed.
Processor or include the functional block shown with concept similar with processor Various Components shown in the drawings
Function not only can provide as specialized hardware, but also can provide to be relatively capable of runs software using having with software appropriate
The hardware of function.It, can be by single application specific processor, single shared processor or more when providing above-mentioned function by processor
A Respective processors provide above-mentioned function, and a part in these can be shared.
Also, with processing, control or concept similarly come the term that proposes clearly using should not be to exclude to have
Be capable of the mode of the hardware of the ability of runs software to explain, should in a manner of unrestricted being construed as to imply that property include number
Signal processor (DSP) hardware, the read-only memory (ROM) for storing software, random access memory (RAM) and non-volatile
Property memory.It may also include other known usual hardware.
It is claimed in range, is shown as executing the documented function in detailed description in the invention of this specification
The structural element of the mechanism of energy includes such as executing the combination of the loop element of above-mentioned function or containing firmware/microcode including executing
Form of ownership software function all methods, and combined with the circuit appropriate for running above-mentioned software, to hold
The above-mentioned function of row.The present invention that range is claimed by this invention to define, comes by the method enumerated in many ways
The function of offer combines, and combines with inventing mode required by claimed range, it is thus possible to provide above-mentioned function
Any method it will be also be appreciated that be equal to from this specification grasp method it is identical.
By following detailed description of relevant to attached drawing, can definitely above-mentioned objects, features and advantages, as a result,
General technical staff of the technical field of the invention can easily implement technical idea of the invention.Also, to the present invention
It, will if judging illustrating there is a possibility that purport of the invention thickens to well-known technique during being illustrated
Description is omitted.
Hereinafter, various embodiments of the invention are described in detail referring to attached drawing.
Fig. 1 is the block diagram for indicating the electronic device of one embodiment of the invention.Referring to Fig.1, electronic device 100 includes storage unit
110, input unit 120, output section 130, lane line information generation unit 140, driveway location information generating unit 150, augmented reality
Supply unit 160, control unit 170, communication unit 180, all or part of in test section 190.
Herein, electronic device 100 can be presented as can provide the intelligent hand for driving related guidance to vehicle operator
It is machine, tablet computer, laptop, personal digital assistant (PDA, personal digital assistant), portable more
Media player (PMP, portable multimedia player), expands Reality glasses, navigator at intelligent glasses
(navigation), a variety of devices such as black box (Black-box).
Herein, the driving condition of vehicle may include the shape of parking of the dead ship condition of vehicle, the driving status of vehicle, vehicle
State etc. is driven the various states of vehicle by driver.
Driving related guidance may include that Route guiding, lane line are detached from guidance, set out guidance, signal change of front vehicles is drawn
It leads, prevent from colliding guidance, driveway change guidance, driveway guidance etc. with front vehicles for assisting driver to drive vehicle
A variety of guidance.
Herein, Route guiding can include: augmented reality Route guiding, in the front of the vehicle of shooting just under steam
Carry out execution route guidance in conjunction with the various information such as the position of user, direction in image;And it is two-dimentional (2D, 2-Dimensional)
Or three-dimensional (3D, 3-Dimensional) navigation, it is various in conjunction with position, the direction of user etc. in two-dimensionally or three-dimensionally diagram data
Information executes navigation.Herein, Route guiding is interpreted as not only being included in the case that user's ride-on vehicles drive
Navigation, further include the concept of the navigation in the case that user is mobile in a manner of walking or run.
Also, it can be to guide to whether the vehicle in traveling is detached from lane line that lane line, which is detached from guidance,.
Also, front vehicles, which are set out, to be guided and can draw to whether the vehicle in the front for the vehicle being located in parking sets out
It leads.
Also, signal lamp change guidance can draw to whether the signal lamp in the front for the vehicle being located in parking changes
It leads.As an example, if from indicating that the Status Change that the red light of stop signal lights is the shape for indicating the green light of starting signal and lighting
State can then guide this.
Also, preventing the guidance collided with front vehicles can be, if the vehicle in parking or traveling be located at front
The distance between vehicle within predetermined distance, then in order to prevent with front vehicles collide and carry out guidance.
Also, driveway change guidance can be to guide vehicle locating for the vehicle in order to guide the path until destination
Driveway be altered to other driveways.
Also, driveway guidance can be the guidance that is carried out to the driveway that vehicle is presently in.
This driving coherent video for being capable of providing a variety of guidance can by the camera placed towards the front of vehicle come
Captured in real-time.Herein, camera can be for be formed as one with the electronic device 100 for being placed in vehicle, and shoots vehicle
Front camera.In this case, camera can be formed as one with smart phone, navigator or black box, electricity
Sub-device 100 can receive image obtained by the camera shooting being formed as one.
As another example, camera can be to be respectively placed in vehicle with electronic device 100, and shoot the front of vehicle
Camera.In this case, camera can be the individual black box placed towards the front of vehicle, electronic device 100
Image captured by individually placed black box is received by wire/wireless communication, if or for storing captured by black box
The storage medium of image be inserted in electronic device 100, then electronic device 100 can receive image captured by black box.
Hereinafter, being based on above content, the electronic device 100 of one embodiment of the invention is more specifically illustrated.
A variety of data and the function of application needed for storage unit 110 executes the work of storage electronics 100.In particular, depositing
Storage portion 110 can data needed for the work of storage electronics 100, such as operating system (OS), track search application, map datum
Deng.Also, the data that storage unit 110 can store the work by electronic device 100 to generate, such as the number of path explored
According to, received image of institute etc..Also, storage unit 110 can store the positional relationship information for multiple signals that signal lamp is included, vehicle
Trade judges table etc..
Herein, storage unit 110 can not only be presented as random access memory (RAM, Random Access
Memory), flash memory, read-only memory (ROM, Read Only Memory), erasable programmable read-only memory (EPROM,
Erasable Programmable ROM), electrically erasable formula read-only memory (EEPROM, Electronically
Erasable and Programmable ROM), register, hard disk, moveable magnetic disc, RAM card, Universal Subscriber Identity Module
Internally-arranged types memory elements such as (USIM, Universal Subscriber Identity Module), may be embodied in general
The memory element of the removable form such as serial bus memories.
The execution of input unit 120 will be converted to specific electric signal from externally input be physically entered of electronic device 100
Function.Herein, input unit 120 may include all or part of in user's input unit 121 and microphone portion 123.
User's input unit 121 can receive the input of the users such as touch, pushing action.Herein, user's input unit 121 can benefit
It is approached in the proximity sensor of movement at least with the form of a variety of buttons, the touch sensor of the touch input of reception, reception
One embodies.
The sound that microphone portion 123 can receive the sound of user and generate from the inside and outside of vehicle.
Output section 130 is the device for exporting the data of electronic device 100.Herein, output section 130 may include display
All or part of in portion 131 and audio output part 133.
Display unit 131 is the device that electronic device 100 exports the data that can be identified in a manner of sense of vision.Display unit
131 can be presented as the display unit before the shell of electronic device 100.Herein, display unit 131 can be filled with electronics
It sets 100 to be formed as one, and exports the identification data of sense of vision, can be also separately provided such as head-up display with electronic device 100
To export the identification data of sense of vision.
Audio output part 133 is the device that electronic device 100 exports the data that can be identified in a manner of audio.Audio
Output section 133 can embody the data that should prompt user of electronic device 100 by showing the loudspeaker of sound.
Communication unit 180 can be set to communicate electronic device 100 with other equipment.Communication unit 180 may include
Position data portion 181, broadcast receiving and transmitting portion 185, mobile division 186, short-range communication portion 187, has wireless interconnected wet end 183
All or part of in line communication unit 189.
Position data portion 181 is to pass through Global Navigation Satellite System (GNSS, Global Navigation Satellite
System) device of position data is obtained.Global Navigation Satellite System means available from the received electric wave of moonlet
Signal come calculate receive terminal position navigation system.It, can be according to operation as the concrete example of Global Navigation Satellite System
Main body is divided into GPS (GPS, Global Positioning System), GALILEO positioning system
(Galileo), GLONASS satellite navigation system (GLONASS, Global Orbiting Navigational Satellite
System), Beidou satellite navigation system (COMPASS), India area navigation satellite system (IRNSS, Indian Regional
Navigational Satellite System), quasi- zenith satellite system (QZSS, Quasi-Zenith Satellite
System) etc..The position data portion 181 of the electronic device 100 of one embodiment of the invention can be by receiving to using electronic device
100 area provides the signal of the Global Navigation Satellite System of service to obtain location information.
Wireless interconnected wet end 183 is the device for obtaining data by being connected with wireless Internet or sending information.It can
The wireless Internet connected by wireless interconnected wet end 183 can be WLAN (WLAN, Wireless LAN), wireless
Broadband (Wibro, Wireless broadband), worldwide interoperability for microwave accesses (Wimax, World interoperability
For microwave acess), high-speed slender body theory (HSDPA, High Speed Downlink Packet Acess)
Deng.
Broadcast receiving and transmitting portion 185 is the device that broadcast singal is received and dispatched by various broadcast systems.Broadcast receiving and transmitting portion can be passed through
The broadcast system of 185 transmitting-receivings can be T-DMB (DMBT, Digital Multimedia
Broadcasting Terrestrial), digital multimedia satellite broadcast (DMBS, Digital Multimedia
Broadcasting Satellite), high pass propose mobile TV standard (MediaFLO, Media Forward Link
Only), hand-held digital video broadcast (DVBH, Digital Video Broadcast Handheld), Japanese Digital audio are wide
Broadcast scheme (ISDBT, Integrated Services Digital Broadcast Terrestrial) etc..It is received by broadcast
The broadcast singal that hair portion 185 is received and dispatched may include traffic data, life data etc..
Mobile division 186 can be according to 3rd generation mobile communication technology (3G, 3rd Generation), third generation cooperation partner
With plan (3GPP, 3rd Generation Partnership Project), long term evolution plan (LTE, Long Term
) etc. Evolution various kinds of mobile communication specification is connected and is communicated with mobile radio communication.
Short-range communication portion 187 is the device for carrying out short-range communication.As described above, short-range communication portion 187 can
Pass through bluetooth (Bluetooth), radio frequency identification (RFID, Radio Frequency Identification), infrared data group
It knits (IrDA, Infrared Data Association), ultra wide band (UWB, Ultra WidBand), purple honeybee (ZigBee), close
Communication (NFC, Near Field Communication), Wireless Fidelity (Wi-Fi, Wireless-Fidelity) etc. come into
Row communication.
Wire communication portion 189 is the interface arrangement that electronic device 100 can be made in a wired fashion to be connected with other equipment.
Wire communication portion 189 can be the universal serial bus that can be communicated by USB port (USB Port)
Module.
This communication unit 180 can be logical using position data portion 181, wireless interconnected wet end 183, broadcast receiving and transmitting portion 185, movement
At least one of letter portion 186, short-range communication portion 187, wire communication portion 189 to communicate with other equipment.
As an example, in the case where electronic device 100 does not include camera function, using short-range communication portion 187, have
At least one of line communication unit 189 come receive the vehicles camera such as black box shooting image.
As another example, in the case where communicating with multiple equipment, short-range communication portion 187 can also be passed through by an equipment
It communicates, another equipment passes through wire communication portion 189 and communicates.
Test section 190 is the device of the current state of detectable electronic device 100.Test section 190 may include motion detection
All or part of in portion 191, optical detection part 193.
Motion detection portion 191 can detect the movement on the three-dimensional space of electronic device 100.Motion detection portion 191 may include
Three axis geomagnetic sensors and 3-axis acceleration sensor.By the exercise data obtained by motion detection portion 191 and position can be passed through
The position data for setting the acquisition of data portion 191 combines, to calculate the more accurate rail for the vehicle for being attached with electronic device 100
Mark.
Optical detection part 193 is the device for measuring the periphery illumination (illuminance) of electronic device 100.Using passing through light
The illumination data that test section 193 obtains, can make the brightness of display unit 195 correspondingly change with periphery brightness.
Power supply unit 195 is the other equipment for supplying the work of electronic device 100 or being connected with electronic device 100
The device of power supply needed for work.Power supply unit 195 can be from external power supplies such as the batteries or vehicle for being built in electronic device 100
Place receives the device of power supply.Also, according to the form for receiving power supply, power supply unit 195 can be presented as wire communication module 119 or body
The now device wirelessly to receive power supply.
On the other hand, control unit 170 controls the overall work of electronic device 100.Specifically, control unit 170 is controllable deposits
Storage portion 110, input unit 120, output section 130, lane line information generation unit 140, driveway position generating unit 150, augmented reality
Supply unit 160, communication unit 180, all or part of in test section 190.
In particular, control unit 170 can control lane line information generation unit 140, driveway location information generating unit 150 to come from vehicle
It drives and to identify lane line region part in coherent video data, and from the image data of the lane line region part identified
Lane line information corresponding with driveway locating for above-mentioned vehicle is generated, using locating for lane line information generated and vehicle
At least one of the driveway information of road generate driveway location information locating for vehicle.
Herein, lane line may imply that form the line of the driveway two sides of driveway locating for vehicle (lane)
(line).Also, driveway can by the first driveway, the second driveway ... the lane line of N driveway etc is formed, and
It may imply that the road of vehicle driving.
Lane line information generation unit 140 can be identified from the image data shot under the driving conditions such as vehicle driving, parking
Lane line region part, and can from generated in the image data of lane line region part on the basis of the driveway locating for the vehicle
The corresponding lane line information of each lane line positioned at the two sides of above-mentioned driveway.Lane line information may include and locating for vehicle
Driveway on the basis of be located at the corresponding lane mark type information of each lane line of two sides, the lane line color of above-mentioned driveway
Information.
Herein, in order to generate lane line information corresponding with driveway locating for vehicle, lane line information generation unit
140 can execute binarization to the image data of lane line region part, and can obtain from the partial image data of binarization
Obtain lane mark type information.Specifically, lane line information generation unit 140 can utilize time continuity information and the lane of lane line
At least one of linear velocity information analyzes the partial image data of binarization, to can recognize driveway locating for vehicle
The types of two lane lines whether be solid line or dotted line.
Moreover, lane line information generation unit 140 can extract each lane line phase identified with type from image data
Corresponding colouring information, Lai Shengcheng lane line information.
According to the control of control unit 170, driveway location information generating unit 150 is using in lane line information generation unit
At least one of driveway information of road locating for the 140 lane line information generated and vehicle generates vehicle locating for vehicle
Trade location information.
Specifically, driveway location information generating unit 150 obtains the driveway of road locating for vehicle from map datum
Information, and judge whether vehicle is located at the first driveway or end driveway of road using lane line information generated,
In the case where vehicle is located at the first driveway or end driveway, it can reflect the driveway information of road to generate locating for vehicle
Driveway location information.Later, if driveway locating for vehicle is made to be changed to be located at first according to vehicle to change driveway
Driveway between driveway or end driveway, then above-mentioned driveway location information generating unit 150 is by driveway generated
Updating location information is the driveway location information that is changed.Later, if being made locating for vehicle according to vehicle to change driveway
Driveway and be changed to the first driveway or end driveway from the driveway between the first driveway and end driveway,
Then above-mentioned driveway location information generating unit 150 reflects the driveway information of road locating for vehicle to regenerate locating for vehicle
Driveway location information.
Herein, the driveway information of road locating for vehicle may include the quantity letter of the driveway of road locating for vehicle
Breath, road category information (for example, highway, expressway around city road, land-service road, ordinary road) etc., and the vehicle of road
Trade information can be obtained from the map datum for the storage unit 110 being stored in electronic device 100, or can be from electronic device
It obtains, or can be obtained from other electronic devices 100 in other external map data bases (DB) other than 100.As an example, exist
In the case that electronic device 100 is presented as black box, black box can be obtained from the external navigation device communicated to connect with black box
The driveway information of road.
On the other hand, lane line information used in every country is different, and driveway location information generating unit 150 can
Using generating driveway location information based on the driveway judgement table of every country traffic law etc..Driveway position as a result,
Information generation unit 150 can generate driveway position letter based on the driveway judgement table with the national information phase mapping set
Breath.
On the other hand, control unit 170 is using in lane line information generation unit 140, driveway location information generating unit 150
The lane line information of generation guides to execute the driving correlation of vehicle.
As an example, control unit 170 can the driveway according to locating for the vehicle identified based on lane line information two sides
Lane line type, to select lane line appropriate to be detached from guidance, and exported to user.Specifically, it is mentioned in electronic device 100
In the case where being detached from guiding function for lane line, control unit 170 can be provided according to the type for the lane line being detached from and color
Mutually different guidance.For example, the case where control unit 170 can cross Central Line according to vehicle, vehicle cross the feelings of solid white line
The case where the case where condition, vehicle cross white dashed line or vehicle cross blue line selects mutually different image or prompt respectively
Voice is simultaneously exported.
As an example again, control unit 170 is believed using in the driveway position that driveway location information generating unit 150 generates
Breath, and driveway guidance generated is exported by output section 140.Specifically, control unit 170 can be with the side of image or voice
Formula export current vehicle locating for driveway be which driveway, for example, the first driveway, the second driveway ... N garage
Road.
As another example, control unit 170 is believed using in the driveway position that driveway location information generating unit 150 generates
Breath, and driveway change guidance generated is exported by output section 140.Specifically, leading for vehicle is provided in electronic device 100
In the case where function of navigating, control unit 170 can according to until destination path and the above-mentioned driveway position that is judged, with
The mode of image or suggestion voice exports driveway change guidance.That is, to turn left or turn right guidance place until away from
From in predetermined distance situation below, it can determine whether can be turned left or be turned right in current driveway, thus to
User exports driveway change guidance.
On the other hand, control unit 170 can control augmented reality supply unit 160, so that electronic device 100 is based on augmented reality
Related guidance is driven to execute.Herein, augmented reality can be in the picture for the real world that presentation user actually sees
Additional information is supplied in a manner of being visually overlapped (for example, the place (Point Of Interest:POI) paid close attention to
Graphical element indicates the graphical element in path until destination etc.) method.In the case, control unit 170 and enhancing
Real supply unit 160 is connected to generate for executing the indicator for driving related guidance, and is given birth to by the output of output section 130
At indicator.As an example, augmented reality can be by utilizing the head-up display of front windshield of vehicle or utilizing other shadows
It is supplied as the image superposition of output device, like this, augmented reality supply unit 160 produces reality imagery or is superimposed on glass
Interface image etc..Augmented reality navigator or Vehicle Information System etc. can be embodied as a result,.
In particular, an embodiment according to the present invention, can passage path guide line three-dimensional processing it is suitable to construct in real time
The Route guiding indicator of augmented reality, to really, effectively show that three-dimensional path guides on two-dimensional camera image
Indicator.In this regard, carrying out with reference to the accompanying drawings aftermentioned.
Fig. 2 is the figure of the network of the system for illustrating to be connected with the electronic device of one embodiment of the invention.Referring to figure
2, the electronic device 100 of one embodiment of the invention can be presented as that navigator, black box, smart phone or the enhancing of other vehicles are existing
Real interface feedway etc. is set to the various devices of vehicle, and can with a variety of communication networks and other electronic equipments 61,62,
63,64 are connected.
Also, electronic device 100 can be connected according to from the received electric wave signal of moonlet 20 with global positioning system
It is dynamic, to calculate current position and current time.
The different L-band frequency of frequency band can be transmitted in each moonlet 20.Electronic device 100 can be based on from each artificial
Time needed for the L-band frequency that satellite 20 is sent reaches electronic device 100 calculates current position.
On the other hand, electronic device 100 can be by communication unit 180, and by control station 40 (ACR), base station 50 (RAS) etc.
To be wirelessly connected with network 30.If electronic device 100 is connected with network 30, can also in an indirect way with
Other electronic equipments 61,62 that network 30 is connected are connected, and exchange data.
On the other hand, electronic device 100 can also be by the other equipment 63 with communication function come in an indirect way
It is connected with network 30.For example, in the case where electronic device does not have the module that can be connected with network 30, it can be by close
Distance communication etc. is communicated with the other equipment 63 with communication function.
Fig. 3 is the flow chart for indicating the lane line information generating method of electronic device of one embodiment of the invention.Referring to figure
3, firstly, electronic device 100 can identify lane line region part (step S101) from the driving coherent video data of vehicle.Tool
Body, lane line information generation unit 140 are converted into grayscale image for coherent video is driven, and execute lane detection algorithm by
Identifiable region in each lane line of the two sides of vehicle is determined as lane line region.Herein, the driving phase of vehicle
Closing image may include the relevant image of parking to vehicle, traveling.Moreover, the driving coherent video of vehicle can be electronic device
100 the image of the received camera module as included by electronic device 100 shooting or the image that is shot by other devices.And
And the driving coherent video of vehicle can be RGB (RGB, Red Green Blue) chromatic image.
Moreover, electronic device 100 can be generated from the image data of the lane line region part identified with vehicle locating for
The corresponding lane line information (step S102) of driveway.Specifically, lane line information generation unit 140 is by detected
The pattern information of lane line regional analysis lane line and the colouring information of lane line, to produce lane line information.Lane line
Information may include line information corresponding with each lane line of driveway two sides locating for vehicle is located at and line color information
At least one of.
Hereinafter, lane line information generating method is described in more details referring to Fig. 4 to Fig. 7.
Fig. 4 is the flow chart of the lane line information generating method of the specific electronic device for indicating one embodiment of the invention.Ginseng
According to Fig. 4, firstly, color imaging data is converted into grayscale image (step S201) by electronic device 100, and can be from the ash converted
It spends in image and detects lane line region (step S202).
Specifically, lane line information generation unit 140 can be extracted from captured driving coherent video for detecting lane
The region of line.If also, a part of road is influenced by shadow, is difficult to that lane line information generation unit 140 is made to detect lane
Line therefore, can be in advance to former adjustment of image light source in order to minimize the influence of shadow.
Moreover, lane line information generation unit 140 can be according to the position for the camera being previously set or the setting angle of camera
It spends there will likely be the region detection of lane line as lane line region.For example, lane line information generation unit 140 can be by lane
The position that line can start determines lane line region as starting point.Also, lane line information generation unit 140 can be by driving phase
Close the width (maximum width between left-hand lane line region and right-hand lane line region) and camera of the driveway in image
Visual angle estimate the length of position and lane line region that lane line region starts.
Also, grayscale image corresponding with lane detection region is converted to edge by lane line information generation unit 140
Image, and lane line region can be detected based on the linear position extracted from the edge images converted.More specifically,
Edge images can be converted to by coherent video is driven by well known many algorithms, edge images may include showing multiple straight lines
Edge.At this point, the position of detected straight line can be identified as lane line by lane line information generation unit 140.Also, lane line
Information generation unit 140 can determine vehicle in multiple candidate straight lines based on the position of the straight line of the width with defined lane line
Diatom region.
Fig. 5 shows this grayscale image conversion and lane line region detection process.Referring to Fig. 5, will can initially be inputted
It drives coherent video and is converted into grayscale image 200, and detect linear vehicle by the lane detections such as edge detection algorithm
Diatom region 201,202.Lane line region can be divided into left-hand lane line region 201 and the right side on the basis of the position of vehicle
Side lane line region 202.
Again Fig. 4 is illustrated.
Later, if it is detected that lane line region, lane line information generation unit 140 can be based on lane line region, to set
Lane mark type region-of-interest (step S203).Specifically, if it is detected that lane line region, lane line information generation unit 140
Lane mark type region-of-interest (ROI, the Region Of on the basis of detected lane line region can be set
interest).Lane mark type region-of-interest may imply that including the lane for the type and color that judge the line of lane line
The part of the driving coherent video of the periphery predetermined region of line and above-mentioned lane line.
More specifically, Fig. 6 indicates the lane mark type region-of-interest in grayscale image.
As shown in fig. 6, lane mark type region-of-interest 210,220 may include before lane line region detected and its week
A part of border region.Also, lane mark type region-of-interest can be divided into left side vehicle on the basis of the direction of advance of vehicle
Diatom type region-of-interest 210 and right-hand lane line type region-of-interest 220.
For example, lane line region detected usually can be linear before, what is indicated with the mathematical expression of y=a × x+b etc.
In the case of, lane mark type region-of-interest is represented by region together including y=a × x+b+m and y=a × x+b-m.This with
Previous simple lane detection mode is different, is the mode for generating specific, various Travel vehicle diatom information, therefore,
The lane line region of the expansible straight line detected of lane line information generation unit 140 is led by the periphery in the lane line region of straight line
Domain is also set to region-of-interest.
Again Fig. 4 is illustrated.
Later, electronic device 100 can to lane mark type region-of-interest execute binarization (step S204), by two into
The grayscale image of the part of inhibition and generation is mapped as one-dimensional region (step S205), and utilizes at least one in visual continuity and speed
Kind carrys out the type (step S206) of identification line.
Lane line information generation unit 140 can extract the part of lane mark type region-of-interest from the grayscale image converted
Grayscale image, and binarization can be executed to part grayscale image.A reference value for carrying out binarization can be according to concern area
The average gray value of the part grayscale image in domain determines.Travel vehicle diatom information generating module 180 can be from part gray scale as a result,
The part for being only judged as lane line is clearly distinguished in image.
Moreover, each line (left side that lane line information generation unit 140 can will be identified from the part grayscale image of binarization
And right side) it is mapped as one-dimensional region.Moreover, the pattern of each line in one-dimensional region is mapped as by analysis, it can be with identification line
Type.
More specifically, Fig. 7 indicates binarization and the one-dimensional map of the lane mark type region-of-interest in grayscale image.
As shown in fig. 7, can get the image being binarized if executing binarization to lane mark type region-of-interest
300.In the image 300 being binarized, lane line can will be identified as with the part of white displays, and part in addition to this
It can be identified as black.
Moreover, each line recognized in the image 300 being binarized can be mapped as one-dimensional region.Lane line information
Generating unit 140 can be using being mapped as the image 310 in one-dimensional region come the type of easily discriminant line.
For example, lane line information generation unit 140 can be based on the initial point and length characteristic for being mapped as one-dimensional each line, to judge
It whether is dotted line or solid line.Also, lane line information generation unit 140 is continuous to the time using one-dimensional each line is mapped as
Property and speed to determine whether be dotted line or solid line.Moreover, lane line information generation unit 140 first can be according to above-mentioned initial point and length
Degree feature decides whether as after dotted line or solid line, secondly can utilize to the continuity and speed of time finally to decide whether for
Dotted line or solid line.
More specifically, lane line information generation unit 140 first can by the length of the line of the initial point position to each line into
Row relatively differentiates, to decide whether as dotted line or solid line.In this case, lane line information generation unit 140 can be only by one
A image frame can also judge whether it is dotted line or solid line.
Also, whether lane line information generation unit 140 can continuously over time form according to each line come more clearly
Judgement is dotted line or solid line.For example, the company of movement speed of the line in image can be previously set in lane line information generation unit 140
Continuous property degree, and in the case where the continuity of each line is less than the value for the continuity degree being previously set, it is judged as dotted line.
Therefore, according to an embodiment of the invention, whether can be distinguished in advance by a frame is dotted line or solid line, and lead to
Continuous frame is crossed to be verified to this, to finally judge the type of line.
Referring again to Fig. 4, electronic device 100 can detect the above-mentioned line that type is identified from colored former image data
Partial color (step S207).
Lane line information generation unit 140 can be corresponding with the identified line of type before to detect by analysis chromatic image
Part color, and classify to this.For example, lane line information generation unit 140 detected color can be divided into it is white
Color, yellow or blue.
Later, electronic device 100 can type based on the line identified and the color classified, come generate with locating for vehicle
The corresponding lane line information (step S208) of driveway.
Fig. 8 is the flow chart of the specific driveway location information generation method for indicating one embodiment of the invention.Reference Fig. 8,
Electronic device 100 can obtain the driveway information (step S301) of road locating for vehicle from map datum.Herein, road
Driveway information can be the driveway information for the road that the vehicle that is driving be presently in, may include locating for vehicle
The quantity information of the driveway of road.Moreover, the driveway information of road can be from the storage unit being stored in electronic device 100
It obtains, or can be obtained from the external map data base (DB) of other other than electronic device 100 in 110 map datum, or
Person can obtain from other electronic devices 100.
Moreover, electronic device 100 can judge that vehicle is located at the first garage of road using lane line information generated
Road (the first driveway) is located at end driveway (end driveway) (step S302).Specifically, driveway location information is raw
It can judge to be applicable in lane line information corresponding with driveway locating for vehicle in table in driveway as shown in Figure 9 at portion 150,
To judge that vehicle is located at the first driveway or end driveway of road.
That is, driveway judgement table may include according to country, the type of left side line and color, the type of right-hand line and color come
The first driveway, the end driveway determined.Herein, driveway as shown in Figure 10 judges that table can be to be illustrative, can
Different values is set according to different settings and different countries or situation.
On the other hand, in the case that vehicle is located at the first driveway or end driveway, electronic device 100 can reflect vehicle
Trade information generates driveway location information (step S303) locating for vehicle.For example, if judging, vehicle is located at end garage
Road, then driveway location information can be generated as N driveway by driveway location information generating unit 150.Moreover, and if terrain vehicle
In the case that the quantity of the corresponding driveway of trade information is 5, this can be reflected, and N driveway is generated
For the 5th driveway.
Moreover, if driveway locating for vehicle is made to be changed to be located at the first driveway and end according to vehicle to change driveway
Driveway between trailer trade, then electronic device 100 can be by driveway updating location information generated at the garage of change
The location information (step S304) in road.In this case, driveway location information generating unit 150 can using lane line information come
Judge whether to be detached from lane line, and judges whether driveway is changed based on this.For example, if being judged as vehicle from the 5th garage
Road changes a driveway to the left, then driveway location information generating unit 150 reflects this, thus by driveway position
Information is updated to the 4th driveway from the 5th driveway.
Moreover, if making the location of vehicle according to vehicle to change driveway from positioned at the first driveway and end vehicle
Driveway between trade is changed to the first driveway or end driveway, then electronic device 100 can regain locating for vehicle
Road driveway information (step S305).Moreover, the driveway information that reflection regains, can regenerate locating for vehicle
Driveway location information (step S306).For example, if being judged as, from the 4th driveway, driveway changes one to vehicle to the right
Driveway, then it is mobile to the 5th driveway as end driveway set before, to can get locating for current vehicle
Road driveway information.Also, if driveway information obtained is the 4th driveway, it can be by the garage locating for vehicle
Road location information is reproduced as the 4th driveway.
On the other hand, the driveway location information generation method of one embodiment of the invention is not limited to above-mentioned Fig. 9.Cause
This, above-mentioned sequence can carry out part change according to another embodiment.As an example, acquisition vehicle can be executed in step s 304
The step of driveway information of road locating for.In this case, if vehicle is located at the first driveway or end driveway,
Then electronic device 100 produces driveway location information (step S303) locating for vehicle.For example, if being judged as, vehicle is located at end
Trailer trade, then driveway location information can be generated as N driveway by driveway location information generating unit 150.
Moreover, if driveway locating for vehicle is made to be changed to be located at the first driveway and end according to vehicle to change driveway
Driveway between trailer trade, then electronic device 100 can utilize driveway location information generated and road obtained
Driveway information update driveway location information (step S304).For example, if be judged as vehicle from end driveway phase
Corresponding N driveway driveway to the left changes a driveway, then driveway location information generating unit 150 is to N-1 vehicle
The N=5 as driveway quantity information is reflected on trade, so as to be the 4th driveway by driveway updating location information.
Also, according to another embodiment of the present invention, driveway location information generating unit 150 can be in vehicle as shown in Figure 10
Trade judges that table is applicable in lane line information corresponding with driveway locating for vehicle, to judge whether vehicle is located at the of road
One driveway, central driveway, end driveway.It but is in this case, multiple situations in the central driveway of road
Under (for example, the quantity of driveway is 4 or more situation), can not exact knowledge locating for the vehicle in multiple central driveways
Driveway, it is therefore preferred that method as shown in Figure 9 can be used in one embodiment of the invention.
Figure 11 is the flow chart for indicating the control method of electronic device of one embodiment of the invention.Referring to Fig.1 1, firstly, electric
Sub-device 100 can identify lane line region part (step S401) from the driving coherent video data of vehicle.
Moreover, can be generated from the image data of the lane line region part identified and driveway locating for above-mentioned vehicle
Corresponding lane line information (step S402).
Moreover, using in the driveway information of road locating for lane line information generated and above-mentioned vehicle at least
One generates driveway location information (step S403) locating for above-mentioned vehicle.
Moreover, executing the driving correlation guidance (step S404) of vehicle using driveway information obtained.
Herein, the step of executing the driving correlation guidance of vehicle (step S404) may include the guidance path using vehicle
The step of driveway change guidance is exported with driveway location information.
Also, execute vehicle driving correlation guidance the step of (step S404) may include using driveway location information come
The step of exporting the guidance of driveway locating for above-mentioned vehicle.
On the other hand, the control method of the electronic device of one embodiment of the invention may also include according to based on lane line information
Come the type of the lane line of the two sides of the driveway of the vehicle identified, to select and export the step that lane line appropriate is detached from guidance
Suddenly.
Herein, can be by generating for executing the indicator for driving related guidance, and held by augmented reality output
The above-mentioned output of row.
On the other hand, in previous augmented reality navigator, multiple technologies are used in terms of Route guiding, but due to difficulty
To synthesize to path guide line and actual road environment, thus there are limitations in the expression of Route guiding line.
But an embodiment according to the present invention, can the three-dimensional of passage path guide line handle to construct suitable increasing in real time
The Route guiding indicator of strong reality, so as to really, effectively show that three-dimensional path is drawn on two-dimensional camera image
Lead indicator.Hereinafter, the augmented reality supply unit to one embodiment of the invention for achieving the above object is specifically described.
Figure 12 is the block diagram of the specific augmented reality control unit 160 for indicating one embodiment of the invention.Referring to Fig.1 2, enhancing is existing
Real supply unit 160 may include calibration portion 161, three-dimensional space generating unit 162, indicator generating unit 163, complete in mapping portion 164
Portion or a part.
Calibration portion 161 can be performed for shooting from camera and obtain presumption camera shooting corresponding with camera in filmed image
The calibration of head parameter.Herein, camera parameter can be the parameter of composition camera matrix, and above-mentioned camera matrix is to indicate
The information of real space relationship corresponding with photo.
Three-dimensional space generating unit 162 can generate virtual three-dimensional space based on filmed image obtained by shooting from camera
Between.Specifically, three-dimensional space generating unit 162 can be shot from camera based on the camera parameter that calibration portion 161 estimates and be obtained
Image in obtain depth information (Depths information), and depth information obtained and filmed image can be based on
To generate virtual three-dimensional space.
Indicator generating unit 163 produces the indicator for guiding on augmented reality, for example, Route guiding refers to
Show that symbol, driveway change guidance indicator and lane line are detached from guidance indicator etc..
In particular, if indicator generating unit 163 by input unit 120 receive user inputted to the path until destination
Boot request can then generate the Route guiding indicator for carrying out Route guiding on augmented reality.Herein, Route guiding
Indicator generating unit may include Route guiding line processing unit 163-1, Route guiding DNA mitochondrial DNA portion 163-2 and dynamic texture mapping
Portion 163-3.
If being generated to the Route guiding line until destination according to the request of the Route guiding of user, at Route guiding line
Reason portion 163-1 can reflect the radius of the driving trace of actual travel vehicle to execute the processing to path guide line.
Specifically, electronic device 100 using from the map datum that storage unit 110 obtains, other than electronic device 100
Until destination of the map datum obtained in other external map data bases to generate the Route guiding request based on user
Route guiding line.Herein, Route guiding line generated may include node and connecting line, can be part (a) such as Figure 13
Shown in form.That is, 3 part (a) referring to Fig.1, Route guiding line 1301 generated curve section 1302 can for
The not similar rectilinear configuration of vehicle driving trace.Therefore, if based on Route guiding line shown in part (a) such as Figure 13 come with
Camera image is synthesized, and provides augmented reality, then shows the mutually different result of driving trace with actual vehicle.
Therefore, the Route guiding line processing unit 163-1 of one embodiment of the invention can be to reflect the row of actual travel vehicle
The mode for sailing the radius of track executes processing to path guide line.Specifically, Route guiding line processing unit 163-1 can give birth to
At Route guiding line in remove vertex (vertex) corresponding with the region for being not displayed on current picture, repeat point etc. no
Necessary vertex.Moreover, Route guiding line processing unit 163-1 can be added for maintaining the forward path guide line of this vehicle linear
Vertex.Moreover, Route guiding line processing unit 163-1 can be added to the curve section of Route guiding line for realizing curve section
Curved vertex.Moreover, Route guiding line processing unit 163-1 can generate reflection actual travel using added vertex
Vehicle driving trace radius Route guiding line.
Like this, an embodiment according to the present invention produces the Route guiding line as shown in part (b) of Figure 13.That is, ginseng
According to part (b) of Figure 13, handled Route guiding line 1303 can be similar with vehicle driving trace in curve section 1304
As soft form.
Route guiding DNA mitochondrial DNA portion 163-2 can make processing unit 163-1 Route guiding line generated according to distance
Height is different, to execute variable three-dimensional.In this regard, 4 will be specifically described referring to Fig.1.
Referring to Fig.1 4, Route guiding DNA mitochondrial DNA portion 163-2 can be in the path generated by Route guiding line processing unit 163-1
The two sides of guide line 1401 generate virtual Route guiding line 1402,1403.Specifically, Route guiding DNA mitochondrial DNA portion 163-2
The unit vector on the vertex of the Route guiding line processing unit 163-1 Route guiding line 1401 generated can be calculated, and by institute
The internal arithmetic of calculated unit vector calculates the normal line vector perpendicular with unit vector, to produce virtual
Route guiding line 1402,1403.
Moreover, Route guiding DNA mitochondrial DNA portion 163-2 can calculate the height on vertex included by Route guiding line 1401
Value.In the case, Route guiding DNA mitochondrial DNA portion 163-2 can be with the height value on vertex included by Route guiding line 1401
The mode increased in proportion to distance calculates height value.
Moreover, Route guiding DNA mitochondrial DNA portion 163-2 can be by polygon come in the Route guiding line for calculating height value
The vertex that 1401 vertex and virtual Route guiding line 1402,1403 are respectively included generates face, thus executable three-dimensional.
The Route guiding indicator of picture is shown in thus, it is possible to be shown in such a way that driver is identifiable in picture
In the distant place positioned at this driveway Route guiding indicator.
The stereo data that dynamic texture mapping portion 163-3 can be generated in Route guiding three-dimensional portion 163-2 is mapped according to vehicle
Speed there is the texture of displacement.
Herein, texture the speed of vehicle can have a texture of displacement according to.That is, what vehicle moved on path
In the case of, dynamic texture mapping portion 163-3 changes the mapping position of texture in stereo data, to produce according to vehicle
Speed has a texture of displacement.In the case, the Route guiding indicator for being shown in picture can will be like being tightly attached to road
The effect in face maximizes.
According to this work, indicator generating unit 163 can generate the Route guiding for Route guiding on augmented reality
Indicator.
On the other hand, there is turning, and can be in order to more efficiently via the display technology after above-mentioned turning
It shows picture and makes Route guiding DNA mitochondrial DNA portion 163-2 to three-dimensional realization verticalization generated according to above-mentioned work.
That is, Route guiding DNA mitochondrial DNA portion 163-2 can be established to short transverse to be drawn with path in the case where left-hand bend
The corresponding virtual Route guiding line 1403 in the right side of conducting wire 1401, to determine stereo data, on the contrary, the case where turning right
Under, Route guiding DNA mitochondrial DNA portion 163-2 can establish void corresponding with the left side of Route guiding line 1401 to short transverse
Quasi- Route guiding line 1402, to determine stereo data.
On the other hand, mapping portion 164 can combine in the virtual three-dimensional space that three-dimensional space generating unit 162 generates and indicate
Accord with the indicator that generating unit 163 generates.
Figure 15 is the flow chart for indicating the augmented reality path guide method of one embodiment of the invention.Referring to Fig.1 5, firstly,
Electronic device 100 can receive the input (step S501) for requesting the user for carrying out Route guiding to be carried out.
Moreover, electronic device 100 can generate Route guiding line (step based on the destination information of Route guiding
S502)。
Moreover, electronic device 100 can reflect the radius of the vehicle driving trace of actual travel to correct path generated
Guide line (step S503).Specifically, step S503 can include: remove and be not displayed in Route guiding line generated and work as
The step of unnecessary vertex of the corresponding vertex in the region of preceding picture (vertex), repetition point etc.;Addition is for maintaining this
The step of forward path guide line of vehicle linear vertex;It adds to the curve section of Route guiding line for realizing curve area
Between curved vertex the step of;The radius of the driving trace of reflection actual travel vehicle is generated using added vertex
Route guiding line the step of.
Moreover, electronic device 100 can be based on the height for making the Route guiding line after correction with the distance between this vehicle
Difference, to execute variable three-dimensional (step S504).Specifically, step S504 can include: in treated Route guiding line
Two sides the step of generating virtual Route guiding line;So that the height value on vertex included by treated Route guiding line
The step of mode computed altitude value increased in proportion to distance;And the above-mentioned of height value is being calculated by polygon
The vertex that the vertex of Route guiding line and above-mentioned virtual Route guiding line are respectively included generates face, thereby executing three-dimensional
Step.
Moreover, electronic device 100 can be mapped in stereo data there is the texture of displacement to generate according to the speed of vehicle
Route guiding indicator (step S505).
Moreover, electronic device 100 can guide indicator (step S506) in picture outgoing route by augmented reality.?
Here, it 6 pairs of output pictures will be specifically described referring to Fig.1.
Figure 16 is the figure for indicating the Route guiding picture of one embodiment of the invention.Referring to Fig.1 6, one embodiment of the invention
Electronic device 100 can show the Route guiding picture (left side picture) on augmented reality and the Route guiding picture on map together
(right panel).
In the case, in order to guide on augmented reality, augmented reality supply unit 160 produces and is superimposed on enhancing
The indicator of reality.
As an example, as shown in figure 16, augmented reality supply unit 160 produces Route guiding indicator 1601, driveway
Change guidance indicator 1602 and lane line are detached from guidance indicator 1603.Moreover, augmented reality supply unit 160 can be existing in enhancing
Indicator generated is exported in reality.
Figure 17 be indicate one embodiment of the invention camera and electronic device be divergence type in the case where embodiments
Figure.Referring to Fig.1 7, respectively arranged vehicle navigator 100 and vehicle black box 200 can utilize wire/wireless communication side
Formula is come the system that constitutes one embodiment of the invention.
Vehicle navigator 100 can include: display unit 145, set on the front surface of the shell 191 of navigator;Navigator behaviour
Make key 193;And navigator microphone 195.
Vehicle black box 200 can obtain the data of vehicle in driving process and docking process.That is, can not only shoot
The image of vehicle in the process of moving, and in the case where vehicle parking, it can also be with filmed image.Pass through vehicle black box
200 can be constant come the clarity of the image obtained or change.For example, the clarity of image can be made before and after accident occurs
Height, and in normal circumstances, clarity is reduced, so as to minimize required memory space, and stores crucial image.
Vehicle black box 200 may include black box camera 222, black box microphone 224 and facies posterior hepatis 281.
On the other hand, although Figure 17 shows vehicle navigator 100 and the vehicle black box 200 that is separately provided by having
Line/communication interconnects, but vehicle navigator 100 and vehicle with black box 200 also may not need by it is wired/
Communication is connected.In this case, if the storage medium of the filmed image for storing black box 200 is inserted in
Electronic device 100, then electronic device 100 can receive captured image.On the other hand, vehicle black box 200 can also be made
Have the function of vehicle navigator 100, or can make vehicle navigator 100 that there is camera and realize integration.It is right
This, 8 will be described in detail referring to Fig.1.
Figure 18 be indicate one embodiment of the invention camera and electronic device be integrated type in the case where embodiments
Figure.Referring to Fig.1 8, in the case where electronic device includes camera function, user can be with the camera part of electronic device
The front of vehicle is shot, user can identify the mode placing electronic device of the display portion of electronic device.This can be embodied as a result,
The system for inventing an embodiment.
Figure 19 is the figure indicated using the head-up display of one embodiment of the invention and the embodiments of electronic device.Reference
Figure 19 electronic device can be connected by wire/wireless communication mode with head-up display, and increasing is shown on head-up display
Strong reality guide picture.
On the other hand, the controlling party of the electronic device of the various embodiments of aforementioned present invention can be embodied with program code
Method, thus to be stored in the readable medium of a variety of non-transitories (non-transitory computer readable
Medium) state is provided to each server or equipment.
The readable medium of non-transitory is not meant to that the short time such as register, cache memory, memory store
The medium of data, but mean semipermanent storing data, and the medium of (reading) can be read by equipment.Specifically
Ground, above-mentioned a variety of applications or program can be stored in CD (CD), digital versatile disc (DVD), hard disk, Blu-ray Disc, lead to
It is provided with the readable medium of the non-transitories such as universal serial bus, memory card, read-only memory.
Also, the preferred embodiment of the present invention is had been shown and described above, but the invention is not limited to above-mentioned
Specific embodiment can be by the present invention in the case where not departing from the purport of the invention invented and applied in claimed range
Person of an ordinary skill in the technical field implements various deformation to the present invention, also, these variant embodiments should not be detached from
Technical idea of the invention or prospect of the invention individually understand.
Claims (13)
1. a kind of control method of electronic device characterized by comprising
The type and color of the driveway two sides lane line in vehicle driving are determined in coherent video data from driving for vehicle
Step;
Type and color to the two sides lane line based on above-mentioned judgement are compared with stored driveway judgement table, thus
The step of determining the driveway location information of the vehicle in traveling;And
Away from above-mentioned vehicle be predetermined distance in front of path on exist turning place in the case where, to the driveway of above-mentioned decision
Location information is compared with the path direction at above-mentioned turning place, judges whether the driveway of above-mentioned vehicle is needed to change
Step,
Wherein, above-mentioned stored driveway judgement table is the type and color for defining the two sides lane line according to vehicle
Table.
2. the control method of electronic device according to claim 1, which is characterized in that
It is above-mentioned to judge whether the step of needing the driveway of above-mentioned vehicle to change, comprising:
The step of routing information until destination is acquired from map datum;
The step of obtaining the location information of above-mentioned vehicle;And
Using the vehicle position information of above-mentioned acquisition and the routing information of above-mentioned acquisition, detection is present in the position away from above-mentioned vehicle
Predetermined distance within turning place the step of.
3. the control method of electronic device according to claim 1, which is characterized in that
Further include following steps: in the case where being judged as needs above-mentioned driveway change, the change guidance of the first driveway being referred to
Show that symbol is shown in augmented reality image, first driveway change guidance indicator expression needs driveway to change.
4. the control method of electronic device according to claim 1, which is characterized in that
Further include following steps: in the case where being judged as needs above-mentioned driveway change, the change guidance of the second driveway being referred to
Show that symbol is shown in augmented reality image, second driveway change guidance indicator expression needs driveway to change,
Wherein, above-mentioned second driveway change guidance indicator is using the driveway before changing of above-mentioned vehicle as starting point and with change
Driveway afterwards is terminal.
5. the control method of electronic device according to claim 1, which is characterized in that
Further include following steps: the Route guiding indicator of above-mentioned vehicle and expression are needed into the driveway change of above-mentioned vehicle
Driveway change guidance indicator is shown in augmented reality image.
6. the control method of electronic device according to claim 1, which is characterized in that
Further include following steps: from the driveway quantity of the driveway obtained in map datum in above-mentioned vehicle driving,
It is above-mentioned to judge whether the step of needing the driveway of above-mentioned vehicle to change, comprising:
In the case where above-mentioned vehicle needs driveway to change in the state of on the first driveway or end driveway in traveling,
Based on above-mentioned driveway quantity, judgement needs to change several driveways.
7. a kind of electronic device characterized by comprising
Lane line information generation unit drives the driveway two sides vehicle determined in vehicle driving in coherent video data from vehicle
The type and color of diatom;
Driveway location information generating unit, type and color and stored vehicle to the two sides lane line based on above-mentioned judgement
Trade judgement table is compared, to determine the driveway location information of the vehicle in traveling;And
Control unit, away from above-mentioned vehicle be predetermined distance in front of path on exist turning place in the case where, the control unit pair
The driveway location information of above-mentioned decision is compared with the path direction at above-mentioned turning place, judges whether to need above-mentioned vehicle
Driveway change,
Wherein, above-mentioned stored driveway judgement table is the type and color for defining the two sides lane line according to vehicle
Table.
8. electronic device according to claim 7, which is characterized in that
Above-mentioned control unit acquires the routing information until destination from map datum, obtains the location information of above-mentioned vehicle,
Using the vehicle position information of above-mentioned acquisition and the routing information of above-mentioned acquisition, detection is present in the rule of the position away from above-mentioned vehicle
Turning place within set a distance.
9. electronic device according to claim 7, which is characterized in that
It further include display unit, in the case where being judged as needs above-mentioned driveway change, which changes the first driveway
Guidance indicator is shown in augmented reality image, and first driveway change guidance indicator expression needs driveway to change.
10. electronic device according to claim 7, which is characterized in that
It further include display unit, in the case where being judged as needs above-mentioned driveway change, which changes the second driveway
Guidance indicator is shown in augmented reality image, and second driveway change guidance indicator expression needs driveway to change,
Wherein, above-mentioned second driveway change guidance indicator is using the driveway before changing of above-mentioned vehicle as starting point and with change
Driveway afterwards is terminal.
11. electronic device according to claim 7, which is characterized in that
It further include display unit, which needs the Route guiding indicator of above-mentioned vehicle and expression the driveway of above-mentioned vehicle
The driveway change guidance indicator of change is shown in augmented reality image.
12. electronic device according to claim 7, which is characterized in that
Above-mentioned control unit from the driveway quantity of the driveway obtained in map datum in above-mentioned vehicle driving,
In the case where above-mentioned vehicle needs driveway to change in the state of on the first driveway or end driveway in traveling,
Above-mentioned control unit is based on above-mentioned driveway quantity, and judgement needs to change several driveways.
13. a kind of computer readable recording medium is used to execute the control method of electronic device, which is situated between
Matter is characterized in that above-mentioned control method includes:
The type and color of the driveway two sides lane line in vehicle driving are determined in coherent video data from driving for vehicle
Step;
Type and color to the two sides lane line based on above-mentioned judgement are compared with stored driveway judgement table, thus
The step of determining the driveway location information of the vehicle in traveling;And
Away from above-mentioned vehicle be predetermined distance in front of path on exist turning place in the case where, to the driveway of above-mentioned decision
Location information is compared with the path direction at above-mentioned turning place, judges whether the driveway of above-mentioned vehicle is needed to change
Step,
Wherein, above-mentioned stored driveway judgement table is the type and color for defining the two sides lane line according to vehicle
Table.
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20140073727 | 2014-06-17 | ||
KR10-2014-0073727 | 2014-06-17 | ||
KR1020140154597A KR102255432B1 (en) | 2014-06-17 | 2014-11-07 | Electronic apparatus and control method thereof |
KR10-2014-0154597 | 2014-11-07 | ||
CN201510336818.5A CN105300401B (en) | 2014-06-17 | 2015-06-17 | Electronic device and its control method |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510336818.5A Division CN105300401B (en) | 2014-06-17 | 2015-06-17 | Electronic device and its control method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110296715A true CN110296715A (en) | 2019-10-01 |
Family
ID=55084952
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910530448.7A Pending CN110296715A (en) | 2014-06-17 | 2015-06-17 | Electronic device and its control method, computer readable recording medium |
CN201811382624.9A Active CN109323708B (en) | 2014-06-17 | 2015-06-17 | Electronic device and control method thereof |
CN201510336818.5A Active CN105300401B (en) | 2014-06-17 | 2015-06-17 | Electronic device and its control method |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811382624.9A Active CN109323708B (en) | 2014-06-17 | 2015-06-17 | Electronic device and control method thereof |
CN201510336818.5A Active CN105300401B (en) | 2014-06-17 | 2015-06-17 | Electronic device and its control method |
Country Status (2)
Country | Link |
---|---|
KR (2) | KR102255432B1 (en) |
CN (3) | CN110296715A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113516014A (en) * | 2020-04-10 | 2021-10-19 | 星克跃尔株式会社 | Lane line detection method, lane line detection device, electronic apparatus, computer program, and computer-readable recording medium |
Families Citing this family (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI609807B (en) * | 2016-05-17 | 2018-01-01 | 緯創資通股份有限公司 | Image evaluation method and electronic apparatus thereof |
CN107784864A (en) * | 2016-08-26 | 2018-03-09 | 奥迪股份公司 | Vehicle assistant drive method and system |
MX2019002985A (en) * | 2016-09-27 | 2019-07-04 | Nissan Motor | Self-position estimation method and self-position estimation device. |
WO2018081807A2 (en) | 2016-10-31 | 2018-05-03 | Mobileye Vision Technologies Ltd. | Systems and methods for navigating lane merges and lane splits |
KR20180050823A (en) | 2016-11-07 | 2018-05-16 | 삼성전자주식회사 | Generating method and apparatus of 3d lane model |
US10551840B2 (en) * | 2018-07-02 | 2020-02-04 | Baidu Usa Llc | Planning driven perception system for autonomous driving vehicles |
CN110361021B (en) * | 2018-09-30 | 2021-06-22 | 毫末智行科技有限公司 | Lane line fitting method and system |
CN111460865B (en) * | 2019-01-22 | 2024-03-05 | 斑马智行网络(香港)有限公司 | Driving support method, driving support system, computing device, and storage medium |
US11790613B2 (en) | 2019-01-31 | 2023-10-17 | Lg Electronics Inc. | Image output device |
CN110006440B (en) * | 2019-04-12 | 2021-02-05 | 北京百度网讯科技有限公司 | Map relation expression method and device, electronic equipment and storage medium |
CN110070623B (en) * | 2019-04-16 | 2023-02-24 | 阿波罗智联(北京)科技有限公司 | Guide line drawing prompting method, device, computer equipment and storage medium |
KR102249100B1 (en) * | 2019-12-10 | 2021-05-06 | 한국교통대학교산학협력단 | The vehicle positioning apparatus |
KR102599269B1 (en) * | 2019-12-31 | 2023-11-06 | 현대오토에버 주식회사 | Augmented reality navigation apparatus and control method thereof |
KR20210087271A (en) * | 2020-01-02 | 2021-07-12 | 삼성전자주식회사 | Apparatus and method for displaying navigation information of three dimention augmented reality |
CN111353466B (en) * | 2020-03-12 | 2023-09-22 | 北京百度网讯科技有限公司 | Lane line recognition processing method, equipment and storage medium |
KR102443401B1 (en) * | 2020-06-29 | 2022-09-15 | 주식회사 라이드플럭스 | Method, apparatus and computer program for generating road network data to automatic driving vehicle |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000282423A (en) * | 1999-03-31 | 2000-10-10 | Toshiba Corp | Variable road marker |
CN1710550A (en) * | 2004-06-16 | 2005-12-21 | 上海宝信软件股份有限公司 | Method for dynamically generating crossing channelized picture |
CN101246010A (en) * | 2007-02-13 | 2008-08-20 | 爱信艾达株式会社 | Lane determining device, method, and program |
CN101315283A (en) * | 2007-05-30 | 2008-12-03 | 阿尔派株式会社 | Navigation devices |
CN101540100A (en) * | 2008-03-17 | 2009-09-23 | 上海宝康电子控制工程有限公司 | Device for recording vehicle lane change |
CN102016930A (en) * | 2008-04-30 | 2011-04-13 | 星克跃尔株式会社 | Method and apparatus for creating of 3D direction displaying |
CN102027509A (en) * | 2008-05-14 | 2011-04-20 | 星克跃尔株式会社 | Method and apparatus for 3D path |
CN102027510A (en) * | 2008-05-15 | 2011-04-20 | 星克跃尔株式会社 | System and method for displaying guidance symbol |
CN102057253A (en) * | 2008-06-11 | 2011-05-11 | 三菱电机株式会社 | Navigation device |
US20110164790A1 (en) * | 2008-10-22 | 2011-07-07 | Kazuyuki Sakurai | Lane marking detection apparatus, lane marking detection method, and lane marking detection program |
CN102519475A (en) * | 2011-12-12 | 2012-06-27 | 杨志远 | Intelligent navigation method and equipment based on augmented reality technology |
JP2014037172A (en) * | 2012-08-13 | 2014-02-27 | Alpine Electronics Inc | Display controller and display control method for head-up display |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4211620B2 (en) * | 2004-01-30 | 2009-01-21 | 株式会社デンソー | Car navigation system |
JP4923647B2 (en) * | 2006-03-17 | 2012-04-25 | 株式会社デンソー | Driving support image display device and program |
JP4886597B2 (en) * | 2007-05-25 | 2012-02-29 | アイシン・エィ・ダブリュ株式会社 | Lane determination device, lane determination method, and navigation device using the same |
JP4994256B2 (en) * | 2008-01-28 | 2012-08-08 | 株式会社ジオ技術研究所 | Data structure of route guidance database |
KR100888155B1 (en) * | 2008-05-14 | 2009-03-10 | 팅크웨어(주) | System and method for displaying 3-dimension map using texture mapping |
KR20100130483A (en) * | 2009-06-03 | 2010-12-13 | 엘지전자 주식회사 | Mobile vehicle navigation method and apparatus thereof |
US8503762B2 (en) * | 2009-08-26 | 2013-08-06 | Jacob Ben Tzvi | Projecting location based elements over a heads up display |
DE102010033729B4 (en) * | 2010-08-07 | 2014-05-08 | Audi Ag | Method and device for determining the position of a vehicle on a roadway and motor vehicles with such a device |
KR101688155B1 (en) * | 2010-10-25 | 2016-12-20 | 엘지전자 주식회사 | Information processing apparatus and method thereof |
KR20130015746A (en) * | 2011-08-04 | 2013-02-14 | 엘지전자 주식회사 | Apparatus for detecting lane and method thereof |
KR20130027367A (en) * | 2011-09-07 | 2013-03-15 | 동국대학교 산학협력단 | A navigation apparatus and method for displaying course thereof |
KR101371725B1 (en) * | 2012-05-31 | 2014-03-07 | 현대자동차(주) | Apparatus and method for displaying three-dimensional contents graphic image in navigation |
KR20130135656A (en) * | 2012-06-01 | 2013-12-11 | 현대엠엔소프트 주식회사 | A navigation apparatus, system and method for controlling vehicle using the same |
US20130325343A1 (en) * | 2012-06-05 | 2013-12-05 | Apple Inc. | Mapping application with novel search field |
US9052197B2 (en) * | 2012-06-05 | 2015-06-09 | Apple Inc. | Providing navigation instructions while device is in locked mode |
DE102012215322A1 (en) * | 2012-08-29 | 2014-03-06 | Robert Bosch Gmbh | Method and device for detecting a position of a vehicle on a lane |
CN103234547B (en) * | 2013-04-18 | 2017-03-22 | 易图通科技(北京)有限公司 | Method and device for displaying road scene in vacuum true three-dimensional navigation |
-
2014
- 2014-11-07 KR KR1020140154597A patent/KR102255432B1/en active IP Right Grant
-
2015
- 2015-06-17 CN CN201910530448.7A patent/CN110296715A/en active Pending
- 2015-06-17 CN CN201811382624.9A patent/CN109323708B/en active Active
- 2015-06-17 CN CN201510336818.5A patent/CN105300401B/en active Active
-
2021
- 2021-05-17 KR KR1020210063573A patent/KR102348127B1/en active IP Right Grant
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000282423A (en) * | 1999-03-31 | 2000-10-10 | Toshiba Corp | Variable road marker |
CN1710550A (en) * | 2004-06-16 | 2005-12-21 | 上海宝信软件股份有限公司 | Method for dynamically generating crossing channelized picture |
CN101246010A (en) * | 2007-02-13 | 2008-08-20 | 爱信艾达株式会社 | Lane determining device, method, and program |
CN101315283A (en) * | 2007-05-30 | 2008-12-03 | 阿尔派株式会社 | Navigation devices |
CN101540100A (en) * | 2008-03-17 | 2009-09-23 | 上海宝康电子控制工程有限公司 | Device for recording vehicle lane change |
CN102016930A (en) * | 2008-04-30 | 2011-04-13 | 星克跃尔株式会社 | Method and apparatus for creating of 3D direction displaying |
CN102027509A (en) * | 2008-05-14 | 2011-04-20 | 星克跃尔株式会社 | Method and apparatus for 3D path |
CN102027510A (en) * | 2008-05-15 | 2011-04-20 | 星克跃尔株式会社 | System and method for displaying guidance symbol |
CN102057253A (en) * | 2008-06-11 | 2011-05-11 | 三菱电机株式会社 | Navigation device |
US20110164790A1 (en) * | 2008-10-22 | 2011-07-07 | Kazuyuki Sakurai | Lane marking detection apparatus, lane marking detection method, and lane marking detection program |
CN102519475A (en) * | 2011-12-12 | 2012-06-27 | 杨志远 | Intelligent navigation method and equipment based on augmented reality technology |
JP2014037172A (en) * | 2012-08-13 | 2014-02-27 | Alpine Electronics Inc | Display controller and display control method for head-up display |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113516014A (en) * | 2020-04-10 | 2021-10-19 | 星克跃尔株式会社 | Lane line detection method, lane line detection device, electronic apparatus, computer program, and computer-readable recording medium |
Also Published As
Publication number | Publication date |
---|---|
CN105300401A (en) | 2016-02-03 |
KR102348127B1 (en) | 2022-01-11 |
KR20210061319A (en) | 2021-05-27 |
KR20150144681A (en) | 2015-12-28 |
CN105300401B (en) | 2019-07-12 |
CN109323708A (en) | 2019-02-12 |
KR102255432B1 (en) | 2021-05-24 |
CN109323708B (en) | 2022-04-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105300401B (en) | Electronic device and its control method | |
CN105185134B (en) | Electronic device, the control method of electronic device and computer readable recording medium storing program for performing | |
US11543256B2 (en) | Electronic apparatus and control method thereof | |
CN105654030B (en) | Electronic device, the control method of electronic device, computer program and computer readable recording medium | |
CN105185137B (en) | Electronic device, the control method of electronic device and computer readable recording medium storing program for performing | |
CN110091798A (en) | Electronic device, the control method of electronic device and computer readable storage medium | |
US20070009137A1 (en) | Image generation apparatus, image generation method and image generation program | |
CN107341454A (en) | The detection method and device of barrier, electronic equipment in a kind of scene | |
CN106494406A (en) | Bend guidance method, bend guider, electronic installation and program | |
CN107097790A (en) | For the method and apparatus and vehicle of the vehicle-periphery for illustrating vehicle | |
CN114056236A (en) | Augmented reality processing device, route guidance method based on augmented reality, and electronic device | |
CN108335507A (en) | Utilize the driving guiding providing method and device of the filmed image of camera | |
US20200341273A1 (en) | Method, System and Apparatus for Augmented Reality | |
CN105651299B (en) | The control method of electronic device, electronic device | |
US20240345261A1 (en) | Method and apparatus for determining location by correcting global navigation satellite system based location and electronic device thereof | |
CN105651298B (en) | The control method of electronic device, electronic device | |
CN109115232A (en) | The method and apparatus of navigation | |
KR20210094475A (en) | Method, apparatus, electronic device, computer program and computer readable recording medium for measuring inter-vehicle distance based on vehicle image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20210817 Address after: Seoul, South Kerean Applicant after: Hyundai Motor Co.,Ltd. Applicant after: Kia Co.,Ltd. Address before: Gyeonggi Do, South Korea Applicant before: THINKWARE SYSTEMS Corp. |
|
TA01 | Transfer of patent application right |