CN105522971B - Image outside vehicle output-controlling device and method - Google Patents
Image outside vehicle output-controlling device and method Download PDFInfo
- Publication number
- CN105522971B CN105522971B CN201510684490.6A CN201510684490A CN105522971B CN 105522971 B CN105522971 B CN 105522971B CN 201510684490 A CN201510684490 A CN 201510684490A CN 105522971 B CN105522971 B CN 105522971B
- Authority
- CN
- China
- Prior art keywords
- image
- mark
- outside vehicle
- angle
- driver
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 23
- 239000000284 extract Substances 0.000 claims abstract description 8
- 239000011521 glass Substances 0.000 claims description 15
- 230000000007 visual effect Effects 0.000 claims description 15
- 210000003128 head Anatomy 0.000 claims 1
- 230000006870 function Effects 0.000 description 12
- 239000003550 marker Substances 0.000 description 6
- 239000000203 mixture Substances 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 238000004590 computer program Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000003190 augmentative effect Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- RKTYLMNFRDHKIL-UHFFFAOYSA-N copper;5,10,15,20-tetraphenylporphyrin-22,24-diide Chemical group [Cu+2].C1=CC(C(=C2C=CC([N-]2)=C(C=2C=CC=CC=2)C=2C=CC(N=2)=C(C=2C=CC=CC=2)C2=CC=C3[N-]2)C=2C=CC=CC=2)=NC1=C3C1=CC=CC=C1 RKTYLMNFRDHKIL-UHFFFAOYSA-N 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/10—Front-view mirror arrangements; Periscope arrangements, i.e. optical devices using combinations of mirrors, lenses, prisms or the like ; Other mirror arrangements giving a view from above or under the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/12—Mirror assemblies combined with other articles, e.g. clocks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/12—Mirror assemblies combined with other articles, e.g. clocks
- B60R2001/1253—Mirror assemblies combined with other articles, e.g. clocks with cameras, video cameras or video screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/105—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/303—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Multimedia (AREA)
- Automation & Control Theory (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Transportation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Image Processing (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
The present invention discloses a kind of image outside vehicle output-controlling device and method, calculates the angle of visibility of driver based on the mark for being pasted onto wearable device and controls the image of vehicle-surroundings to export.Vehicle according to the invention external image output-controlling device, including:Image obtaining portion, obtain the image of outside vehicle;Angle of visibility calculating part, the angle of visibility of the driver is calculated using the mark on the wearable device for pasting driver's wearing;Object images generating unit, based on the angle of visibility and from the image to the outside vehicle extract at least one party to image and be generated as object output image;And image displaying part, the object output image is shown to the wearable device.
Description
Technical field
The present invention relates to the device and method of the output of control image.In more detail, it is related to the figure of control outside vehicle
The device and method of the output of picture.
Background technology
Nowadays, vehicle boarded various electronic.For example, the equipment includes:Tracking and the position of display vehicle
Put and the guider of the map of vehicle traveling-position is exported with picture, the inside/outside portion of vehicle is shot and obtains the shooting of image
Head device, storage simultaneously export the image shot by camera or the LCD of reproduced picture for exporting CD or DVD etc. with picture etc..
But the conventional cam device for being mounted in vehicle has following problem.
Firstth, collecting the scope of picture has limitation, only collects the image information of confined area.Accordingly, existing can not
The problem of presentation is covered the image on driver's seat periphery in the visual field by car body.
Secondth, when vehicle-surroundings have three-dimensional structure, the image of the error message of distortion is presented in image frame.
KR published patent the 2013-0013410th proposes the device of output vehicle periphery image.But the dress
The image that vehicle-surroundings are simply exported according to the transport condition of vehicle is put, can not solve described problem point.
The content of the invention
(technical problems to be solved)
It is of the invention to be proposed to solve described problem point, its object is to, there is provided a kind of image outside vehicle output control
Device and method, calculate the angle of visibility of driver based on the mark for being pasted onto wearable device and control vehicle-surroundings
Image exports.
But the purpose of the present invention is not limited to the item being related to, the other purposes being not directed to can be according to as follows
Record and be expressly understood that by practitioner in the art.
(means for solving problem)
The present invention proposes to reach the purpose, there is provided a kind of image outside vehicle output-controlling device, including:Image
Obtaining portion, obtain the image of outside vehicle;Angle of visibility calculating part, utilize the mark on the wearable device for pasting driver's wearing
Will and the angle of visibility for calculating the driver;Object images generating unit, based on the angle of visibility and outside to the vehicle
In the image in portion extract at least one party to image and be generated as object output image;And image displaying part, by the output pair
As image is shown to the wearable device.
Preferably, described image obtaining portion, the image in remaining direction in addition to the vehicle front is obtained or to described
Vehicle directive image and as the image to the outside vehicle.
Preferably, the angle of visibility calculating part and described image display part can be worn by the use of the equipment of glasses form described in
Wear equipment.
Preferably, the angle of visibility calculating part utilizes the mark for the upper end for being pasted onto both sides spectacle-frame and is pasted onto described
The mark of the bottom of both sides spectacle-frame and be used as the mark.
Preferably, the angle of visibility calculating part is with position, the movement side of the driver's eyes obtained according to the mark
To and according to calculating the angle of visibility of the driver based on camera and the position of the mark that obtains.
Preferably, the angle of visibility calculating part utilizes the mark, on the basis of the front of the vehicle, on left side and the right side
Within the scope of 90 degree of side, the motion of real-time tracing driver's eyes.
Preferably, the angle of visibility calculating part also utilizes the mark of the left end for being pasted onto the both sides spectacle-frame and viscous
Be attached to the right-hand end of the both sides spectacle-frame is masked as the mark.
Preferably, the object output image is generated as stereo-picture by the object images generating unit, from the vehicle
After the image that at least two directions are extracted in outside image, the object output image is generated as panoramic picture.
Preferably, if the object images generating unit is judged as the angle of visibility on the basis of the front of vehicle to the left
Beyond 90 degree or to the right, side direction exceeds 90 degree in direction, then generates the object output image.
Also, the present invention provides a kind of image outside vehicle output control method, including:Obtain the image of outside vehicle
Step;The step of angle of visibility of the driver being calculated using the mark pasted on the wearable device of driver's wearing;With
Extract based on the angle of visibility and from the image for the outside vehicle at least one party to image and be generated as exporting
The step of object images;And the step of object output image is shown to the wearable device.
Preferably, the step of acquisition, the image in remaining direction in addition to the vehicle front is obtained or to described
Vehicle directive image and as the image to the outside vehicle.
Preferably, the step of calculating and the step of the display, the equipment by the use of glasses form can be worn described in
Wear equipment.
Preferably, the step of calculating, using the upper end for being pasted onto both sides spectacle-frame mark and be pasted onto described
The mark of the bottom of both sides spectacle-frame and be used as the mark.
Preferably, the step of calculating, with the position of the driver's eyes of acquisition, moving direction according to the mark
And calculate the angle of visibility of the driver based on the position of the mark obtained according to camera.
Preferably, the step of calculating, using the mark, on the basis of the front of the vehicle, on left side and the right side
Within the scope of 90 degree of side, the motion of real-time tracing driver's eyes.
Preferably, the step of calculating, mark and stickup using the left end for being pasted onto the both sides spectacle-frame
The mark is masked as in the right-hand end of the both sides spectacle-frame.
Preferably, the step of generation, the object output image is generated as stereo-picture, from the outside vehicle
Image in extract the image at least two directions after, the object output image is generated as panoramic picture.
Preferably, the step of generation, the side to the left if being judged as the angle of visibility on the basis of the front of vehicle
Exceed 90 degree to side direction beyond 90 degree or to the right, then generate the object output image.
(The effect of invention)
The present invention calculates the angle of visibility of driver based on the mark being pasted onto on wearable device, so as to control car
The image output on periphery, has the effect that.
First, 360 degree of side images are able to confirm that in driver's seat, rather than vehicle-surroundings image is overlooked from upside
Two-dimensional image.
Second, driver is intuitive to see and covered by car body using wearable display (Wearable Display) equipment
The image on the driver's seat periphery in the visual field.
Brief description of the drawings
Fig. 1 is concept map of the diagram according to the internal structure of the 3D AVM systems of one embodiment of the invention.
Fig. 2 is the stereogram of the wearable display device of the 3D AVM systems of pie graph 1.
Fig. 3 is the reference chart for illustrating the image collection method of the camera of the 3D AVM systems of pie graph 1.
Fig. 4 is the flow chart for the method for operation for illustrating the 3D AVM systems according to one embodiment of the invention.
Embodiment
Below, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.First, it is attached on the inscape for each accompanying drawing
Add reference marks, when identical inscape is indicated on different accompanying drawings, use identical symbol as far as possible.Also, say
When bright of the invention, it is judged as that the illustrating for open composition or function of correlation is possible to that technology main idea of the invention can be obscured
When, omit detail explanation.Also, the preferred embodiments of the present invention are described below, but the technological thought of the present invention is not
It is defined in this or is so limited, practitioner can be changed and be implemented in a variety of forms.
It the present invention relates to the use of the 3D AVM (Around of wearable display (Wearable Display) equipment of glasses type
View Monitor, panoramic indicator) system, by collecting 360 degree of images of peripheral vehicle and by wearable display device
Export side images.
Fig. 1 is the concept map for the internal structure for illustrating 3D AVM systems according to an embodiment of the invention.
3D AVM systems 100 are characterised by, can be intuitively using the wearable display device of glasses form
To the image on the driver's seat periphery that the visual field is covered by car body.For the wearable display device of glasses form, reference picture 2 and enter
Row is aftermentioned.
3D AVM systems 100 have the effect that according to the feature.Firstth, driver is only set by wearable display
Standby viewing periphery, just can confirm that the image outside vehicle.Second, driver feels the vehicle that oneself seating glass makes, energy
Enough confirm the apperance of vehicle-surroundings.
Angle of visibility measure logic section 110 performs the function of calculating the angle of visibility of driver.The profit of angle of visibility measure logic section 110
The angle of visibility of driver is calculated with the mark being pasted onto on the wearable device of driver's wearing.
When inputting flag data 111 by mark, angle of visibility measure logic section 110 is determined based on the flag data
Angle of visibility, the determination data 112 will be transmitted and arrive additional information apposition logic section 130.Angle of visibility measure logic section 110 can be from advance
4 points of setting obtain flag data (4Point Marking, 4 point marks).
It is not to use the sight using camera when angle of visibility measure logic section 110 determines the direction of visual lines of driver
Tracking, but use the angle of visibility measure skill and technique using mark.
Conventional Eye-controlling focus skill and technique is delayed because of system loading amount, can not carry out correct Eye-controlling focus.Because regarding
Wild angle measure logic section 110 follows the trail of sight by using the optical markers of 4 point marks, compared to conventional skill and technique, has relatively fewer
Load and being capable of fast track.
Also, angle of visibility measure logic section 110 can determine within the 2ms on three dimensions apart from angle.Angle of visibility
Measure logic section 110 calculates desired position value and carries out position deduction more quickly than ever, by predicting next angle of sight
And reflect and image mapping step can be immediately moved to according to condition.
Image collection portion 120 performs the function for the image for obtaining outside vehicle.Image collection portion 120 is according to next logic
And collect the picture of outside vehicle.
Image collection portion 120 utilizes the camera 121 positioned at vehicle front, the camera 122 positioned at vehicle left side, is located at
The camera 123 of vehicle right side, positioned at camera 124 of rear view of vehicle etc. obtain front/rear/side image of peripheral vehicle.
Fig. 3 is the reference chart for illustrating the image collection method of the camera of the 3D AVM systems of pie graph 1.Image is received
Collection portion 120 obtains vehicle-surroundings 320a, 320b, 320c image as illustrated in fig. 3, using camera 310, such as before vehicle
Square 320a image, vehicle side 320b image, rear view of vehicle 320c image etc..
The camera image that image collection portion 120 is obtained by merging generates 360 degree of images 125.Image collection portion 120
The image generated is stored as image or video format.When the driver for having dressed wearable display device sees side, rear
Deng when, the figure for the part for being covered and can not be seen by vehicle interior is presented in image collection portion 120 on display (Monitor)
Picture.
Additional information apposition logic section 130 is performed based on determining the angle of visibility that logic section 110 determines according to angle of visibility
And convert the function of the image generated by image collection portion 120.Additional information apposition logic section 130 is according to the sight position of driver
Put and combining display image (131).
The image converted by additional information apposition logic section 130 is output to display by wearable display image transport unit 140
Device (141).Also, the image collection for being output to display is vehicle image and stored by wearable display image transport unit 140
(142)。
3D AVM systems 100 described above are not only simple front enhancing by the synthesis with augmented reality function
Real data service, moreover it is possible to which the augmented reality images serve at side and rear is provided.Because eliminate driver periphery
Dead zone.
Also, 3D AVM systems 100 can intuitively confirm the image on periphery in rear parking.
Also, 3D AVM systems 100 realize 360 degree of image collection system functions using the camera installed in vehicle,
Black box picture system can be replaced.
Also, 3D AVM systems 100 have used 4 point marks for measure driver's seat angle, therefore can carry out quick
Location tracking and effectively reduce wearable device sensor IC cost.
Fig. 2 is the stereogram of the wearable display device of the 3D AVM systems of pie graph 1.
Identify that the method for pilot's line of vision is as follows using the wearable display device 200 of glasses form.
4 point marks 201,202,203,204 are pasted with before glasses 200.First, the driver in installation shooting car room
The camera of face and obtain the 4 point mark imaged images for being pasted onto glasses.
Afterwards, 4 point marks 201,202,203,204 angle and size (SIZE) are determined and calculates the angle of sight of driver
Degree, the position coordinates etc. of face.At this moment, point mark specifies particular color to improve image procossing discrimination.It is here, specific
Color is the color for referring to make a distinction with the color on periphery.
Afterwards, calculated using the apperance of the spectacle-frame in 4 points of position and tetragonal region 205 driver distance,
Anglec of rotation etc..
Fig. 4 is the flow chart for the method for operation for illustrating the 3D AVM systems according to one embodiment of the invention.Illustrate below
Reference picture 1 and Fig. 4.
First, image collection portion 120 obtains the image of vehicle-surroundings and stores 360 degree of image informations (S405).
Afterwards, what the marker position information calculating part collection driver of composition angle of visibility measure logic section 110 had dressed can
The marker position information of object wearing device, mark memory portion store the marker position information (S410).
Afterwards, marker position information calculating part calculates the mark for the initial position for including the mark installed in wearable device
Positional information (S415).
Afterwards, marker position information calculating part calculates the current location including wearable device based on marker position information
And the relative information (S420) of current angular.
Afterwards, the wearable device position calculation unit for forming angle of visibility measure logic section 110 calculates the angle of visibility of driver
(S425)。
Afterwards, wearable device position calculation unit judges the direction of visual lines of driver based on the angle of visibility of driver
Whether it is side or rear (S430).
If the direction of visual lines for being judged as driver is side or rear, additional information apposition logic section 130 matches basis
The side images (S435) of angle of visibility.Afterwards, side images are output to wearable display part by additional information apposition logic section 130
(S440)。
On the contrary, if the direction of visual lines for being judged as driver is not side or rear, additional information apposition logic section is formed
130 mark next step (next step) position deduction portion estimates the anticipation movement position of mark based on the positional information of mark
Put (S445).
Mark next step position deduction portion judges whether the direction of visual lines of driver is side in (next step) in next step
Side or rear (S450).
In next step, if the direction of visual lines for being judged as driver is side or rear, mark next step position deduction portion etc.
After next step imaged image is obtained (S460), immediately move to images match step (S435) and worn without unnecessary
Wear position calculation.
On the contrary, in next step, if the direction of visual lines for being judged as driver is side or rear, mark next step position pushes away
Determine portion and side images are saved in thesaurus (S455).
The embodiment of the present invention is illustrated above by reference to Fig. 1 to Fig. 4.Explanation can implement shape from this one below
The preferred configuration of the invention that state is released.
Image outside vehicle output-controlling device according to a preferred embodiment of the invention includes:Image obtaining portion, the visual field
Angle calculating part, object images generating unit, image displaying part, power supply unit and master control part.
Power supply unit performs the function that power supply is provided to each composition for forming image outside vehicle output-controlling device.Main control
Portion performs the function of the overall operation of each composition of control composition image outside vehicle output-controlling device.In view of need to be by vehicle
External image output-controlling device is installed to vehicle, can not possess power supply unit and master control part in the present embodiment.
Image obtaining portion performs the function of obtaining image outside vehicle.Image obtaining portion is corresponding diagram 1 and Fig. 4 image receipts
The concept in collection portion 120.
Image of the image obtaining portion as outside vehicle, can obtain remaining direction in addition to vehicle front image or
The directive image of institute of vehicle.
Angle of visibility calculating part performs using the mark for the wearable device for being pasted onto driver's wearing and calculates driver's
The function of angle of visibility.Angle of visibility calculating part is the concept of corresponding diagram 1 and Fig. 4 angle of visibility measure logic section 110.
Angle of visibility calculating part can be used as wearable device by the use of the equipment of glasses form.Wearable in the present embodiment sets
Standby (wearable device) can be rendered as the glasses type wearable devices such as intelligent glasses (smart glasses).
As mark, angle of visibility calculating part using be pasted onto both sides spectacle-frame upper end mark and be pasted onto both sides
The mark of the bottom of spectacle-frame.
Angle of visibility calculating part obtains with the position of the driver's eyes according to mark acquisition and moving direction, and according to camera
The angle of visibility for being masked as position and calculating driver obtained.
Angle of visibility calculating part is using mark and on the basis of the front of vehicle, real within the scope of 90 degree of left side and right side
When follow the trail of driver's eyes motion.When angle of visibility calculating part is judged as the sight of driver with the front (0 degree) of vehicle for base
It is accurate and to the left side direction beyond 90 degree or to the right side direction beyond 90 degree when, be judged as that driver sees the side or rear of vehicle,
No longer follow the trail of the motion of driver's eyes.
As mark, angle of visibility calculating part also using be pasted onto both sides spectacle-frame left end mark and be pasted onto
The mark of the right-hand end of both sides spectacle-frame.
Object images generating unit is performed based on the angle of visibility of driver, and at least one party is extracted from image outside vehicle
To image and be generated as the function of object output image.Object images generating unit is corresponding diagram 1 and Fig. 4 additional information apposition
The concept of logic section 130.
Object images generating unit extracted in the image of outside vehicle at least one party to image when, can be by object output
Image is generated as stereo-picture.Object images generating unit extracts the image at least two directions in the image of outside vehicle
When, object output image can be generated as panoramic picture.
When object images generating unit be judged as angle of visibility on the basis of the front of vehicle and to the left side direction beyond 90 degree or
When side direction exceeds 90 degree to the right, object output image can be generated.
Image displaying part performs the function that object output image is shown to wearable device.Image displaying part is corresponding diagram
The concept of 1 and Fig. 4 wearable display image transport unit 140.
As wearable device, image displaying part can utilize the equipment of glasses form.
Secondly, the method for operation of image outside vehicle output-controlling device is illustrated.
First, the image of outside vehicle is obtained.
Afterwards, the angle of visibility of driver is calculated using the mark for the wearable device for being pasted onto driver's wearing.
Afterwards, based on the angle of visibility of driver, from the image of outside vehicle extract at least one party to image and
It is generated as object output image.
Afterwards, object output image is shown on wearable device.
Even if it is described above form embodiments of the invention all inscapes be combined as a whole or with reference to and operate,
The present invention is not limited to this embodiment.That is, as long as within the scope of the purpose of the present invention, all inscapes can
More than one optionally with reference to and operate.Also, all inscapes are presented as an independent hardware, but each composition respectively
The part or all of of key element is optionally combined, and is presented as with the part or all of work(for performing one or more hardware combinations
The computer program of the program module of energy.Also, this computer program is stored in the meter such as USB storage, CD disks, flash memory
The readable recording medium of calculation machine (Computer Readable Media), can be read and executed by a computer, so as to embody this hair
Bright embodiment.The recording medium of computer program has magnetic recording medium, optical recording media, carrier wave recording medium etc..
If also, to including all terms of technical or scientific term, do not define separately in the detailed description, with this
What the people with general knowledge was commonly understood by technical field that the present invention belongs to looks like with identical, with the use defined in dictionary
The term that language is identical and typically uses should be interpreted that it is consistent with the contextual meaning of correlation technique, as long as the present invention it is not clear and definite
Definition, it is impossible to be construed to the meaning that is abnormal or excessively formalizing.
Described above simply illustratively illustrates the technological thought of the present invention, has one in the technical field of the invention
As knowledge people, can carry out a variety of modifications in the range of intrinsic propesties of the present invention is not departed from, change and replace.Therefore,
Embodiment disclosed by the invention and accompanying drawing simultaneously do not lie in the technological thought for limiting the present invention, but in order to illustrate, skill of the invention
The scope of art thought is not limited by this embodiment and accompanying drawing.Protection scope of the present invention should be according to following claim
Explained, the interest field of the present invention is included in all technological thoughts in its equivalents.
Claims (15)
- A kind of 1. image outside vehicle output-controlling device, it is characterised in that including:Image obtaining portion, obtain the image of outside vehicle;Angle of visibility calculating part, regarding for the driver is calculated using the mark on the wearable device for pasting driver's wearing Wild angle;Object images generating unit, based on the angle of visibility and from the image to the outside vehicle extract at least one party to Image and be generated as object output image;AndImage displaying part, the object output image is shown to the wearable device,The object images generating unit is not when it is side or rear to be judged as the direction of visual lines of the driver, based on the mark The positional information of will estimates the anticipation shift position of the mark, and position is moved in the anticipation for being judged as the mark based on presumption When the direction of visual lines of the driver put is the side or rear, object output image is generated.
- 2. image outside vehicle output-controlling device according to claim 1, it is characterised in thatDescribed image obtaining portion, obtain the image in remaining direction in addition to the vehicle front or all sides to the vehicle To image and as the image to the outside vehicle.
- 3. image outside vehicle output-controlling device according to claim 1, it is characterised in thatThe angle of visibility calculating part and described image display part are used as the wearable device by the use of the equipment of glasses form.
- 4. image outside vehicle output-controlling device according to claim 3, it is characterised in thatThe angle of visibility calculating part utilizes the mark for the upper end for being pasted onto both sides spectacle-frame and is pasted onto the both sides spectacle-frame Bottom mark and be used as the mark.
- 5. image outside vehicle output-controlling device according to claim 1, it is characterised in thatThe angle of visibility calculating part with obtained according to the mark the positions of driver's eyes, moving direction and according to shooting Head and obtain the mark position based on and calculate the angle of visibility of the driver.
- 6. image outside vehicle output-controlling device according to claim 1, it is characterised in thatThe angle of visibility calculating part utilizes the mark, on the basis of the front of the vehicle, in left side and 90 degree of right side scope Within, the motion of real-time tracing driver's eyes.
- 7. image outside vehicle output-controlling device according to claim 4, it is characterised in thatAs the mark, the angle of visibility calculating part also using the left end for being pasted onto the both sides spectacle-frame mark and It is pasted onto the mark of the right-hand end of the both sides spectacle-frame.
- 8. image outside vehicle output-controlling device according to claim 1, it is characterised in thatThe object output image is generated as stereo-picture by the object images generating unit, from the image of the outside vehicle After the image for extracting at least two directions, the object output image is generated as panoramic picture.
- 9. image outside vehicle output-controlling device according to claim 1, it is characterised in thatIf the object images generating unit is judged as the angle of visibility on the basis of the front of vehicle and side direction exceeds 90 to the left Degree or to the right side direction exceed 90 degree, then generate the object output image.
- A kind of 10. image outside vehicle output control method, it is characterised in that including:The step of obtaining the image of outside vehicle;The step of angle of visibility of the driver being calculated using the mark pasted on the wearable device of driver's wearing;Based on the angle of visibility and from the image for the outside vehicle extract at least one party to image and generate For object output image the step of;AndThe step of object output image is shown to the wearable device,Described the step of being generated as object output image is, is being judged as that the direction of visual lines of the driver is not side or rear When, the positional information based on the mark estimates the anticipation shift position of the mark, is being judged as the mark based on presumption When the direction of visual lines of the driver of the anticipation shift position of will is the side or rear, object output image is generated.
- 11. image outside vehicle output control method according to claim 10, it is characterised in thatThe step of the step of calculating and the display, the wearable device is used as by the use of the equipment of glasses form.
- 12. image outside vehicle output control method according to claim 11, it is characterised in thatThe step of calculating, using the upper end for being pasted onto both sides spectacle-frame mark and be pasted onto the both sides spectacle-frame The mark of bottom and be used as the mark.
- 13. image outside vehicle output control method according to claim 10, it is characterised in thatThe step of calculating, using the mark, on the basis of the front of the vehicle, left side and 90 degree of right side scope it It is interior, the motion of real-time tracing driver's eyes.
- 14. image outside vehicle output control method according to claim 10, it is characterised in thatThe step of generation, the object output image is generated as stereo-picture, to being carried in the image of the outside vehicle After the image for taking at least two directions, the object output image is generated as panoramic picture.
- 15. image outside vehicle output control method according to claim 10, it is characterised in thatThe step of generation, if being judged as the angle of visibility on the basis of the front of vehicle to the left side direction beyond 90 degree or Side direction exceeds 90 degree to the right, then generates the object output image.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020140142393A KR102270577B1 (en) | 2014-10-21 | 2014-10-21 | Apparatus and method for controlling outputting external image of vehicle |
KR10-2014-0142393 | 2014-10-21 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105522971A CN105522971A (en) | 2016-04-27 |
CN105522971B true CN105522971B (en) | 2018-02-23 |
Family
ID=55765621
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510684490.6A Active CN105522971B (en) | 2014-10-21 | 2015-10-20 | Image outside vehicle output-controlling device and method |
Country Status (2)
Country | Link |
---|---|
KR (1) | KR102270577B1 (en) |
CN (1) | CN105522971B (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105974586A (en) * | 2016-05-12 | 2016-09-28 | 上海擎感智能科技有限公司 | Intelligent glasses and operating method and system therefor |
CN106339980A (en) * | 2016-08-22 | 2017-01-18 | 乐视控股(北京)有限公司 | Automobile-based VR display device and method and automobile |
CN107009962B (en) * | 2017-02-23 | 2019-05-14 | 杭州电子科技大学 | A kind of panorama observation method based on gesture recognition |
CN107745713A (en) * | 2017-09-25 | 2018-03-02 | 南京律智诚专利技术开发有限公司 | A kind of method of work of the vehicle automatic running system based on environmental analysis |
CN108556738A (en) * | 2018-03-30 | 2018-09-21 | 深圳市元征科技股份有限公司 | The display device and method of automobile A-column blind area |
JP7163732B2 (en) * | 2018-11-13 | 2022-11-01 | トヨタ自動車株式会社 | Driving support device, driving support system, driving support method and program |
CN109624857B (en) * | 2018-11-30 | 2022-03-29 | 沈阳工业大学 | Night-vision automobile outside rearview mirror supporting lane change early warning and implementation method |
CN112884941A (en) * | 2021-01-19 | 2021-06-01 | 中国人民解放军32212部队 | A on-vehicle information acquisition system for tank armoured vehicle combat test |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1890128A (en) * | 2003-12-01 | 2007-01-03 | 沃尔沃技术公司 | Method and system for supporting path control |
CN101277432A (en) * | 2007-03-26 | 2008-10-01 | 爱信艾达株式会社 | Driving support method and driving support apparatus |
CN102998797A (en) * | 2011-09-07 | 2013-03-27 | 奥迪股份公司 | Method used for providing display in motor vehicle according to visual direction of vehicle driver |
CN103358996A (en) * | 2013-08-13 | 2013-10-23 | 吉林大学 | Automobile A pillar perspective vehicle-mounted display device |
CN103802728A (en) * | 2013-11-08 | 2014-05-21 | 广东好帮手电子科技股份有限公司 | Driving assistance system and method based on head-mounted display and used for automatic scene selection for display |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5483027B2 (en) * | 2011-02-04 | 2014-05-07 | 国際航業株式会社 | 3D image measurement method and 3D image measurement apparatus |
KR101332253B1 (en) * | 2011-12-13 | 2013-12-02 | 현대자동차주식회사 | Display Appratus for Doesn't Making A Blind Zone |
JP5630518B2 (en) * | 2012-03-14 | 2014-11-26 | 株式会社デンソー | Driving assistance device |
KR20140054926A (en) * | 2012-10-30 | 2014-05-09 | 현대모비스 주식회사 | Display apparatus for rear side view of vehicle and vehicle having the same |
KR20140079947A (en) * | 2012-12-20 | 2014-06-30 | 한국전자통신연구원 | Video recording apparatus for a vehicle and Method of video recording for a vehicle |
-
2014
- 2014-10-21 KR KR1020140142393A patent/KR102270577B1/en active IP Right Grant
-
2015
- 2015-10-20 CN CN201510684490.6A patent/CN105522971B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1890128A (en) * | 2003-12-01 | 2007-01-03 | 沃尔沃技术公司 | Method and system for supporting path control |
CN101277432A (en) * | 2007-03-26 | 2008-10-01 | 爱信艾达株式会社 | Driving support method and driving support apparatus |
CN102998797A (en) * | 2011-09-07 | 2013-03-27 | 奥迪股份公司 | Method used for providing display in motor vehicle according to visual direction of vehicle driver |
CN103358996A (en) * | 2013-08-13 | 2013-10-23 | 吉林大学 | Automobile A pillar perspective vehicle-mounted display device |
CN103802728A (en) * | 2013-11-08 | 2014-05-21 | 广东好帮手电子科技股份有限公司 | Driving assistance system and method based on head-mounted display and used for automatic scene selection for display |
Also Published As
Publication number | Publication date |
---|---|
KR20160046480A (en) | 2016-04-29 |
CN105522971A (en) | 2016-04-27 |
KR102270577B1 (en) | 2021-06-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105522971B (en) | Image outside vehicle output-controlling device and method | |
WO2021004548A1 (en) | Vehicle speed intelligent measurement method based on binocular stereo vision system | |
WO2021004312A1 (en) | Intelligent vehicle trajectory measurement method based on binocular stereo vision system | |
US10757373B2 (en) | Method and system for providing at least one image captured by a scene camera of a vehicle | |
US10362296B2 (en) | Localized depth map generation | |
CN106575473B (en) | Method and device for non-contact axle counting of vehicle and axle counting system | |
JP6659924B2 (en) | Adjusting the presentation of the head-mounted display | |
US11176704B2 (en) | Object pose estimation in visual data | |
CN103871045B (en) | Display system and method | |
JP2005268847A (en) | Image generating apparatus, image generating method, and image generating program | |
JP6008397B2 (en) | AR system using optical see-through HMD | |
Roth et al. | Dd-pose-a large-scale driver head pose benchmark | |
CN107392103A (en) | The detection method and device of road surface lane line, electronic equipment | |
US20200234398A1 (en) | Extraction of standardized images from a single view or multi-view capture | |
CN107633703A (en) | A kind of drive recorder and its forward direction anti-collision early warning method | |
US20210225038A1 (en) | Visual object history | |
CN106031166B (en) | Vehicle-surroundings image display device and vehicle-surroundings image display method | |
US10573083B2 (en) | Non-transitory computer-readable storage medium, computer-implemented method, and virtual reality system | |
CN108510528A (en) | A kind of method and device of visible light and infrared image registration fusion | |
CN202058221U (en) | Passenger flow statistic device based on binocular vision | |
JP6061334B2 (en) | AR system using optical see-through HMD | |
Li et al. | Durlar: A high-fidelity 128-channel lidar dataset with panoramic ambient and reflectivity imagery for multi-modal autonomous driving applications | |
CN111400423A (en) | Smart city CIM three-dimensional vehicle pose modeling system based on multi-view geometry | |
CN107356916B (en) | Vehicle distance detecting method and device, electronic equipment, computer readable storage medium | |
US20220114748A1 (en) | System and Method for Capturing a Spatial Orientation of a Wearable Device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |