CN105522971A - Apparatus and method for controlling outputting of external image of vehicle - Google Patents
Apparatus and method for controlling outputting of external image of vehicle Download PDFInfo
- Publication number
- CN105522971A CN105522971A CN201510684490.6A CN201510684490A CN105522971A CN 105522971 A CN105522971 A CN 105522971A CN 201510684490 A CN201510684490 A CN 201510684490A CN 105522971 A CN105522971 A CN 105522971A
- Authority
- CN
- China
- Prior art keywords
- image
- outside vehicle
- mark
- visibility
- angle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 23
- 239000000284 extract Substances 0.000 claims abstract description 9
- 239000011521 glass Substances 0.000 claims description 15
- 230000006870 function Effects 0.000 description 13
- 238000010586 diagram Methods 0.000 description 11
- 230000000007 visual effect Effects 0.000 description 10
- 239000003550 marker Substances 0.000 description 7
- 230000015572 biosynthetic process Effects 0.000 description 5
- 230000001276 controlling effect Effects 0.000 description 4
- 230000000875 corresponding effect Effects 0.000 description 4
- 230000003190 augmentative effect Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 3
- 238000004590 computer program Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- RKTYLMNFRDHKIL-UHFFFAOYSA-N copper;5,10,15,20-tetraphenylporphyrin-22,24-diide Chemical group [Cu+2].C1=CC(C(=C2C=CC([N-]2)=C(C=2C=CC=CC=2)C=2C=CC(N=2)=C(C=2C=CC=CC=2)C2=CC=C3[N-]2)C=2C=CC=CC=2)=NC1=C3C1=CC=CC=C1 RKTYLMNFRDHKIL-UHFFFAOYSA-N 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/10—Front-view mirror arrangements; Periscope arrangements, i.e. optical devices using combinations of mirrors, lenses, prisms or the like ; Other mirror arrangements giving a view from above or under the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/12—Mirror assemblies combined with other articles, e.g. clocks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/12—Mirror assemblies combined with other articles, e.g. clocks
- B60R2001/1253—Mirror assemblies combined with other articles, e.g. clocks with cameras, video cameras or video screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/105—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/303—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Multimedia (AREA)
- Automation & Control Theory (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Transportation (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Processing (AREA)
Abstract
The invention discloses an apparatus and method for controlling outputting of an external image of a vehicle. Based on a logo adhered to a wearable device, a view angle of a driver is calculated and image outputting around the vehicle is controlled. The apparatus comprises an image obtaining part, a view angle calculating part, an object image generating part and an image displaying part. The image obtaining part obtains an external image of the vehicle. The view angle calculating part calculates the view angle of the driver by use of the logo adhered to a wearable device worn by the driver. The object image generating part extracts images in at least one direction from the external image of the vehicle and generates the image as an output object image based on the view angle. The image displaying part displays the output object image to the wearable device.
Description
Technical field
The present invention relates to the device and method of the output controlling image.In more detail, the device and method of the output of the image controlling outside vehicle is related to.
Background technology
Nowadays, vehicle boarded various electronic.For example, described equipment comprises: follow the trail of and display vehicle position and export the homing advice of the map of vehicle traveling-position, the inside/outside portion of shooting vehicle and obtain the cam device of image, storage export the LCD etc. of the image taken by camera or the reproduced picture exporting CD or DVD etc. with picture with picture.
But the cam device being in the past mounted in vehicle has following problem.
The scope of the first, collecting picture has limitation, only collects the graphicinformation of confined area.Accordingly, there is the problem that cannot present the image being covered the operator's saddle periphery in the visual field by car body.
The second, when vehicle-surroundings exists three-dimensional structure, the image of the mis-information of distortion in image frame, is presented.
No. 2013-0013410th, KR published patent proposes the device exporting vehicle periphery image.But this device just exports the image of vehicle-surroundings according to the motoring condition of vehicle, cannot solve described problem points.
Summary of the invention
(technical matters that will solve)
The present invention proposes for solving described problem points, its object is to, provides a kind of image outside vehicle output-controlling device and method, calculate to be pasted onto being masked as basis of wearable device chaufeur angle of visibility and control vehicle-surroundings image export.
But the item related to described in object of the present invention is not limited to, other objects do not related to clearly can be understood by practitioner in the art according to following record.
(means of dealing with problems)
The present invention proposes for reaching described object, provides a kind of image outside vehicle output-controlling device, comprising: image obtaining portion, obtains the image of outside vehicle; Angle of visibility calculating part, utilization pastes the mark on the wearable device of chaufeur wearing and calculates the angle of visibility of described chaufeur; Object images generating unit, extracts the image at least one direction and is generated as object output image based on described angle of visibility from the image to described outside vehicle; And image displaying part, described object output image is shown to described wearable device.
Preferably, described image obtaining portion, obtain all the other directions except described vehicle front image or to described vehicle directive image and as the image to described outside vehicle.
Preferably, described angle of visibility calculating part and described image displaying part utilize the equipment of glasses form as described wearable device.
Preferably, described angle of visibility calculating part utilize the mark being pasted onto the upper end of both sides spectacle-frame and the bottom being pasted onto described both sides spectacle-frame mark and as described mark.
Preferably, described angle of visibility calculating part calculates the angle of visibility of described chaufeur based on the position of the driver's eyes obtained according to described mark, moving direction and the position of described mark that obtains according to camera.
Preferably, described angle of visibility calculating part utilizes described mark, with the front of described vehicle for benchmark, within left side and 90 degree, right side scope, and the motion of real-time tracing driver's eyes.
Preferably, described angle of visibility calculating part also utilize the mark of the left end being pasted onto described both sides spectacle-frame and be pasted onto described both sides spectacle-frame right-hand end be masked as described mark.
Preferably, described object output Computer image genration is stereo-picture by described object images generating unit, after extracting the image of at least both direction, is panoramic picture by described object output Computer image genration from the image of described outside vehicle.
Preferably, if described object images generating unit be judged as described angle of visibility with the front of vehicle for benchmark and to the left direction exceed 90 degree or to the right direction exceed 90 degree, then generate described object output image.
Further, the invention provides a kind of image outside vehicle output control method, comprising: the step obtaining the image of outside vehicle; The mark that the wearable device utilizing chaufeur to dress is pasted and calculate the step of the angle of visibility of described chaufeur; Based on described angle of visibility, from the image for described outside vehicle, extract the image at least one direction and be generated as the step of object output image; And described object output image is shown to the step of described wearable device.
Preferably, the step of described acquisition, obtain all the other directions except described vehicle front image or to described vehicle directive image and as the image to described outside vehicle.
Preferably, the step of described calculating and the step of described display, utilize the equipment of glasses form as described wearable device.
Preferably, the step of described calculating, utilize be pasted onto the upper end of both sides spectacle-frame mark and be pasted onto described both sides spectacle-frame bottom mark and as described mark.
Preferably, the step of described calculating, calculates the angle of visibility of described chaufeur based on the position of the driver's eyes obtained according to described mark, moving direction and the position of described mark that obtains according to camera.
Preferably, the step of described calculating, utilizes described mark, with the front of described vehicle for benchmark, within left side and 90 degree, right side scope, and the motion of real-time tracing driver's eyes.
Preferably, the step of described calculating, utilize be pasted onto the left end of described both sides spectacle-frame mark and be pasted onto described both sides spectacle-frame right-hand end be masked as described mark.
Preferably, described object output Computer image genration is stereo-picture by the step of described generation, after extracting the image of at least both direction, is panoramic picture by described object output Computer image genration from the image of described outside vehicle.
Preferably, the step of described generation, if be judged as described angle of visibility with the front of vehicle for benchmark and to the left direction exceed 90 degree or to the right direction exceed 90 degree, then generate described object output image.
(effect of invention)
The present invention is to be pasted onto being masked as basis and calculating the angle of visibility of chaufeur on wearable device, thus the image controlling vehicle-surroundings exports, and has following effect.
The first, the side images of 360 degree can be confirmed in driver's seat, instead of overlook the two-dimensional image of vehicle-surroundings image from upside.
The second, chaufeur uses wearable display (WearableDisplay) equipment and intuitively arrives the image of the operator's saddle periphery being covered the visual field by car body.
Accompanying drawing explanation
Fig. 1 is the concept map of diagram according to the inner structure of the 3DAVM system of one embodiment of the invention.
Fig. 2 is the block diagram of the wearable display equipment of the 3DAVM system of pie graph 1.
Fig. 3 is the reference diagram of the image collection method of the camera of 3DAVM system for illustration of pie graph 1.
Fig. 4 is the diagram of circuit of the method for operation of the 3DAVM system illustrated according to one embodiment of the invention.
Detailed description of the invention
Below, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.First, add reference marks about the inscape for each accompanying drawing, when identical inscape is indicated on different accompanying drawings, employ identical symbol as far as possible.Further, when illustrating of the present invention, what be judged as the open formation of being correlated with or function illustrates when likely can obscure technology main idea of the present invention, omits its detailed description.Further, the preferred embodiments of the present invention can be described below, but technological thought of the present invention is not limited thereto or is so limited, practitioner can carry out change and implement in a variety of forms.
The present invention relates to the 3DAVM (AroundViewMonitor utilizing the wearable display of glasses type (WearableDisplay) equipment, panoramic indicator) system, export side images by collecting 360 degree of images of peripheral vehicle by wearable display equipment.
Fig. 1 is the concept map of the inner structure illustrating 3DAVM system according to an embodiment of the invention.
The feature of 3DAVM system 100 is, use glasses forms wearable display equipment and intuitively can arrive the image of the operator's saddle periphery being covered the visual field by car body.For glasses form wearable display equipment, with reference to Fig. 2 and carry out aftermentioned.
3DAVM system 100 has following effect according to described feature.The first, chaufeur is by means of only wearable display equipment viewing periphery, just can confirm the image outside vehicle.The second, chaufeur is felt oneself to take the vehicle made of glass, can confirm the apperance of vehicle-surroundings.
Angle of visibility measures the function that logic section 110 performs the angle of visibility calculating chaufeur.Angle of visibility measures logic section 110 utilization and is pasted onto the mark on the wearable device of chaufeur wearing and calculates the angle of visibility of chaufeur.
When by mark input flag data 111, angle of visibility measures logic section 110 and measure angle of visibility based on this flag data, by this determination data 112 of transmission to additional information apposition logic section 130.Angle of visibility measures logic section 110 can obtain flag data (4PointMarking, 4 marks) from 4 points preset.
When angle of visibility mensuration logic section 110 measures the direction of visual lines of chaufeur, not use the Eye-controlling focus utilizing camera, but use the angle of visibility mensuration skill and technique utilizing mark.
Eye-controlling focus skill and technique is in the past delayed because of system loading amount, cannot carry out correct Eye-controlling focus.Follow the trail of sight line because angle of visibility measures logic section 110 by the optical markers that utilizes indicate at 4, phase skill and technique than ever, there is relatively less lifting capacity and can fast track.
Further, angle of visibility mensuration logic section 110 can measure the distance angle on three dimensional space within 2ms.Angle of visibility measures logic section 110 and calculates desired position value and carry out position deduction than ever more quickly, by predicting the next angle of sight and reflecting and can move on to image mapped step immediately according to condition.
Image collection portion 120 performs the function of the image obtaining outside vehicle.The picture of outside vehicle is collected in image collection portion 120 according to next logic.
Image collection portion 120 utilizes the camera 121 being positioned at vehicle front, the camera 122 being positioned at vehicle left side, be positioned at the camera 123 of vehicle right side, be positioned at the camera 124 etc. of rear view of vehicle and obtain peripheral vehicle front/rear/side image.
Fig. 3 is the reference diagram of the image collection method of the camera of 3DAVM system for illustration of pie graph 1.Image collection portion 120 as illustrated in fig. 3, utilizes camera 310 and obtains the image of vehicle-surroundings 320a, 320b, 320c, the image of such as vehicle front 320a, the image of vehicle side 320b, the image etc. of rear view of vehicle 320c.
The camera image that image collection portion 120 is obtained by merging and generate 360 degree of images 125.The image generated is stored as image or video format by image collection portion 120.When the chaufeur dressing wearable display equipment sees side, rear etc., image collection portion 120 presents and is covered by vehicle interior and the image of the part that cannot see on telltale (Monitor).
Additional information apposition logic section 130 performs the function converting the image generated by image collection portion 120 based on the angle of visibility measuring logic section 110 mensuration according to angle of visibility.Additional information apposition logic section 130 is combining display image (131) according to the eye position of chaufeur.
The image converted by additional information apposition logic section 130 is outputted to telltale (141) by wearable display image transport unit 140.Further, the image collection outputting to telltale is vehicle image and stores (142) by wearable display image transport unit 140.
3DAVM system 100 described above, by the synthesis with augmented reality function, is not only simple front augmented reality data, services, can also be provided the augmented reality images serve at side and rear.This is because eliminate the dead zone of chaufeur periphery.
Further, 3DAVM system 100 in the wings parking time can confirm the image of periphery intuitively.
Further, 3DAVM system 100 utilization is arranged on the camera of vehicle and realizes 360 degree of image collection system functions, can replace black box vision system.
Further, 3DAVM system 100 employs 4 marks for measuring driver's seat angle, therefore, it is possible to carry out fast speed location tracking and effective sensor IC cost reducing wearable device.
Fig. 2 is the block diagram of the wearable display equipment of the 3DAVM system of pie graph 1.
Utilize glasses form wearable display equipment 200 and identify that the method for pilot's line of vision is as follows.
4 marks 201,202,203,204 are pasted with before glasses 200.First, the camera of the chaufeur face of shooting car indoor is installed and obtains 4 the mark imaged images being pasted onto glasses.
Afterwards, measure 4 marks 201,202,203, the angle of 204 and size (SIZE) and calculate the sight angle of chaufeur, the position coordinate etc. of face.At this moment, mark specifies particular color to improve image procossing discrimination.At this, particular color refers to and can carry out with the color of periphery the color distinguished.
Afterwards, use the apperance of the spectacle-frame in the position of 4 and tetragonal region 205 and calculate the distance, angle of rotation etc. of chaufeur.
Fig. 4 is the diagram of circuit of the method for operation of the 3DAVM system illustrated according to one embodiment of the invention.Below illustrate with reference to Fig. 1 and Fig. 4.
First, image collection portion 120 obtains the image of vehicle-surroundings and stores 360 degree of graphicinformations (S405).
Afterwards, the marker position information calculating part forming angle of visibility mensuration logic section 110 collects the marker position information of the wearable device that chaufeur has been dressed, and mark memory portion stores this marker position information (S410).
Afterwards, marker position information calculating part calculates the marker position information (S415) comprising the initial position of the mark being arranged on wearable device.
Afterwards, marker position information calculating part calculates the relative information (S420) of current location and the current angular comprising wearable device based on marker position information.
Afterwards, the angle of visibility (S425) that angle of visibility measures the wearable device position calculation unit calculating chaufeur of logic section 110 is formed.
Afterwards, wearable device position calculation unit judges based on the angle of visibility of chaufeur whether the direction of visual lines of chaufeur is side or rear (S430).
If be judged as, the direction of visual lines of chaufeur is side or rear, then additional information apposition logic section 130 mates the side images (S435) according to angle of visibility.Afterwards, side images is outputted to wearable display part (S440) by additional information apposition logic section 130.
On the contrary, if be judged as, the direction of visual lines of chaufeur is not side or rear, then next step (nextstep) position deduction portion of mark forming additional information apposition logic section 130 estimates the anticipation shift position (S445) of mark based on the location information indicated.
Indicate that next step position deduction portion judges in next step (nextstep) whether the direction of visual lines of chaufeur is side or rear (S450).
In next step, if be judged as, the direction of visual lines of chaufeur is side or rear, indicate next step position deduction portion wait obtain next step imaged image after (S460), move to images match step (S435) immediately and do not carry out unnecessary wearable position calculation.
On the contrary, in next step, if be judged as, the direction of visual lines of chaufeur is side or rear, then indicate that side images is saved in thesaurus (S455) by next step position deduction portion.
An example of the present invention is described above referring to figs. 1 through Fig. 4.The following describes the preferred configuration of the present invention can released from a this example.
Image outside vehicle output-controlling device according to a preferred embodiment of the invention comprises: image obtaining portion, angle of visibility calculating part, object images generating unit, image displaying part, power supply unit and master control part.
Power supply unit performs the function providing power supply to each formation forming image outside vehicle output-controlling device.Master control part performs the function of the overall operation controlling each formation forming image outside vehicle output-controlling device.Consider and image outside vehicle output-controlling device need be installed to vehicle, in the present embodiment, power supply unit and master control part can not be possessed.
Image obtaining portion performs the function obtaining image outside vehicle.Image obtaining portion is the concept in the image collection portion 120 of corresponding diagram 1 and Fig. 4.
Image obtaining portion, as the image of outside vehicle, can obtain the image in all the other directions except vehicle front or the directive image of vehicle.
The execution of angle of visibility calculating part utilizes the mark of the wearable device being pasted onto chaufeur wearing and calculates the function of the angle of visibility of chaufeur.Angle of visibility calculating part is the concept of the angle of visibility mensuration logic section 110 of corresponding diagram 1 and Fig. 4.
Angle of visibility calculating part can utilize glasses form equipment as wearable device.Wearable device (wearabledevice) in the present embodiment can be rendered as the glasses type wearable devices such as intelligent glasses (smartglasses).
As mark, angle of visibility calculating part can utilize the mark of the upper end being pasted onto both sides spectacle-frame and be pasted onto the mark of bottom of both sides spectacle-frame.
Angle of visibility calculating part with the position of the driver's eyes obtained according to mark and moving direction, and calculates the angle of visibility of chaufeur according to being masked as position of camera acquisition.
Angle of visibility calculating part utilize mark and with the front of vehicle for benchmark, within left side and 90 degree, right side scope, the motion of real-time tracing driver's eyes.When angle of visibility calculating part be judged as the sight line of chaufeur with the front of vehicle (0 degree) for benchmark and to the left direction exceed 90 degree or direction exceeds 90 degree to the right time, be judged as that chaufeur sees side or the rear of vehicle, no longer follow the trail of the motion of driver's eyes.
As mark, angle of visibility calculating part also can utilize the mark of the left end being pasted onto both sides spectacle-frame and be pasted onto the mark of right-hand end of both sides spectacle-frame.
Object images generating unit performs based on the angle of visibility of chaufeur, extracts the image at least one direction and be generated as the function of object output image from image outside vehicle.Object images generating unit is the concept of the additional information apposition logic section 130 of corresponding diagram 1 and Fig. 4.
Object images generating unit extract in the image of outside vehicle at least one party to image time, can be stereo-picture by object output Computer image genration.When object images generating unit extracts the image of at least both direction in the image of outside vehicle, can be panoramic picture by object output Computer image genration.
When object images generating unit be judged as angle of visibility with the front of vehicle for benchmark and to the left direction exceed 90 degree or direction exceeds 90 degree to the right time, object output image can be generated.
Image displaying part performs function object output image being shown to wearable device.Image displaying part is the concept of the wearable display image transport unit 140 of corresponding diagram 1 and Fig. 4.
As wearable device, image displaying part can utilize glasses form equipment.
Secondly, the method for operation of image outside vehicle output-controlling device is described.
First, the image of outside vehicle is obtained.
Afterwards, utilize the mark being pasted onto the wearable device that chaufeur is dressed and calculate the angle of visibility of chaufeur.
Afterwards, based on the angle of visibility of chaufeur, from the image of outside vehicle, extract the image at least one direction and be generated as object output image.
Afterwards, object output image is shown on wearable device.
Even if all inscapes of formation embodiments of the invention described above are combined as a whole or combine and operate, the present invention is not limited to this embodiment.That is, as long as within object scope of the present invention, all inscapes can more than one optionally combine and operate.Further, all inscapes are presented as an independently hardware respectively, but the part or all of of each inscape is optionally combined, and are presented as the computer program of the program module with the part or all of function performing one or more hardware combinations.Further, this computer program is stored in the computer-readable recording mediums (ComputerReadableMedia) such as USB storage, CD disk, flash memory, can be read and executed by a computer, thus embodies embodiments of the invention.The recording medium of computer program has magnetic recording medium, optical recording media, carrier wave recording medium etc.
And; if to all terms comprising technical or scientific term; do not define separately in the detailed description; that generally understands with the people in the technical field of the invention with general knowledge has the identical meaning; identical with the term defined in dictionary and term that is that generally use should be interpreted as with the contextual meaning of correlation technique consistent; as long as the present invention does not clearly define, the abnormal or excessive formal meaning can not be construed to.
Above-mentioned explanation just describes technological thought of the present invention illustratively, has the people of general knowledge in the technical field of the invention, in the scope not departing from intrinsic propesties of the present invention, can carry out multiple amendment, changes and replaces.Therefore, embodiment disclosed by the invention and accompanying drawing also do not lie in and limit technological thought of the present invention, but in order to illustrate, the scope of technological thought of the present invention is by the restriction of this embodiment and accompanying drawing.Protection scope of the present invention should be explained according to following claim, is all included in interest field of the present invention with all technological thoughts in its equivalents.
Claims (15)
1. an image outside vehicle output-controlling device, is characterized in that, comprising:
Image obtaining portion, obtains the image of outside vehicle;
Angle of visibility calculating part, utilization pastes the mark on the wearable device of chaufeur wearing and calculates the angle of visibility of described chaufeur;
Object images generating unit, extracts the image at least one direction and is generated as object output image based on described angle of visibility from the image to described outside vehicle; And
Image displaying part, is shown to described wearable device by described object output image.
2. image outside vehicle output-controlling device according to claim 1, is characterized in that,
Described image obtaining portion, obtain all the other directions except described vehicle front image or to described vehicle directive image and as the image to described outside vehicle.
3. image outside vehicle output-controlling device according to claim 1, is characterized in that,
Described angle of visibility calculating part and described image displaying part utilize the equipment of glasses form as described wearable device.
4. image outside vehicle output-controlling device according to claim 3, is characterized in that,
Described angle of visibility calculating part utilize the mark being pasted onto the upper end of both sides spectacle-frame and the bottom being pasted onto described both sides spectacle-frame mark and as described mark.
5. image outside vehicle output-controlling device according to claim 1, is characterized in that,
Described angle of visibility calculating part is based on the position of the driver's eyes obtained according to described mark, moving direction and the position of described mark that obtains according to camera and calculate the angle of visibility of described chaufeur.
6. image outside vehicle output-controlling device according to claim 1, is characterized in that,
Described angle of visibility calculating part utilizes described mark, with the front of described vehicle for benchmark, within left side and 90 degree, right side scope, and the motion of real-time tracing driver's eyes.
7. image outside vehicle output-controlling device according to claim 4, is characterized in that,
As described mark, described angle of visibility calculating part also utilizes the mark of the left end being pasted onto described both sides spectacle-frame and is pasted onto the mark of right-hand end of described both sides spectacle-frame.
8. image outside vehicle output-controlling device according to claim 1, is characterized in that,
Described object output Computer image genration is stereo-picture by described object images generating unit, after extracting the image of at least both direction, is panoramic picture by described object output Computer image genration from the image of described outside vehicle.
9. image outside vehicle output-controlling device according to claim 1, is characterized in that,
If described object images generating unit be judged as described angle of visibility with the front of vehicle for benchmark and to the left direction exceed 90 degree or to the right direction exceed 90 degree, then generate described object output image.
10. an image outside vehicle output control method, is characterized in that, comprising:
Obtain the step of the image of outside vehicle;
The mark that the wearable device utilizing chaufeur to dress is pasted and calculate the step of the angle of visibility of described chaufeur;
Based on described angle of visibility, from the image for described outside vehicle, extract the image at least one direction and be generated as the step of object output image; And
Described object output image is shown to the step of described wearable device.
11. image outside vehicle output control methods according to claim 10, is characterized in that,
The step of described calculating and the step of described display, utilize the equipment of glasses form as described wearable device.
12. image outside vehicle output control methods according to claim 11, is characterized in that,
The step of described calculating, utilize be pasted onto the upper end of both sides spectacle-frame mark and be pasted onto described both sides spectacle-frame bottom mark and as described mark.
13. image outside vehicle output control methods according to claim 10, is characterized in that,
The step of described calculating, utilizes described mark, with the front of described vehicle for benchmark, within left side and 90 degree, right side scope, and the motion of real-time tracing driver's eyes.
14. image outside vehicle output control methods according to claim 10, is characterized in that,
Described object output Computer image genration is stereo-picture by the step of described generation, after extracting the image of at least both direction, is panoramic picture by described object output Computer image genration in the image to described outside vehicle.
15. image outside vehicle output control methods according to claim 10, is characterized in that,
The step of described generation, if be judged as described angle of visibility with the front of vehicle for benchmark and to the left direction exceed 90 degree or to the right direction exceed 90 degree, then generate described object output image.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020140142393A KR102270577B1 (en) | 2014-10-21 | 2014-10-21 | Apparatus and method for controlling outputting external image of vehicle |
KR10-2014-0142393 | 2014-10-21 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105522971A true CN105522971A (en) | 2016-04-27 |
CN105522971B CN105522971B (en) | 2018-02-23 |
Family
ID=55765621
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510684490.6A Active CN105522971B (en) | 2014-10-21 | 2015-10-20 | Image outside vehicle output-controlling device and method |
Country Status (2)
Country | Link |
---|---|
KR (1) | KR102270577B1 (en) |
CN (1) | CN105522971B (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105974586A (en) * | 2016-05-12 | 2016-09-28 | 上海擎感智能科技有限公司 | Intelligent glasses and operating method and system therefor |
CN106339980A (en) * | 2016-08-22 | 2017-01-18 | 乐视控股(北京)有限公司 | Automobile-based VR display device and method and automobile |
CN107009962A (en) * | 2017-02-23 | 2017-08-04 | 杭州电子科技大学 | A kind of panorama observation procedure based on gesture recognition |
CN107745713A (en) * | 2017-09-25 | 2018-03-02 | 南京律智诚专利技术开发有限公司 | A kind of method of work of the vehicle automatic running system based on environmental analysis |
CN109624857A (en) * | 2018-11-30 | 2019-04-16 | 沈阳工业大学 | Support the night vision automobile outer rear-view mirror and implementation method of lane change early warning |
CN111169382A (en) * | 2018-11-13 | 2020-05-19 | 丰田自动车株式会社 | Driving support device, driving support system, driving support method, and program |
JP7548196B2 (en) | 2021-11-22 | 2024-09-10 | トヨタ自動車株式会社 | Image Display System |
JP7556344B2 (en) | 2021-11-22 | 2024-09-26 | トヨタ自動車株式会社 | Image Display System |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108556738A (en) * | 2018-03-30 | 2018-09-21 | 深圳市元征科技股份有限公司 | The display device and method of automobile A-column blind area |
CN112884941A (en) * | 2021-01-19 | 2021-06-01 | 中国人民解放军32212部队 | A on-vehicle information acquisition system for tank armoured vehicle combat test |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1890128A (en) * | 2003-12-01 | 2007-01-03 | 沃尔沃技术公司 | Method and system for supporting path control |
CN101277432A (en) * | 2007-03-26 | 2008-10-01 | 爱信艾达株式会社 | Driving support method and driving support apparatus |
CN102998797A (en) * | 2011-09-07 | 2013-03-27 | 奥迪股份公司 | Method used for providing display in motor vehicle according to visual direction of vehicle driver |
WO2013136740A1 (en) * | 2012-03-14 | 2013-09-19 | 株式会社デンソー | Driving assistance device and driving assistance method |
CN103358996A (en) * | 2013-08-13 | 2013-10-23 | 吉林大学 | Automobile A pillar perspective vehicle-mounted display device |
CN103802728A (en) * | 2013-11-08 | 2014-05-21 | 广东好帮手电子科技股份有限公司 | Driving assistance system and method based on head-mounted display and used for automatic scene selection for display |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5483027B2 (en) * | 2011-02-04 | 2014-05-07 | 国際航業株式会社 | 3D image measurement method and 3D image measurement apparatus |
KR101332253B1 (en) * | 2011-12-13 | 2013-12-02 | 현대자동차주식회사 | Display Appratus for Doesn't Making A Blind Zone |
KR20140054926A (en) * | 2012-10-30 | 2014-05-09 | 현대모비스 주식회사 | Display apparatus for rear side view of vehicle and vehicle having the same |
KR20140079947A (en) * | 2012-12-20 | 2014-06-30 | 한국전자통신연구원 | Video recording apparatus for a vehicle and Method of video recording for a vehicle |
-
2014
- 2014-10-21 KR KR1020140142393A patent/KR102270577B1/en active IP Right Grant
-
2015
- 2015-10-20 CN CN201510684490.6A patent/CN105522971B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1890128A (en) * | 2003-12-01 | 2007-01-03 | 沃尔沃技术公司 | Method and system for supporting path control |
CN101277432A (en) * | 2007-03-26 | 2008-10-01 | 爱信艾达株式会社 | Driving support method and driving support apparatus |
CN102998797A (en) * | 2011-09-07 | 2013-03-27 | 奥迪股份公司 | Method used for providing display in motor vehicle according to visual direction of vehicle driver |
WO2013136740A1 (en) * | 2012-03-14 | 2013-09-19 | 株式会社デンソー | Driving assistance device and driving assistance method |
CN103358996A (en) * | 2013-08-13 | 2013-10-23 | 吉林大学 | Automobile A pillar perspective vehicle-mounted display device |
CN103802728A (en) * | 2013-11-08 | 2014-05-21 | 广东好帮手电子科技股份有限公司 | Driving assistance system and method based on head-mounted display and used for automatic scene selection for display |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105974586A (en) * | 2016-05-12 | 2016-09-28 | 上海擎感智能科技有限公司 | Intelligent glasses and operating method and system therefor |
CN106339980A (en) * | 2016-08-22 | 2017-01-18 | 乐视控股(北京)有限公司 | Automobile-based VR display device and method and automobile |
CN107009962A (en) * | 2017-02-23 | 2017-08-04 | 杭州电子科技大学 | A kind of panorama observation procedure based on gesture recognition |
CN107009962B (en) * | 2017-02-23 | 2019-05-14 | 杭州电子科技大学 | A kind of panorama observation method based on gesture recognition |
CN107745713A (en) * | 2017-09-25 | 2018-03-02 | 南京律智诚专利技术开发有限公司 | A kind of method of work of the vehicle automatic running system based on environmental analysis |
CN111169382A (en) * | 2018-11-13 | 2020-05-19 | 丰田自动车株式会社 | Driving support device, driving support system, driving support method, and program |
CN111169382B (en) * | 2018-11-13 | 2024-01-09 | 丰田自动车株式会社 | Driving support device, driving support system, driving support method, and program |
CN109624857A (en) * | 2018-11-30 | 2019-04-16 | 沈阳工业大学 | Support the night vision automobile outer rear-view mirror and implementation method of lane change early warning |
CN109624857B (en) * | 2018-11-30 | 2022-03-29 | 沈阳工业大学 | Night-vision automobile outside rearview mirror supporting lane change early warning and implementation method |
JP7548196B2 (en) | 2021-11-22 | 2024-09-10 | トヨタ自動車株式会社 | Image Display System |
JP7556344B2 (en) | 2021-11-22 | 2024-09-26 | トヨタ自動車株式会社 | Image Display System |
Also Published As
Publication number | Publication date |
---|---|
KR102270577B1 (en) | 2021-06-29 |
CN105522971B (en) | 2018-02-23 |
KR20160046480A (en) | 2016-04-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105522971A (en) | Apparatus and method for controlling outputting of external image of vehicle | |
JP6659924B2 (en) | Adjusting the presentation of the head-mounted display | |
US11176704B2 (en) | Object pose estimation in visual data | |
US11783443B2 (en) | Extraction of standardized images from a single view or multi-view capture | |
CN103262127B (en) | Object display device and object display method | |
CN106066701B (en) | A kind of AR and VR data processing equipment and method | |
CN109074681A (en) | Information processing unit, information processing method and program | |
CN103871045B (en) | Display system and method | |
US20200258309A1 (en) | Live in-camera overlays | |
CN109118532B (en) | Visual field depth estimation method, device, equipment and storage medium | |
CN109727271A (en) | Method and apparatus for tracking object | |
JP2005268847A (en) | Image generating apparatus, image generating method, and image generating program | |
CN105210113A (en) | Monocular visual SLAM with general and panorama camera movements | |
KR20140080720A (en) | Augmented Reality imaging based sightseeing guide apparatus | |
CN104864849B (en) | Vision navigation method and device and robot | |
CN105210009A (en) | Display control device, display control method, and recording medium | |
CN106031166B (en) | Vehicle-surroundings image display device and vehicle-surroundings image display method | |
CN108510528A (en) | A kind of method and device of visible light and infrared image registration fusion | |
JP6061334B2 (en) | AR system using optical see-through HMD | |
CN114463832B (en) | Point cloud-based traffic scene line of sight tracking method and system | |
EP3038061A1 (en) | Apparatus and method to display augmented reality data | |
CN112513784B (en) | Data glasses for vehicles with automatic hiding display content | |
CN113033426A (en) | Dynamic object labeling method, device, equipment and storage medium | |
CN102740100A (en) | Display control device, display control method, and program | |
US20220114748A1 (en) | System and Method for Capturing a Spatial Orientation of a Wearable Device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |