CN102138163B - Bird's-eye image forming device, bird's-eye image forming method - Google Patents

Bird's-eye image forming device, bird's-eye image forming method Download PDF

Info

Publication number
CN102138163B
CN102138163B CN200980133675.0A CN200980133675A CN102138163B CN 102138163 B CN102138163 B CN 102138163B CN 200980133675 A CN200980133675 A CN 200980133675A CN 102138163 B CN102138163 B CN 102138163B
Authority
CN
China
Prior art keywords
point group
point
orthograph
eye view
cloud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN200980133675.0A
Other languages
Chinese (zh)
Other versions
CN102138163A (en
Inventor
吉田光伸
宫雅一
岛嘉宏
泷口纯一
黑崎隆二郎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Publication of CN102138163A publication Critical patent/CN102138163A/en
Application granted granted Critical
Publication of CN102138163B publication Critical patent/CN102138163B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/08Volume rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/30Clipping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/12Relief maps

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Remote Sensing (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Processing Or Creating Images (AREA)
  • Instructional Devices (AREA)
  • Image Processing (AREA)
  • Navigation (AREA)
  • Image Analysis (AREA)

Abstract

Disclosed are a bird's-eye image forming device, a bird's-eye image forming method and a bird's-eye image forming program for acquiring a road image, from which a shield such as a tree or tunnel is removed. A distance azimuth point-group (291), a camera image (292), a GPS-observed information (293), a gyro-measured value (294) and an odometry-measured value (295) are acquired while a target area being traveled by a vehicle carrying a mover measuring device (200). A position-posture orientation device (300) orients the position and posture of the vehicle on the basis of the GPS-observed information (293), the gyro-measured value (294) and the odometry-measured value (295). A point-cloud creation device (400) creates a point cloud (491) on the basis of the camera image (292), the distance azimuth point-group (291) and a position-posture orientation value (391). A point-group ortho-image creation device (100) removes points higher than a road surface from the point cloud (491) to thereby extract only points near the road surface, and projects only the individual extracted points orthogonally on a horizontal plane to thereby create a point-group ortho image (191). The point-group ortho image (191) represents the road surface unshielded.

Description

Eye view image generating apparatus and eye view image generation method
Technical field
The present invention relates to eye view image generating apparatus and the eye view image generation method of using colored laser spots all living creatures to become road orthograph picture.
Background technology
From the expression by laser scanner instrumentation, apart from the laser point group in orientation, restore 3 dimension three-dimensional shapes of instrumentation thing.Because the more three-dimensional shapes of laser spots are more correct, so obtain a large amount of laser spots.
, because the object also comprising in laser point group beyond Recovery object carries out the point group of instrumentation, so need to extract the laser point group that Recovery object has been carried out to instrumentation from a large amount of laser spots.
Always, by following method, carry out the extraction of laser point group.
Look into the distance laser point group to (1) 3 dimension, with the necessary point of visual extraction.
(2), at the overlapping demonstration laser of camera image point group, the differentiation of auxiliary object thing, with the necessary point of visual extraction.
In method (1), for example, there is following problem.
(A) need to specify bit by bit the laser spots of extraction.
(B) can not be directly in CAD(Computer Aided Design, computer-aided design (CAD)) in utilize the laser point group extracting.
In method (2), for example, there is following problem.
(A) can only differentiate object from the laser point group of the visual field direction list view at video camera.
(B) select the operation of applicable camera image to expend time.
(C) be difficult to assurance and differentiating the object that is positioned at which place.
Above-mentioned method needs bit by bit with the necessary point of visual extraction, substantially expends time in.Although also in the exploitation of carrying out automatic identification technology, the object that can identify has limitation, and discrimination is also insufficient, need to correct with visual.
Prior art document
Patent documentation
Patent documentation 1: TOHKEMY 2007-218705 communique.
Summary of the invention
The problem that invention will solve
The object of the invention is to, from the laser spots of the greatest amount that obtains, get rid of unwanted point, only extract efficiently the laser spots needing.
For solving the scheme of problem
Eye view image generating apparatus of the present invention, use the three-dimensional point all living creatures of the three-dimensional coordinate that represents ground each place to become ground eye view image, wherein, possess: three-dimensional point group Projection Division, the three-dimensional coordinate that each point based on described three-dimensional point group represents, use CPU(Central Processing Unit, central processing unit) each point of described three-dimensional point group is projected to plane generate ground eye view image.
Described eye view image generating apparatus, possess: specified altitude point group extraction unit, the three-dimensional coordinate that each point based on described three-dimensional point group represents, use CPU to extract the each point that represents the height in specialized range as specified altitude point group from described three-dimensional point group, the three-dimensional coordinate that the each point of the specified altitude point group of described three-dimensional point group Projection Division based on extracting from described three-dimensional point group by described specified altitude point group extraction unit represents, use CPU that the each point of described specified altitude point group is projected to plane, generate eye view image.
Described eye view image generating apparatus possesses: dot density calculating part, the dot density of the each point of the described three-dimensional point group of described plane will be projected to by described three-dimensional point group Projection Division, each of region after dividing according to plane described in the large young pathbreaker with regulation, is used CPU to calculate; Setting arranges thing and refers in particular to bonding part, based on the dot density of calculating by described dot density calculating part, specifies to show erect the image section that thing is set from described eye view image use CPU; And erect the thing difference portion that arranges, thing refers in particular to the specially appointed image section in bonding part and other image section is distinguished the eye view image representing by arranging by described setting to use CPU generation.
The effect of invention
According to the present invention, can for example not need visual and extract and represent the laser point group (specified altitude point group) of road surface, generate the eye view image of the road covering of having removed tunnel, trees.
In addition, can for example not need setting visual and extraction expression electric pole etc. that the laser point group of thing is set, generate the eye view image that setting is arranged to thing and road surface difference expression.
Accompanying drawing explanation
Fig. 1 is that the point group orthograph of embodiment 1 is as the structural drawing of generation system 800.
Fig. 2 is the figure that represents the outward appearance of the moving body measuring device 200 of embodiment 1.
Fig. 3 is the point group orthograph that represents embodiment 1 figure as an example of the hardware resource of generating apparatus 100.
Fig. 4 is the process flow diagram that represents the point group orthograph picture generation method of embodiment 1.
Fig. 5 is the mileage chart of moving body measuring device 200 region (object region) of travelling.
Fig. 6 is that the point group orthograph of object region (Fig. 5) is as 191.
Fig. 7 is an example of the eye view image that represents with a cloud 491.
Fig. 8 is an example of the eye view image that represents with a cloud 491.
Fig. 9 is that the point group orthograph of embodiment 2 is as the structural drawing of generating apparatus 100.
Figure 10 is that the point group orthograph picture that represents embodiment 2 generates the process flow diagram of processing (S140).
Figure 11 be the height from floor level 139a be specified altitude point group 129a below 50cm by orthogonal projection object region b(Fig. 6) point group orthograph as 191.
Figure 12 be the height from floor level 139a be specified altitude point group 129a below 50cm by orthogonal projection object region b(Fig. 6) point group orthograph as 191.
Figure 13 is that the point group orthograph of embodiment 3 is as the structural drawing of generating apparatus 100.
Figure 14 is that the point group orthograph picture that represents embodiment 3 generates the process flow diagram of processing (S140).
Figure 15 be the height from floor level 139a be more than 50cm specified altitude point group 129a by orthogonal projection object region b(Fig. 6) point group orthograph as 191.
Figure 16 is the part expanded view of object region b.
Figure 17 be the height from floor level 139a be more than 50cm specified altitude point group 129a by orthogonal projection object region a(Fig. 6) point group orthograph as 191.
Figure 18 is the part expanded view of object region a.
Figure 19 represents embodiment 4(embodiment 1) the figure of the method that specifies of floor level 139a.
Figure 20 represents embodiment 4(embodiment 2) the figure of the method that specifies of floor level 139a.
Figure 21 represents embodiment 4(embodiment 2) the figure of the method that specifies of curb point group.
Figure 22 is the figure that represents the picture that shows the image that has a cloud 491, and this cloud 491 represents the curb of road and road left and right.
Figure 23 represents embodiment 4(embodiment 2) curb point group specify the process flow diagram of method.
Figure 24 represents by embodiment 4(embodiment 2) curb point group specify method and the figure of specially appointed curb point group.
Figure 25 represents embodiment 4(embodiment 3(1)) the figure of the method that specifies of floor level 139a.
Figure 26 represents embodiment 4(embodiment 3(2)) the figure of the method that specifies of floor level 139a.
Figure 27 is the structural drawing of the map datum generation system 801 of embodiment 5.
Embodiment
Embodiment 1
In embodiment 1, for the three-dimensional point all living creatures who uses the three-dimensional coordinate that represents ground each place, become the eye view image generating apparatus of ground eye view image to describe.
Fig. 1 is that the point group orthograph of embodiment 1 is as the structural drawing of generation system 800.
Point group orthograph for embodiment 1 is describing based on Fig. 1 below as the structure of generation system 800.
Point group orthograph has as generation system 800: moving body measuring device 200; Position and attitude caliberating device 300; Point cloud generating apparatus 400; And point group orthograph is as generating apparatus 100.
Moving body measuring device 200 is to possess laser scanner 210, video camera 220, GPS receiver 230, gyroscope 240 and velometer 250 moving bodys (for example, vehicle, aircraft).
Moving body measuring device 200 one side on the ground (or overhead) are mobile, obtain the basic various instrumentation data that become three-dimensional point group on one side.
Laser scanner 210 is towards each place irradiating laser, and the laser returning being positioned at the clutter reflections in each place is observed.And the direction of illumination of laser scanner 210 based on laser carries out instrumentation to the residing orientation of atural object, and the time instrumentation till the observation that is irradiated to reflection laser based on from laser is to the distance of atural object.
Laser scanner 210 is also referred to as laser radar, laser range finder (LRF).
Below, by representing, by the distance of the atural object to each place and the point group data of orientation and Ear Mucosa Treated by He Ne Laser Irradiation direction of laser scanner 210 instrumentations, be called " apart from orientation point group 291 ".
In the instrumentation that video camera 220 carries out apart from orientation point group 291 at laser scanner 210, from the instrumentation place (when laser scanner 210 carries out optically focused observation, the residing place of moving body measuring device 200) of laser scanner 210, atural object is photographed.
Below, the view data photography by video camera 220 being obtained is called camera image 292.
In the instrumentation that GPS receiver 230 carries out apart from orientation point group 291 at laser scanner 210, the positioning signal sending from multiple gps satellites (GPS:Global Positioning System, GPS) is observed.And GPS receiver 230 is obtained: the information of the phase place of the boat method message that positioning signal represents, the carrier wave of transmission positioning signal, the pseudo-distance of the distance of the expression that the transmission time based on positioning signal calculates from GPS receiver 230 to gps satellite and the positioning result calculating based on pseudo-distance etc.
Below, the information obtaining by GPS receiver 230 is called to " GPS observation information 293 ".
In the instrumentation that gyroscope 240 carries out apart from orientation point group 291 at laser scanner 210, the angular velocity of 3 direction of principal axis (yaw, trim, heel) to moving body measuring device 200 carries out instrumentation.
Below, the 3 axial angular velocity by gyroscope 240 instrumentations are called to " gyroscope instrumentation value 294 ".
In the instrumentation that velometer 250 carries out apart from orientation point group 291 at laser scanner 210, the velocity variable of moving body measuring device 200 is carried out to instrumentation.
Below, the velocity variable of Negotiation speed table 250 instrumentation is called to " velometer instrumentation value 295 ".
Measuring device storage part 290 is stored: apart from orientation point group 291, camera image 292, GPS observation information 293, gyroscope instrumentation value 294 and velometer instrumentation value 295.
Apart from orientation point group 291, camera image 292, GPS observation information 293, gyroscope instrumentation value 294 and velometer instrumentation value 295, respectively shown with the instrumentation moment, be mapped with the instrumentation moment.
Fig. 2 is the figure that represents the outward appearance of the moving body measuring device 200 of embodiment 1.
For example, moving body measuring device 200 forms with vehicle 202 as shown in Figure 2.
Laser scanner 210, video camera 220, GPS receiver 230 and gyroscope 240 are fixed on the top board 201 at the top that is arranged at vehicle 202 and arrange.Velometer 250 is arranged on vehicle 202.The front that the position of laser scanner 210 and video camera 220 is for example arranged on vehicle 202 also can, also can in the rear that is arranged on vehicle 202.
Vehicle 202 travels in the road of instrumentation object region.
Laser scanner 210 is arranged on the rear of vehicle 202, carries out yaw on one side at the Width (x direction of principal axis) of vehicle 202 with 240 ° of degree, on one side towards the rear of vehicle 202 and the side irradiating laser of vehicle 202.And laser scanner 210 is observed the laser that is positioned at the clutter reflections of the rear of vehicle 202 and the side of vehicle 202 and returns, obtain atural object to being positioned at instrumentation object region and carried out the distance orientation point group 291 of instrumentation.
Video camera 220 is arranged on the front of vehicle 202, repeatedly carries out the photography of the direct of travel (z direction of principal axis) of vehicle 202, obtains the camera image 292 of instrumentation object region.
GPS receiver 230 arranges at 3 places of top board 201, according to the positioning signal receiving from gps satellite respectively, obtains GPS observation information 293.
The angular velocity of the x, y, z axle of gyroscope 240 to vehicle 202 carries out instrumentation, obtains gyroscope instrumentation value 294.
The rotation number of velometer (otometer) 250 instrumentation tires is carried out instrumentation to the velocity variable of moving body measuring device 200, obtains velometer instrumentation value 295.
In Fig. 2, some O represents the coordinate center (following, to be called aerial survey reference point) of moving body measuring device 200, and the coordinate of moving body measuring device 200 means the coordinate of O.(following to the displacement of some O to laser scanner 210, video camera 220, GPS receiver 230 and gyroscope 240 in advance, be called skew) carry out instrumentation, coordinate at an O adds skew, can ask for the coordinate of laser scanner 210, video camera 220, GPS receiver 230 and gyroscope 240.
Below, suppose that the coordinate of laser scanner 210, video camera 220, GPS receiver 230 and gyroscope 240 is consistent with some O, equate to describe with the coordinate of moving body measuring device 200.
In addition, suppose that the direction of visual lines of video camera 220 equates with the attitude angle of moving body measuring device 200.
In Fig. 1, position and attitude caliberating device 300 possesses: position and attitude demarcation portion 310 and caliberating device storage part 390, the position and attitude while calculating the instrumentation of moving body measuring device 200.
GPS observation information 293, gyroscope instrumentation value 294 and the velometer instrumentation value 295 of position and attitude demarcation portion 310 based on obtaining by moving body measuring device 200, use CPU(Central Proceessing Unit, central processing unit) position (latitude, longitude, height [highly]) (East, North, Up) and attitude angle (yaw angle while calculating the instrumentation of moving body measuring device 200, trim angle, heeling angle).
For example, the position of position and attitude demarcation portion 310 using the positioning result comprising in GPS observation information 293 as moving body measuring device 200.
In addition for example phase calculation pseudo-distance of the carrier wave of position and attitude demarcation portion 310 based on comprising in GPS observation information 293, the position of calculating moving body measuring device 200 based on the pseudo-distance calculating.
In addition for example, position and the attitude angle of moving body measuring device 200, based on gyroscope instrumentation value 294 and velometer instrumentation value 295, calculated by position and attitude demarcation portion 310 by dead reckoning.Dead reckoning be angular velocity to attitude angle and translational speed carry out integration ask for from the past sometime variable quantity, position in the past and attitude angle add that variable quantity asks for present position and the method for attitude angle.
Below, position and the attitude angle of the moving body measuring device 200 calculating by position and attitude demarcation portion 310 are called to " position and attitude calibration value 391 ".Position and attitude calibration value 391 represents position and the attitude angle of the moving body measuring device 200 in each moment.
Caliberating device storage part 390 memory location attitude calibration values 391.
Point cloud generating apparatus 400 possesses: three-dimensional point group's generating unit 410; Point cloud generating unit 420 and some cloud generating apparatus storage part 490, to representing that the three-dimensional coordinate in ground each place and the three-dimensional point group of color generate.
The distance orientation point group 291 of three-dimensional point group's generating unit 410 based on obtaining by moving body measuring device 200 and the position and attitude calibration value 391 calculating by position and attitude caliberating device 300, used CPU generating three-dimensional point group 419a.At this moment, three-dimensional point group's generating unit 410 is for the each point apart from orientation point group 291, from position and attitude calibration value 391, extract the position and attitude of the moving body measuring device 200 in the instrumentation moment of each point, the three-dimensional coordinate in the place in the distance orientation that each point represents is left in calculating from the position and attitude extracting, generate the three-dimensional point group 419a representing apart from the three-dimensional coordinate of the each point of orientation point group 291.
The three-dimensional point group 419a of some cloud generating unit 420 based on generating by three-dimensional point group generating unit 410 and the camera image 292 of obtaining by moving body measuring device 200, used CPU to generate a some cloud 491.In a cloud 491, each point represents three-dimensional coordinate and color, so be also referred to as colored laser point group.
At this moment, some cloud generating unit 420 will be calculated as the image pickup plane of video camera 220 with the orthogonal plane of direction of making a video recording place from photography place to shooting direction (direction of visual lines of video camera 220) that leave focal length from.Image pickup plane equates with the plane of camera image 292.And the three-dimensional coordinate that the each point of some cloud generating unit 420 based on three-dimensional point group 419a represents projects to camera image 292(image pickup plane by the each point of three-dimensional point group 419a), the color of the pixel of projection place in camera image 292 is as the color of this point.
Point cloud generating apparatus storage part 490 memory point clouds 491.
Point group orthograph is as an example of generating apparatus 100(eye view image generating apparatus) possess: some cloud Projection Division 110 and video generation device storage part 190, used the eye view image of some cloud 491 formation object regions.
The point cloud 491 generating by a cloud generating apparatus 400 is used in some cloud Projection Division 110, uses the eye view image of CPU formation object region.At this moment, the surface level corresponding with the lat/lon of object region calculated in some cloud Projection Division 110, and at the surface level calculating, by the each point of a cloud 491, the three-dimensional coordinate based on separately carries out orthogonal projection., as orthogonal projection, process the three-dimensional coordinate of the each point of a cloud 491 (x, y, z) some cloud Projection Division 110 as " z(height)=0 ", with surface level corresponding to two-dimensional coordinate (x, y) in part in configure each point.
For example, some cloud Projection Division 110 suppose from the viewpoint of the regulation in overhead, object region under to carrying out video camera photography, calculate image pickup plane, at the image pickup plane calculating, the each point of a cloud 491 is carried out to orthogonal projection.The viewpoint of regulation, using the height of the lat/lon at the center of instrumentation region and regulation as three-dimensional coordinate, is carried out projection by the each point of a cloud 491 to the part in the consistent image pickup plane of lat/lon.
In orthogonal projection in the surface level of each point of some cloud 491, represent to observe in the vertical direction from overhead the image of instrumentation region downwards.
Below, by orthogonal projection the Bitmap image that represents in the surface level of each point of some cloud 491 be called " point group orthograph is as an example of 191(eye view image) ".
But, orthogonal projection the plane of each point of some cloud 491 to be not limited to be surface level, can be also the plane being inclined relative to horizontal.In this case, in orthogonal projection in the surface level of each point of some cloud 491, represent to observe at vergence direction from overhead the image (example of eye view image) of instrumentation region downwards.
In addition, it is orthogonal projection that the projection of the some cloud 491 that carry out some cloud Projection Division 110 is not limited to, for example, can be also central projection.
Fig. 3 is the point group orthograph that represents embodiment 1 figure as an example of the hardware resource of generating apparatus 100.
In Fig. 3, point group orthograph possesses as generating apparatus 100: the CPU911(Central Processing Unit of executive routine, and also referred to as central processing unit, treating apparatus, arithmetic unit, microprocessor, microcomputer, processor).CPU911 is via bus 912 and ROM913, RAM914, communication port 915, display device 901, keyboard 902, mouse 903, FDD904(Flexible Disk Drive, floppy disk), CDD905(compact disk device), print apparatus 906, scanner device 907, disk set 920 connect, and controls these hardware devices.Replacing disk set 920, can be also the memory storage of optical disc apparatus, memory card read-write equipment etc.
RAM914 is an example of volatile memory.The storage medium of ROM913, FDD904, CDD905, disk set 920 is examples for nonvolatile memory.They are examples for memory device, memory storage or storage part.
Communication port 915, keyboard 902, scanister 907, FDD904 etc. are examples of input equipment, input media or input part.
In addition, communication port 915, display device 901, print apparatus 906 etc. are examples of output device, output unit or efferent.
Communication port 915 is with wired or wireless, with LAN(Local Area Network, LAN (Local Area Network)), the WAN(wide area network of the Internet, ISDN etc.), the communication network of telephone line etc. connects.
In disk set 920, store OS921(operating system), window system 922, package 923, file group 924.The program of package 923 is carried out by CPU911, OS921, window system 922.
In said procedure group 923, store the program of carrying out the function illustrating as " ~ portion " in the explanation of embodiment.Program is read and is performed by CPU911.
In file group 924, projects as " ~ file ", " ~ database " store: the result data of " ~ result of determination " while carrying out the function of " ~ portion " of explanation of embodiment, " ~ result for retrieval ", " ~ result " etc., the data, other information, signal value, variate-value, the parameter that between the program of function of carrying out " ~ portion ", join.
" ~ file ", " ~ database " are stored in the recording medium of dish or storer etc.Canned data, data, signal value, variate-value, parameter in the storage medium of dish, storer etc., via read/write circuit, by CPU911, be read out to primary memory, cache memory, in the work of the CPU of extraction/retrieval/reference/comparison/computing/calculating/processing/output/printing/demonstration etc., use.During the work of the CPU of extraction/retrieval/reference/comparison/computing/calculating/processing/output/printing/demonstration etc., information, data, signal value, variate-value, parameter are temporarily stored in primary memory, cache memory, memory buffer.
In addition, the part of the arrow of the process flow diagram illustrating in the explanation of embodiment mainly represents the input and output of data, signal, data, signal value are recorded in storer, the floppy disk of FDD904, the compact disk of CDD905, the disk of disk set 920, other CD, mini-disk, the DVD(Digital Versatile Disk of RAM914, digital versatile disc) etc. recording medium in.In addition, data, signal value are transmitted online by other the transmission medium such as bus 912, signal wire, cable.
In addition, what in the explanation of embodiment, as " ~ portion ", illustrate can be also " ~ circuit ", " ~ device ", " ~ equipment ", " ~ unit ", can be also " ~ step ", " ~ process ", " ~ processing " in addition.That is the firmware needing only to store in ROM913, illustrating as " ~ portion " is realized.Or, be only software, or be only the hardware of elements/devices/substrate/wiring etc., or the combination of software and hardware, can also be to implement also can with the combination of firmware.Firmware and software are stored in the recording medium of disk, floppy disk, CD, compact disk, mini-disk, DVD etc. as program.Program is read by CPU911, by CPU911, is performed.That is, program makes computing machine as " ~ portion " performance function.Or, make computing machine carry out process, the method for " ~ portion ".
Moving body measuring device 200, position and attitude caliberating device 300 and some cloud generating apparatus 400 also with point group orthograph as generating apparatus 100 similarly, possess CPU, storer, carry out the function illustrating as " ~ portion ".
Fig. 4 is the process flow diagram that represents the point group orthograph picture generation method of embodiment 1.
Point group orthograph picture generation method for the point group orthograph of embodiment 1 as generation system 800, is describing below based on Fig. 4.
Moving body measuring device 200, position and attitude caliberating device 300, some cloud generating apparatus 400, point group orthograph are used CPU to carry out the processing of following explanation as generating apparatus 100 and " ~ portion " separately.
<S110: process > apart from orientation point group instrumentation
First, with the vehicle 202 of moving body measuring device 200, in subject area, travel.
In the travelling of object region, laser scanner 210, video camera 220, GPS receiver 230, gyroscope 240 and velometer 250 that moving body measuring device 200 possesses carry out respectively instrumentation, obtain apart from orientation point group 291, camera image 292, GPS observation information 293, gyroscope instrumentation value 294 and velometer instrumentation value 295.
<S120: position and attitude is demarcated and processed >
Then, GPS observation information 293, gyroscope instrumentation value 294 and the velometer instrumentation value 295 of the position and attitude demarcation portion 310 of position and attitude caliberating device 300 based on obtaining in S110, calculating location attitude calibration value 391.
Position and attitude calibration value 391 is illustrated in three-dimensional coordinate and the three-dimension altitude angle in each moment of the moving body measuring device 200 travelling in subject area.
<S130: some cloud generates processes >
Then, the distance orientation point group 291 of three-dimensional point group's generating unit 410 of some cloud generating apparatus 400 based on obtaining at S110 and the position and attitude calibration value 391 calculating in S120, generating three-dimensional point group 419a.Then, the some cloud generating unit 420 of some cloud generating apparatus 400, based on three-dimensional point group 419a and the camera image 292 obtained at S110, generates a some cloud 491.
Three-dimensional point group 419a represents the three-dimensional coordinate apart from the each point of orientation point group 291.The each point of three-dimensional coordinate 419a is with corresponding apart from the each point of orientation point group 291.
Three-dimensional point group's generating unit 410 is extracted the position and attitude apart from the moving body measuring device 200 in instrumentation moment of the each point of orientation point group 291 from position and attitude calibration value 391, and the three-dimensional coordinate in the place of leaving the distance orientation that each point represents from the position and attitude extracting is calculated as the three-dimensional coordinate of each point.
Point cloud 491 represents three-dimensional coordinate and the color of the each point of three-dimensional point group 419a.The each point and three-dimensional point group 419a and corresponding apart from the each point of orientation point group 291 of some cloud 491.
By the each point of three-dimensional point group 419a, the three-dimensional coordinate based on separately projects to camera image 292 to some cloud generating unit 420, the color using the color of the pixel of projection place as each point.
But some cloud 491 is not colored and also can by a cloud generating unit 40.In addition, the some cloud 491 expression monochrome informatio (gray scale) corresponding with the reflecting brightness of the laser of observation also can.For example, some cloud generating unit 420 is by the lower each point that arrives some cloud 491 with regard to more black color settings of higher reflecting brightness of laser just reflecting brightness whiter, laser.
<S140: point group orthograph picture generates processes >
Then, point group orthograph is used the some cloud 491 generating in S130 to generate point group orthograph as 191 as the some cloud Projection Division 110 of generating apparatus 100.
Point group orthograph as 191 expressions from overhead the downward image of object of observation region in the vertical direction.
Using a cloud Projection Division 110 in the surface level orthogonal projection corresponding with the lat/lon of object region the image of each point of some cloud 491 as the point group orthograph of object region as 191.
But, orthogonal projection the plane of each point of some cloud 491 to be not limited to be surface level, can be also the plane being inclined relative to horizontal.
In addition, it is orthogonal projection that the projection of the some cloud 491 that carry out some cloud Projection Division 110 is not limited to, for example, can be also central projection.
Below, represent that point group orthograph by the point group orthograph picture method of generationing (S110 ~ S140) generation is as an example of 191.
Fig. 5 represents the mileage chart of the region (object region) that moving body measuring device 200 travels.
For example, suppose in apart from orientation point group instrumentation processing (S110), in the region that moving body measuring device 200 represents at Fig. 5, travel while carry out instrumentation, obtain apart from orientation point group 291, camera image 292, GPS observation information 293, gyroscope instrumentation value 294 and velometer instrumentation value 295.
The point group orthograph of Fig. 6 indicated object region (Fig. 5) is as 191.
At point group orthograph picture, generate and process in (S140), some cloud Projection Division 110 is by a cloud 491 being carried out to orthogonal projection, and acquisition point group orthograph is as shown in Figure 6 as 191.
As shown in Figure 6, point group orthograph is consistent with the mileage chart shown in Fig. 5 as 191.Point group orthograph can be corresponding to the position and attitude stated accuracy of the instrumentation precision of moving body measuring device 200, position and attitude caliberating device 300, with the road of high precision indicated object region as 191.
Fig. 7, Fig. 8 are examples of the eye view image that represents with a cloud 491, be will be different respectively the image that expands of point of crossing part.
At point group orthograph picture, generate in processing (S140), in the plane being inclined relative to horizontal projection point cloud 491 (or point group orthograph as 191 by image process with respect to horizontal rotational shaft and shown), can obtain the eye view image that Fig. 7, Fig. 8 represent.
As shown in Figure 7, Figure 8, the eye view image obtaining by the projection of a cloud 491 can represent the various atural objects of the automobile of point of crossing, house, parking, cross-section pedestrian-way etc.The position of the each atural object representing in eye view image, size etc. represent with the high precision corresponding with the position and attitude stated accuracy of the instrumentation precision of moving body measuring device 200, position and attitude caliberating device 300.
Point group orthograph by plane projection point cloud 491, can obtain the image (point group orthograph as 191, eye view image etc.) of the object region of the angle that represents not photographed by video camera from reality with high precision (vertical lower, tiltedly below etc.) observation as generating apparatus 100.
In embodiment 1, for following point group orthograph, as generating apparatus 100, be illustrated.
Point group orthograph utilizes the laser point group (apart from orientation point group 291) of obtaining by moving body measuring device 200 as generating apparatus 100, detect accurately the atural object of sign, white line, pavement markers, sewer mouth, curb, electric pole, pillar, street lamp, electric wire, wall etc.
Point cloud 491 has three dimensional local information (three-dimensional coordinate) by every bit.Therefore, point group orthograph can be arranged a cloud 491 as generating apparatus 100 and make the image of looking into the distance from any direction.Therefore, point group orthograph as generating apparatus 100 by arrangement put cloud 491 and from directly over observe, can make the point group orthograph equal with the orthograph picture of aerophoto as 191.In addition, point group orthograph as 191 compared with using the orthograph pictures of making from the camera images of vehicle photography distortion few, precision is high, angle of visibility is open.
Point group orthograph is as the atural object that shows clearly white line, curb, wall surface etc. in 191.Therefore, point group orthograph can utilize as 191 in the making of mileage chart.For example,, in CAD(Computer Aided Design, computer-aided design (CAD)) background in attach point group orthograph as 191, with line, point group orthograph is drawn and is imitated as the 191 each atural objects that represent, can make at high speed the mileage chart of present situation.And then, by image, process from point group orthograph as the each atural object of 191 extraction, can automatically make mileage chart.
At patent documentation 1(TOHKEMY 2007-218705 communique) in, a kind of various instrumentation data that go-cart obtains based on instrumentation are disclosed, calculate the position and attitude (S101 of patent documentation 1) of instrumentation go-cart, generate the method for road pavement form model (three-dimensional point group) (S106 of patent documentation 1).In addition in patent documentation 1, the method (S107 of patent documentation 1) that road pavement form model (three-dimensional point group) is projected to camera image is disclosed.
Moving body measuring device 200 is equivalent to the instrumentation go-cart of patent documentation 1, and position and attitude caliberating device 300 is equivalent to vehicle location attitude (3 axle) operational part of patent documentation 1, and some cloud generating apparatus 400 is equivalent to the road pavement form model generation portion of patent documentation 1.
Embodiment 2
In embodiment 2, for generating that road is not covered by trees, tunnel etc. and the point group orthograph that mirrors describes as 191 mode.
Below, mainly for the item different from embodiment 1, describing, about the item that description thereof is omitted, is same with embodiment 1.
Fig. 9 is that the point group orthograph of embodiment 2 is as the structural drawing of generating apparatus 100.
Point group orthograph for embodiment 2 is describing based on Fig. 9 below as the structure of generating apparatus 100.
Point group orthograph is as an example of generating apparatus 100(eye view image generating apparatus) possess: some cloud Projection Division 110; Specified altitude point group extraction unit 120; Floor level refers in particular to bonding part 130; Point group orthograph is as display part 140; Camera image display part 150; And video generation device storage part 190.
Floor level refers in particular to an example of the three-dimensional point group of the some cloud 491(of bonding part 130 based on generating by a cloud generating apparatus 400) the height (highly) of the three-dimensional coordinate that represents of each point, use CPU to specify (specify) floor level 139a.
One example of the three-dimensional point group of the some cloud 491(of specified altitude point group extraction unit 120 based on generating by a cloud generating apparatus 400) the three-dimensional coordinate that represents of each point, from a cloud 491, extract the each point that represents the height in specialized ranges.
Particularly specified altitude point group extraction unit 120 is based on refer in particular to the specially appointed floor level 139a in bonding part 130 by floor level, and the height extracting from ground from a cloud 491 is the each point below the height stipulating.
Below, the each point extracting from a cloud 491 by specified altitude point group extraction unit 120 is called to specified altitude point group 129a.
One example of the three-dimensional point group of some cloud Projection Division 110(Projection Division) three-dimensional coordinate that represents of the each point of specified altitude point group 129a based on extracting from a cloud 491 by specified altitude point group extraction unit 120, use CPU that the each point of specified altitude point group 129a is projected in to plane and generate the example of point group orthograph as 191(eye view image).
Point group orthograph is as an example of display part 140(eye view image display part) in display device 901, show that the point group orthograph generating by a cloud Projection Division 110 is as 191.
Camera image display part 150 is used CPU, specify the point being projected in the image section as 191 interior appointments at the point group orthograph showing as display part 140 by point group orthograph, in display device 901, show the camera image 292 of the instrumentation place photography from specially appointed point being carried out to instrumentation.
Video generation device storage part 190 is stored the camera image 292 of obtaining by moving body measuring device 200 and the point group orthograph generating by a cloud Projection Division 220 as 191.
Figure 10 is that the point group orthograph picture that represents embodiment 2 generates the process flow diagram of processing (S140).
The flow process that generates processing (S140) for the point group orthograph picture of embodiment 2 is describing below based on Figure 10.
Point group orthograph, as each " ~ the portion " of generating apparatus 100, is used CPU to carry out the processing of following explanation.
<S141A: floor level specifies processes >
First, floor level refers in particular to the height of the three-dimensional coordinate that the each point of bonding part 130 based on a cloud 491 represent, specifies floor level 139a.
For example, floor level refers in particular to bonding part 130 in the case of the floor level 139a that specifies certain region, from the each point that represents the lat/lon in this region, extract highly minimum point, the height that the point of extraction is represented is as the floor level 139a in this region.
About the details of the method that specifies of floor level 139a, explanation in embodiment 4.
<S142A: specified altitude point group extraction process >
Then, it is the each point below the height stipulating that specified altitude point group extraction unit 120 is extracted the height that specially appointed floor level 139a is benchmark among S141A the height from floor level 139a from a cloud 491, as specified altitude point group 129a.
That is to say, specified altitude point group 129a is the point group from a cloud 491 has been removed the each point that the height of aspect ratio from floor level 139a regulation is high.
For example, at the height of regulation, be " 50cm ", specified altitude point group extraction unit 120 is that " (floor level 139a)+50[cm] " following each point extracts from a cloud 491 by the height of three-dimensional coordinate, as specified altitude point group 129a.Floor level 139a by each region by specially appointed situation under, specified altitude point group extraction unit 120 is by each extracted region specified altitude point group 129a.
<S143A: three-dimensional point group projection process >
Then, some cloud Projection Division 110 to surface level, generates point group orthograph as 191 by the each point orthogonal projection of the specified altitude point group 129a extracting in S142A.
But, by the plane of the each point of orthogonal projection specified altitude point group 129a, being not limited to surface level, the projection that carry out some cloud Projection Division 110 is not limited to orthogonal projection.
<S144A: eye view image Graphics Processing >
Then the point group orthograph that, point group orthograph is presented at display device 901 to generate in S143A as display part 140 is as 191.
Here, the point group orthograph that user shows in display device 901 specifies as 191 interior mouse 903, the keyboards 902 etc. of utilizing the image section of wishing to confirm camera image 292.
<S145A: camera image Graphics Processing >
Then, camera image display part 150 shows the camera image 292 corresponding to image section of specifying with catspaw in display device 901.
At this moment, camera image display part 150 specifies the point of the some cloud 491 of projection in the image section of specifying user, to (following from specially appointed point, be called and refer in particular to fixed point) the camera image 292 of instrumentation place photography specify, in display device 901, show the camera image 292 having been specified.
By specially appointed camera image 292, be to refer in particular to fixed point by the camera image 292 of the moment of instrumentation photography.In addition, refer in particular to fixed point by the moment of instrumentation, refer to as the point of distance orientation point group 291 of raw data that refers in particular to fixed point by the moment of instrumentation.
User can confirm to be difficult in as 191 at point group orthograph the atural object of confirming by the camera image 292 showing in display device 901.
Below, represent to generate and process point group orthograph that (S141A ~ S143A) generate as an example of 191 by point group orthograph picture.
Figure 11 is object region b(Fig. 6) point group orthograph as 191, Figure 12, be object region a(Fig. 6) point group orthograph as 191.Figure 11, Figure 12 are all that only to have height from floor level 139a be that specified altitude point group 129a below 50cm is by the figure of orthogonal projection.
In specified altitude point group extraction process (S142A), the high each point of aspect ratio 50cm from floor level 139a is removed from a cloud 491, is only as the height from floor level 139a, to be the height below 50cm each point is extracted as specified altitude point group 129a.
Therefore, in Figure 11, Figure 12, road is not all covered by the veil in trees, tunnel etc., can clearly identify white line, road boundary etc.
In embodiment 2, for following such point group orthograph, as generating apparatus 100, be illustrated.
The aerophoto being always utilized as road image, the road part of crested when can not show under trees, tunnel internal etc. observes from vertical direction.
On the other hand, point group orthograph as generating apparatus 100 in a cloud 491, only to use from road surface (floor level 139a), be that the mode of the point group (specified altitude point group 129a) below specified altitude applies restriction and generates point group orthograph as 191, the veil that can remove thus trees, tunnel etc., can all show road.
And then the vehicle 202 of moving body measuring device 200 for example, travels from the empty spacing (5m left and right) of opening regulation around, obtain thus the distance orientation point group 291 of instrumentation value of other vehicles that do not comprised instrumentation.And point group orthograph is used as generating apparatus 100 the point cloud 491 that generates of the distance orientation point group 291 of instrumentation value of other vehicles that never comprised instrumentation, makes the point group orthograph of the vehicle in not travelling completely as 191.
That is to say, point group orthograph can be made the beautiful road image of the part of not can't see as generating apparatus 100.
In addition, user at the atural object that can not confirm to mirror in as 191 at point group orthograph (for example, electric pole, street lamp, sign) the situation of kind, content under, point group orthograph shows camera image 292 as generating apparatus 100 and point group orthograph linkedly as 191.For example, point group orthograph as generating apparatus 100, in CAD picture, make point group orthograph as 191 and camera image 292 link show.At this moment, point group orthograph as generating apparatus 100 based on each point by the moment of instrumentation, propose in its vicinity photography camera image 292.Thus, it is electric pole or street lamp as 191 atural objects that mirror that user can differentiate at point group orthograph, or sign.Point group orthograph as generating apparatus 100 when camera image 202 is shown, camera image 292 is carried out image processing and extracts the content (word of record, figure etc.) of sign, signboard etc., the information of extraction is shown and also can together with camera image 292.Thus, user, except the difference of electric pole, street lamp, also can confirm the content of sign, signboard etc.
Embodiment 3
In embodiment 3, for generating to specify, erect the point group orthograph that thing is set and describe as 191 mode.
Below, mainly for the item different from embodiment 1 ~ 2, describe, about the item that description thereof is omitted, with embodiment 1 ~ 2nd, same.
Figure 13 is that the point group orthograph of embodiment 3 is as the structural drawing of generating apparatus 100.
Point group orthograph for embodiment 3 is describing based on Figure 13 below as the structure of generating apparatus 100.
Point group orthograph is as an example of generating apparatus 100(eye view image generating apparatus) possess: some cloud Projection Division 110; Point group orthograph is as display part 140; Camera image display part 150; Dot density calculating part 160; Setting arranges thing and refers in particular to bonding part 170; Setting arranges thing difference portion 180; And video generation device storage part 190.
Dot density calculating part 160 by the point group orthograph generating by a cloud Projection Division 110 as 191 dot density 169a according to using CPU to calculate as each of 191 regions of having divided with the size of regulation point group orthograph.
Setting arranges thing and refers in particular to the dot density 169a of bonding part 170 based on calculating by dot density calculating part 160, uses CPU to specify to show as 191 from point group orthograph and erects the image section that thing is set.
Below, will arrange that thing refers in particular to bonding part 170 and specially appointed image section is called to erect and thing display image portion is set divides 179a by setting.
Setting arranges thing difference portion 180 and uses CPU to generate to refer in particular to the specially appointed setting in bonding part 170 and thing display image portion is set divides point group orthograph that 179a and other image section distinguish expression as 191 thing is set by settings.
Figure 14 is that the point group orthograph picture of embodiment 3 generates the process flow diagram of processing (S140).
The flow process that generates processing (S140) for the point group orthograph picture of embodiment 3 is describing below based on Figure 14.
Point group orthograph, as each " ~ the portion " of generating apparatus 100, is used CPU to carry out the processing of following explanation.
<S141B: three-dimensional point group projection process >
Cloud 491 orthogonal projections will be put to surface level in some cloud Projection Division 110, generate point group orthograph as 191.
Below, by by orthogonal projection point cloud 491 surface level be called " projecting plane ".
<S142B: dot density computing >
Dot density calculating part 160 is divided projecting plane with the size of regulation, by the dot density 169a of the each point of each region calculation level cloud 491.
The size in each region is small, for example, be not the size in image, but big or small " 30cm × 30cm " left and right of real world.In addition for example, point group orthograph also can corresponding to 1 region as 1 pixel of 191.
Here, " dot density 169a " means the quantity of the point of the some cloud 491 being projected in tiny area.
<S143B: setting arranges thing and specifies processing >
Setting arranges thing, and to refer in particular to bonding part 170 be that more than stated number each tiny area arranges thing display image portion as setting and divides 179a to specify using the dot density 169a calculating in S142B.
Because the short transverse of the laser scanner 210 also side to vehicle 202 is carried out instrumentation, thus the atural object (following, be called to erect thing is set) that wall surface, electric pole, street lamp etc. have a height in short transverse by the multiple points of instrumentation.On the other hand, as road surface, do not have height atural object in short transverse only by 1 point of instrumentation.Therefore, erect that the dot density 169a of thing is set is larger than the dot density 169a of road surface.Therefore, in setting, thing display image portion being set and dividing in 179a, is that more than stated number each tiny area arranges thing display image portion as setting and divides 179a to specify using dot density 169a.
For example, erect and thing is set refers in particular to bonding part 170 and the tiny area that is projected 10 o'clock above points is arranged to a thing display image portion minute 179a as setting specify.
<S144B: erect thing differentiated treatment > is set
Setting arranges thing difference portion 180 and generates and specially appointed setting in S143B is arranged to thing display image portion divide 179a and other the image section point group orthograph that difference represents as 191.
For example, setting arranges thing difference portion 180, in setting, the color that thing display image portion is divided 179a colouring regulation is set.
In addition for example, erect and thing difference portion 180 is set makes to erect and thing display image portion is set to divide 179a and other image section be different color matching (setting arranges thing display image portion, and to divide 179a be " red ", and other image section is " black " etc.).
In addition for example, erect and thing difference portion 180 is set in setting, thing display image portion is set and divides 179a to add specific mark.
And then, erect and thing difference portion 180 is set by each dot density, to erectting, thing display image portion is set divides 179a to divide, by each dot density, add different color matching, marks and also can.For example, erect and thing difference portion 180 is set according to large order colouring " in vain " → " green " → " red " of dot density.
<S145B: eye view image Graphics Processing >
The point group orthograph that point group orthograph is presented at display device 901 to generate in S144B as display part 140 is as 191.
<S146B: camera image Graphics Processing >
Camera image display part 150 and S145A(Figure 10) similarly, in display device 901, show the camera image 292 corresponding to image section of specifying with catspaw.
Point group orthograph, can specify point group orthograph, as the setting in 191, thing is set by calculating the dot density of tiny area as generating apparatus 100.Thus, user can know that the setting of electric pole etc. arranges thing and is where positioned at.
Point group orthograph as generating apparatus 100 and embodiment 2 in the same manner, possesses that floor level refers in particular to bonding part 130 and specified altitude point group extraction unit 120 also can.
Specified altitude point group extraction unit 120 is extracted by the each point referring in particular to by floor level more than the height that height that the specially appointed floor level 139a in bonding part 130 is benchmark the height from floor level 139a are regulations, as specified altitude point group 129a from a cloud 491.That is to say, specified altitude point group 129a is the point group from a cloud 491 has been removed the each point of height of the not enough regulation of height from floor level 139a.
The specified altitude point group 129a extracting by specified altitude point group extraction unit 120 is used in some cloud Projection Division 110, generates point group orthograph as 191.
For example, by specified altitude point group extraction unit 120 and embodiment 2, remove on the contrary the each point of the height deficiency " 50cm " from floor level 139a, some cloud Projection Division 110 can generate the point group orthograph that does not mirror road surface as 191.
Thus, point group orthograph more correctly specifies to erect as generating apparatus 100 thing is set, and user can more easily differentiate setting thing is set.
Below, represent height from floor level 139a be more than 50cm specified altitude point group 129a by orthogonal projection point group orthograph as an example of 191.
Figure 15 is object region b(Fig. 6) point group orthograph as 191, Figure 16, be the part expanded view of object region b.
Figure 17 is object region a(Fig. 6) point group orthograph as 191, Figure 18, be the part expanded view of object region a.
Figure 15 ~ Figure 18 be only the height from floor level 139a be more than 50cm specified altitude point group 129a by orthogonal projection point group orthograph as 191, do not mirror road surface.
As by Figure 16 of having expanded in the dotted line frame of Figure 15 with by shown in the Figure 18 having expanded in the dotted line frame of Figure 17, point group orthograph is as 191 atural objects that clearly represent street lamp, tree, wall, electric wire, electric pole etc.
In addition, point group orthograph as 191 by dot density and by color differentiating.For example, along with dot density thickens, in the mode of " red → green → white ", change.Because the dot density of the atural object standing vertically as electric pole thickens, so represent with green point or white point.
By using such point group orthograph as 191, erect the location positioning that thing is set and become easily, compared with existing handwork, can significantly reduce the activity duration that location positioning that setting arranges thing expends.
In addition, in Figure 18, can identify from the appearance of the wall of family, electric wire that electric pole stretches out.By having or not of electric wire, can distinguish electric pole and street lamp.In the case of distinguishing object, be that electric pole, street lamp or other setting arrange thing any, user makes point group orthograph show camera image 292 as generating apparatus 100, by confirming camera image 292, can correctly identify.
In embodiment 3, for following point group orthograph, as generating apparatus 100, be illustrated.
Point group orthograph as 191 expressions from directly over observe image.Therefore the stereoscopic article (setting arranges thing) that is, similar color with ground as electric pole disappears.In addition, the stereoscopic article of electric pole etc. is important target in mileage chart.
Therefore, point group orthograph as generating apparatus 100 according to point group orthograph each the calculation level density as 191 tiny area, by each dot density, distinguish brightness, color, shape etc. and show.
Electric pole, wall surface are vertically to build substantially.Therefore, in the vertical direction projection point cloud 491 point group orthograph as 191 dot density, in electric pole, wall surface part, rise.On the other hand, ground etc. are owing to being only in the vertical direction 1 point, so dot density step-down.
Thus, according to distinguishing point group orthograph that brightness, color, shape etc. represent by each dot density as 191, where the stereoscopic article that can easily differentiate/detect electric pole etc. is positioned at.
And then, by only applying, generate point group orthograph as 191 restriction with some cloud generating apparatus storage part 490 more than specified altitude, can more easily differentiate/detect stereoscopic article.
Embodiment 4
In embodiment 4, the method that specifies that refers in particular to the floor level 139a carrying out bonding part 130 for floor level describes.
< embodiment 1>
Floor level refers in particular to bonding part 130 and extracts highly minimum point from the each point of the some cloud 491 that represents the lat/lon in region, and the height that the point of extraction is represented is as the floor level 139a in this region.
Figure 19 represents embodiment 4(embodiment 1) the figure of the method that specifies of floor level 139a.The upper figure of Figure 19 represents the planimetric map in ramp, and figure below of Figure 19 represents the outboard profile in ramp.
Floor level refers in particular to bonding part 130 as shown on Figure 19, the object region that comprises ramp (is for example divided into size latticed and that be divided into regulation with lat/lon, 100m × 100m), as shown in figure below of Figure 19, extract the minimum some P of height in the point in each region.Then, floor level refers in particular to the height of the three-dimensional coordinate that bonding part 130 represents a P as the floor level 139a of this region.
By specified altitude point group extraction unit 120, from a cloud 491, extract from a P(floor level 139a) such as 50cm of specified altitude x() below each point, some cloud Projection Division 110 can generate and represent that the point group orthograph in ramp is as 191.
< embodiment 2>
Floor level refers in particular to bonding part 130 represents the curb of road each point from a cloud 491 extractions, and the three-dimensional coordinate that the each point based on extracting represents specifies floor level 139a.
Figure 20 represents embodiment 4(embodiment 2) the figure of the method that specifies of floor level 139a, represent subject area to be divided into 1 region after the size of regulation.
First, floor level refers in particular to the three-dimensional coordinate that the each point of bonding part 130 based on a cloud 491 represents, specifies the each point of the curb (2 curb part) of the left and right that represents ramp.
Then, floor level refers in particular to bonding part 130 and extracts highly the highest some A and highly minimum some B from the each point of the curb that represents a side, from representing the opposing party's the highly the highest some C of each point extraction of curb.Point C is minimum point, represent curb arbitrarily 1 also can.
Then, floor level refers in particular to 3 A, Bs, the three-dimensional coordinate of C of bonding part 130 based on extracting, and 3 dimension equations of the plane by a some A, some B and some C are calculated as the equation (following, to be called road surface equation) of the gradient that represents ramp.
By floor level, refer in particular to road surface equation that bonding part 130 calculates represents ramp floor level 139a by each lat/lon.
To the residing region of the each point of a cloud 491, the lat/lon based on each point specifies specified altitude point group extraction unit 120, the road surface equation in this region of lat/lon substitution of each point is calculated to floor level 139a, height to the floor level 139a calculating and each point compares, and extracts specified altitude point group 129a.
Then, as (following to the each point that represents curb, be called curb point group) carry out specially appointed method, illustrate that (1) makes user select A, B of curb, the method for C at 3, and the method that specifies of the curb point group of the uncontinuity of (2) position based on each point.
First, for (1), make user select A, B of curb, the method for C to describe at 3.
Floor level refers in particular to bonding part 130 as patent documentation 1(TOHKEMY 2007-218705 communique) S107(Figure 13) a cloud 491 is projected to camera image 292, camera image 292(point cloud 491 and the overlapping image of camera image 292 of some cloud 491 that shown projection in display device 901), make each point selection 3 A, Bs, the C of some cloud 491 of user from demonstration.
Then, for (2), the method that specifies of the curb point group of the uncontinuity of the position based on each point describes.
Figure 21 represents embodiment 4(embodiment 2) the figure of the method that specifies of curb point group.
The upper figure of Figure 21 is the vertical cross section that represents the curb of road and road left and right, and left and right represents horizontal direction (lat/lon direction), represents up and down high (highly) direction.Figure below of Figure 21 is the expanded view in the dotted line frame of upper figure.Circle mark in figure represents respectively 1 point of a cloud 491.Below, each circle mark is called to " 3D point ".Together with the movement of each 3D point and moving body measuring device 200, from left to right, from right to left successively by instrumentation.The line of below, the multiple 3D that connect the horizontally-arranged row that obtain by instrumentation from left to right or from right to left being ordered is called " sweep trace ".Figure 21 represents the multiple 3D points on 1 sweep trace from left to right with circle mark.
Figure 22 is the figure that represents the picture that shows the image that has a cloud 491, and this cloud 491 represents the curb of road and road left and right.
In Figure 22, represent key frame (figure entirety) and secondary picture (figure upper left), key frame shows the image (point group orthograph as 191 an example) that uses some cloud 491 to represent road plane, secondary picture disply extracts the some cloud 491 of turnout part, uses the some cloud 491 extracting to represent the image of the vertical section on turnout.
The image showing in the secondary picture of Figure 22 is the image of the real data corresponding with Figure 21.
Floor level refers in particular to bonding part 130 based on the continuous multiple 3D points of instrumentation order, calculate the straight line of the inclination that represents this part, the variable quantity of the inclination that the straight line based on calculating represents, specifies the bifurcation of road and curb as the 3D point that represents curb.
For example, in Figure 21, floor level refers in particular to bonding part 130 using I 3D point as basic point, based on I-1 the 3D point calculated line 1 with I-2, and based on I, I+1 and the individual 3D point calculated line 2 of I+2.Below, the 3D point of x is designated as to " x point ".Straight line 1 is the straight line of the intermediate point of ordering by I point and by I-1 point and I-2, and straight line 2 is straight lines of the intermediate point of ordering by I point and by I+1 point and I+2.But, straight line 1,2 based on continuous 4 above 3D points (for example I-3 ~ I point, I point ~ I+3 point) calculate also can, based on 2 3D points (I-1 ~ I point, I point ~ I+1 point), calculate and also can.
And floor level refers in particular to poor (variable quantity) of the inclination of bonding part 130 based on straight line 1 and straight line 2 and forms straight line 2(is straight line 1 in curb left side) multiple 3D difference of height of ordering specify the 3D point of curb.For example, if it is more than ormal weight and the difference of height of the I height of ordering and the I+2 height of ordering is below the setting (20cm) that is equivalent to the height of curb at variable quantity that floor level refers in particular to bonding part 130, be judged to be the 3D point that I-1 point or I+1 point are curbs.I+2 point is at the 3D point that forms the difference of height maximum of ordering with I in I point, I+1 point and the I+2 point of straight line 2.Conventionally, because the curb of road rolls tiltedly, so what use is not I point, but any 3D point (, I-2 point, I-1 point, I+1 point, I+2 point) of I point front and back is chosen as to the 3D point of curb in the calculating of the high 139a in ground.For example, floor level refers in particular to the 3D point of bonding part 130 using I-1 point or I+1 point as curb.But, I is clicked to the 3D point that is selected as curb and also can.
Floor level refers in particular to bonding part 130 each point of a cloud 491 is specified to curb point group as basic point by above-mentioned method respectively.
Figure 23 represents embodiment 4(embodiment 2) curb point group specify the process flow diagram of method.
The flow process that specifies the processing of method (Figure 21) for curb point group, is describing below based on Figure 23.
First, floor level refers in particular to bonding part 130 and reads in a cloud 491(S210).
Then, floor level refers in particular to bonding part 130 and selects successively a sweep trace, the multiple 3D points that extract the sweep trace of selecting from a cloud 491.Below, the multiple 3D points on the sweep trace of extraction are called to " scanning point group " (S220).
Then, floor level refers in particular to bonding part 130 and selects 1 3D point as basic point I from the analyzing spot mass selection extracting, and the multiple points based on continuous from the basic point I selecting calculate by 2 straight lines 1, the 2(S230 of basic point).
Then, floor level refers in particular to the difference of the inclination of bonding part 130 based on straight line 1 and straight line 2 and forms straight line 2(or straight line 1) multiple 3D difference of height judgement basic point I of order whether be curb part (S231).
If floor level refers in particular to bonding part 130, to be judged to be basic point I be curb part, and the 3D point (for example, I-1 point, I+1 point) of storage curb (S232).
Floor level refers in particular to bonding part 130 and repeatedly carries out S230 ~ S232 for the each point of the scanning point group extracting at S220, specifies 3D point the storage (S233) of the curb of left and right.
And then, floor level refer in particular to the 3D of curb that bonding part 130 can not specify left and right in S230 ~ S233 order (S240), based on the height of Ear Mucosa Treated by He Ne Laser Irradiation angle and each point, specify the 3D point of curb and store (S250).
Particularly, floor level refers in particular to bonding part 130 and specifies as follows the 3D point of curb.
First, floor level refer in particular to bonding part 130 from scanning point group extract Ear Mucosa Treated by He Ne Laser Irradiation angle with approaching multiple 3D points of the 3D point of specially appointed curb in the sweep trace of 1.For example, if floor level refers in particular to the 3D point of bonding part 130 specially appointed curb in the sweep trace of first 1, be n 3D point on sweep trace, from scanning point group, extract the n-3 ~ n+3 point sweep trace.
Then, floor level refer in particular to that bonding part 130 order highly minimum 3D in the multiple 3D points (n-3 ~ n+3 point) that extract before (or afterwards) one (or multiples') 3D point as the 3D point of curb, store.At this moment, can be also that the following situation of setting (for example, 1 degree) joins in the condition of ordering as the 3D of curb by the difference of the Ear Mucosa Treated by He Ne Laser Irradiation angle of ordering with the 3D of specially appointed curb in the sweep trace of first 1.
Floor level refers in particular to bonding part 130 and repeatedly carries out S220 ~ S250(S260 for whole sweep traces), respectively to representing multiple 3D points of curb in left side and multiple 3D points of curb on expression right side divide into groups (S270).
Figure 24 represents by embodiment 4(embodiment 2) curb point group specify method and the figure of specially appointed curb point group.
As shown in figure 24, by embodiment 4(embodiment 2) curb point group specify method and specially appointed curb point group is consistent with the mileage chart of the object region shown in Fig. 5.
< embodiment 3>
The height that floor level refers in particular to the aerial survey reference point O of bonding part 130 based on moving body measuring device 200 specifies floor level 139a.
Aerial survey reference point, as illustrated at Fig. 2, is the coordinate center of moving body measuring device 200.
As the height of the aerial survey reference point O based on moving body measuring device 200, specify the method for floor level 139a, illustrate that (1) calculate the 3D equation of road surface and specify the method for floor level 139a and (2) and specify by each instrumentation moment the method for floor level 139a.
Figure 25 represents embodiment 4(embodiment 3(1)) the figure of the method that specifies of floor level 139a.
The method that specifies floor level 139a for the 3D equation of (1) calculating road surface, is describing below based on Figure 25.
, suppose the each moment for t0, t1, t2 here, obtained the three-dimensional coordinate of aerial survey reference point O and the each point of some cloud 491.
In addition, the height from ground of the aerial survey reference point O of instrumentation is in advance made as to 2000mm.
Floor level refers in particular to the height of bonding part 130 based on moment t0, t1, t2 aerial survey reference point O separately, and the 3D equation of the plane of the low 2000mm of the plane from by each aerial survey reference point O (or each aerial survey reference point O becomes immediate plane) is calculated as the 3D equation that represents road surface.Then, floor level refers in particular to bonding part 130 by the 3D equation of the lat/lon substitution road surface of the each point of a cloud 491, calculates floor level 139a.
Figure 26 represents embodiment 4(embodiment 3(2)) the figure of the method that specifies of floor level 139a.
The method that specifies floor level 139a for (2) by each instrumentation moment, is describing below based on Figure 26.
Here, suppose for each moment, obtained the three-dimensional coordinate of aerial survey reference point O and the each point of some cloud 491.
In addition, the height from ground of the aerial survey reference point O of instrumentation is in advance made as to 2000mm.
Floor level refers in particular to bonding part 130 and will by the height of the low 2000mm of height of the aerial survey reference point O in the moment of instrumentation, as floor level 139a, be calculated than the each point of a cloud 491.
But, also can be that floor level refers in particular to bonding part 130 and do not calculate floor level 139a, but specified altitude point group extraction unit 120 will separate altitude datum as correction than the height of the low 1500mm of height of aerial survey reference point O (height of 500mm from ground " height of regulation "), calculate, from a cloud 491, will as specified altitude point group 129a, extract than revising the low each point (or high each point) of separation altitude datum.
By the method that respectively specifies of explanation in embodiment 4, even if floor level refers in particular to bonding part 130 in the situation that road becomes inclined-plane, also can calculate accurately floor level 139a.
Embodiment 5
The point group orthograph generating as generating apparatus 100 by point group orthograph is for example useful as 191 in the making of mileage chart.
In embodiment 5, for the system of making mileage chart, describe.
Figure 27 is the structural drawing of the map datum generation system 801 of embodiment 5.
Structure for the map datum generation system 801 of embodiment 5 is describing below based on Figure 27.
Outside the structure of the point group orthograph that map datum generation system 801 has illustrated in each embodiment as generation system 800, also there is CAD device 500.
Always, in road management, use the city planning chart that represents buildings, road, represent the road liber of the road chose in possession of electric pole, sewer mouth, advertising tower etc., recorded the road management liber accompanying drawing of road edge stone, guardrail, sign etc. etc., expected that the precision of these city planning charts, road liber and road management liber accompanying drawing improves.
Point group orthograph as 191 with high precision remove trees, tunnel etc. veil cover to represent road (embodiment 2), the setting of emphasizing in addition electric pole, street lamp etc. arranges thing and represents (embodiment 3).Therefore in the making of city planning chart, road liber and road management liber accompanying drawing, be, useful.
CAD device 500 possesses CAD portion 510 and CAD storage part 590, uses the point group orthograph generating as generating apparatus 100 by point group orthograph to generate map datum 591(for example city planning chart, road management liber accompanying drawing as 191).
CAD portion 510 is used CPU, shows that point group orthograph is as 191 and camera image 292, and generate map datum 591 corresponding to operating in of user in display device 901.
CAD storage part 590 store map data 591.
User uses keyboard 902, mouse 903 to operate CAD device 500, make the point group orthograph of making in embodiment 2 as 191 demonstrations, by to show point group orthograph as 191 in represent road depict, mileage chart is mapped, the mileage chart of having mapped is preserved as map datum 591.
And then user makes in embodiment 3 the point group orthograph of making as 191 demonstrations, the point group orthograph that is chosen in successively demonstration arranges thing and the camera image 292 of selection place is shown as the setting representing in 191.Then, user specifies and erects the kind that thing is set according to the camera image 292 showing, using having set, each setting arranges the position of thing and the mileage chart of kind is preserved as map datum 591.
CAD portion 510 is by user's selection, and by image, process extract road, each setting arranges thing and also can.
Like this, user, by utilizing point group orthograph as 191, can make city planning chart, road management liber accompanying drawing simply than prior art.
In each embodiment, point group orthograph as each device of generating apparatus 100, moving body measuring device 200, position and attitude caliberating device 300, some cloud generating apparatus 400, CAD device 500 be respectively independently device also can, be that 1 device that comprises mutual structure also can.
In addition, each device be do not have independently device that network connects also can, be with the wired or wireless LAN of being connected in, the Internet, mutually carry out the communicator that data send reception and also can.
Description of reference numerals
100 point group orthographs are as generating apparatus; 110 cloud Projection Divisions; 120 specified altitude point group extraction units; 129a specified altitude point group; 130 floor levels refer in particular to bonding part; 139a floor level; 140 point group orthographs are as display part; 150 camera image display parts; 160 dot density calculating parts; 169a dot density; 170 settings arrange thing and refer in particular to bonding part; 179a setting arranges thing display image portion and divides; 180 erect the thing difference portion that arranges; 190 video generation device storage parts; 191 point group orthograph pictures; 200 moving body measuring devices; 201 top boards; 202 vehicles; 210 laser scanners; 220 video cameras; 230 GPS computing machines; 240 gyroscopes; 250 velometers; 290 measuring device storage parts; 291 apart from orientation point group; 292 camera images; 293 GPS observation information; 294 gyroscope instrumentation values; 295 velometer instrumentation values; 300 position and attitude caliberating devices; 310 position and attitude demarcation portions; 390 caliberating device storage parts; 391 position and attitude calibration values; 400 cloud generating apparatus; 410 three-dimensional point group generating units; The three-dimensional point group of 419a; 420 cloud generating units; 490 cloud generating apparatus storage parts; 491 clouds; 500 CAD devices; 510 CAD portions; 590 CAD storage parts; 591 map datums; 800 point group orthographs are as generation system; 801 map datum generation systems; 901 display device; 902 keyboards; 903 mouses; 904 FDD; 905 CDD; 906 print apparatus; 907 scanner devices; 908 microphones; 909 loudspeakers; 911 CPU; 912 buses; 913 ROM; 914 RAM; 915 communication port; 920 disk sets; 921 OS; 922 window systems; 923 package; 924 file group.

Claims (5)

1. an eye view image generating apparatus, is used the three-dimensional point all living creatures of the three-dimensional coordinate that represents ground each place to become ground eye view image, it is characterized in that,
Described three-dimensional point group comprises by utilize laser thing is set carries out multiple three-dimensional point that instrumentation obtains erectting in short transverse from mobile on the ground moving body,
Described eye view image generating apparatus possesses:
Three-dimensional point group Projection Division, the three-dimensional coordinate that the each point based on described three-dimensional point group represents, is that central processing unit projects to plane by the each point of described three-dimensional point group and generates ground eye view image with CPU;
Dot density calculating part, will project to the dot density of the each point of the described three-dimensional point group of described plane by described three-dimensional point group Projection Division, according to the region after described plane is divided with the size of regulation each, use CPU to calculate;
Setting arranges thing and refers in particular to bonding part, based on the dot density calculating by described dot density calculating part, uses CPU to specify to show described setting the image section that thing is set from described eye view image; And
Setting arranges thing difference portion, and thing refers in particular to the specially appointed image section in bonding part and other image section is distinguished the eye view image representing by arranging by described setting to use CPU generation.
2. eye view image generating apparatus according to claim 1, is characterized in that,
Described eye view image generating apparatus possesses: specified altitude point group extraction unit, and the three-dimensional coordinate that the each point based on described three-dimensional point group represents, is used the each point of CPU from the height of described three-dimensional point group extraction expression specialized range, as specified altitude point group,
The three-dimensional coordinate that the each point of the specified altitude point group of described three-dimensional point group Projection Division based on extracting from described three-dimensional point group by described specified altitude point group extraction unit represents, is used CPU that the each point of described specified altitude point group is projected to plane, generates eye view image.
3. eye view image generating apparatus according to claim 2, is characterized in that, the each point of described specified altitude point group extraction unit more than specified altitude extracts as described specified altitude point group.
4. eye view image generating apparatus according to claim 3, is characterized in that, described specified altitude point group extraction unit is that more than specified altitude each point extracts as described specified altitude point group using the height from ground.
5. an eye view image generation method, is used the three-dimensional point all living creatures of the three-dimensional coordinate that represents ground each place to become ground eye view image, it is characterized in that,
Described three-dimensional point group comprises by utilize laser thing is set carries out multiple three-dimensional point that instrumentation obtains erectting in short transverse from mobile on the ground moving body, following three-dimensional point group projection process is carried out in three-dimensional point group Projection Division,, the three-dimensional coordinate that each point based on described three-dimensional point group represents, with CPU, be that central processing unit projects to plane by the each point of described three-dimensional point group and generates ground eye view image
Dot density calculating part is carried out following dot density computing,, to by described three-dimensional point group Projection Division, project to the dot density of the each point of the described three-dimensional point group of described plane, according to the region after described plane is divided with the size of regulation each, use CPU to calculate;
Setting arranges thing and refers in particular to bonding part and carry out following setting and thing is set specifies processing,, based on the dot density calculating by described dot density calculating part, uses CPU to specify to show described setting the image section that thing is set from described eye view image that is;
Setting arranges following setting of thing difference portion execution thing differentiated treatment is set, that is, thing refers in particular to the specially appointed image section in bonding part and other image section is distinguished the eye view image representing by arranging by described setting to use CPU generation.
CN200980133675.0A 2008-08-29 2009-08-24 Bird's-eye image forming device, bird's-eye image forming method Expired - Fee Related CN102138163B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2008-221359 2008-08-29
JP2008221359 2008-08-29
PCT/JP2009/064700 WO2010024212A1 (en) 2008-08-29 2009-08-24 Bird's-eye image forming device, bird's-eye image forming method, and bird's-eye image forming program

Publications (2)

Publication Number Publication Date
CN102138163A CN102138163A (en) 2011-07-27
CN102138163B true CN102138163B (en) 2014-04-30

Family

ID=41721380

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200980133675.0A Expired - Fee Related CN102138163B (en) 2008-08-29 2009-08-24 Bird's-eye image forming device, bird's-eye image forming method

Country Status (6)

Country Link
US (1) US8665263B2 (en)
EP (1) EP2320382A4 (en)
JP (4) JP4832596B2 (en)
KR (2) KR101269981B1 (en)
CN (1) CN102138163B (en)
WO (1) WO2010024212A1 (en)

Families Citing this family (99)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011124271A1 (en) 2010-04-09 2011-10-13 Tomtom International B.V. Method of generating a route
CN101887596B (en) * 2010-06-01 2013-02-13 中国科学院自动化研究所 Three-dimensional model reconstruction method of tree point cloud data based on partition and automatic growth
US8565958B1 (en) * 2011-06-02 2013-10-22 Google Inc. Removing extraneous objects from maps
GB201114591D0 (en) 2011-08-23 2011-10-05 Tomtom Int Bv Methods of and apparatus for displaying map information
CN102445186B (en) * 2011-09-28 2013-06-05 中交第二公路勘察设计研究院有限公司 Method for generating road design surface information by laser radar scan
GB201116960D0 (en) * 2011-09-30 2011-11-16 Bae Systems Plc Monocular camera localisation using prior point clouds
JP4948689B1 (en) * 2011-10-06 2012-06-06 アジア航測株式会社 Laser ortho image generating apparatus and program thereof
US9047688B2 (en) 2011-10-21 2015-06-02 Here Global B.V. Depth cursor and depth measurement in images
US8553942B2 (en) 2011-10-21 2013-10-08 Navteq B.V. Reimaging based on depthmap information
US9116011B2 (en) 2011-10-21 2015-08-25 Here Global B.V. Three dimensional routing
US9024970B2 (en) 2011-12-30 2015-05-05 Here Global B.V. Path side image on map overlay
US9404764B2 (en) * 2011-12-30 2016-08-02 Here Global B.V. Path side imagery
US8731247B2 (en) * 2012-01-20 2014-05-20 Geodigital International Inc. Densifying and colorizing point cloud representation of physical surface using image data
KR101611891B1 (en) 2012-03-23 2016-04-14 미쓰비시덴키 가부시키가이샤 Cover pole unit
JP5947666B2 (en) * 2012-08-21 2016-07-06 アジア航測株式会社 Traveling road feature image generation method, traveling road feature image generation program, and traveling road feature image generation apparatus
KR101394425B1 (en) * 2012-11-23 2014-05-13 현대엠엔소프트 주식회사 Apparatus and method for map data maintenance
ES2967089T3 (en) * 2012-12-26 2024-04-26 Cambridge Mobile Telematics Inc Driver identification methods and systems
CN103149569B (en) * 2013-02-25 2014-12-10 昆山南邮智能科技有限公司 Method for recognizing high-voltage wire by laser radar based on wavelet transformation
WO2014170895A1 (en) * 2013-04-15 2014-10-23 Xsight Systems Ltd. Contaminant detection and bird risk management at airports
US9858798B2 (en) 2013-05-28 2018-01-02 Aai Corporation Cloud based command and control system integrating services across multiple platforms
JP6006179B2 (en) * 2013-06-20 2016-10-12 株式会社パスコ Data analysis apparatus, data analysis method, and program
US9449227B2 (en) * 2014-01-08 2016-09-20 Here Global B.V. Systems and methods for creating an aerial image
JP6389049B2 (en) * 2014-03-17 2018-09-12 アジア航測株式会社 Laser point group coloring method and laser point group coloring program
JP6359868B2 (en) * 2014-04-25 2018-07-18 東芝プラントシステム株式会社 3D data display device, 3D data display method, and 3D data display program
SG11201610454PA (en) * 2014-06-25 2017-02-27 Mitsubishi Electric Corp Device for creating construction gauge measurement diagram, device for creating construction gauge measurement diagram data, method for creating construction gauge measurement diagram, construction gauge measurement diagram, and construction gauge measurement diagram data
JP6432825B2 (en) * 2014-08-22 2018-12-05 株式会社Ihi Method and apparatus for aligning three-dimensional point cloud data and moving body system thereof
US9443312B2 (en) * 2014-08-29 2016-09-13 Leica Geosystems Ag Line parametric object estimation
US9693040B2 (en) 2014-09-10 2017-06-27 Faro Technologies, Inc. Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device
DE102014013677B4 (en) * 2014-09-10 2017-06-22 Faro Technologies, Inc. Method for optically scanning and measuring an environment with a handheld scanner and subdivided display
US9671221B2 (en) 2014-09-10 2017-06-06 Faro Technologies, Inc. Portable device for optically measuring three-dimensional coordinates
US9602811B2 (en) 2014-09-10 2017-03-21 Faro Technologies, Inc. Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device
DE102014013678B3 (en) 2014-09-10 2015-12-03 Faro Technologies, Inc. Method for optically sensing and measuring an environment with a handheld scanner and gesture control
EP3624081A1 (en) * 2014-10-30 2020-03-18 Mitsubishi Electric Corporation In-vehicle device and autonomous vehicle including such an in-vehicle device
JP6435781B2 (en) * 2014-10-31 2018-12-12 株式会社豊田中央研究所 Self-position estimation apparatus and mobile body equipped with self-position estimation apparatus
US20160342861A1 (en) * 2015-05-21 2016-11-24 Mitsubishi Electric Research Laboratories, Inc. Method for Training Classifiers to Detect Objects Represented in Images of Target Environments
CN104864889B (en) * 2015-05-29 2018-05-29 山东鲁能智能技术有限公司 A kind of robot odometer correction system and method for view-based access control model
JP6589410B2 (en) * 2015-06-24 2019-10-16 株式会社豊田中央研究所 Map generating apparatus and program
US9830509B2 (en) * 2015-06-29 2017-11-28 Nokia Technologies Oy Method and apparatus for constructing a digital elevation model utilizing ground points captured by ground-based LiDAR
JP6451544B2 (en) * 2015-08-04 2019-01-16 日産自動車株式会社 Road boundary detection device, self-position estimation device, and road boundary detection method
CN105184852B (en) * 2015-08-04 2018-01-30 百度在线网络技术(北京)有限公司 A kind of urban road recognition methods and device based on laser point cloud
JP6561670B2 (en) * 2015-08-10 2019-08-21 日産自動車株式会社 Step detecting device and step detecting method
KR102360464B1 (en) * 2015-08-31 2022-02-09 현대오토에버 주식회사 Apparatus and method for acquisition of connecting road information
JP6062020B2 (en) * 2015-11-18 2017-01-18 三菱電機株式会社 Designated point projection method, program and computer
JP6869023B2 (en) * 2015-12-30 2021-05-12 ダッソー システムズDassault Systemes 3D to 2D reimaging for exploration
CN105719284B (en) 2016-01-18 2018-11-06 腾讯科技(深圳)有限公司 A kind of data processing method, device and terminal
WO2017163640A1 (en) * 2016-03-22 2017-09-28 三菱電機株式会社 Measurement device
JPWO2017199333A1 (en) * 2016-05-17 2019-03-14 パイオニア株式会社 Information output device, terminal device, control method, program, and storage medium
CN106052697B (en) * 2016-05-24 2017-11-14 百度在线网络技术(北京)有限公司 Unmanned vehicle, unmanned vehicle localization method, device and system
CN106097444B (en) * 2016-05-30 2017-04-12 百度在线网络技术(北京)有限公司 Generation method and device of high-accuracy map
CN107742091B (en) * 2016-08-22 2019-01-29 腾讯科技(深圳)有限公司 A kind of method and device that road shoulder extracts
CN107871129B (en) * 2016-09-27 2019-05-10 北京百度网讯科技有限公司 Method and apparatus for handling point cloud data
CN109906608B (en) 2016-11-01 2021-03-12 松下电器(美国)知识产权公司 Display method and display device
CN108335337B (en) * 2017-01-20 2019-12-17 高德软件有限公司 method and device for generating orthoimage picture
CN106791797A (en) 2017-03-14 2017-05-31 京东方科技集团股份有限公司 The method and apparatus that a kind of double vision shows
CN108732582B (en) * 2017-04-20 2020-07-10 百度在线网络技术(北京)有限公司 Vehicle positioning method and device
EP3460518B1 (en) * 2017-09-22 2024-03-13 Leica Geosystems AG Hybrid lidar-imaging device for aerial surveying
CN110073403A (en) * 2017-11-21 2019-07-30 深圳市大疆创新科技有限公司 Image output generation method, equipment and unmanned plane
KR101910751B1 (en) * 2017-11-22 2018-10-22 중앙대학교 산학협력단 Method and system for acquiring 3-D data considering the shape feature of element of construction structure
CN109902542B (en) * 2017-12-11 2023-11-21 财团法人车辆研究测试中心 Dynamic ground detection method of three-dimensional sensor
KR101998396B1 (en) * 2017-12-22 2019-07-09 한국기술교육대학교 산학협력단 Method for workspace modeling based on virtual wall using 3d scanner and system thereof
KR101998397B1 (en) * 2017-12-22 2019-07-09 한국기술교육대학교 산학협력단 Method for vertex optimization using depth image in workspace modeling and system thereof
CN108319895B (en) 2017-12-29 2021-09-17 百度在线网络技术(北京)有限公司 Method and device for identifying intersection in electronic map
US20200014909A1 (en) 2018-07-03 2020-01-09 Faro Technologies, Inc. Handheld three dimensional scanner with autofocus or autoaperture
JP7204772B2 (en) * 2018-10-23 2023-01-16 マクセル株式会社 head up display system
KR20200046437A (en) 2018-10-24 2020-05-07 삼성전자주식회사 Localization method based on images and map data and apparatus thereof
CN111174777A (en) * 2018-11-09 2020-05-19 阿里巴巴集团控股有限公司 Positioning method and device and electronic equipment
JP2020204602A (en) * 2018-11-20 2020-12-24 株式会社エムアールサポート Ortho-image creation system, ortho-image creation method, anti-aircraft mark used therefor, and road investigation method
JP7188798B2 (en) * 2018-11-29 2022-12-13 Necソリューションイノベータ株式会社 Coordinate calculation device, coordinate calculation method, and program
CN111238494B (en) * 2018-11-29 2022-07-19 财团法人工业技术研究院 Carrier, carrier positioning system and carrier positioning method
JP7004636B2 (en) * 2018-12-04 2022-01-21 三菱電機株式会社 Display data generator, display data generation method, and display data generation program
US11665372B2 (en) 2019-01-07 2023-05-30 Samsung Electronics Co., Ltd. Fast projection method in video-based point cloud compression codecs
JP7189029B2 (en) * 2019-01-08 2022-12-13 朝日航洋株式会社 Elevation drawing method
US11315317B2 (en) * 2019-01-30 2022-04-26 Baidu Usa Llc Point clouds ghosting effects detection system for autonomous driving vehicles
US10762690B1 (en) 2019-02-11 2020-09-01 Apple Inc. Simulated overhead perspective images with removal of obstructions
CN110111414B (en) * 2019-04-10 2023-01-06 北京建筑大学 Orthographic image generation method based on three-dimensional laser point cloud
CN111829531A (en) * 2019-04-15 2020-10-27 北京京东尚科信息技术有限公司 Two-dimensional map construction method and device, robot positioning system and storage medium
CN110108259A (en) * 2019-04-18 2019-08-09 中国科学院地理科学与资源研究所 A kind of the photo acquisition device and information measuring method of field ground feature
CN109991984B (en) * 2019-04-22 2024-04-30 上海蔚来汽车有限公司 Method, apparatus and computer storage medium for generating high-definition map
WO2020255548A1 (en) * 2019-06-17 2020-12-24 株式会社エムアールサポート Orthoimage creation system, orthoimage creation method, ground control points for use in orthoimage creation method, and road surveying method
JP7441509B2 (en) 2019-08-23 2024-03-01 株式会社エムアールサポート Ortho image creation method, ortho image creation system, 3D model creation method, 3D model creation system, and sign used therefor
CN112634181A (en) * 2019-09-24 2021-04-09 北京百度网讯科技有限公司 Method and apparatus for detecting ground point cloud points
CN110766798B (en) * 2019-11-30 2023-02-14 中铁一局集团有限公司 Tunnel monitoring measurement result visualization method based on laser scanning data
US11788859B2 (en) * 2019-12-02 2023-10-17 Here Global B.V. Method, apparatus, and computer program product for road noise mapping
US11393489B2 (en) * 2019-12-02 2022-07-19 Here Global B.V. Method, apparatus, and computer program product for road noise mapping
KR102312892B1 (en) * 2019-12-20 2021-10-15 재단법인대구경북과학기술원 Apparatus and method for detecting road curb
CN111210488B (en) * 2019-12-31 2023-02-03 武汉中海庭数据技术有限公司 High-precision extraction system and method for road upright rod in laser point cloud
JP7122721B2 (en) * 2020-06-02 2022-08-22 株式会社Zmp OBJECT DETECTION SYSTEM, OBJECT DETECTION METHOD AND OBJECT DETECTION PROGRAM
CN111784834A (en) * 2020-06-24 2020-10-16 北京百度网讯科技有限公司 Point cloud map generation method and device and electronic equipment
JP2022016908A (en) 2020-07-13 2022-01-25 フォルシアクラリオン・エレクトロニクス株式会社 Overhead image generation device, overhead image generation system and automatic parking device
CN111815772B (en) * 2020-07-20 2023-06-23 云南财经大学 Plateau mountain land utilization method, system, storage medium and computer equipment
KR102543871B1 (en) * 2021-01-18 2023-06-20 네이버랩스 주식회사 Method and system for updating road information changes in map data
JP2022121049A (en) * 2021-02-08 2022-08-19 トヨタ自動車株式会社 Self-position estimation device
CN113320518B (en) * 2021-06-25 2022-06-03 东风汽车集团股份有限公司 Method and system for preventing vehicle from sliding after parking on ramp
WO2023067717A1 (en) * 2021-10-20 2023-04-27 日本電気株式会社 Facility inspection display device, information processing device, facility inspection display method, and non-transitory computer-readable medium
CN114510761B (en) * 2022-01-24 2024-05-17 湖南省第一测绘院 Method for eliminating elevation abnormality of road surface in DSM
CN116109643B (en) * 2023-04-13 2023-08-04 深圳市明源云科技有限公司 Market layout data acquisition method, device and computer readable storage medium
CN116342858B (en) * 2023-05-29 2023-08-25 未来机器人(深圳)有限公司 Object detection method, device, electronic equipment and storage medium
CN117372273B (en) * 2023-10-26 2024-04-19 航天科工(北京)空间信息应用股份有限公司 Method, device, equipment and storage medium for generating orthographic image of unmanned aerial vehicle image
CN117994469A (en) * 2024-04-07 2024-05-07 国网浙江省电力有限公司宁波供电公司 Unmanned aerial vehicle-based power line panoramic image generation method and system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1696606A (en) * 2004-05-14 2005-11-16 佳能株式会社 Information processing method and apparatus for finding position and orientation of targeted object

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10143659A (en) * 1996-11-06 1998-05-29 Komatsu Ltd Object detector
JP2003156330A (en) 2001-11-22 2003-05-30 Nec Corp Airborne topography-measuring apparatus and method
US7242460B2 (en) * 2003-04-18 2007-07-10 Sarnoff Corporation Method and apparatus for automatic registration and visualization of occluded targets using ladar data
US8294712B2 (en) * 2003-09-19 2012-10-23 The Boeing Company Scalable method for rapidly detecting potential ground vehicle under cover using visualization of total occlusion footprint in point cloud population
WO2005098793A1 (en) * 2004-03-31 2005-10-20 Pioneer Corporation Map creation device and navigation device
JP3966419B2 (en) * 2004-12-15 2007-08-29 三菱電機株式会社 Change area recognition apparatus and change recognition system
US20070065002A1 (en) * 2005-02-18 2007-03-22 Laurence Marzell Adaptive 3D image modelling system and apparatus and method therefor
JP2006323608A (en) * 2005-05-18 2006-11-30 Kozo Keikaku Engineering Inc Apparatus and method for creating model of group of three-dimensional structure and system for creating three-dimensional model
JP4619962B2 (en) 2006-02-15 2011-01-26 三菱電機株式会社 Road marking measurement system, white line model measurement system, and white line model measurement device
US7822266B2 (en) * 2006-06-02 2010-10-26 Carnegie Mellon University System and method for generating a terrain model for autonomous navigation in vegetation
CN101617197B (en) 2007-02-16 2011-06-22 三菱电机株式会社 Feature identification apparatus, measurement apparatus and measuring method
CA2725800A1 (en) * 2008-07-31 2010-02-04 Tele Atlas B.V. Method of displaying navigation data in 3d

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1696606A (en) * 2004-05-14 2005-11-16 佳能株式会社 Information processing method and apparatus for finding position and orientation of targeted object

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JP特开2003-156330A 2003.05.30
JP特开2006-323608A 2006.11.30

Also Published As

Publication number Publication date
JP5808369B2 (en) 2015-11-10
KR101319471B1 (en) 2013-10-17
JP5058364B2 (en) 2012-10-24
KR20120132704A (en) 2012-12-07
US20110310091A2 (en) 2011-12-22
CN102138163A (en) 2011-07-27
US8665263B2 (en) 2014-03-04
EP2320382A4 (en) 2014-07-09
JP4832596B2 (en) 2011-12-07
EP2320382A1 (en) 2011-05-11
US20110164037A1 (en) 2011-07-07
JP2013225336A (en) 2013-10-31
KR101269981B1 (en) 2013-05-31
KR20110033859A (en) 2011-03-31
JP2011233165A (en) 2011-11-17
JPWO2010024212A1 (en) 2012-01-26
JP2012018170A (en) 2012-01-26
JP5319741B2 (en) 2013-10-16
WO2010024212A1 (en) 2010-03-04

Similar Documents

Publication Publication Date Title
CN102138163B (en) Bird&#39;s-eye image forming device, bird&#39;s-eye image forming method
CN111542860A (en) Sign and lane creation for high definition maps for autonomous vehicles
CN105448184B (en) The method for drafting and device of map road
CN110832348A (en) Point cloud data enrichment for high definition maps of autonomous vehicles
CN109791052A (en) For generate and using locating reference datum method and system
CN101681525A (en) Method of and apparatus for producing a multi-viewpoint panorama
CN110758243A (en) Method and system for displaying surrounding environment in vehicle driving process
JP5339953B2 (en) 3D map correction apparatus and 3D map correction program
CN112652065A (en) Three-dimensional community modeling method and device, computer equipment and storage medium
CN117576652B (en) Road object identification method and device, storage medium and electronic equipment
Zhao et al. Autonomous driving simulation for unmanned vehicles
JP2022039188A (en) Position attitude calculation method and position attitude calculation program
CN109490926B (en) Path planning method based on binocular camera and GNSS
CN116978010A (en) Image labeling method and device, storage medium and electronic equipment
CN114419180A (en) Method and device for reconstructing high-precision map and electronic equipment
CN116917936A (en) External parameter calibration method and device for binocular camera
JP2022513830A (en) How to detect and model an object on the surface of a road
KR102616437B1 (en) Method for calibration of lidar and IMU, and computer program recorded on record-medium for executing method therefor
JP7467722B2 (en) Feature Management System
KR102618951B1 (en) Method for visual mapping, and computer program recorded on record-medium for executing method therefor
KR102626574B1 (en) Method for calibration of camera and lidar, and computer program recorded on record-medium for executing method therefor
US20240144435A1 (en) Method for generate training data for transportation facility and computer program recorded on record-medium for executing method therefor
US20240144594A1 (en) Method for create map using aviation lidar and computer program recorded on record-medium for executing method therefor
CN111414848B (en) Full-class 3D obstacle detection method, system and medium
CN117994744A (en) Image data processing method, image data processing device, storage medium and vehicle

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20140430

Termination date: 20190824