US20210048312A1 - Electronic device for generating map data and operation method thereof - Google Patents
Electronic device for generating map data and operation method thereof Download PDFInfo
- Publication number
- US20210048312A1 US20210048312A1 US16/978,907 US201916978907A US2021048312A1 US 20210048312 A1 US20210048312 A1 US 20210048312A1 US 201916978907 A US201916978907 A US 201916978907A US 2021048312 A1 US2021048312 A1 US 2021048312A1
- Authority
- US
- United States
- Prior art keywords
- electronic device
- image data
- resolution
- node
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims description 29
- 238000011017 operating method Methods 0.000 claims abstract description 8
- 230000000875 corresponding effect Effects 0.000 description 16
- 238000005516 engineering process Methods 0.000 description 11
- 230000033001 locomotion Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 230000002093 peripheral effect Effects 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 230000001276 controlling effect Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000007246 mechanism Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 238000012937 correction Methods 0.000 description 3
- 230000001413 cellular effect Effects 0.000 description 2
- 230000002596 correlated effect Effects 0.000 description 2
- 239000000428 dust Substances 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 239000000446 fuel Substances 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000002123 temporal effect Effects 0.000 description 2
- 206010039203 Road traffic accident Diseases 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001427 coherent effect Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
- G09B29/003—Maps
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3833—Creation or updating of map data characterised by the source of data
- G01C21/3848—Data obtained from both position sensors and additional sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1656—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/38—Electronic maps specially adapted for navigation; Updating thereof
- G01C21/3804—Creation or updating of map data
- G01C21/3833—Creation or updating of map data characterised by the source of data
- G01C21/3837—Data obtained from a single source
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/01—Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/13—Receivers
- G01S19/14—Receivers specially adapted for specific applications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/207—Analysis of motion for motion estimation over a hierarchy of resolutions
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B29/00—Maps; Plans; Charts; Diagrams, e.g. route diagram
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
Definitions
- the disclosure relates to an electronic device for generating map data and an operating method thereof.
- An autonomous car refers to a vehicle that recognizes the surrounding environment and determines a driving situation to drive to a given destination under self-control, without intervention by a driver.
- autonomous cars have attracted attention as a personal transportation means which could reduce traffic accidents, increase transportation efficiency, save fuel, and enhance convenience by replacing driving.
- Technical problems to be solved are not limited thereto. Other technical problems may exist.
- FIG. 1 is a view for describing an example of an operation of an electronic device according to an embodiment.
- FIG. 2 shows an example path graph including a plurality of nodes according to an embodiment.
- FIG. 3 shows an example of loop-closing according to an embodiment.
- FIG. 4 is a view for describing an example of generating map data according to an embodiment.
- FIG. 5 is a block diagram of an electronic device according to an embodiment.
- FIG. 6 is a block diagram of an electronic device according to an embodiment.
- FIG. 7 is a block diagram of a vehicle according to an embodiment.
- FIG. 8 is a flowchart of an operating method of an electronic device according to an embodiment.
- an operating method of an electronic device includes: obtaining image data of a first resolution and image data of a second resolution for each of a plurality of nodes generated while the electronic device moves; obtaining location information with respect to each of the generated nodes, by using the image data of the second resolution; generating and storing map data by matching the obtained location information with the image data of the first resolution for each node; and estimating a current location of the electronic device by using the generated map data and the image data of the first resolution.
- an electronic device includes: at least one sensing portion configured to obtain image data of a first resolution and image data of a second resolution for each of a plurality of nodes generated while the electronic device moves; a memory storing one or more instructions; and a processor configured to execute the one or more instructions stored in the memory to: obtain location information with respect to each of the generated nodes by using the image data of the second resolution, generate map data by matching the obtained location information with the image data of the first resolution for each node, store the generated map data in the memory, and estimate a current location of the electronic device by using the generated map data and the image data of the first resolution.
- a computer-readable recording medium has recorded thereon a program for executing the operating method of the electronic device on a computer.
- the part when a part is referred to as being “connected” to other parts, the part may be “directly connected” to the other parts or may be “electrically connected” to the other parts with other devices therebetween.
- a part when a part “includes” a certain element, unless it is specifically mentioned otherwise, the part may further include another component and may not exclude the other component.
- the terms, such as “unit” or “module,” used in the specification should be understood as a unit that processes at least one function or operation and that may be embodied in a hardware manner, a software manner, or a combination of the hardware manner and the software manner.
- a vehicle 1 may include an electronic device 100 (hereinafter, the electronic device 100 ) for assisting in or controlling driving of the vehicle 1 .
- FIG. 1 is a view for describing an example of an operation of an electronic device according to an embodiment.
- the electronic device 100 included in the vehicle 1 may generate map data by recognizing a surrounding environment through a sensing portion 110 , while the vehicle 1 drives on the road.
- image data of different resolutions may be obtained for a plurality of nodes generated while the vehicle 1 moves.
- the plurality of nodes may be non-continually generated while the vehicle 1 moves.
- the image data obtained for each node may include a 3D point cloud, image, etc.
- the image data according to an embodiment may include a distribution chart indicating information sensed with respect to a two-dimensional or a three-dimensional space.
- the image data according to an embodiment is not limited to the example described above and may include various types of data indicating information collected about surrounding environmental conditions at a certain location.
- the node may correspond to a location of the vehicle 1 of the electronic device 100 when the image data is obtained.
- the electronic device 100 may generate a plurality of nodes according to a time interval or a distance interval. However, it is not limited thereto, and the electronic device 100 may non-continually generate a plurality of nodes. For example, when a location of the electronic device 100 at a temporal point t is node A, a location of the electronic device 100 at a temporal point t+1 may correspond to node B adjacent to the node A. A path through which the vehicle 1 including the electronic device 100 drives may be a set of continual nodes.
- images of different resolutions including the surrounding environment of the vehicle 1 may be captured at the nodes.
- the electronic device 100 may generate the map data by using image data of different resolutions.
- the electronic device 100 may obtain pose information of the vehicle 1 by using image data of a high resolution and may obtain location information for a current node by using the pose information of the vehicle.
- the location information for the current node may be obtained by calculating a distance and a direction of movement of the vehicle from a location of a previous node, based on the pose information of the vehicle.
- the electronic device 100 may generate the map data by matching the location information of the current node with image data of a low resolution.
- the electronic device 100 may obtain location information of the electronic device 100 , the location information having high accuracy, by using image data of a high resolution, and may generate the map data including image data of a low resolution by using the location information.
- the electronic device 100 may estimate a current location of the electronic device 100 by using the generated map data and image data of a first resolution (for example, the image data of the low resolution). For example, the electronic device 100 may estimate the current location of the electronic device 100 by obtaining, from among the image data of the first resolution of the map data, image data which is most closely matched to the image data of the first resolution, which is obtained with respect to the current location.
- a first resolution for example, the image data of the low resolution
- the electronic device 100 may estimate the current location of the electronic device 100 by obtaining, from among the image data of the first resolution of the map data, image data which is most closely matched to the image data of the first resolution, which is obtained with respect to the current location.
- the electronic device 100 may determine a range for the current location of the electronic device 100 and estimate the current location by using map data corresponding to at least one node included in the determined range.
- the range for the current location of the electronic device 100 may include an area which may be estimated as the current location of the electronic device 100 .
- the electronic device 100 may use the map data corresponding to the range for the current location, in order to minimize the amount of calculations taken to compare image data of the map data with image data obtained at the current location.
- the range for the current location of the electronic device 100 may be determined based on at least one of information about a previous location of the electronic device 100 and global positioning system (GPS) information about the current location of the electronic device 100 .
- GPS global positioning system
- the electronic device 100 may determine the range which may include the current location based on the movement of the electronic device 100 , based on the previous location.
- the electronic device 100 may determine the range which may include the current location, based on the GPS information and an error bound of the GPS information.
- the disclosure is not limited to the example described above.
- the range for the current location of the electronic device 100 may be determined based on information collected using various methods with respect to the current location of the electronic device 100 .
- the pose information of the vehicle 1 may include 6-degree-of-freedom information.
- the 6-degree-of-freedom information may include information about a direction in which a vehicle moves and rotation of the vehicle.
- the 6-degree-of-freedom information may include at least one of x, y, z, roll, yaw, and pitch.
- the x, y, z values may include information about a direction (eg. a vector value) in which the vehicle moves.
- the roll value may be an angle of rotation in a counter-clockwise direction based on an x-axis
- the yaw value may be an angle of rotation in a counter-clockwise direction based on a y-axis
- the pitch value may be an angle of rotation in a counter-clockwise direction based on a z-axis.
- the yaw value may indicate a movement direction of the vehicle 1 and the pitch value may indicate whether the vehicle 1 moves through a slope or a bump.
- the pose information of the vehicle 1 may be obtained based on the number of rotations of a wheel of the vehicle 1 and a direction of the rotation, which are measured through an odometery sensor 230 .
- the pose information measured through the odometery sensor 230 may have low accuracy, due to a slipping phenomenon generated between the wheel and a bottom surface.
- the electronic device 100 may obtain the pose information of the vehicle 1 , the pose information having high accuracy, by using image data of a second resolution and may obtain location information of the vehicle 1 based on the pose information.
- the electronic device 100 may obtain the pose information of the vehicle from the plurality of nodes, by using the image data of the second resolution. For example, the electronic device 100 may obtain a difference value between the pose information of each node, by using the image data of the second resolution, and based on the difference value of the pose information, may obtain pose information for each node, the pose information being optimized to have the least error.
- the electronic device 100 may obtain location information for a current node of the vehicle 1 , based on the pose information of the current node and a previous node and location information of the previous node.
- the pose information of the vehicle may include a direction in which the vehicle moves and a direction of the rotation.
- the electronic device 100 may obtain the location information for the current node by obtaining information about a direction and a distance of the movement from the previous node to the current node, based on the pose information of at least one of the current node and the previous node.
- the electronic device 100 may generate the map data by matching image data of a first resolution (for example, a low resolution) with respect to the current node to the obtained location information of the current node of the vehicle 1 .
- a first resolution for example, a low resolution
- the electronic device 100 may obtain the image data from each node, through the sensing portion 110 including a radar sensor 226 , a lidar sensor 227 , an image sensor 228 , etc.
- the image data of the first resolution described above may be generated by a sensor using radio waves, for example, the radar sensor 226 .
- the image data of the second resolution described above may be generated by a sensor using a laser beam or light, for example, the lidar sensor 227 , the image sensor 228 , etc.
- the electronic device 100 may obtain the image data of the first resolution (for example, the low resolution) by using the radar sensor 226 and obtain the image data of the second resolution (for example, the high resolution) by using at least one of the lidar sensor 227 and the image sensor 228 .
- the first resolution for example, the low resolution
- the second resolution for example, the high resolution
- estimation of a current location of a moving object is possible by using equipment (for example, a radar sensor) to capture an image of a low resolution with map data including the image data of the low resolution.
- equipment for example, a radar sensor
- map data including the image data of the low resolution accurate location estimation is possible by using only a less expensive device for capturing image data of a low resolution.
- the image data of the second resolution (for example, the high resolution) described above includes image data, at which a difference of the pose information between adjacent nodes is to be identified
- the image data of the second resolution may be used to obtain the pose information, according to an embodiment.
- the electronic device 100 may obtain the location information according to an embodiment by using the image data of the second resolution obtained by using for example a lidar sensor of a first channel having one beam.
- the electronic device 100 according to an embodiment may perform the operation according to an embodiment, without including an expensive lidar sensor having a plurality of beams.
- the image data of the first resolution is an image generated by using radio waves
- speed detection using a Doppler effect is possible with respect to an object in an image.
- a dynamic object having a speed is desirably excluded from image data, in generating the map data. That is, the electronic device 100 according to an embodiment may identify a dynamic object from objects in an image, based on speed information, and generate or modify and refine the map data by using the image data from which the dynamic object is excluded.
- the electronic device 100 may obtain speed information with respect to the image data of the first resolution.
- the speed information may include, for example, a speed value corresponding to each unit area of the image data.
- the electronic device 100 may identify the dynamic object included in the image data of the first resolution, based on the speed information with respect to the image data of the first resolution.
- the electronic device 100 may remove the dynamic object identified in the image data of the first resolution.
- the electronic device 100 may remove the identified dynamic object from the image data of the second resolution corresponding to the image data of the first resolution.
- the electronic device 100 may generate the map data based on the image data of the first resolution and the image data of the second resolution, from which the dynamic object is removed.
- the map data may include the image data of the first resolution including a static object.
- the electronic device 100 may modify and refine the map data based on the image data of the first resolution and the image data of the second resolution, from which the dynamic object is removed.
- the electronic device 100 may use an area of the image data of the first resolution obtained to modify and refine the map data, the area including the identified static object, rather than a total area thereof.
- the autonomous vehicle may generate and modify and refine map data about a surrounding environment by using various pieces of sensor information and estimate a current location of the vehicle on the map data.
- map data may be estimated on the map data.
- FIG. 1 illustrates that the electronic device 100 is included in the vehicle 1 .
- a movable device or robot may include the electronic device 100 .
- the electronic device 100 may generate the map data by using the image data of the first resolution including a distribution chart based on information sensed at a certain location. For example, the electronic device 100 may obtain an indoor temperature or dust distribution chart, or an indoor wireless signal strength distribution chart, as the image data of the first resolution, from each node, and may obtain location information based on the image data of the second resolution. The electronic device 100 may match the location information with the image data of the first resolution obtained from each node, to generate the map data including the indoor temperature or dust distribution chart, or the indoor wireless signal strength distribution chart.
- FIG. 2 shows an example path graph including a plurality of nodes according to an embodiment.
- the electronic device 100 may generate a path graph as a set of at least two nodes and edges between the at least two nodes.
- the graph may be generated by indicating the plurality of nodes as dots and connecting the adjacent nodes via edges.
- the path graph p 20 may include an edge e 21 connecting a node node 21 and a node node 22 .
- Each of the nodes node 21 and node 22 may include pose information of the electronic device 100 according to an embodiment and the edge 21 may include a difference value between pose information of adjacent nodes.
- the electronic device 100 may obtain at least one of a difference value and a covariance between pose information of the adjacent nodes, as a value of the edge e 21 between the two nodes, based on the image data of the second resolution corresponding to each of the nodes node 21 and node 22 .
- the covariance may indicate a degree in which values of the pose information of the two nodes are changed in a correlated manner.
- the pose information of the node node 22 may be obtained from the pose information of the node node 21 .
- the electronic device 100 may obtain the pose information with respect to at least one node connected to an edge, based on a value of an edge. For example, the electronic device 100 may obtain the pose information of the at least one node, at which the pose information has the least error, based on at least one edge value.
- the electronic device 100 may obtain at least one of the difference value and the covariance of the pose information, by comparing image data of the node node 21 with image data of the node node 22 .
- the pose information of the node node 21 may include a pre-obtained value or a pre-defined value based on a certain condition, according to pose information of a previous node adjacent to the node node 21 .
- the electronic device 100 may obtain the pose information of the node node 22 , based on the pose information of the node node 21 and the value of the edge e 21 .
- the electronic device 100 may obtain location information of each node, by using the pose information of each node. For example, based on information about a moving distance and a moving direction of the electronic device 100 , the information being included in the pose information, the location information of the current node may be obtained from the location information of the previous node of the electronic device 100 .
- FIG. 3 shows an example of loop-closing according to an embodiment.
- the electronic device 100 may correct the pose information of each node such that a sum of error values of edges included in a path graph is minimized.
- the electronic device 100 may use simultaneous localization and mapping (SLAM) technologies, in which a moving vehicle or robot measures its location while simultaneously writing a map of a surrounding environment.
- the electronic device 100 may perform loop-closing based on a relative location of two adjacent nodes, by using graph-based SLAM technologies.
- the electronic device 100 may generate a loop-closure edge connecting two nodes, by using a relative distance, a relative angle, etc., between two nodes, to derive a corrected resultant value.
- the electronic device 100 may move in a clock-wise direction from a node node 31 to a node node 32 .
- the electronic device 100 may obtain optimized pose information having the least error, based on a value of at least one edge including an edge e 31 included in the path graph path 30 , according to the loop-closing correction method. For example, with the node node 31 and the node node 32 having the same location information as a pre-requisite condition, the optimized pose information of each node may be obtained.
- the electronic device 100 may obtain the pose information of each node, by using various methods of optimizing the pose information, in addition to the loop-closing correction method.
- the electronic device 100 may obtain the value of the at least one edge included in the path graph path 30 , by using the image data of the second resolution for each node of the path graph path 30 . Also, the electronic device 100 may obtain the pose information of at least one node included in the path graph path 30 , the node being configured to have the least error, based on the value of the at least one edge. The electronic device 100 may obtain location information of each node, based on the pose information of each node.
- FIG. 4 is a view for describing an example of generating map data according to an embodiment.
- the electronic device 100 may generate map data 40 including image data d 41 corresponding to a node node 41 and image data d 42 corresponding to a node node 42 .
- the image data d 41 and d 42 may be the image data of the first resolution (low resolution) described above.
- the electronic device 100 may generate the map data by storing image data of a first resolution and location information, corresponding to each node.
- the map data may be realized in the form of a 3D point cloud map, a 2D grid map, a 3D voxel map, etc., based on the image data of the first resolution and the location information, but is not limited thereto.
- the map data may be realized in the various forms (for example, a feature map, a semantic map, a dense map, a texture map, etc.) according to types of data included in a map when the map is generated.
- the electronic device 100 may generate the map data in the form of the 3D point cloud map, by using image data in a 3D point cloud form corresponding to each node, based on location information of each node of a corrected path graph.
- the image data may be, for example, the image data of the first resolution.
- the electronic device 100 may generate the map data generated by converting the image data in the 3D point cloud form corresponding to each node into a 3D voxel form.
- the electronic device 100 may generate the map data in a 2D grid form by using only a point cloud corresponding to each node or a ground surface of a road extracted from image data in an image form.
- FIG. 5 is a block diagram of an electronic device according to an embodiment.
- the electronic device 100 may include the sensing portion 110 , a processor 120 , and a memory 130 .
- FIG. 5 illustrates only components of the electronic device 100 , the components being related to the present embodiment. Thus, it will be understood by one of ordinary skill in the art that other general-purpose components than the components illustrated in FIG. 5 may further be included.
- the sensing portion 110 may obtain a peripheral image including objects located around the vehicle 1 ( FIG. 1 ) driving on a road. Also, the sensing portion 110 according to an embodiment may obtain the peripheral image described above as image data of different resolutions.
- the sensing portion 110 may include a plurality of sensors configured to obtain the peripheral image.
- the sensing portion 110 may include a distance sensor, such as a lidar sensor and a radar sensor, and an image sensor, such as a camera.
- the lidar sensor of the sensing portion 110 may generate the image data of the second resolution (for example, the high resolution) described above, and the radar sensor may generate the image data of the first resolution (for example, the low resolution).
- the sensing portion 110 may include one or more actuators configured to correct locations and/or alignments of the plurality of sensors, and thus may sense an object located at each of a front direction, a rear direction, and side directions of the vehicle 1 .
- the sensing portion 110 may sense a shape of a peripheral object and a shape of a road by using the image sensor.
- the processor 120 may include at least one processor. Also, the processor 120 may execute one or more instructions stored in the memory 130 .
- the processor 120 may generate map data by using the image data of different resolutions. For example, the processor 120 may obtain location information of a plurality of nodes by using image data of a second resolution (for example, a high resolution). Also, the processor 120 may generate the map data by matching the location information of each node with image data of a first resolution (for example, a low resolution) of each node.
- a second resolution for example, a high resolution
- a first resolution for example, a low resolution
- the processor 120 may obtain the location information of each node by using image data of a second resolution (for example, a high resolution).
- the processor 120 may obtain pose information of the electronic device 100 by using the image data of the second resolution (for example, the high resolution) and obtain the location information of each node by using the pose information of the electronic device 100 .
- the processor 120 may obtain at least one of a difference value and a covariance between pose information of a first node and a second node, and based on the obtained at least one, may obtain the location information of the second node from the location information of the first node.
- the processor 120 may obtain the at least one of the difference value and the covariance between the pose information described above by comparing the image data of the second resolution with respect to the first node and the second node.
- the pose information described above may include 6-degree-of-freedom information of the electronic device 100 .
- the processor 120 may determine a range of a current location based on information about the current location obtained in various methods and may estimate the current location based on map data corresponding to at least one node included in the determined range.
- the range of the current location may be determined based on at least one of information about a previous location of the electronic device and GPS information about the current location of the electronic device.
- the map data corresponding to the at least one node included in the range of the current location may include at least one piece of image data of a first resolution corresponding to the at least one node.
- the processor 120 may identify a dynamic object in the image data of the first resolution, based on speed information with respect to the image data of the first resolution.
- the processor 120 may remove the identified dynamic object from at least one of the image data of the first resolution and the image data of the second resolution.
- the processor 120 may generate the map data by using the image data from which the dynamic object is removed.
- the memory 130 may store one or more instructions performed by the processor 120 .
- the memory 130 may store various data and programs for driving and controlling the electronic device 100 under control of the processor 120 .
- the memory 130 may store signals or data that is input/output based on operations of the sensing portion 110 and the processor 120 .
- the memory 130 may store the map data generated by the processor 120 under control of the processor 120 .
- FIG. 6 is a block diagram of an electronic device according to an embodiment.
- the electronic device 100 may include the sensing portion 110 , the processor 120 , the memory 130 , an outputter 140 , an inputter 150 , and a communicator 160 .
- the electronic device 100 , the sensing portion 110 , the processor 120 , and the memory 130 illustrated in FIG. 6 may correspond to the electronic device 100 , the sensing portion 110 , the processor 120 , and the memory 130 of FIG. 5 , respectively.
- the sensing portion 110 may include a plurality of sensors configured to sense information about a surrounding environment in which the vehicle ( FIG. 1 ) is located and may include one or more actuators configured to correct locations and/or alignments of the sensors.
- the sensing portion 110 may include a GPS 224 , an inertial measurement unit (IMU) 225 , a radar sensor 226 , a lidar sensor 227 , an image sensor 228 , and an odometery sensor 230 .
- the sensing portion 110 may include at least one of a temperature/humidity sensor 232 , an infrared sensor 233 , an atmospheric sensor 235 , a proximity sensor 236 , and an RGB illuminance sensor 237 , but is not limited thereto.
- a function of each sensor may be intuitively inferred by one of ordinary skill in the art based on a name of the sensor, and thus, its detailed description is omitted.
- the sensing portion 110 may include a motion sensing portion 238 configured to sense a motion of the vehicle 1 ( FIG. 1 ).
- the motion sensing portion 238 may include a magnetic sensor 229 , an acceleration sensor 231 , and a gyroscope sensor 234 .
- the GPS 224 may include a sensor configured to estimate a geographical location of the vehicle 1 ( FIG. 1 ). That is, the GPS 224 may include a transceiver configured to estimate a location of the vehicle 1 ( FIG. 1 ) on the earth. According to an embodiment, a range of a current location of the vehicle 1 may be determined based on GPS information with respect to the current location of the vehicle 1 . The current location of the vehicle 1 may be estimated based on the map data obtained based on the determined range.
- the IMU 225 may be a combination of sensors configured to sense changes of a location and an alignment of the vehicle 1 ( FIG. 1 ) based on inertia acceleration.
- the combination of sensors may include accelerometers and gyroscopes.
- the radar sensor 226 may include a sensor configured to sense objects in an environment in which the vehicle 1 ( FIG. 1 ) is located, by using wireless signals. Also, the radar sensor 226 may be configured to sense a speed and/or a direction of objects.
- the lidar sensor 227 may include a sensor configured to sense objects in an environment in which the vehicle 1 ( FIG. 1 ) is located, by using a laser beam.
- the lidar sensor 227 may include a laser light source and/or a laser scanner configured to emit a laser beam, and a sensor configured to sense reflection of the laser beam.
- the lidar sensor 227 may be configured to operate in a coherent (for example, using heterodyne sensing) or an incoherent sensing mode.
- the image sensor 228 may include a still camera or a video camera configured to record an environment outside the vehicle 1 ( FIG. 1 ).
- the image sensor 228 may include a plurality of cameras and the plurality of cameras may be arranged at various locations inside and outside of the vehicle 1 ( FIG. 1 ).
- the odometery sensor 230 may estimate the location of the vehicle 1 ( FIG. 1 ) and measure a moving distance. For example, the odometery sensor 230 may measure a value of a location change of the vehicle 1 ( FIG. 1 ) by using the number of rotations of a wheel of the vehicle 1 ( FIG. 1 ).
- the location of the electronic device 100 may be measured by using the methods of trilateration, triangulation, etc., using sensors and communication devices, such as 3 G, LTE, a global navigation satellite system (GNSS), a global system for mobile communication (GSM), LORAN-C, NELS, WLAN, Bluetooth, etc.
- sensors and communication devices such as 3 G, LTE, a global navigation satellite system (GNSS), a global system for mobile communication (GSM), LORAN-C, NELS, WLAN, Bluetooth, etc.
- a location of the electronic device 100 may be estimated by using sensors, such as indoor-GPS, Bluetooth, WLAN, VLC, active badge, GSM, RFID, visual tags, WIPS, WLAN, ultraviolet rays, magnetic sensors, etc.
- sensors such as indoor-GPS, Bluetooth, WLAN, VLC, active badge, GSM, RFID, visual tags, WIPS, WLAN, ultraviolet rays, magnetic sensors, etc.
- the method of measuring the location of the electronic device 100 is not limited to the examples described above. Other methods, in which location data of the electronic device 100 may be obtained, may also be used.
- the memory 130 may include a magnetic disk drive, an optical disk drive, and a flash memory. Alternatively, the memory 130 may include a portable USB data storage.
- the memory 130 may store system software configured to execute examples related to the disclosure.
- the system software configured to execute the examples related to the disclosure may be stored in a portable storage medium.
- the communicator 160 may include at least one antenna for wirelessly communicating with other devices.
- the communicator 160 may be used to wirelessly communicate with cellular networks or other wireless protocols and systems through Wi-Fi or Bluetooth.
- the communicator 160 controlled by the processor 120 may transmit and receive wireless signals.
- the processor 120 may execute a program included in the storage 140 for the communicator 160 to transmit and receive wireless signals to and from the cellular network.
- the inputter 150 refers to a device for inputting data for controlling the vehicle 1 ( FIG. 1 ).
- the inputter 150 may include a key pad, a dome switch, a touch pad (a touch capacitance method, a pressure-resistive layer method, an infrared sensing method, a surface ultrasonic conductive method, an integral tension measuring method, a piezo effect method, etc.), a jog wheel, a jog switch, etc., but is not limited thereto.
- the inputter 150 may include a microphone, which may be configured to receive audio (for example, a voice command) from a passenger of the vehicle 1 ( FIG. 1 ).
- the outputter 140 may output an audio signal or a video signal
- an output device 280 may include a display 281 and a sound outputter 282 .
- the display 281 may include at least one of a liquid crystal display, a thin-film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a 3D display, and an electrophoretic display. According to a realized form of the outputter 130 , the outputter 140 may include at least two displays 281 .
- the sound outputter 282 may output audio data received from the communicator 160 or stored in the storage 140 . Also, the sound outputter 282 may include a speaker, a buzzer, etc.
- the inputter 150 and the outputter 140 may include a network interface and may be realized as a touch screen.
- the processor 120 may execute programs stored in the memory 130 to generally control the sensing portion 110 , the communicator 160 , the inputter 150 , the storage 140 , and the outputter 140 .
- FIG. 7 is a block diagram of a vehicle according to an embodiment.
- the vehicle 1 may include the electronic device 100 and a driving device 200 .
- FIG. 7 illustrates only components of the vehicle 1 , the components being related to the present embodiment. Thus, it will be understood by one of ordinary skill in the art that other general-purpose components than the components illustrated in FIG. 7 may further be included.
- the electronic device 100 may include the sensing portion 110 , the processor 120 , and the memory 130 .
- the sensing portion 110 , the processor 120 , and the memory 130 are described in detail in FIGS. 5 and 6 , and thus, their descriptions are omitted.
- the driving device 200 may include a brake unit 221 , a steering unit 222 , and a throttle 223 .
- the steering unit 222 may be a combination of mechanisms configured to adjust a direction of the vehicle 1 .
- the throttle 223 may be a combination of mechanisms configured to control a speed of the vehicle 1 by controlling an operating speed of an engine/motor 211 . Also, the throttle 223 may adjust an amount of mixture gas of fuel air inserted into the engine/motor 211 by adjusting an opening amount of the throttle and may control power and a driving force by adjusting the opening amount of the throttle.
- the brake unit 221 may be a combination of mechanisms configured to decelerate the vehicle 1 .
- the brake unit 221 may use friction to reduce a speed of a wheel/tire 214 .
- FIG. 8 is a flowchart of an operating method of an electronic device according to an embodiment.
- the electronic device 100 may obtain image data of a first resolution and image data of a second resolution for each of a plurality of nodes generated while the electronic device 100 moves.
- the electronic device 100 may obtain image data of different resolutions at each of the nodes, by using different sensors.
- the electronic device 100 may obtain location information with respect to each node based on the image data of the second resolution.
- the electronic device 100 may compare the image data of the second resolution with respect to the first node and the second node, to obtain at least one of a difference value and a covariance between pose information of a first node and a second node as a value of an edge between the two nodes.
- the first node and the second node may be a previous node and a current node of the electronic device 100 , respectively.
- the covariance may indicate a degree in which values of the pose information of the two nodes are changed in a correlated manner.
- the pose information of the second node may be obtained from the pose information of the first node.
- the electronic device 100 may obtain the pose information of the second node based on the value of the edge and the pose information of the first node.
- the electronic device 100 may obtain the pose information of at least one optimized node, according to a loop-closing correction method, based on the value of the edge.
- the electronic device 100 may obtain location information of the current node based on the pose information of each node.
- the electronic device 100 may obtain location information having high accuracy by using image data of a high resolution, through which feature values of the image data may be distinctly compared.
- the electronic device 100 may match and store the location information for each node obtained in operation S 820 and the image data of the first resolution.
- the electronic device 100 may generate the map data by matching and storing the location information for each node and the image data of the first resolution.
- the electronic device 100 may estimate the current location of the electronic device 100 by using the map data generated in operation 840 and the image data of the first resolution.
- the image data of the first resolution may be image data obtained at the current location of the electronic device 100 .
- the electronic device 100 may estimate the current location of the electronic device 100 by comparing the image data of the first resolution obtained at the current location with the image data of the first resolution included in the map data. For example, the electronic device 100 may determine, from the image data of the first resolution included in the map data, image data of the first resolution mostly closely matched to the image data of the first resolution obtained at the current location.
- the electronic device 100 may estimate the location information corresponding to the determined image data of the first resolution as the current location of the electronic device 100 .
- the device may include a processor, a memory for storing program data and executing it, a permanent storage such as a disk drive, a communication port for handling communications with external devices, and user interface devices, such as touch panels, keys, buttons, etc.
- Any methods implemented as software modules or algorithms may be stored as program instructions or computer-readable codes executable by a processor on a computer-readable recording media.
- the computer-readable recording media may include magnetic storage media (for example, read-only memory (ROM), random-access memory (RAM), floppy disks, hard disks, etc.) and optical reading media (for example, CD-ROMs, digital versatile disc (DVD), etc.).
- the computer-readable recording medium can also be distributed over network-coupled computer systems so that the computer-readable code is stored and executed in a distributed fashion. The media can be read by the computer, stored in the memory, and executed by the processor.
- the present embodiment may be described in terms of functional block components and various processing steps. Such functional blocks may be realized by any number of hardware and/or software components configured to perform the specified functions.
- the embodiment may employ various integrated circuit components, e.g., memory elements, processing elements, logic elements, look-up tables, and the like, which may carry out a variety of functions under the control of one or more microprocessors or other control devices.
- the elements of the present embodiment are implemented using software programming or software elements the embodiment may be implemented with any programming or scripting language such as C, C++, Java, assembler, or the like, with the various algorithms being implemented with any combination of data structures, objects, processes, routines or other programming elements.
- Functional aspects may be implemented in algorithms that execute on one or more processors.
- the present embodiment could employ any number of conventional techniques for electronics configuration, signal processing and/or control, data processing and the like.
- the words “mechanism,” “element,” and “component” are used broadly and are not limited to mechanical or physical embodiments. The meaning of these words can include software routines in conjunction with processors, etc.
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Automation & Control Theory (AREA)
- Software Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Mathematical Physics (AREA)
- Educational Technology (AREA)
- Computer Networks & Wireless Communication (AREA)
- Educational Administration (AREA)
- Business, Economics & Management (AREA)
- Computer Graphics (AREA)
- Navigation (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
- The disclosure relates to an electronic device for generating map data and an operating method thereof.
- Recently, automobiles have been rapidly getting smarter due to integration of information and communication technologies and the automobile industry. By getting smarter, automobiles are not just mechanical devices, rather, they have evolved into smart cars. In particular, as a key technology of smart cars, autonomous driving has drawn attention.
- An autonomous car refers to a vehicle that recognizes the surrounding environment and determines a driving situation to drive to a given destination under self-control, without intervention by a driver. In recent times, autonomous cars have attracted attention as a personal transportation means which could reduce traffic accidents, increase transportation efficiency, save fuel, and enhance convenience by replacing driving.
- For such autonomous driving of cars, various technologies are required, such as technologies for recognizing surrounding environmental conditions such as traffic lanes, peripheral vehicles, pedestrians, etc., technologies for determining driving situations, and technologies for control of steering, acceleration/deceleration, etc. Particularly, among these technologies, technologies for accurately determining the surrounding environment of a vehicle are highly important. That is, it is necessary to generate a map containing a small error range and to precisely determine the surrounding environment of the vehicle on the generated map.
- Under such conditions, for autonomous driving of various moving bodies including vehicles, the need for technologies for generating and using a map providing high reliability with respect to actual road situations has arisen.
- Provided are a method and an electronic device for generating map data. Provided is a computer-readable recording medium having recorded thereon a program for executing the method on a computer. Technical problems to be solved are not limited thereto. Other technical problems may exist.
-
FIG. 1 is a view for describing an example of an operation of an electronic device according to an embodiment. -
FIG. 2 shows an example path graph including a plurality of nodes according to an embodiment. -
FIG. 3 shows an example of loop-closing according to an embodiment. -
FIG. 4 is a view for describing an example of generating map data according to an embodiment. -
FIG. 5 is a block diagram of an electronic device according to an embodiment. -
FIG. 6 is a block diagram of an electronic device according to an embodiment. -
FIG. 7 is a block diagram of a vehicle according to an embodiment. -
FIG. 8 is a flowchart of an operating method of an electronic device according to an embodiment. - According to an aspect of the disclosure, an operating method of an electronic device includes: obtaining image data of a first resolution and image data of a second resolution for each of a plurality of nodes generated while the electronic device moves; obtaining location information with respect to each of the generated nodes, by using the image data of the second resolution; generating and storing map data by matching the obtained location information with the image data of the first resolution for each node; and estimating a current location of the electronic device by using the generated map data and the image data of the first resolution.
- According to another aspect of the disclosure, an electronic device includes: at least one sensing portion configured to obtain image data of a first resolution and image data of a second resolution for each of a plurality of nodes generated while the electronic device moves; a memory storing one or more instructions; and a processor configured to execute the one or more instructions stored in the memory to: obtain location information with respect to each of the generated nodes by using the image data of the second resolution, generate map data by matching the obtained location information with the image data of the first resolution for each node, store the generated map data in the memory, and estimate a current location of the electronic device by using the generated map data and the image data of the first resolution.
- According to another aspect of the disclosure, a computer-readable recording medium has recorded thereon a program for executing the operating method of the electronic device on a computer.
- Hereinafter, embodiments of the disclosure will be described n detail with reference to the accompanying drawings so that one of ordinary skill in the art could easily execute the disclosure. However, the disclosure may have different forms and should not be construed as being limited to the embodiments described herein. Also, in the drawings, parts not related to descriptions are omitted for the clear description of the disclosure, and throughout the specification, like reference numerals are used for like elements.
- Throughout the specification, when a part is referred to as being “connected” to other parts, the part may be “directly connected” to the other parts or may be “electrically connected” to the other parts with other devices therebetween. When a part “includes” a certain element, unless it is specifically mentioned otherwise, the part may further include another component and may not exclude the other component. Also, the terms, such as “unit” or “module,” used in the specification, should be understood as a unit that processes at least one function or operation and that may be embodied in a hardware manner, a software manner, or a combination of the hardware manner and the software manner.
- Hereinafter, embodiments of the disclosure will be described n detail with reference to the accompanying drawings so that one of ordinary skill in the art could easily execute the disclosure. However, the disclosure may have different forms and should not be construed as being limited to the embodiments described herein.
- Hereinafter, the disclosure will be described in detail with reference to the accompanying drawings.
- In this specification, a vehicle 1 may include an electronic device 100 (hereinafter, the electronic device 100) for assisting in or controlling driving of the vehicle 1.
-
FIG. 1 is a view for describing an example of an operation of an electronic device according to an embodiment. - Referring to
FIG. 1 , theelectronic device 100 included in the vehicle 1 may generate map data by recognizing a surrounding environment through asensing portion 110, while the vehicle 1 drives on the road. - According to an embodiment, image data of different resolutions may be obtained for a plurality of nodes generated while the vehicle 1 moves. The plurality of nodes may be non-continually generated while the vehicle 1 moves. According to an embodiment, the image data obtained for each node may include a 3D point cloud, image, etc. Also, the image data according to an embodiment may include a distribution chart indicating information sensed with respect to a two-dimensional or a three-dimensional space. However, the image data according to an embodiment is not limited to the example described above and may include various types of data indicating information collected about surrounding environmental conditions at a certain location.
- The node according to an embodiment may correspond to a location of the vehicle 1 of the
electronic device 100 when the image data is obtained. - According to an embodiment, the
electronic device 100 may generate a plurality of nodes according to a time interval or a distance interval. However, it is not limited thereto, and theelectronic device 100 may non-continually generate a plurality of nodes. For example, when a location of theelectronic device 100 at a temporal point t is node A, a location of theelectronic device 100 at a temporal point t+1 may correspond to node B adjacent to the node A. A path through which the vehicle 1 including theelectronic device 100 drives may be a set of continual nodes. - According to an embodiment, when the vehicle 1 including the
electronic device 100 moves, images of different resolutions including the surrounding environment of the vehicle 1 may be captured at the nodes. - The
electronic device 100 may generate the map data by using image data of different resolutions. According to an embodiment, theelectronic device 100 may obtain pose information of the vehicle 1 by using image data of a high resolution and may obtain location information for a current node by using the pose information of the vehicle. For example, the location information for the current node may be obtained by calculating a distance and a direction of movement of the vehicle from a location of a previous node, based on the pose information of the vehicle. Theelectronic device 100 may generate the map data by matching the location information of the current node with image data of a low resolution. - When generating the map data, it may be difficult to distinctly compare features of two images, by using only the image data of the low resolution, and thus, the
electronic device 100 may have difficulty accurately obtaining location information corresponding to the image data of the low resolution. However, according to an embodiment, theelectronic device 100 may obtain location information of theelectronic device 100, the location information having high accuracy, by using image data of a high resolution, and may generate the map data including image data of a low resolution by using the location information. - According to an embodiment, the
electronic device 100 may estimate a current location of theelectronic device 100 by using the generated map data and image data of a first resolution (for example, the image data of the low resolution). For example, theelectronic device 100 may estimate the current location of theelectronic device 100 by obtaining, from among the image data of the first resolution of the map data, image data which is most closely matched to the image data of the first resolution, which is obtained with respect to the current location. - According to an embodiment, the
electronic device 100 may determine a range for the current location of theelectronic device 100 and estimate the current location by using map data corresponding to at least one node included in the determined range. The range for the current location of theelectronic device 100 may include an area which may be estimated as the current location of theelectronic device 100. Theelectronic device 100 may use the map data corresponding to the range for the current location, in order to minimize the amount of calculations taken to compare image data of the map data with image data obtained at the current location. - According to an embodiment, the range for the current location of the
electronic device 100 may be determined based on at least one of information about a previous location of theelectronic device 100 and global positioning system (GPS) information about the current location of theelectronic device 100. For example, theelectronic device 100 may determine the range which may include the current location based on the movement of theelectronic device 100, based on the previous location. Also, theelectronic device 100 may determine the range which may include the current location, based on the GPS information and an error bound of the GPS information. The disclosure is not limited to the example described above. The range for the current location of theelectronic device 100 may be determined based on information collected using various methods with respect to the current location of theelectronic device 100. The pose information of the vehicle 1 according to an embodiment may include 6-degree-of-freedom information. The 6-degree-of-freedom information may include information about a direction in which a vehicle moves and rotation of the vehicle. For example, the 6-degree-of-freedom information may include at least one of x, y, z, roll, yaw, and pitch. The x, y, z values may include information about a direction (eg. a vector value) in which the vehicle moves. The roll value may be an angle of rotation in a counter-clockwise direction based on an x-axis, the yaw value may be an angle of rotation in a counter-clockwise direction based on a y-axis, and the pitch value may be an angle of rotation in a counter-clockwise direction based on a z-axis. The yaw value may indicate a movement direction of the vehicle 1 and the pitch value may indicate whether the vehicle 1 moves through a slope or a bump. - The pose information of the vehicle 1 may be obtained based on the number of rotations of a wheel of the vehicle 1 and a direction of the rotation, which are measured through an
odometery sensor 230. However, the pose information measured through theodometery sensor 230 may have low accuracy, due to a slipping phenomenon generated between the wheel and a bottom surface. Thus, theelectronic device 100 according to an embodiment may obtain the pose information of the vehicle 1, the pose information having high accuracy, by using image data of a second resolution and may obtain location information of the vehicle 1 based on the pose information. - According to an embodiment, the
electronic device 100 may obtain the pose information of the vehicle from the plurality of nodes, by using the image data of the second resolution. For example, theelectronic device 100 may obtain a difference value between the pose information of each node, by using the image data of the second resolution, and based on the difference value of the pose information, may obtain pose information for each node, the pose information being optimized to have the least error. - Also, the
electronic device 100 may obtain location information for a current node of the vehicle 1, based on the pose information of the current node and a previous node and location information of the previous node. The pose information of the vehicle may include a direction in which the vehicle moves and a direction of the rotation. For example, theelectronic device 100 may obtain the location information for the current node by obtaining information about a direction and a distance of the movement from the previous node to the current node, based on the pose information of at least one of the current node and the previous node. - Also, the
electronic device 100 may generate the map data by matching image data of a first resolution (for example, a low resolution) with respect to the current node to the obtained location information of the current node of the vehicle 1. - The
electronic device 100 according to an embodiment may obtain the image data from each node, through thesensing portion 110 including aradar sensor 226, alidar sensor 227, animage sensor 228, etc. The image data of the first resolution described above may be generated by a sensor using radio waves, for example, theradar sensor 226. Also, the image data of the second resolution described above may be generated by a sensor using a laser beam or light, for example, thelidar sensor 227, theimage sensor 228, etc. - For example, the
electronic device 100 may obtain the image data of the first resolution (for example, the low resolution) by using theradar sensor 226 and obtain the image data of the second resolution (for example, the high resolution) by using at least one of thelidar sensor 227 and theimage sensor 228. - Thus, according to an embodiment, without expensive equipment (for example, a lidar sensor) to capture an image of high resolution, estimation of a current location of a moving object is possible by using equipment (for example, a radar sensor) to capture an image of a low resolution with map data including the image data of the low resolution. Thus, according to an embodiment, by using the map data including the image data of the low resolution, accurate location estimation is possible by using only a less expensive device for capturing image data of a low resolution.
- Also, when the image data of the second resolution (for example, the high resolution) described above includes image data, at which a difference of the pose information between adjacent nodes is to be identified, the image data of the second resolution may be used to obtain the pose information, according to an embodiment. Thus, the
electronic device 100 may obtain the location information according to an embodiment by using the image data of the second resolution obtained by using for example a lidar sensor of a first channel having one beam. Theelectronic device 100 according to an embodiment may perform the operation according to an embodiment, without including an expensive lidar sensor having a plurality of beams. - Also, when the image data of the first resolution is an image generated by using radio waves, speed detection using a Doppler effect is possible with respect to an object in an image. A dynamic object having a speed is desirably excluded from image data, in generating the map data. That is, the
electronic device 100 according to an embodiment may identify a dynamic object from objects in an image, based on speed information, and generate or modify and refine the map data by using the image data from which the dynamic object is excluded. - For example, the
electronic device 100 may obtain speed information with respect to the image data of the first resolution. The speed information may include, for example, a speed value corresponding to each unit area of the image data. Theelectronic device 100 may identify the dynamic object included in the image data of the first resolution, based on the speed information with respect to the image data of the first resolution. Theelectronic device 100 may remove the dynamic object identified in the image data of the first resolution. Also, theelectronic device 100 may remove the identified dynamic object from the image data of the second resolution corresponding to the image data of the first resolution. - Also, the
electronic device 100 may generate the map data based on the image data of the first resolution and the image data of the second resolution, from which the dynamic object is removed. Thus, the map data according to an embodiment may include the image data of the first resolution including a static object. - Also, when the
electronic device 100 modifies and refines the map data, theelectronic device 100 may modify and refine the map data based on the image data of the first resolution and the image data of the second resolution, from which the dynamic object is removed. To easily modify and refine the map data, theelectronic device 100 may use an area of the image data of the first resolution obtained to modify and refine the map data, the area including the identified static object, rather than a total area thereof. - When an autonomous vehicle drives on a road, the autonomous vehicle may generate and modify and refine map data about a surrounding environment by using various pieces of sensor information and estimate a current location of the vehicle on the map data. Here, as the vehicle contains more precise map data, a more accurate location of the vehicle may be estimated on the map data.
- Also,
FIG. 1 illustrates that theelectronic device 100 is included in the vehicle 1. However, it is not limited thereto. According to an embodiment, a movable device or robot (not shown) may include theelectronic device 100. - Also, the
electronic device 100 according to an embodiment may generate the map data by using the image data of the first resolution including a distribution chart based on information sensed at a certain location. For example, theelectronic device 100 may obtain an indoor temperature or dust distribution chart, or an indoor wireless signal strength distribution chart, as the image data of the first resolution, from each node, and may obtain location information based on the image data of the second resolution. Theelectronic device 100 may match the location information with the image data of the first resolution obtained from each node, to generate the map data including the indoor temperature or dust distribution chart, or the indoor wireless signal strength distribution chart. -
FIG. 2 shows an example path graph including a plurality of nodes according to an embodiment. - Referring to
FIG. 2 , theelectronic device 100 may generate a path graph as a set of at least two nodes and edges between the at least two nodes. The graph may be generated by indicating the plurality of nodes as dots and connecting the adjacent nodes via edges. For example, the path graph p20 may include an edge e21 connecting anode node 21 and anode node 22. - Each of the
nodes node 21 andnode 22 may include pose information of theelectronic device 100 according to an embodiment and theedge 21 may include a difference value between pose information of adjacent nodes. Theelectronic device 100 according to an embodiment may obtain at least one of a difference value and a covariance between pose information of the adjacent nodes, as a value of the edge e21 between the two nodes, based on the image data of the second resolution corresponding to each of thenodes node 21 andnode 22. The covariance may indicate a degree in which values of the pose information of the two nodes are changed in a correlated manner. According to an embodiment, based on at least one of the difference value and the covariance, the pose information of thenode node 22 may be obtained from the pose information of thenode node 21. - For example, the
electronic device 100 may obtain the pose information with respect to at least one node connected to an edge, based on a value of an edge. For example, theelectronic device 100 may obtain the pose information of the at least one node, at which the pose information has the least error, based on at least one edge value. - For example, the
electronic device 100 may obtain at least one of the difference value and the covariance of the pose information, by comparing image data of thenode node 21 with image data of thenode node 22. The pose information of thenode node 21 may include a pre-obtained value or a pre-defined value based on a certain condition, according to pose information of a previous node adjacent to thenode node 21. Thus, according to an embodiment, theelectronic device 100 may obtain the pose information of thenode node 22, based on the pose information of thenode node 21 and the value of the edge e21. - Also, the
electronic device 100 may obtain location information of each node, by using the pose information of each node. For example, based on information about a moving distance and a moving direction of theelectronic device 100, the information being included in the pose information, the location information of the current node may be obtained from the location information of the previous node of theelectronic device 100. -
FIG. 3 shows an example of loop-closing according to an embodiment. - According to an embodiment, the
electronic device 100 may correct the pose information of each node such that a sum of error values of edges included in a path graph is minimized. - According to an embodiment, the
electronic device 100 may use simultaneous localization and mapping (SLAM) technologies, in which a moving vehicle or robot measures its location while simultaneously writing a map of a surrounding environment. Theelectronic device 100 may perform loop-closing based on a relative location of two adjacent nodes, by using graph-based SLAM technologies. Theelectronic device 100 may generate a loop-closure edge connecting two nodes, by using a relative distance, a relative angle, etc., between two nodes, to derive a corrected resultant value. - Referring to a
path graph path 30 ofFIG. 3 , theelectronic device 100 may move in a clock-wise direction from a node node 31 to a node node 32. When the node node 31 and the node node 32 are located at the same location, theelectronic device 100 may obtain optimized pose information having the least error, based on a value of at least one edge including an edge e31 included in thepath graph path 30, according to the loop-closing correction method. For example, with the node node 31 and the node node 32 having the same location information as a pre-requisite condition, the optimized pose information of each node may be obtained. - However, it is not limited to the example described above. The
electronic device 100 may obtain the pose information of each node, by using various methods of optimizing the pose information, in addition to the loop-closing correction method. - According to an embodiment, the
electronic device 100 may obtain the value of the at least one edge included in thepath graph path 30, by using the image data of the second resolution for each node of thepath graph path 30. Also, theelectronic device 100 may obtain the pose information of at least one node included in thepath graph path 30, the node being configured to have the least error, based on the value of the at least one edge. Theelectronic device 100 may obtain location information of each node, based on the pose information of each node. -
FIG. 4 is a view for describing an example of generating map data according to an embodiment. - Referring to
FIG. 4 , for example, theelectronic device 100 may generatemap data 40 including image data d41 corresponding to anode node 41 and image data d42 corresponding to anode node 42. The image data d41 and d42 may be the image data of the first resolution (low resolution) described above. According to an embodiment, theelectronic device 100 may generate the map data by storing image data of a first resolution and location information, corresponding to each node. - According to an embodiment, the map data may be realized in the form of a 3D point cloud map, a 2D grid map, a 3D voxel map, etc., based on the image data of the first resolution and the location information, but is not limited thereto. Also, according to an embodiment, the map data may be realized in the various forms (for example, a feature map, a semantic map, a dense map, a texture map, etc.) according to types of data included in a map when the map is generated.
- For example, the
electronic device 100 may generate the map data in the form of the 3D point cloud map, by using image data in a 3D point cloud form corresponding to each node, based on location information of each node of a corrected path graph. The image data may be, for example, the image data of the first resolution. Also, theelectronic device 100 may generate the map data generated by converting the image data in the 3D point cloud form corresponding to each node into a 3D voxel form. Also, theelectronic device 100 may generate the map data in a 2D grid form by using only a point cloud corresponding to each node or a ground surface of a road extracted from image data in an image form. -
FIG. 5 is a block diagram of an electronic device according to an embodiment. - According to an embodiment, the
electronic device 100 may include thesensing portion 110, aprocessor 120, and amemory 130.FIG. 5 illustrates only components of theelectronic device 100, the components being related to the present embodiment. Thus, it will be understood by one of ordinary skill in the art that other general-purpose components than the components illustrated inFIG. 5 may further be included. - According to an embodiment, the
sensing portion 110 may obtain a peripheral image including objects located around the vehicle 1 (FIG. 1 ) driving on a road. Also, thesensing portion 110 according to an embodiment may obtain the peripheral image described above as image data of different resolutions. - The
sensing portion 110 may include a plurality of sensors configured to obtain the peripheral image. For example, thesensing portion 110 may include a distance sensor, such as a lidar sensor and a radar sensor, and an image sensor, such as a camera. - According to an embodiment, the lidar sensor of the
sensing portion 110 may generate the image data of the second resolution (for example, the high resolution) described above, and the radar sensor may generate the image data of the first resolution (for example, the low resolution). - Also, the
sensing portion 110 may include one or more actuators configured to correct locations and/or alignments of the plurality of sensors, and thus may sense an object located at each of a front direction, a rear direction, and side directions of the vehicle 1. - Also, the
sensing portion 110 may sense a shape of a peripheral object and a shape of a road by using the image sensor. - According to an embodiment, the
processor 120 may include at least one processor. Also, theprocessor 120 may execute one or more instructions stored in thememory 130. - According to an embodiment, the
processor 120 may generate map data by using the image data of different resolutions. For example, theprocessor 120 may obtain location information of a plurality of nodes by using image data of a second resolution (for example, a high resolution). Also, theprocessor 120 may generate the map data by matching the location information of each node with image data of a first resolution (for example, a low resolution) of each node. - Also, the
processor 120 may obtain the location information of each node by using image data of a second resolution (for example, a high resolution). For example, theprocessor 120 may obtain pose information of theelectronic device 100 by using the image data of the second resolution (for example, the high resolution) and obtain the location information of each node by using the pose information of theelectronic device 100. - Also, the
processor 120 may obtain at least one of a difference value and a covariance between pose information of a first node and a second node, and based on the obtained at least one, may obtain the location information of the second node from the location information of the first node. - Also, the
processor 120 may obtain the at least one of the difference value and the covariance between the pose information described above by comparing the image data of the second resolution with respect to the first node and the second node. The pose information described above may include 6-degree-of-freedom information of theelectronic device 100. - Also, the
processor 120 may determine a range of a current location based on information about the current location obtained in various methods and may estimate the current location based on map data corresponding to at least one node included in the determined range. For example, the range of the current location may be determined based on at least one of information about a previous location of the electronic device and GPS information about the current location of the electronic device. Also, the map data corresponding to the at least one node included in the range of the current location may include at least one piece of image data of a first resolution corresponding to the at least one node. - Also, the
processor 120 may identify a dynamic object in the image data of the first resolution, based on speed information with respect to the image data of the first resolution. Theprocessor 120 may remove the identified dynamic object from at least one of the image data of the first resolution and the image data of the second resolution. Thus, theprocessor 120 may generate the map data by using the image data from which the dynamic object is removed. - The
memory 130 according to an embodiment may store one or more instructions performed by theprocessor 120. For example, thememory 130 may store various data and programs for driving and controlling theelectronic device 100 under control of theprocessor 120. Also, thememory 130 may store signals or data that is input/output based on operations of thesensing portion 110 and theprocessor 120. - The
memory 130 may store the map data generated by theprocessor 120 under control of theprocessor 120. -
FIG. 6 is a block diagram of an electronic device according to an embodiment. - The
electronic device 100 may include thesensing portion 110, theprocessor 120, thememory 130, anoutputter 140, aninputter 150, and acommunicator 160. Theelectronic device 100, thesensing portion 110, theprocessor 120, and thememory 130 illustrated inFIG. 6 may correspond to theelectronic device 100, thesensing portion 110, theprocessor 120, and thememory 130 ofFIG. 5 , respectively. - The
sensing portion 110 may include a plurality of sensors configured to sense information about a surrounding environment in which the vehicle (FIG. 1 ) is located and may include one or more actuators configured to correct locations and/or alignments of the sensors. For example, thesensing portion 110 may include aGPS 224, an inertial measurement unit (IMU) 225, aradar sensor 226, alidar sensor 227, animage sensor 228, and anodometery sensor 230. Also, thesensing portion 110 may include at least one of a temperature/humidity sensor 232, aninfrared sensor 233, anatmospheric sensor 235, aproximity sensor 236, and anRGB illuminance sensor 237, but is not limited thereto. A function of each sensor may be intuitively inferred by one of ordinary skill in the art based on a name of the sensor, and thus, its detailed description is omitted. - Also, the
sensing portion 110 may include amotion sensing portion 238 configured to sense a motion of the vehicle 1 (FIG. 1 ). Themotion sensing portion 238 may include amagnetic sensor 229, anacceleration sensor 231, and agyroscope sensor 234. - The
GPS 224 may include a sensor configured to estimate a geographical location of the vehicle 1 (FIG. 1 ). That is, theGPS 224 may include a transceiver configured to estimate a location of the vehicle 1 (FIG. 1 ) on the earth. According to an embodiment, a range of a current location of the vehicle 1 may be determined based on GPS information with respect to the current location of the vehicle 1. The current location of the vehicle 1 may be estimated based on the map data obtained based on the determined range. - The
IMU 225 may be a combination of sensors configured to sense changes of a location and an alignment of the vehicle 1 (FIG. 1 ) based on inertia acceleration. For example, the combination of sensors may include accelerometers and gyroscopes. - The
radar sensor 226 may include a sensor configured to sense objects in an environment in which the vehicle 1 (FIG. 1 ) is located, by using wireless signals. Also, theradar sensor 226 may be configured to sense a speed and/or a direction of objects. - The
lidar sensor 227 may include a sensor configured to sense objects in an environment in which the vehicle 1 (FIG. 1 ) is located, by using a laser beam. In more detail, thelidar sensor 227 may include a laser light source and/or a laser scanner configured to emit a laser beam, and a sensor configured to sense reflection of the laser beam. Thelidar sensor 227 may be configured to operate in a coherent (for example, using heterodyne sensing) or an incoherent sensing mode. - The
image sensor 228 may include a still camera or a video camera configured to record an environment outside the vehicle 1 (FIG. 1 ). For example, theimage sensor 228 may include a plurality of cameras and the plurality of cameras may be arranged at various locations inside and outside of the vehicle 1 (FIG. 1 ). - The
odometery sensor 230 may estimate the location of the vehicle 1 (FIG. 1 ) and measure a moving distance. For example, theodometery sensor 230 may measure a value of a location change of the vehicle 1 (FIG. 1 ) by using the number of rotations of a wheel of the vehicle 1 (FIG. 1 ). - Also, the location of the
electronic device 100 may be measured by using the methods of trilateration, triangulation, etc., using sensors and communication devices, such as 3G, LTE, a global navigation satellite system (GNSS), a global system for mobile communication (GSM), LORAN-C, NELS, WLAN, Bluetooth, etc. - Also, when the
electronic device 100 is in an indoor environment, a location of theelectronic device 100 may be estimated by using sensors, such as indoor-GPS, Bluetooth, WLAN, VLC, active badge, GSM, RFID, visual tags, WIPS, WLAN, ultraviolet rays, magnetic sensors, etc. - The method of measuring the location of the
electronic device 100 according to an embodiment is not limited to the examples described above. Other methods, in which location data of theelectronic device 100 may be obtained, may also be used. - The
memory 130 may include a magnetic disk drive, an optical disk drive, and a flash memory. Alternatively, thememory 130 may include a portable USB data storage. Thememory 130 may store system software configured to execute examples related to the disclosure. The system software configured to execute the examples related to the disclosure may be stored in a portable storage medium. - The
communicator 160 may include at least one antenna for wirelessly communicating with other devices. For example, thecommunicator 160 may be used to wirelessly communicate with cellular networks or other wireless protocols and systems through Wi-Fi or Bluetooth. Thecommunicator 160 controlled by theprocessor 120 may transmit and receive wireless signals. For example, theprocessor 120 may execute a program included in thestorage 140 for thecommunicator 160 to transmit and receive wireless signals to and from the cellular network. - The
inputter 150 refers to a device for inputting data for controlling the vehicle 1 (FIG. 1 ). For example, theinputter 150 may include a key pad, a dome switch, a touch pad (a touch capacitance method, a pressure-resistive layer method, an infrared sensing method, a surface ultrasonic conductive method, an integral tension measuring method, a piezo effect method, etc.), a jog wheel, a jog switch, etc., but is not limited thereto. Also, theinputter 150 may include a microphone, which may be configured to receive audio (for example, a voice command) from a passenger of the vehicle 1 (FIG. 1 ). - The
outputter 140 may output an audio signal or a video signal, and an output device 280 may include adisplay 281 and asound outputter 282. - The
display 281 may include at least one of a liquid crystal display, a thin-film transistor-liquid crystal display, an organic light-emitting diode, a flexible display, a 3D display, and an electrophoretic display. According to a realized form of theoutputter 130, theoutputter 140 may include at least twodisplays 281. - The
sound outputter 282 may output audio data received from thecommunicator 160 or stored in thestorage 140. Also, thesound outputter 282 may include a speaker, a buzzer, etc. - The
inputter 150 and theoutputter 140 may include a network interface and may be realized as a touch screen. - The
processor 120 may execute programs stored in thememory 130 to generally control thesensing portion 110, thecommunicator 160, theinputter 150, thestorage 140, and theoutputter 140. -
FIG. 7 is a block diagram of a vehicle according to an embodiment. - According to an embodiment, the vehicle 1 may include the
electronic device 100 and adriving device 200.FIG. 7 illustrates only components of the vehicle 1, the components being related to the present embodiment. Thus, it will be understood by one of ordinary skill in the art that other general-purpose components than the components illustrated inFIG. 7 may further be included. - The
electronic device 100 may include thesensing portion 110, theprocessor 120, and thememory 130. - The
sensing portion 110, theprocessor 120, and thememory 130 are described in detail inFIGS. 5 and 6 , and thus, their descriptions are omitted. - The
driving device 200 may include abrake unit 221, asteering unit 222, and athrottle 223. - The
steering unit 222 may be a combination of mechanisms configured to adjust a direction of the vehicle 1. - The
throttle 223 may be a combination of mechanisms configured to control a speed of the vehicle 1 by controlling an operating speed of an engine/motor 211. Also, thethrottle 223 may adjust an amount of mixture gas of fuel air inserted into the engine/motor 211 by adjusting an opening amount of the throttle and may control power and a driving force by adjusting the opening amount of the throttle. - The
brake unit 221 may be a combination of mechanisms configured to decelerate the vehicle 1. For example, thebrake unit 221 may use friction to reduce a speed of a wheel/tire 214. -
FIG. 8 is a flowchart of an operating method of an electronic device according to an embodiment. - Referring to
FIG. 8 , inoperation 810, theelectronic device 100 may obtain image data of a first resolution and image data of a second resolution for each of a plurality of nodes generated while theelectronic device 100 moves. Theelectronic device 100 according to an embodiment may obtain image data of different resolutions at each of the nodes, by using different sensors. - In
operation 820, theelectronic device 100 may obtain location information with respect to each node based on the image data of the second resolution. - The
electronic device 100 according to an embodiment may compare the image data of the second resolution with respect to the first node and the second node, to obtain at least one of a difference value and a covariance between pose information of a first node and a second node as a value of an edge between the two nodes. According to an embodiment, the first node and the second node may be a previous node and a current node of theelectronic device 100, respectively. The covariance may indicate a degree in which values of the pose information of the two nodes are changed in a correlated manner. According to an embodiment, based on at least one of the difference value and the covariance, the pose information of the second node may be obtained from the pose information of the first node. - Also, the
electronic device 100 may obtain the pose information of the second node based on the value of the edge and the pose information of the first node. Alternatively, theelectronic device 100 may obtain the pose information of at least one optimized node, according to a loop-closing correction method, based on the value of the edge. Theelectronic device 100 may obtain location information of the current node based on the pose information of each node. - The
electronic device 100 according to an embodiment may obtain location information having high accuracy by using image data of a high resolution, through which feature values of the image data may be distinctly compared. - In
operation 830, theelectronic device 100 may match and store the location information for each node obtained in operation S820 and the image data of the first resolution. Theelectronic device 100 may generate the map data by matching and storing the location information for each node and the image data of the first resolution. - In
operation 840, theelectronic device 100 may estimate the current location of theelectronic device 100 by using the map data generated inoperation 840 and the image data of the first resolution. The image data of the first resolution may be image data obtained at the current location of theelectronic device 100. According to an embodiment, theelectronic device 100 may estimate the current location of theelectronic device 100 by comparing the image data of the first resolution obtained at the current location with the image data of the first resolution included in the map data. For example, theelectronic device 100 may determine, from the image data of the first resolution included in the map data, image data of the first resolution mostly closely matched to the image data of the first resolution obtained at the current location. Theelectronic device 100 may estimate the location information corresponding to the determined image data of the first resolution as the current location of theelectronic device 100. - The device according to the embodiments described herein may include a processor, a memory for storing program data and executing it, a permanent storage such as a disk drive, a communication port for handling communications with external devices, and user interface devices, such as touch panels, keys, buttons, etc. Any methods implemented as software modules or algorithms may be stored as program instructions or computer-readable codes executable by a processor on a computer-readable recording media. Here, the computer-readable recording media may include magnetic storage media (for example, read-only memory (ROM), random-access memory (RAM), floppy disks, hard disks, etc.) and optical reading media (for example, CD-ROMs, digital versatile disc (DVD), etc.). The computer-readable recording medium can also be distributed over network-coupled computer systems so that the computer-readable code is stored and executed in a distributed fashion. The media can be read by the computer, stored in the memory, and executed by the processor.
- The present embodiment may be described in terms of functional block components and various processing steps. Such functional blocks may be realized by any number of hardware and/or software components configured to perform the specified functions. For example, the embodiment may employ various integrated circuit components, e.g., memory elements, processing elements, logic elements, look-up tables, and the like, which may carry out a variety of functions under the control of one or more microprocessors or other control devices. Similarly, where the elements of the present embodiment are implemented using software programming or software elements the embodiment may be implemented with any programming or scripting language such as C, C++, Java, assembler, or the like, with the various algorithms being implemented with any combination of data structures, objects, processes, routines or other programming elements. Functional aspects may be implemented in algorithms that execute on one or more processors. Furthermore, the present embodiment could employ any number of conventional techniques for electronics configuration, signal processing and/or control, data processing and the like. The words “mechanism,” “element,” and “component” are used broadly and are not limited to mechanical or physical embodiments. The meaning of these words can include software routines in conjunction with processors, etc.
Claims (15)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020180039342A KR102561263B1 (en) | 2018-04-04 | 2018-04-04 | Electronic apparatus and operating method for generating a map data |
KR10-2018-0039342 | 2018-04-04 | ||
PCT/KR2019/002599 WO2019194424A1 (en) | 2018-04-04 | 2019-03-06 | Electronic device for generating map data and operation method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
US20210048312A1 true US20210048312A1 (en) | 2021-02-18 |
US11859997B2 US11859997B2 (en) | 2024-01-02 |
Family
ID=68100842
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/978,907 Active 2041-02-18 US11859997B2 (en) | 2018-04-04 | 2019-03-06 | Electronic device for generating map data and operation method thereof |
Country Status (3)
Country | Link |
---|---|
US (1) | US11859997B2 (en) |
KR (1) | KR102561263B1 (en) |
WO (1) | WO2019194424A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210004015A1 (en) * | 2019-07-05 | 2021-01-07 | Lg Electronics Inc. | Moving robot and control method thereof |
US11189049B1 (en) * | 2020-10-16 | 2021-11-30 | Ford Global Technologies, Llc | Vehicle neural network perception and localization |
US11674809B2 (en) | 2019-07-05 | 2023-06-13 | Lg Electronics Inc. | Moving robot and control method thereof |
US11700989B2 (en) | 2019-07-11 | 2023-07-18 | Lg Electronics Inc. | Mobile robot using artificial intelligence and controlling method thereof |
US11774982B2 (en) | 2019-07-11 | 2023-10-03 | Lg Electronics Inc. | Moving robot and control method thereof |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102297836B1 (en) * | 2019-11-07 | 2021-09-03 | 네이버랩스 주식회사 | Method for determining intensity information of point cloud data on virtual environment based on deep learning and electronic device for executing the same |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120300979A1 (en) * | 2011-05-27 | 2012-11-29 | Qualcomm Incorporated | Planar mapping and tracking for mobile devices |
US9230366B1 (en) * | 2013-12-20 | 2016-01-05 | Google Inc. | Identification of dynamic objects based on depth data |
GB2533295A (en) * | 2014-12-15 | 2016-06-22 | The Chancellor Masters And Scholars Of The Univ Of Oxford | Localising portable apparatus |
US20170307746A1 (en) * | 2016-04-22 | 2017-10-26 | Mohsen Rohani | Systems and methods for radar-based localization |
DE102017126925A1 (en) * | 2016-11-17 | 2018-05-17 | GM Global Technology Operations LLC | Automated co-pilot control for autonomous vehicles |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
NO20082337L (en) | 2008-05-22 | 2009-11-23 | Modulprodukter As | Method of producing road maps and use of the same, as well as road map system |
JP5589900B2 (en) | 2011-03-03 | 2014-09-17 | 株式会社豊田中央研究所 | Local map generation device, global map generation device, and program |
US9562778B2 (en) | 2011-06-03 | 2017-02-07 | Robert Bosch Gmbh | Combined radar and GPS localization system |
KR101700764B1 (en) * | 2011-06-29 | 2017-01-31 | 엘지전자 주식회사 | Method for Autonomous Movement and Apparatus Thereof |
JP5947666B2 (en) * | 2012-08-21 | 2016-07-06 | アジア航測株式会社 | Traveling road feature image generation method, traveling road feature image generation program, and traveling road feature image generation apparatus |
KR20150144125A (en) | 2014-06-16 | 2015-12-24 | 현대모비스 주식회사 | Safe driving guiding system and method thereof |
KR20160002178A (en) | 2014-06-30 | 2016-01-07 | 현대자동차주식회사 | Apparatus and method for self-localization of vehicle |
KR101625486B1 (en) | 2014-11-14 | 2016-05-30 | 재단법인대구경북과학기술원 | Map-based positioning system and method thereof |
KR101764222B1 (en) | 2015-12-22 | 2017-08-03 | 재단법인대구경북과학기술원 | System and method for high precise positioning |
-
2018
- 2018-04-04 KR KR1020180039342A patent/KR102561263B1/en active IP Right Grant
-
2019
- 2019-03-06 US US16/978,907 patent/US11859997B2/en active Active
- 2019-03-06 WO PCT/KR2019/002599 patent/WO2019194424A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120300979A1 (en) * | 2011-05-27 | 2012-11-29 | Qualcomm Incorporated | Planar mapping and tracking for mobile devices |
US9230366B1 (en) * | 2013-12-20 | 2016-01-05 | Google Inc. | Identification of dynamic objects based on depth data |
GB2533295A (en) * | 2014-12-15 | 2016-06-22 | The Chancellor Masters And Scholars Of The Univ Of Oxford | Localising portable apparatus |
US20170307746A1 (en) * | 2016-04-22 | 2017-10-26 | Mohsen Rohani | Systems and methods for radar-based localization |
DE102017126925A1 (en) * | 2016-11-17 | 2018-05-17 | GM Global Technology Operations LLC | Automated co-pilot control for autonomous vehicles |
Non-Patent Citations (2)
Title |
---|
Badino et al.; Real-Time Topometric Localization; 2012 IEEE International Conference on Robotics and Automation RiverCentre, Saint Paul, Minnesota, USA May 14-18, 2012 (Year: 2012) * |
Vivacqua et al.; A Low Cost Sensors Approach for Accurate Vehicle Localization and Autonomous Driving Application; Sensors (Basel, Switzerland); vol. 17, 10 2359; Oct. 16, 2017 (Year: 2017) * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210004015A1 (en) * | 2019-07-05 | 2021-01-07 | Lg Electronics Inc. | Moving robot and control method thereof |
US11674809B2 (en) | 2019-07-05 | 2023-06-13 | Lg Electronics Inc. | Moving robot and control method thereof |
US11774976B2 (en) * | 2019-07-05 | 2023-10-03 | Lg Electronics Inc. | Moving robot and control method thereof |
US11700989B2 (en) | 2019-07-11 | 2023-07-18 | Lg Electronics Inc. | Mobile robot using artificial intelligence and controlling method thereof |
US11774982B2 (en) | 2019-07-11 | 2023-10-03 | Lg Electronics Inc. | Moving robot and control method thereof |
US11189049B1 (en) * | 2020-10-16 | 2021-11-30 | Ford Global Technologies, Llc | Vehicle neural network perception and localization |
Also Published As
Publication number | Publication date |
---|---|
KR102561263B1 (en) | 2023-07-28 |
KR20190115982A (en) | 2019-10-14 |
WO2019194424A1 (en) | 2019-10-10 |
US11859997B2 (en) | 2024-01-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11859997B2 (en) | Electronic device for generating map data and operation method thereof | |
US11157001B2 (en) | Device and method for assisting with driving of vehicle | |
US10788830B2 (en) | Systems and methods for determining a vehicle position | |
US11860640B2 (en) | Signal processing device and signal processing method, program, and mobile body | |
US10539664B2 (en) | Distance sensor, and calibration method performed by device and system including the distance sensor | |
US11183056B2 (en) | Electronic device for generating map data and operating method therefor | |
US9273971B2 (en) | Apparatus and method for detecting traffic lane using wireless communication | |
CN110390240B (en) | Lane post-processing in an autonomous vehicle | |
US10549750B2 (en) | Moving body | |
CN111356902A (en) | RADAR assisted visual inertial ranging initialization | |
US11745765B2 (en) | Electronic device and method for assisting with driving of vehicle | |
US20170308093A1 (en) | Automatic driving control system of mobile object | |
JP6901386B2 (en) | Gradient Estimator, Gradient Estimator, Program and Control System | |
US20240053475A1 (en) | Method, apparatus, and system for vibration measurement for sensor bracket and movable device | |
US11514681B2 (en) | System and method to facilitate calibration of sensors in a vehicle | |
JP2016080460A (en) | Moving body | |
KR20180104496A (en) | Apparatus and method for assisting driving of a vehicle | |
US20220205804A1 (en) | Vehicle localisation | |
CN115221937A (en) | Sensor information fusion method and apparatus, and recording medium | |
US11704827B2 (en) | Electronic apparatus and method for assisting with driving of vehicle | |
KR20180124713A (en) | An electronic device and method thereof for estimating shape of road | |
JP7337617B2 (en) | Estimation device, estimation method and program | |
WO2019053986A1 (en) | Self-position estimation device, self-position estimation method, program, and mobile body device | |
CN111026107A (en) | Method and system for determining the position of a movable object | |
JP2014044093A (en) | Speed estimation device, and speed estimation method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHOI, MIDEUM;PARK, CHANGSOO;NA, INHAK;SIGNING DATES FROM 20200716 TO 20200831;REEL/FRAME:053712/0001 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |