GB2588637A - Method of determining cells in a multiple resolution grid - Google Patents

Method of determining cells in a multiple resolution grid Download PDF

Info

Publication number
GB2588637A
GB2588637A GB1915734.6A GB201915734A GB2588637A GB 2588637 A GB2588637 A GB 2588637A GB 201915734 A GB201915734 A GB 201915734A GB 2588637 A GB2588637 A GB 2588637A
Authority
GB
United Kingdom
Prior art keywords
cell
defining
point
cells
multiple resolution
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1915734.6A
Other versions
GB201915734D0 (en
Inventor
Govindachar Suresh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mercedes Benz Group AG
Original Assignee
Daimler AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daimler AG filed Critical Daimler AG
Priority to GB1915734.6A priority Critical patent/GB2588637A/en
Publication of GB201915734D0 publication Critical patent/GB201915734D0/en
Priority to GB2001295.1A priority patent/GB2588696A/en
Publication of GB2588637A publication Critical patent/GB2588637A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/51Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20016Hierarchical, coarse-to-fine, multiscale or multiresolution image processing; Pyramid transform
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The application relates to applications which process data that is associated with some or all of the cells that tile a multi-dimensional region e.g. sensor fusion applications that receive data from radar, laser and other sensors as may be used in autonomous vehicles or robots. The present disclosure relates to of determining cells in a multiple resolution and comprises receiving data with a multiple resolution grid, wherein the multiple resolution grid is a combination of a plurality of cells of different sizes wherein each cell among the plurality of cells is tiled by one or more a unit cells Further, identifying for each cell, a defining cell, and a defining point wherein the defining cell is a unit cell among the one or more unit cells that tile the cell and the defining point is in relation to a predefined point in the defining cell. Furthermore, determining, by the processing unit, a cell identification number for the plurality of cells based on at least one of the defining cell, the defining point and a relative offset between the defining point of the defining cell and the a defining point of the cell in the multiple resolution grid. It is then possible given a location to determine the cell identification number that contains that location and conversely, given a cell identification number, determining the location of the determining point of that cell.

Description

F ORM 2 THE PATENTS ACT, 1970 (39 of 1970) The patent Rule, 2003
COMPLETE SPECIFICATION
(See section 10 and rule 13)
TITLE OF THE INVENTION
METHOD OF DETERMINING CELLS IN A MULTIPLE RESOLUTION GRID
Name and address of the applicant: a) Name: Daimler AG b) Nationality: Germany c) Address: 70372, Stutt2art, Germany.
[00011 PREAMBLE TO THE DESCRIPTION:
[0002] The following specification particularly describes the invention and the manner in which it is to be performed:
[0003] DESCRIPTION OF THE INVENTION:
[0004] Technical field
[0005] The present disclosure relates to applications which process data that is associated with some or all of the cells that tile a multi-dimensional region; an example of such an application would be sensor fusion applications (in autonomous vehicles and robots) that receive data from radar, laser and other sensors. A tiling of a region by cells is an arrangement of the cells such that there are no overlapping cells (a "concatenation of cells") and such that the resulting arrangement covers the region; when the cells are of identical size, we have a uniform grid, and when the cells are of different sizes, we have a multiple resolution grid. More particularly, but not specifically, the present disclosure relates to a method for determining cells in a multiple resolution grid. For ease of exposition, this document exposits based on a two-dimensional image processing application.
[0006] Background of the disclosure
[0007] Generally, the image processing techniques are computationally intensive for images of high resolution. Therefore, the image is divided into a grid of cells and processed. Each of the cells in the gridded image does not contain the same amount of information and/or contain information of the same criticality to the application. For example, objects present at a faraway distance in an image may not be of interest. Therefore, all the regions in the image need not to be processed with same precision. Hence, there is the motivation to work with cells of varying sizes to process the image.
[00081 W02018077641A1 discloses an image having the uniform grid cells, combining a plurality of grids with uniform grid cells to obtain multiresolution grids. However, the conventional art described thereat creates multiple grids, with each grid containing uniformly sized cells, and with cells of different grids being of different size -which is in contrast to a single grid with cells of varying size, i.e., a grid having multiple resolutions and termed "multiple resolution grid". The issue of numbering and locating cells has a rather straightforward solution in the use of multircsolution grids but exposes a need in the use of multiple resolution grid.
[0009] The present disclosure is directed to overcome one or more limitations stated above or any other limitations associated with the prior art.
[00101 The information disclosed in this background of the disclosure section is only for enhancement of understanding of the general background of the invention and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.
[0011] SUMMARY OF THE DISCLOSURE
[0012] In an embodiment, the present disclosure relates to a method of determining cells in a multiple resolution grid. The method includes receiving an image with a multiple resolution grid, wherein the multiple resolution grid is a concatenation of cells of varying sizes, together with a unit cell which allows describing any cell of the grid as a concatenation of the unit cell. For ease of exposition, it is non-restrictively supposed that the unit cell is the largest among the cells that can provide such a description. Further, the method includes identifying, for each type of cell of the multiple resolution grid viewed as a group of concatenated unit cells, a defining unit cell and a defining point for that cell, wherein the defining unit cell is one of the unit cells within the cell and the defining point is in relation to a predefined location in the defining unit cell. For ease of exposition, at times, this document might shorten "defining unit cell" to just "defining cell." The method includes determining a cell identification number for the group of concatenated cells of a multiple resolution grid based on at least one of the defining unit cell, the defining point and a relative offset between the defining point of the defining unit cell and of the concatenated unit cells. Finally, the method includes translating from any location in the multiple resolution grid to the identification number of the cell the location is contained in and, conversely, translating from any cell identification number to the location of the defining point of that cell.
1-90131 hi an embodiment, the present disclosure discloses a processing unit comprising a processor, and a memory, communicatively coupled to the processor, storing processor executable instructions, which, on execution causes the processor to receive data with a multiple resolution grid, wherein the multiple resolution grid is a concatenation of cells of varying sizes, together with a unit cell which allows describing any cell of the grid as a concatenation of the unit cell. For ease of exposition, it is non-restrictively supposed that the unit cell is the largest among the cells that can provide such a description. Further, the processor is configured to identify for each type of cell of the multiple resolution grid viewed as a group of concatenated unit cells, a defining unit cell and a defining point for that cell, wherein the defining unit cell is one of the unit cells within the cell and the defining point is in relation to a predefined location in the defining unit cell. The processor is configured to also determine a cell identification number for the group of concatenated cells of a multiple resolution grid based on at least one of the defining unit cell, the defining point and a relative offset between the defining point of the defining unit cell and of the concatenated unit cells. Finally, the processor is configured to translate from any location in the multiple resolution grid to the identification number of the cell the location is contained in and, conversely, to translate from any cell identification number to the location of the defining point of that cell.
[0014] The foregoing summary is illustrative only and is not intended to he in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
[00151 BRIEF DESCRIPTION OF DRAWINGS
[0016] The novel features and characteristic of the disclosure are set forth in the appended claims. The disclosure itself, however, as well as a preferred mode of use, further objectives and advantages thereof, will best understood by reference to the following detailed description of an illustrative embodiment when read in conjunction with the accompanying figures. One or more embodiments are now described, by way of example only, with reference to the accompanying figures wherein like reference numerals represent like elements and in which: [0017] Figure 1 shows an exemplary environment for determining cells in a multiple resolution grid in accordance with embodiments of the present disclosure; [0018] Figure 2 shows an exemplary flow chart illustrating method steps for 10 determining cells in a multiple resolution grid in accordance with an embodiment of the present disclosure.
[0019] Figure 3 shows an exemplary image represented by the pixel values in accordance with an embodiment of the present disclosure.
[0020] Figure 4A shows an exemplary multiple resolution grid with plurality of cells in accordance with one embodiment of the present disclosure.
[0021] Figure 4B shows an exemplary uniform grid of a multiple resolution grid with plurality of cells in accordance with one embodiment of the present disclosure.
[0022] Figure 4C shows an exemplary unit cell in accordance with one embodiment of the present disclosure [0023] Figure 5 shows an exemplary multiple resolution grid of the received image in accordance with one embodiment of the present disclosure.
[0024] Figure 6 shows an exemplary uniform grid of the received image in accordance with one embodiment of the present disclosure.
[0025] Figure 7 shows an exemplary multiple resolution grid in accordance with one embodiments of the present disclosure.
[0026] Figure 8 shows an exemplary uniform grid in accordance with one
embodiment of the present disclosure.
[0027] Figure 9A shows an exemplary corner technique to identify the defining cell for a cell among the plurality of cells in accordance with one embodiment of the present disclosure.
[0028] Figure 9B shows an exemplary center technique to identify the defining cell for a cell among the plurality of cells in accordance with one embodiment of the present disclosure.
[0029] Figure 9C show an exemplary center technique to identify the defining 10 point for a cell in accordance with one embodiment of the present disclosure.
[0030] Figure 9D shows an exemplary corner technique to identify the defining point for a cell in accordance with one embodiment of the present disclosure.
[0031] Figure 10 shows an exemplary determination of the cell identification number in accordance with one embodiment of the present disclosure.
[9932] It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and executed by a computer or processor, whether or not such computer or processor is explicitly shown.
[0033] DETAILED DESCRIPTION
[0034] In the present document, the word "exemplary" is used herein to mean "serving as an example, instance, or illustration." Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
[0035] While the disclosure is susceptible to various modifications and alternative forms, specific embodiment thereof has been shown by way of example in the drawings and will he described in detail below. It should be understood, however that it is not intended to limit the disclosure to die particular forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternative falling within the scope of the disclosure.
[0036] The terms comprises", "includes", "comprising", "including", or any other variations thereof, arc intended to cover a non-exclusive inclusion, such that a setup, device or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus proceeded by "comprises... a", "including.., a" does not, without more constraints, preclude the existence of other elements or additional elements in the system or apparatus.
[0037] Embodiments of the present disclosure relate to a method for determining cells in a multiple resolution grid. The method comprises receiving an image (101) with a multiple resolution grid, where the multiple resolution grid is a combination of a plurality of cells of di fferem sizes, where each cell among the plurality of cells is tiled by one or more unit cells. Further, the method comprises identifying, for each cell, a defining cell, and a defining point, where the defining cell is a unit cell among the one or more unit cells and the defining point is in relation to a predefined point in the defining cell. Finally, the method comprises determining a cell identification number for the plurality of cells based on at least one of the defining cell, the defining point and a relative offset between the defining point of the defining cell and the defining point of the cell in the multiple resolution grid.
Thereby, given a spatial location in the multiple resolution grid, the cell identification number of the cell that location is in is determined and, conversely. given the cell identification number, the spatial location of the defining point of the cell is determined.
[0038] Figure 1 is indicative of an exemplary environment for determining cells in a multiple resolution grid in accordance with one embodiment of the present disclosure. The environment includes an image processing unit (102) used for receiving an image (101) from the one or more sensors for determining the cells in a multiple resolution grid. For example, applications such as robotics and autonomous vehicles includes one or more sensors to provide data about the surroundings of an ego-object (for example robot, automobile and the like). The sensing mechanism by the one or more sensors (for example camera or LIDAR) involves rays (for example, rays of visible light impinging on a camera's sensor and rays of laser beams emitted by the sensor). The rays diverge as the distance from the ego-object increases. Therefore, the rays are densely spaced closer to the ego-object and rays are sparsely spaced as the distance from the ego object increases.
[0039] The image (101) is represented as a matrix of dimension (nix n), where 'm' indicates the number of rows in the matrix and 'n' indicates the number of columns in the matrix. Every value in the matrix of (m x n) represents the pixel value of the image (101). As shown in Figure 3, a part of the image (101) is represented by the corresponding pixel values. lithe image (101) has a resolution of 1080 x 1020, then the matrix representing the image (101) will have the dimension 1080 x 1020 (m x n) and a total of 1,101,600-pixel values in the matrix. Further, the image (101) received from the one or more sensors is divided into a grid comprising plurality of cells, where the plurality of cell in the vicinity of the ego-object is of a small size and the plurality of cells have a larger size as the distance from the ego-object increases. The cell is pixel or a group of pixels in the image (101). For example, a small cell may consist of 4 pixels and an exemplary large cell may consist of 16 pixels.
[0040] So, in the received image (101), the size of the cells increases as the distance from the ego-object increases. The received image (101) may be divided into plurality of cells of different size based on the distance from the ego-object thereby forming a multiple resolution grid. As shown in Figure 5, an image (101) is divided into a multiple resolution grid based on the distance from the ego-object (indicated by the black circle placed at the center of the grid). Each cell among the plurality of cells in the grid comprising a pixel or a group of pixels is denoted by the square box as shown in Figure 5. Each cell in the multiple resolution grid can be viewed as being tiled by a unit cell, and the unit cell of the multiple resolution grid is the largest cell that can tile each of the plurality of cells of the multiple resolution grid, and may be of any dimension (in x n). For example, the Figure 5 of the image (101) shows the plurality of cells in die vicinity of the ego-object as being tiled by just 1-unit cell, and the plurality of cells in the regions farther away from the ego-object as being a 2x2 tiling of the unit cell.
[0041] In an embodiment, the image processing unit (102) divides the image (101) to become tiled by the unit cell. As shown in Figure 6, an exemplary uniform grid is generated by tiling or concatenating the unit cell.
[0042] In an embodiment, the dimension of the unit cell may be set to predefined value for example (2 x 3). As shown in Figure 4A, the multiple resolution grid includes cells of varying size. Figure 4B shows the underlying uniform grid corresponding to the multiple resolution grid of the Figure 4A. Further, the Figure 4C shows the unit cell used for tiling the uniform grid of the Figure 4B and the multiple resolution grid of Figure 4A. The uniform grid as shown in Figure 4B corresponding to the multiple resolution grid as shown in Figure 4A comprises of 6 unit cells. The multiple resolution grid, as shown in Figure 4A, comprises of 3 cells, where Me cell 0 and cell 1 is filed by a 2 x 1 unit cell and die cell 2 is tiled by 1 x 2 unit cell.
[0043] In an embodiment, the distance from the ego-object to concatenate the group of unit cells is set to a predefined value for example 80m.
[0044] The multiple resolution grid is numbered by the image processing unit (102) where the cell identification number is assigned to each cell among the plurality of cells in the multiple resolution grid. The cell identification number for each cell of the multiple resolution grid is determined based on at least one of the defining cell of the tile of unit cells that constitute the cell, the defining point of the cell and a relative offset between the defining point of the cell and the defining point of the defining unit cell of the tile of unit cells that constitute the cell.
[0045] The multiple resolution grid with cell identification numbers (103) may be used by other processors to further convert the cell identification number to the corresponding spatial location and vice versa.
[0046] In an embodiment, the multiple resolution grid with cell identification numbers (103) may he used to determine a spatial location of the defining point of the cell for the given cell identification number, based on the uniform grid underlying the multiple resolution grid and the relative offset from the defining point of the defining cell to the defining point of the cell. Furthermore, for a given a spatial location in the multiple resolution grid the corresponding cell identification number is identified based on the underlying uniform grid and the defining cell.
[0047] Figure 2 shows an exemplary flow chart illustrating method steps in 10 accordance with some embodiments of the present disclosure.
[0048] As illustrated in Figure 2, the method 200 may comprise one or more steps in accordance with some embodiments of the present disclosure. The method 200 may he described in the general context of computer executable instructions.
Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform particular functions or implement particular abstract data types.
[0049] The order in which the method 200 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method. Additionally, individual blocks may be deleted from the methods without departing from the scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof [0050] As shown in Figure 2, at the step 201. the image processing unit (102). receives an image (101) with a multiple resolution grid. The multiple resolution grid is a combination of plurality of cells of different sizes, with the unit cell being the largest cell that can tile any of the cell of the multiple resolution grid. As shown in Figure 7, a multiple resolution grid is received with the ego-object at the centre of the image (101) denoted by a black circle. In the vicinity of the ego-object the multiple resolution grid has the unit cells in the image (101). As shown in Figure 7, the vicinity is considered to be a region of 8 unit cells x 8 unit cells around the ego-object; we will use the symbol -near" to denote this 8; and the entire region of interest is 16 unit cells x 16 unit cells around the ego object; we will use the symbol "far" to denote this 16. As the distance from the ego-object increases, a group of 4 unit cells (2 unit cells in the x-direction and 2 unit cells in the y-direction) are concatenated in the multiple resolution grid of the image (101) as shown in Figure 7.
[0051] In an embodiment, Oven the multiple resolution grid, the uniform grid of the image (101) is generated by identifying the unit cell and dividing the image (101) into integer multiples of the unit cell as shown in Figure 8. The dimension of the unit cell in the uniform grid may be set to a predefined value.
[0052] Referring back to Figure 2, at the step 202, the image processing unit (102), identifies, for each cell among the plurality of cells, a defining cell among the one or more unit cells that tile the cell and a defining point for the cell from a predefined point in the defining cell. Further, identifying the defining cell for each of the cell in the grid comprises at least one of a corner technique and a centre technique. Furthermore, identi lying the defining point for each of the cell in the multiple resolution grid, comprises at least one of a corner technique and a centre technique.
[0053] As shown in Figure 7, for the multiple resolution grid in the vicinity of the ego-object the grid has unit cells, therefore the unit cells itself are the defining cells. Further, in the multiple resolution grid as the distance from the ego-object increases the cells are larger and are a group of 4 unit cells, therefore using the corner technique, any one unit cell among the four unit cells is identified as the defining cell for the larger cells. For example, a lower left unit cell is identified as the defining cell for each of the cells formed by concatenating the 4 unit cells as shown in Figure 9A. Tn an embodiment, given a multiple resolution grid, using the centre technique the unit cell in the middle of a large cell tiled by the unit cells is chosen as the defining unit cell of the large 3x3 cell as shown in Figure 9B.
[0054] The defining point in each of the cell of the multiple resolution grid using the centre technique is chosen as the centre of the cell as shown in Figure 9C. Further, as shown in Figure 9D, the defining point in each of the cell using the corner technique is chosen as one of four corners of the cell.
[0055] Referring hack to Figure 2, at the step 203, the image processing unit (102), determines the cell identification number for each of the cell of the multiple resolution grid. The cell identification number is determined based on at least one of the defining cell, the defining point and the relative offset between the defining point of the defining cell and the defining point of the cell in a multiple resolution grid. Further, determining the cell identification number comprises assigning a cell identification number in an increasing order along at least one of x-axis, y-axis and z-axis. In an embodiment, the relative offset is computed as the difference between spatial location of defining point of the defining cell and the defining point in the cell. The relative offset is the difference in the spatial distance along the coordinate axes, for e.g, along the x-axis, y-axis and z-axis between the defining point of the defining cell and the defining point of the cell.
[0056] As shown in Figure 10, based on the corner technique that results in identifying the lower left unit cell among the group of 4 unit cells in a multiple resolution grid as the defining cell, the cell identi fication number for each cell in the multiple resolution grid is determined. For example, consider the cell numbered "0" as shown in Figure 10: this cell is tiled by 4 unit cells (0,1,16,17) from the underlying uniform grid. Based on the comer technique the lower left unit cell with cell identification number "0" is used as the cell identification number of the cell as shown in Figure 10.
[0057] In an embodiment, for a given cell identification number a spatial location of the defining point of the cell is determined using at least one of a corner technique or the centre technique.
[0058] For example, let "S" in meter denote the length and breadth of the multiple resolution grid of Figure 7. Further, let "S" be 1.6 m for the multiple resolution grid in Figure 7. The dimension of the uniform grid underlying the multiple resolution grid as shown in Figure 8 is (16 x 16), therefore m = 16 and n = 16. Let and -j" denote the row and the column number of the cell identification number in the underlying uniform grid. The rows and the columns in the grid are numbered from zero to (m-1) and (n-1) respectively i.e. from 0 to 15. Figure 10 shows the cells numbered using the lower left unit cell as the defining cell. In Figure 10, for example, if cell identification number is "66", the corresponding column number "i = 2" and the row number "j = 4". Further, let "x" and "y" denote the spatial location in the multiple resolution grid corresponding to the cell identification number. Next let the centre of the cell be the defining point. We now sec that the relative offset is 0.5 (0.05 m) in each direction x and y as follows: spatial dimension of unit cell is 1 x 1 (0.1m x 0.1m), therefore the distance covered from the centre of the defining Cell to the centre of the cell along x-direction or y direction is 0.5 (0.05 in) therefore the relative offset is 0.5 in each direction x and y. Given a cell identification number, the corresponding column number "i" and the row number "j" is determined and, using the corner technique for cell identification and the centre technique for defining point, the spatial location in the multiple resolution grid is determined using the equation given below: defining point of cell = defining point of defining cell + vector relative offset (x, y) = ((i+0.5, j+05) + vector relative offset) * (SIN) x = ((i+0.5) + relative offset) * (SIN) y = ((j+0.5) + relative offset) * (SIN) Therefore, for the cell identification number "66" the spatial location is "x = 0.3m" and -y = 0.5m".
[00591 Further, let the defining cell be identified using a corner technique and defining point be identified using the corner technique as well, the spatial location in the multiple resolution grid is determined using the equation given below: x = i * (S/N) (3) y = j (S/N) (4) Therefore, under such a scheme, for the cell identification number "66" the spatial location is "x = 0.2m" and "y = 0.4m".
[0060] In an embodiment, for a given spatial location in the multiple resolution grid the corresponding cell identification number in the multiple resolution grid is identified based on the underlying uniform grid and the defining cell using at least one of the corner technique or the centre technique.
[0061] Let the defining cell be identified using the corner technique (for example left lower unit cell of cell in the multiple resolution grid) and the defining point be identified using centre technique. For a given spatial location, first the row and the column number of the cell identification number in the underlying uniform grid is determined using equation given below: i = floor (x (N/S) (5) j = floor (y * (N/S) (6) [0062] Let the spatial location be "x = 0.15" and "y = 0.25", therefore the row number "i = 1" and column number 1 = 2" is determined. Next the left lower unit cell of the containing cell is determined by the equation below: ii= i jj =j in = (far -near)/2 if (a <m) II (j >, (m+near)) II < II ( >, m + neax))) ii = i & OxFFFFFFFe jj =J & OxFFFFFFFe cell identification number = (jj * far) + ii [0063] Using the above equations, thc cell identification number corresponding to the location "x = 0.15" and "y = 0.25" comes out to be 32.
[0064] Further, let the defining point be identified using the centre technique and defining cell be identified using the corner technique. For a given spatial location the row and the column number of the cell identification number in a uniform grid is determined using equation given below: i = int (x (N/S) + 0.5) (5) j = int (y (N/S) + 0.5) (6) [0065] In an embodiment, the method performs determining the cell identification number for a spatial location in the image (101) overlaid with the multiple resolution grid and also performs determining the spatial location for the cell identification number within a multiple resolution grid over the image (101).
[0066] The method may determine the cell identification number along x-axis, y-axis and z-axis. Further, the method of centre and corner technique used to determine the cell identification number is computationally efficient.
[0067] The desciibed operations may be implemented as a method, system or article of manufacture using standard programming and/or engineering techniques to produce software, firmware, hardware, or any combination thereof. The described operations may be implemented as code maintained in a "non-transitory computer readable medium", where a processor may read and execute the code from the computer readable medium. The processor is at least one of a microprocessor, fPGA, ASIC processor etc. capable of processing and executing the queries. A non-transitory computer readable medium may comprise media such as magnetic storage medium (e.g., hard disk drives, floppy disks, tape, etc.), optical storage (CD-ROMs. DVDs, optical disks, etc.), volatile and non-volatile memory devices (e.g., EEPROMs, ROMs, PROMs, RAMs, DRAMs, SRAMs, Flash Memory, firmware, programmable logic, etc.), etc. Further, non-transitory computer-readable media comprise all computer-readable media except for a transitory. The code implementing the described operations may further be implemented in hardware logic (e.g., an integrated circuit chip, Programmable Gate Array (PGA), Application Specific Integrated Circuit (AS IC), etc.).
[0068] Still further, the code implementing the described operations may be implemented in "transmission signals", where transmission signals may propagate through space or through a transmission media, such as an optical fiber, copper wire, etc. The transmission signals in which the code or logic is encoded may further comprise a wireless signal, satellite transmission, radio waves, infrared signals, Bluetooth, etc. The transmission signals in which the code or logic is encoded is capable of being transmitted by a transmitting station and received by a receiving station, where the code or logic encoded in the transmission signal may be decoded and stored in hardware or a non-transitory computer readable medium at the receiving and transmitting stations or devices. An "article of manufacture" comprises non-transitory computer readable medium, hardware logic, and/or transmission signals in which code may be implemented. A device in which the code implementing the described embodiments of operations is encoded may comprise a computer readable medium or hardware logic. Of course, those skilled in the art will recognize that many modifications may be made to this configuration without departing from the scope of the invention, and that the article of manufacture may comprise suitable information bearing medium known in the art.
[0069] The terms an embodiment", "embodiment", "embodiments", "the embodiment", "the embodiments", "one Or more embodiments", "some embodiments", and 'one embodiment" mean "one or more (but not all) embodiments of the invention(s)" unless expressly specified otherwise.
[0070] The terms "including", "comprising", "having" and variations thereof mean 'Including hut not limited to", unless expressly specified otherwise.
[0071] The enumerated listing of items does not imply that any or all of the items are mutually exclusive, unless expressly specified otherwise.
[0072] The terms "a", "an" and "the" mean "one or more", unless expressly specified otherwise. A description of an embodiment with several components in communication with each other does not imply that all such components are required. On the contrary a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention.
[0073] When a single device or article is described herein, it will be readily apparent that more than one device/article (whether or not they cooperate) may be used in place of a single device/article. Similarly, where more than one device or article is described herein (whether or not they cooperate), it will be readily apparent that a single device/article may be used in place of the more than one device or article or a different number of devices/articles may be used instead of the shown number of devices or programs. The functionality and/or the features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of the invention need not include the device itself.
[0074] The illustrated operations of Figure 2 show certain events occurring in a certain order. In alternative embodiments, certain operations may be performed in a different order, modified or removed. Moreover, steps may be added to the above described logic and still conform to the described embodiments. Further, operations described herein may occur sequentially or certain operations may be processed in parallel. Yet further, operations may be performed by a single processing unit or by distributed processing units.
[0075] Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based here ow Accordingly, ihe disclosure of the embodiments of the invention is intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
[0076] While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true scope being indicated by the following claims.

Claims (10)

  1. We claim: 1 A method of determining cells in a multiple resolution grid, the method comprising: receiving, by a processing unit (102), data (101) in a multiple resolution grid, wherein the multiple resolution grid is a combination of a plurality of cells of different sizes, wherein each cell among the plurality of cells is tiled by one or more unit cells; identifying, by the processing unit (102), for each cell, a defining cell, and a defining point, wherein the defining cell is a unit cell among the one or more unit cells that tile the cell and the defining point is in relation to a predefined point in the defining cell; and determining, by the processing unit (102), a cell identification number for the plurality of cells based on at least one of the defining cell, the defining point and a relative offset between the defining point of the defining cell and a defining point of the cell in the multiple resolution grid.
  2. 2. The method as claimed in claim 1, wherein identifying the defining cell comprises at least one of a corner technique and a center technique.
  3. 3. The method as claimed in claim 1, wherein identifying the defining point for the defining cell comprises at least one of a corner technique and a center technique.
  4. 4. The method as claimed in claim 1, wherein given a location, determining the cell identification number of the cell that location is in comprises translating the spatial location in the cell to the defining point of the cell, and further to the defining point of the defining cell and hence to the cell identification number of the defining cell in a underlying unifoun grid.
  5. 5. The method as claimed in claim 1, wherein for a given cell identification number a spatial location of the defining point of the cell is determined based on the cell identification number of the underlying uniform grid thereby determining the spatial location of the defining point of the defining cell hence determining the location of cell, from the relative position between the defining point of the cell and the defining point of the defining cell.
  6. 6. An processing unit (102) comprising: a processor; and a memory, communicatively coupled to the processor, storing processor executable instructions, which, on execution causes the processor to: receive data (101) with a multiple resolution grid, wherein the multiple resolution grid is a combination of a plurality of cells of different sizes, wherein each cell among the plurality of cells is tiled by one or more unit cells; identify for each cell, a defining cell, and a defining point, wherein the defining cell is a unit cell among the one or more unit cells that tile the cell and the defining point is in relation to a predefined point in the defining cell; determine a cell identification number for the plurality of cells based on at least one of the defining cell, the defining point and a relative offset between the defining point of the defining cell and a defining point of the cell in the multiple resolution grid.
  7. 7. The processing unit (102) as claimed in claim 6, wherein the processor is configured to identify the defining cell comprises at least one of the corner technique and the center technique.
  8. 8. The processing unit (102) as claimed in claim 6, wherein the processor is configured to determine the cell identification number of the cell that contains a given location comprises translating the spatial location in the cell to the defining point of the cell and further to the defining point of the defining cell and hence to the cell identification number of the defining cell in a underlying uniform grid.
  9. 9. The image processing unit (102) as claimed in claim 6, wherein the processor is configured to determine, for a given cell identification number, a spatial location of the defining point of the cell is determined based on the cell identification number of the underlying uniform grid thereby deteimining the spatial location of the defining point of the defining cell hence determining the location of cell, from the relative position between the defining point of the cell and the defining point of the defining cell.
  10. 10. A method of determining cells in a multiple resolution grid, the method comprising: receiving, by an processing unit (102), data(101) with a multiple resolution grid, wherein the multiple resolution grid is a combination of a plurality of cells of different sizes, wherein each cell among the plurality of cells is tiled by one or more unit cells; identifying, by the processing unit (102), for each cell, a defining cell, and a defining point, wherein the defining cell is a unit cell among the one or more unit cells that tile the cell and the defining point is in relation to a predefined point in the defining cell; and determining, by processing unit (102), a cell identification number for the plurality of cells based on at least one of the defining cell, the defining point and a relative offset between the defining point of the defining cell and a defining point of the cell in the multiple resolution grid, wherein determining the cell identification number of the cell containing a given location comprises translating the spatial location in the cell to the defining point of the cell, and further to the defining point of the defining cell and hence to the cell identification number of the defining cell in a underlying uniform grid, wherein given the cell identification number a spatial location of the defining point of the cell is determined based on the cell identification number of the underlying uniform grid thereby determining the spatial location of the defining point of the defining cell hence determining the location of cell, from the relative position between the defining point of the cell and the defining point of the defining cell.
GB1915734.6A 2019-10-30 2019-10-30 Method of determining cells in a multiple resolution grid Withdrawn GB2588637A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1915734.6A GB2588637A (en) 2019-10-30 2019-10-30 Method of determining cells in a multiple resolution grid
GB2001295.1A GB2588696A (en) 2019-10-30 2020-01-30 Method of determining cells in a multiple resolution grid

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1915734.6A GB2588637A (en) 2019-10-30 2019-10-30 Method of determining cells in a multiple resolution grid

Publications (2)

Publication Number Publication Date
GB201915734D0 GB201915734D0 (en) 2019-12-11
GB2588637A true GB2588637A (en) 2021-05-05

Family

ID=68769044

Family Applications (2)

Application Number Title Priority Date Filing Date
GB1915734.6A Withdrawn GB2588637A (en) 2019-10-30 2019-10-30 Method of determining cells in a multiple resolution grid
GB2001295.1A Withdrawn GB2588696A (en) 2019-10-30 2020-01-30 Method of determining cells in a multiple resolution grid

Family Applications After (1)

Application Number Title Priority Date Filing Date
GB2001295.1A Withdrawn GB2588696A (en) 2019-10-30 2020-01-30 Method of determining cells in a multiple resolution grid

Country Status (1)

Country Link
GB (2) GB2588637A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018077641A1 (en) 2016-10-28 2018-05-03 Valeo Schalter Und Sensoren Gmbh Determining a trajectory with a multi-resolution grid
US10410531B2 (en) * 2014-11-05 2019-09-10 Sierra Nevada Corporation Systems and methods for generating improved environmental displays for vehicles

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011076197A (en) * 2009-09-29 2011-04-14 Seiko Epson Corp Apparatus and method for processing image, and computer program
EP3343445A1 (en) * 2016-12-28 2018-07-04 Thomson Licensing Method and apparatus for encoding and decoding lists of pixels
CN108830929A (en) * 2018-05-21 2018-11-16 东南大学 Multi-resolution Terrain pyramid model generation method and system based on database

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10410531B2 (en) * 2014-11-05 2019-09-10 Sierra Nevada Corporation Systems and methods for generating improved environmental displays for vehicles
WO2018077641A1 (en) 2016-10-28 2018-05-03 Valeo Schalter Und Sensoren Gmbh Determining a trajectory with a multi-resolution grid

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
GANESAN SASHIKUMAAR ET AL: "An Object Oriented Parallel Finite Element Scheme for Computations of PDEs: Design and Implementation", 2016 IEEE 23RD INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE COMPUTING WORKSHOPS (HIPCW), IEEE, 19 December 2016 (2016-12-19), pages 106 - 115, XP033055992, DOI: 10.1109/HIPCW.2016.023 *
HARTMANN D ET AL: "An adaptive multilevel multigrid formulation for Cartesian hierarchical grid methods", COMPUTERS AND FLUIDS, PERGAMON PRESS, NEW YORK, NY, GB, vol. 37, no. 9, 1 October 2008 (2008-10-01), pages 1103 - 1125, XP023610565, ISSN: 0045-7930, [retrieved on 20071215], DOI: 10.1016/J.COMPFLUID.2007.06.007 *

Also Published As

Publication number Publication date
GB2588696A (en) 2021-05-05
GB202001295D0 (en) 2020-03-18
GB201915734D0 (en) 2019-12-11

Similar Documents

Publication Publication Date Title
CN110490922B (en) Method and processing system for updating first image based on second image
CN111563923B (en) Method for obtaining dense depth map and related device
CN113643378B (en) Active rigid body pose positioning method in multi-camera environment and related equipment
Vuylsteke et al. Range image acquisition with a single binary-encoded light pattern
US20220276339A1 (en) Calibration method and apparatus for sensor, and calibration system
JP6310101B2 (en) Radar-based interpretation of multiple 3D codes
US20090058993A1 (en) Cmos stereo camera for obtaining three-dimensional image
US11105911B2 (en) Method and system for contextualized perception of physical bodies
CN111765884A (en) Robot repositioning method and device, electronic equipment and storage medium
CN113204004A (en) Laser radar calibration device and method
CN112270719B (en) Camera calibration method, device and system
CN115856829B (en) Image data identification method and system for radar three-dimensional data conversion
CN113281723B (en) AR tag-based calibration method for structural parameters between 3D laser radar and camera
GB2588637A (en) Method of determining cells in a multiple resolution grid
CN112781893A (en) Spatial synchronization method and device for vehicle-mounted sensor performance test data and storage medium
CN111580089A (en) Positioning method and related device
CN115471563A (en) Calibration method and device of vehicle-mounted all-round looking system
CN115063489A (en) External parameter calibration method, device, equipment and storage medium
EP3869793A1 (en) Image sensor circuitry for reducing effects of laser speckles
CN114882118A (en) Method and device for calibrating laser radar and camera external parameters and electronic equipment
CN210093321U (en) Camera and camera control device
Cha et al. A calibration and range-data extraction algorithm for an omnidirectional laser range finder of a free-ranging mobile robot
CN114897997B (en) Camera calibration method, device, equipment and storage medium
CN100480971C (en) Optical sensing apparatus for navigating position and a navigation method thereof
US20230126591A1 (en) System and method for calibrating a three-dimensional scanning device

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)