JP4877105B2 - Vehicle 3D shape model data creation method - Google Patents

Vehicle 3D shape model data creation method Download PDF

Info

Publication number
JP4877105B2
JP4877105B2 JP2007176439A JP2007176439A JP4877105B2 JP 4877105 B2 JP4877105 B2 JP 4877105B2 JP 2007176439 A JP2007176439 A JP 2007176439A JP 2007176439 A JP2007176439 A JP 2007176439A JP 4877105 B2 JP4877105 B2 JP 4877105B2
Authority
JP
Japan
Prior art keywords
coordinate value
value data
dimensional
vehicle
outer surface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2007176439A
Other languages
Japanese (ja)
Other versions
JP2009014500A (en
Inventor
武 倉田
俊介 山本
朋幸 橋本
哲治 長尾
Original Assignee
マツダ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by マツダ株式会社 filed Critical マツダ株式会社
Priority to JP2007176439A priority Critical patent/JP4877105B2/en
Publication of JP2009014500A publication Critical patent/JP2009014500A/en
Application granted granted Critical
Publication of JP4877105B2 publication Critical patent/JP4877105B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a method for creating three-dimensional shape model data of a vehicle composed of a plurality of parts.

  In order to perform various examinations of a product, there is a demand for efficiently obtaining 3D shape model data of the entire product including parts constituting the product.

  For example, in order to perform vehicle CAE analysis (collision, NVH, vehicle strength, rigidity, unit, aerodynamics, etc.) and shape benchmarks, not only the three-dimensional shape model data of the vehicle exterior but also the parts that make up the vehicle are included. Therefore, establishment of a method for efficiently obtaining the three-dimensional shape model data is required.

  As a method of creating three-dimensional shape model data of a vehicle, a target mark is pasted on a measurement object, photographed from multiple directions with a non-contact optical three-dimensional digitizer, and the target mark 3 is obtained from images obtained from each direction. A method of calculating three-dimensional coordinate value data and creating three-dimensional shape model data from the three-dimensional coordinate value data is known. A technique for obtaining a three-dimensional coordinate value data of a target mark by assigning a target mark to an object to be measured is described in Patent Document 1, for example.

JP 2002-2566 A

  According to the conventional method, it is possible to create three-dimensional shape model data individually in units such as only the outside of the vehicle or a single component part. However, these three-dimensional shape model data are independent of each other, and from such three-dimensional shape model data, the relevance of each part to the whole vehicle such as where each part is located in the whole vehicle is determined. And I couldn't get the relationship between parts. Even if the 3D shape model data of the vehicle exterior only and the 3D shape model data of individual parts are created separately, if there is nothing to correlate those data, a highly reliable analysis is possible. Can not.

  It is an object of the present invention to provide a method for creating three-dimensional shape model data in which a vehicle outer shape and each component are associated with each other at high speed and with high accuracy.

A method for creating three-dimensional shape model data of a vehicle according to one aspect of the present invention includes the following steps.
(A1) A plurality of target marks are provided on the outer surface of the vehicle.
(A2) The outer surface of the vehicle provided with the plurality of target marks in step A1 is imaged from a plurality of positions by the first imaging means, and the whole three-dimensional coordinate value data of the plurality of target marks is acquired by image processing.
(A3) The outer surface shape of the vehicle is imaged from a plurality of positions by the second imaging means, and the entire three-dimensional shape model data including the three-dimensional coordinate values of the outer surface of the vehicle associated with the entire three-dimensional coordinate value data is created by image processing. To do.
(B1) Remove the vehicle lid and apply a plurality of target marks to the outer surface of the exposed part.
(B2) The range including the outer surface of the part to which the plurality of target marks are assigned in step B1 and the outer surface of the vehicle around the part is imaged by the first imaging means, and the plurality of target mark portions 3 are processed by image processing. Get the dimension coordinate value data.
(B3) The coordinate values in the partial three-dimensional coordinate value data of at least three target marks given to the outer surface of the vehicle around the part in step A1 are used as the at least three target marks in the whole three-dimensional coordinate value data. The coordinate values of the partial three-dimensional coordinate value data are converted to data in the same coordinate system as the whole three-dimensional coordinate value data.
(B4) A part 3 composed of the three-dimensional coordinate values of the part outer surface associated with the partial three-dimensional coordinate value data obtained by imaging the outer shape of the part with the second imaging unit and coordinate-converted in step B3 by image processing. Create dimensional shape model data.
(C1) The component is removed from the vehicle, and a plurality of target marks are additionally provided on the outer surface other than the outer surface to which the target mark is provided in Step B1.
(C2) All the outer surfaces of the part to which a plurality of target marks are assigned on the entire outer surface in the step C1 are imaged from a plurality of positions by the first imaging means, and the component three-dimensional coordinate value data of the plurality of target marks by image processing To get.
(C3) At least three points in the partial three-dimensional coordinate value data obtained by coordinate-transforming the coordinate values in the component three-dimensional coordinate value data of at least three target marks given to the outer surface of the component in step B1 in step B3 By matching the coordinate value of the target mark, the component three-dimensional coordinate value data is coordinate-converted into data having the same coordinate system as the coordinate-converted partial three-dimensional coordinate value data.
(C4) The entire outer shape of the part is imaged from a plurality of positions by the second imaging means, and 3 of all the outer surfaces of the part associated with the component three-dimensional coordinate value data transformed in step C3 by image processing. The part 3D shape model data composed of the dimension coordinate values is created.

  According to the above series of steps, the three-dimensional shape model data in which the outer shape of the vehicle and each component are associated with each other can be efficiently created.

  According to the present invention, three-dimensional shape model data in which the outer shape of a vehicle and each part are associated with each other can be created at high speed and with high accuracy.

  DESCRIPTION OF EMBODIMENTS Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the drawings.

<System configuration>
FIG. 1 is a diagram illustrating a configuration of a system that serves as a platform for a method of creating three-dimensional shape model data according to the present embodiment.

  Reference numeral 10 denotes a vehicle to be measured in the present embodiment. Reference numerals 20 and 30 denote measuring means. Specifically, reference numeral 20 denotes a coordinate measuring camera as a first imaging means. This coordinate measuring camera is a so-called three-dimensional digitizer, for example, Optigo 11 of CogniTens is suitable, but a general digital single lens reflex camera can also be used. Reference numeral 30 denotes a three-dimensional measurement camera as a second imaging unit. Compared with the coordinate measuring camera 20 described above, the three-dimensional measuring camera 30 is equipped with three CCD cameras 31, 32, and 33 and can obtain three pieces of two-dimensional image data in one shot, The feature is that high-speed photography is possible. As such a three-dimensional measurement camera, for example, Optigo 200 manufactured by CogniTens is suitable. Both the coordinate measurement camera 20 and the three-dimensional measurement camera 30 are portable, and the vehicle 10 that is the measurement target can be photographed by hand from all angles. However, the three-dimensional measurement camera 30 may be supported by a carriage 34 with an articulated arm as illustrated.

  Each of the coordinate measurement camera 20 and the three-dimensional measurement camera 30 is connected to a personal computer 40 as image processing means, and image data obtained by photographing can be transferred to the personal computer 40.

  FIG. 2 is a block diagram showing a hardware configuration of the personal computer 40.

  As shown in the figure, the personal computer 40 has the following configuration including a CPU 41 that controls the operation, a RAM 42 that provides a work area for the CPU 41 and functions as a main storage device, and a ROM 43 that stores a boot program and the like.

  The VRAM 44 is a memory for developing image data to be displayed, and the image can be displayed on the monitor 45 by developing the image data or the like here.

  The HDD 46 is a hard disk device, in which an operating system (OS) and image processing software for generating later-described three-dimensional coordinate value data and three-dimensional shape model data are installed. The HDD 46 also has an area for storing the generated 3D coordinate value data and 3D shape model data.

  Reference numerals 47 and 48 denote a keyboard and a mouse as input devices, respectively. Reference numeral 49 denotes an interface (I / F) for connecting to the coordinate measuring camera 20 and the three-dimensional measuring camera 30.

  Hereinafter, an example of a method for creating the three-dimensional shape model data of the vehicle 10 that is the measurement target in the above-described system configuration will be described in detail.

<Step A1. Attaching the target mark to the outer surface of the vehicle>
First, as shown in FIG. 3, a plurality of target marks T having an optical reflection surface are attached to the entire outer surface of the vehicle 10. Until the series of steps described below is completed, the target mark T is fixed so as not to be displaced or peeled off. It is convenient to use a magnet target mark for a magnetic material such as a body. The target mark T is affixed almost uniformly on the entire outer surface of the vehicle 10. However, care must be taken not to form a straight line in order to recognize the position.

  An auto bar and a scale bar are arranged around the vehicle 10 (not shown). The autobar is, for example, five targets arranged in a cross shape, and is used to determine the direction of the coordinate measuring camera 20 or the three-dimensional measuring camera 30. The scale bar has a plurality of targets with known intervals and is used as a reference device of a three-dimensional coordinate system.

<Step A2. Acquisition of overall 3D coordinate value data>
The outer surface of the vehicle 10 provided with the plurality of target marks T in step A <b> 1 is imaged from a plurality of positions by the coordinate measurement camera 20. The photographer photographs the surroundings of the vehicle 10 while changing the position. Thereby, two-dimensional image data from at least two viewpoints is obtained for each target mark T.

  A plurality of two-dimensional image data obtained by photographing is transferred to the personal computer 40. The personal computer 40 performs image processing using a known digital photogrammetry algorithm by image processing software, and generates three-dimensional coordinate value data (alignment data) of the target mark T. As shown in FIG. 4, the three-dimensional coordinate value data is text data representing the XYZ coordinate values of the target mark T and relates to the entire outer surface of the vehicle 10. It is called “value data”.

<Step A3. Creation of overall 3D shape model data>
The entire three-dimensional coordinate value data obtained in step A2 is imported into the three-dimensional measurement camera 30, and then the outer shape of the vehicle 10 is imaged from a plurality of positions with the three-dimensional measurement camera 30 (see FIG. 5). In this photographing, photographing is performed so that a plurality of target marks T always fall within the photographing range.

  A plurality of two-dimensional image data obtained by photographing is transferred to the personal computer 40. The personal computer 40 performs image processing using image processing software, and converts the captured two-dimensional image data into three-dimensional shape model data (point cloud data).

  The personal computer 40 further arranges the three-dimensional shape model data on the coordinate space of the whole three-dimensional coordinate value data. Specifically, as shown in FIG. 6, a target mark is detected from the three-dimensional shape model data per shot (61 in FIG. 6), and pattern matching is performed on the entire three-dimensional coordinate value data. Then, coordinate conversion (positioning) of the three-dimensional shape model data is performed according to this result.

  Each time shooting is performed, the above-described conversion into the three-dimensional shape model data and the arrangement of the three-dimensional shape model data on the coordinate space are performed. In this way, the three-dimensional shape model data of the entire outer surface of the vehicle 10 including the three-dimensional coordinate values of the outer surface of the vehicle associated with the entire three-dimensional coordinate value data is created. Hereinafter, this three-dimensional shape model data is referred to as “total three-dimensional shape model data”.

<Step B1. Attaching target marks to the external surface of parts>
Next, as shown in FIG. 7, the bonnet 11 that is a lid of the vehicle 10 is removed. Then, a plurality of target marks T are affixed to the exposed parts, specifically, in the engine room 12 and the upper part of the engine body 13. Note that the front door, rear door, and trunk lid (lift gate), which are other lids of the vehicle 10, may also be removed in this step.

<Step B2. Acquisition of partial 3D coordinate value data>
The part to which the plurality of target marks T are given in step B1 is imaged from the plurality of positions by the coordinate measuring camera 20. At this time, not only the part to which the plurality of target marks T are provided in the process B1, but also the part 14 on the outer surface of the vehicle around the engine room 12 (target marks T1 to T4 are provided in the process A1) are photographed. .

  A plurality of two-dimensional image data obtained by photographing is transferred to the personal computer 40. As in step A2, the personal computer 40 performs image processing using image processing software, and generates three-dimensional coordinate value data of the target mark T. This three-dimensional coordinate value data is text data representing the XYZ coordinate values of the target mark. Since this three-dimensional coordinate value data relates to the assembled parts group (part to which the target mark is given) that is exposed when the lid of the vehicle 10 is removed in step B1, here, "part 3 This is called “dimensional coordinate value data” or “part group three-dimensional coordinate value data”.

<Step B3. Coordinate conversion of partial 3D coordinate value data>
As shown in FIG. 8, the coordinate system is different between the partial three-dimensional coordinate value data obtained in step B2 and the entire three-dimensional coordinate value data obtained in step A2. Therefore, processing for matching both coordinate systems is performed.

  Specifically, the coordinate values in the partial three-dimensional coordinate value data of at least three target marks (for example, target marks T1 to T3) given to the outer surface of the vehicle in the vicinity of the engine room 12 photographed in the process B2 are expressed as 3 By matching the coordinate values of at least three target marks in the three-dimensional coordinate value data, the partial three-dimensional coordinate value data is converted to data in the same coordinate system as the whole three-dimensional coordinate value data. This process can be executed as a function of the image processing software of the personal computer 40. For example, the user selects coordinate value data of three target marks T1 to T3 from the partial three-dimensional coordinate value data and the entire three-dimensional coordinate value data on the personal computer 40, and instructs coordinate conversion processing. Then, the personal computer 40 matches the coordinate values of the three target marks T1 to T3 in the partial three-dimensional coordinate value data with the coordinate values in the entire three-dimensional coordinate value data, and accordingly, the partial three-dimensional coordinate value data The other coordinate values of are also displaced.

  By this coordinate conversion, the coordinate reference between the overall three-dimensional coordinate value data of the outer shape of the vehicle 10 and the partial three-dimensional coordinate value data in the engine room 12 is aligned.

<Step B4. Creation of partial 3D shape model data>
The partial three-dimensional coordinate value data transformed in step B3 is imported into the three-dimensional measurement camera 30, and then the three-dimensional measurement camera 30 uses the three-dimensional measurement camera 30 in the engine room 12 and the upper part of the engine main body 13 based on the imported coordinate value data. Is taken from a plurality of positions (see FIG. 9).

  A plurality of two-dimensional image data obtained by photographing is transferred to the personal computer 40. The personal computer 40 performs image processing using image processing software, and converts the captured two-dimensional image data in the engine room 12 and the upper part of the engine body 13 into three-dimensional shape model data. Since the 3D shape model data is the 3D shape model data of the assembled parts group that is exposed when the lid of the vehicle 10 is removed, “partial 3D shape model data” or “part group 3D It is called “shape model data”.

  Since the partial three-dimensional shape model data obtained here is based on the partial three-dimensional coordinate value data coordinate-converted in step B3, it has the same coordinate system as the whole three-dimensional shape model data.

<Step C1. Attaching target marks to other external surfaces of parts>
Next, the engine body 13 is removed from the vehicle 10 as shown in FIG. A target mark T is given to the upper part of the engine body 13 in step B1. Here, a plurality of target marks T are additionally affixed to the outer surface other than the outer surface to which the target marks have been assigned in step B1.

<Step C2. Acquisition of part 3D coordinate value data>
The coordinate measuring camera 20 images the entire outer surface of the engine main body 13 provided with a plurality of target marks T on the entire outer surface in step C1.

  A plurality of two-dimensional image data obtained by photographing is transferred to the personal computer 40. The personal computer 40 performs image processing using image processing software in the same manner as in steps A2 and B2, and generates three-dimensional coordinate value data of the target mark. This three-dimensional coordinate value data is text data representing the XYZ coordinate values of the target mark, and relates to the component (engine body 13) to which the target mark T has been assigned in step C1, and therefore, here, “part three-dimensional This is called “coordinate value data”.

<Step C3. Coordinate transformation of part 3D coordinate value data>
The coordinate system of the component three-dimensional coordinate value data obtained in step C2 and the coordinate system of the partial coordinate value data transformed in step B3 (that is, the coordinate system of the entire three-dimensional coordinate value data obtained in step A2). Is different. Therefore, a process of converting the coordinate system of the component three-dimensional coordinate value data obtained in step C2 into the coordinate system of the partial coordinate value data subjected to coordinate conversion in step B3 is performed.

  Specifically, the coordinate values in the component three-dimensional coordinate value data of at least three target marks T given to the upper part of the engine main body 13 in step B1 are converted into at least the partial coordinate value data in the coordinate values converted in step B3. By matching the coordinate values of the three target marks, the component three-dimensional coordinate value data is converted to data in the same coordinate system as the partial three-dimensional coordinate value data. This process can also be executed as a function of the image processing software of the personal computer 40, as in the process B3. For example, on the personal computer 40, the user selects coordinate value data of three target marks from the component three-dimensional coordinate value data and the partial three-dimensional coordinate value data after coordinate conversion, and instructs coordinate conversion processing. . Then, the personal computer 40 matches the coordinate values of the three target marks in the component three-dimensional coordinate value data with the coordinate values in the partial three-dimensional coordinate value data, and in response to this, the other three-dimensional coordinate value data of the component three-dimensional coordinate value data. The coordinate value is also displaced.

  By this coordinate conversion, the component 3D coordinate value data of the engine body 13 is aligned with the partial 3D coordinate value data in the engine room 12 and the overall 3D coordinate value data of the outer shape of the vehicle 10. Become.

<Step C4. Creation of part 3D shape model data>
The part three-dimensional coordinate value data coordinate-transformed in step C3 is imported into the three-dimensional measurement camera 30, and then a plurality of outer surface shapes of the engine body 13 are pluralized by the three-dimensional measurement camera 30 based on the imported coordinate value data. The image is taken from the position (see FIG. 12).

  A plurality of two-dimensional image data obtained by this photographing is transferred to the personal computer 40. The personal computer 40 performs image processing using image processing software, and converts the captured two-dimensional image data relating to the entire outer surface of the engine body 13 into component three-dimensional shape model data.

  Since the part 3D shape model data obtained here is based on the part 3D coordinate value data of the engine body 13 coordinate-converted in step C3, the partial 3D shape model data and the entire 3D shape model are obtained. It has the same coordinate system as the data.

  Through the series of steps described above, the entire three-dimensional shape model data representing the outer surface of the vehicle 10, the partial three-dimensional shape model data representing the inside of the engine room 12 with the hood 11 removed, and the component three-dimensional representing the engine body 13 Shape model data can be acquired in the same coordinate system.

  Then, by repeating the above-described steps after B1 for each part constituting the vehicle, such as vehicle interior parts (instrument panels, seats, etc.), tires, undercovers, etc. exposed when the door is removed, Also for these parts, it is possible to obtain part three-dimensional shape model data having a common coordinate system.

  As described above, according to the method of creating the 3D shape model data of the present embodiment, by making the coordinate system of the whole 3D shape model data, the partial 3D shape model data, and the component 3D shape model data common, Each model data is associated. As a result, it is possible to easily grasp the positional relationship of each part in the entire vehicle, and work such as effective analysis becomes possible. For example, by using CAD software capable of processing these three-dimensional shape model data, it becomes possible to perform vehicle CAE analysis (collision, NVH, vehicle strength, rigidity, unit, aerodynamics, etc.) and shape benchmark. .

  In addition, the overall three-dimensional shape model data, partial three-dimensional shape model data, and component three-dimensional shape model data in which the coordinate system is unified as described above are synthesized, and the combined three-dimensional shape of the entire vehicle 10 including each component. It is also possible to create model data.

  Although not described in detail, each lid such as a bonnet is also measured with the coordinate measuring camera 20 and the three-dimensional measuring camera 30 with a target mark attached to the inside, and the three-dimensional coordinate value data and 3 Get the dimensional shape model data.

  The present invention is not limited to the types of vehicles described above, and can be applied to various types of vehicles.

It is a figure which shows the structure of the system used as the platform of the production method of the three-dimensional shape model data in embodiment. It is a block diagram which shows the hardware constitutions of the personal computer in embodiment. It is a figure explaining grant of a target mark and photography to a vehicle outer surface in an embodiment. It is a figure explaining the production | generation of the whole target mark three-dimensional coordinate value data by the personal computer in embodiment. It is a figure explaining the import of the whole three-dimensional coordinate value data to the three-dimensional measurement camera in embodiment, and imaging | photography of the vehicle outer surface by a three-dimensional measurement camera. It is a figure explaining arrangement | positioning on the coordinate space of the whole three-dimensional coordinate value data of the three-dimensional shape model data in embodiment. It is a figure explaining removal of the bonnet in the embodiment, giving of a target mark in an engine room, and photography. It is a figure explaining coordinate conversion of partial three-dimensional coordinate value data in an embodiment. It is a figure explaining photography in an engine room and an engine main part upper part by a three-dimensional measuring camera in an embodiment. It is a figure explaining removal of an engine main part in an embodiment, grant of a target mark to an engine main part, and photography. It is a figure explaining coordinate transformation of part three-dimensional coordinate value data in an embodiment. It is a figure explaining imaging of the engine main part by the three-dimensional measuring camera in an embodiment.

Explanation of symbols

10: Vehicle 20: Coordinate measuring camera 30: Three-dimensional measuring camera 31, 32, 33: CCD camera 34: Cart 40: Personal computer

Claims (2)

  1. A method for creating three-dimensional shape model data of a vehicle,
    (A1) providing a plurality of target marks on the outer surface of the vehicle;
    (A2) imaging the outer surface of the vehicle provided with a plurality of target marks in step A1 with a first imaging means from a plurality of positions, and acquiring overall three-dimensional coordinate value data of the plurality of target marks by image processing; ,
    (A3) The outer surface shape of the vehicle is imaged from a plurality of positions by the second imaging means, and the entire three-dimensional shape model data including the three-dimensional coordinate values of the outer surface of the vehicle associated with the entire three-dimensional coordinate value data is created by image processing. And a process of
    (B1) removing the vehicle lid and providing a plurality of target marks on the outer surface of the exposed part;
    (B2) The range including the outer surface of the part to which the plurality of target marks are assigned in step B1 and the outer surface of the vehicle around the part is imaged by the first imaging means, and the plurality of target mark portions 3 are processed by image processing. A step of acquiring dimension coordinate value data;
    (B3) The coordinate values in the partial three-dimensional coordinate value data of at least three target marks given to the outer surface of the vehicle around the part in step A1 are used as the at least three target marks in the whole three-dimensional coordinate value data. A coordinate conversion of the partial three-dimensional coordinate value data into data of the same coordinate system as the whole three-dimensional coordinate value data,
    (B4) A part 3 composed of the three-dimensional coordinate values of the part outer surface associated with the partial three-dimensional coordinate value data obtained by imaging the outer shape of the part with the second imaging unit and coordinate-converted in step B3 by image processing. Creating a shape model data;
    (C1) removing the component from the vehicle, and additionally providing a plurality of target marks on an outer surface other than the outer surface to which the target marks are provided in step B1;
    (C2) All the outer surfaces of the part to which a plurality of target marks are assigned on the entire outer surface in the step C1 are imaged from a plurality of positions by the first imaging means, and the component three-dimensional coordinate value data of the plurality of target marks by image processing A process of obtaining
    (C3) At least three points in the partial three-dimensional coordinate value data obtained by coordinate-transforming the coordinate values in the component three-dimensional coordinate value data of at least three target marks given to the outer surface of the component in step B1 in step B3 A coordinate conversion of the component three-dimensional coordinate value data into data of the same coordinate system as the coordinate-converted partial three-dimensional coordinate value data by matching with the coordinate value of the target mark of
    (C4) The entire outer shape of the part is imaged from a plurality of positions by the second imaging means, and 3 of all the outer surfaces of the part associated with the component three-dimensional coordinate value data transformed in step C3 by image processing. Creating a part 3D shape model data composed of dimensional coordinate values;
    A method characterized by comprising:
  2.   The vehicle lid is a door and a bonnet, and the parts are a part in a vehicle compartment that is exposed when the door is removed and a part in the engine room that is exposed when the bonnet is removed. The method according to 1.
JP2007176439A 2007-07-04 2007-07-04 Vehicle 3D shape model data creation method Active JP4877105B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007176439A JP4877105B2 (en) 2007-07-04 2007-07-04 Vehicle 3D shape model data creation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2007176439A JP4877105B2 (en) 2007-07-04 2007-07-04 Vehicle 3D shape model data creation method

Publications (2)

Publication Number Publication Date
JP2009014500A JP2009014500A (en) 2009-01-22
JP4877105B2 true JP4877105B2 (en) 2012-02-15

Family

ID=40355588

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2007176439A Active JP4877105B2 (en) 2007-07-04 2007-07-04 Vehicle 3D shape model data creation method

Country Status (1)

Country Link
JP (1) JP4877105B2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108474640A (en) * 2016-04-04 2018-08-31 宝马股份公司 Mobile measuring system for three dimensional optical measuring vehicle and vehicle part

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101679195B1 (en) * 2014-10-20 2016-11-24 안동대학교 산학협력단 Three-dimensional shape modeling apparatus for using the 2D cross-sectional accumulated and a method thereof

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04102178A (en) * 1990-08-22 1992-04-03 Hitachi Commun Syst Inc Object model input device
JP2002002566A (en) * 2000-06-20 2002-01-09 Daihatsu Motor Co Ltd Method of measuring assembly accuracy of fitting member by three-dimensional digitizer
GB0022444D0 (en) * 2000-09-13 2000-11-01 Bae Systems Plc Positioning system and method
JP4599515B2 (en) * 2005-05-27 2010-12-15 コニカミノルタセンシング株式会社 Method and apparatus for aligning three-dimensional shape data

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108474640A (en) * 2016-04-04 2018-08-31 宝马股份公司 Mobile measuring system for three dimensional optical measuring vehicle and vehicle part

Also Published As

Publication number Publication date
JP2009014500A (en) 2009-01-22

Similar Documents

Publication Publication Date Title
Kasper et al. The KIT object models database: An object model database for object recognition, localization and manipulation in service robotics
US7092109B2 (en) Position/orientation measurement method, and position/orientation measurement apparatus
JP2010540933A (en) Local positioning system and method
EP2459959B1 (en) Position and orientation calibration method and apparatus
US20110074929A1 (en) Auto-referenced sensing device for three-dimensional scanning
US8849636B2 (en) Assembly and method for verifying a real model using a virtual model and use in aircraft construction
Esquivel et al. Calibration of a multi-camera rig from non-overlapping views
JP2011141174A (en) Three-dimensional measurement apparatus and control method thereof
US20080267454A1 (en) Measurement apparatus and control method
EP0782100B1 (en) Three-dimensional shape extraction apparatus and method
EP1596330A2 (en) Estimating position and orientation of markers in digital images
JP2006214832A (en) Method of measuring marker arrangement, method of assuming position/attitude, device for measuring marker arrangement, device for assuming position/attitude
KR101212419B1 (en) Calibration apparatus
JP4677273B2 (en) Information processing method and information processing apparatus
JP2005215917A (en) Working drawing creation support method and replacement model creation method
EP2959315B1 (en) Generation of 3d models of an environment
JP2011175477A (en) Three-dimensional measurement apparatus, processing method and program
JP5018980B2 (en) Imaging apparatus, length measurement method, and program
JP4886560B2 (en) Information processing apparatus and information processing method
JP5548482B2 (en) Position / orientation measuring apparatus, position / orientation measuring method, program, and storage medium
JP2004062758A (en) Information processor and information processing method
JP2006099188A (en) Information processing method and apparatus
CN101542520A (en) Recognition processing method and image processing device using the same
JP3426459B2 (en) Photogrammetry system and photogrammetric method
US8350897B2 (en) Image processing method and image processing apparatus

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20100315

RD03 Notification of appointment of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7423

Effective date: 20101001

RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20101102

TRDD Decision of grant or rejection written
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20111026

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20111101

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20111114

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20141209

Year of fee payment: 3