WO2022202926A1 - Dispositif de correction de précision et dispositif d'affichage d'image de réalité augmentée - Google Patents

Dispositif de correction de précision et dispositif d'affichage d'image de réalité augmentée Download PDF

Info

Publication number
WO2022202926A1
WO2022202926A1 PCT/JP2022/013665 JP2022013665W WO2022202926A1 WO 2022202926 A1 WO2022202926 A1 WO 2022202926A1 JP 2022013665 W JP2022013665 W JP 2022013665W WO 2022202926 A1 WO2022202926 A1 WO 2022202926A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
unit
augmented reality
image
input
Prior art date
Application number
PCT/JP2022/013665
Other languages
English (en)
Japanese (ja)
Inventor
徹哉 土居
友典 川上
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2022202926A1 publication Critical patent/WO2022202926A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Definitions

  • the present disclosure relates to an accuracy correction device for an augmented reality image display device and an augmented reality image display device.
  • Augmented reality image display devices are known that virtually extend the world in front of you by displaying virtual objects overlaid on images captured by a camera.
  • the display position of the object within the augmented reality image is set in advance based on the distance from a specific position in the physical space.
  • a specific position recognized in the captured space is set as specific coordinates, and a position a predetermined distance away from the specific coordinates is set as a reference point for object display.
  • One aspect of the present disclosure aims to suppress deterioration in display position accuracy of an object due to individual differences in an augmented reality image display device that displays a virtual object overlaid on an image captured by a camera. do.
  • An accuracy correction device is provided in an augmented reality image display device.
  • This augmented reality image display device recognizes the space of the image captured by the image capturing unit, and places an object in the image using coordinates that are a predetermined distance away from specific coordinates in that space as a reference point. It is a device that generates an image and displays it on a display unit.
  • the accuracy correction device of the present disclosure includes an input section and a correction value setting section.
  • the input unit inputs distance information representing the actual distance between specific two points in the real space
  • the correction value setting unit recognizes the distance information input from the input unit and the augmented reality image display device.
  • a correction value for the predetermined distance is set based on the reference distance between the two points.
  • a correction value can be set.
  • the augmented reality image display device uses the set correction value to correct the predetermined distance used for setting the reference point, thereby displaying the object at the proper position where it should originally be displayed in the augmented reality image. be able to.
  • the accuracy correction device of the present disclosure it is possible to improve the display position accuracy of the object in the augmented reality image display device.
  • an augmented reality image display device includes an imaging unit, a space recognition unit, an image generation unit, and a display unit.
  • the shooting unit shoots images of the surroundings, and the space recognition unit recognizes the space of the images shot by the shooting unit.
  • the image generation unit generates and displays an augmented reality image by arranging an object in the video with coordinates that are a predetermined distance from the specific coordinates as reference points in the space recognized by the space recognition unit.
  • the unit displays the augmented reality image generated by the image generation unit.
  • the augmented reality image display device of the present disclosure includes an input unit and a correction value setting unit, similar to the accuracy correction device described above. Then, the image generating section corrects the predetermined distance based on the correction value set by the correction value setting section, and sets a reference point as the display position of the object.
  • the augmented reality image display device of the present disclosure it is possible to display an object at an appropriate position where it should originally be displayed in an augmented reality image, and it is possible to improve the display position accuracy of the object.
  • FIG. 1 is a block diagram showing the configuration of an augmented reality image display device
  • FIG. 4 is an explanatory diagram showing an example of specific coordinates and reference lines specified in the space around the vehicle
  • FIG. 10 is an explanatory diagram showing a method of setting a reference point, which is the display position of an object
  • 4 is a flowchart showing accuracy correction processing executed by an accuracy correction unit
  • 9 is a flowchart showing accuracy correction processing of the second embodiment
  • an augmented reality image display device (hereafter referred to as an AR machine) 5 of this embodiment is an in-vehicle sensor adjustment support system that assists adjustment work of in-vehicle sensors necessary for collision damage mitigation braking. , it is used to provide the operator 3 with the installation position of the equipment for adjustment.
  • the AR machine 5 of the present embodiment is worn on the head of the worker 3 who is the user, and displays an augmented reality image (hereafter referred to as an AR image) in front of the worker 3, so-called goggles. type display device.
  • AR is an abbreviation for "Augmented Reality”.
  • the vehicle-mounted sensor adjustment support system includes a server 201 , a personal computer (PC) 202 , a relay device 203 and an AR device 5 .
  • the server 201 and the PC 20 are connected to the communication network 200.
  • the relay device 203 relays wireless data communication, and is capable of wireless data communication with the communication network 200 .
  • the PC 202 and the AR device 5 have a wireless communication function, and can wirelessly communicate data with the relay device 203 . Therefore, the AR machine 5, the server 201 and the PC 202 can perform data communication with each other via the relay device 203 or the communication network 200.
  • the server 201 stores vehicle type information for each vehicle including vehicle 1.
  • the server 201 stores object data indicating an installation position object for each type of adjustment equipment.
  • the installation position object is an image indicating the installation position of the adjustment facility.
  • the server 201 also stores various work-related information (for example, work procedures) related to adjustment work for each type of adjustment equipment.
  • the PC 202 can download and save vehicle type information, object data, and work-related information from the server 201 . It may be possible to upload vehicle type information, installation position objects, and work-related information from the PC 202 to the server 201 .
  • the AR machine 5 can download and acquire vehicle type information, installation position objects, and work-related information from the server 201 or PC 202 .
  • the AR machine 5 displays in real time an image 7 captured by a camera 32 (see FIG. 2) as a capturing unit.
  • the video 7 is a video of the space in front of the worker 3.
  • the worker 3 puts the vehicle 1 on the floor surface 2 of the predetermined work area and performs adjustment work for various in-vehicle sensors provided on the vehicle 1 .
  • This adjustment work is performed by installing an adjustment facility corresponding to the vehicle-mounted sensor to be adjusted at a predetermined position outside the vehicle 1 and using the adjustment facility.
  • the in-vehicle sensor adjustment support system of this embodiment is used when performing this adjustment work. That is, the in-vehicle sensor adjustment support system provides the worker 3 with the installation position of the adjustment facility via the AR machine 5 so that the worker 3 can easily and efficiently install the adjustment facility.
  • the AR machine 5 displays the image 7, and furthermore, according to the operation of the operator 3, an installation position object indicating the installation position is added to the position in the image 7 where the adjustment equipment should be installed. display.
  • the AR machine 5 has a vision-based augmented reality function, and by means of the augmented reality function, the adjustment equipment can be installed at the installation position where the adjustment equipment should be installed in the actual image 7 captured by the camera 32.
  • the AR image is displayed on the display unit 34 by superimposing the position object.
  • the vehicle 1 of the present embodiment is provided with a front camera 16, a millimeter wave radar sensor 17, a right rear radar (not shown), a left rear radar (not shown), etc., as in-vehicle sensors for collision damage reduction. ing.
  • the front camera 16 is provided in front of the passenger compartment of the vehicle 1 and photographs the front of the vehicle 1 . Images captured by the front camera 16 are used in various driving support systems such as a collision prevention support system and a lane departure prevention support system mounted on the vehicle 1, for example.
  • the millimeter wave radar sensor 17 is provided behind the front bumper of the vehicle 1, for example.
  • the millimeter wave radar sensor 17 radiates radio waves in the millimeter wave band, and receives the reflected waves of the radiated radio waves that are reflected by a target in front of the vehicle 1, thereby detecting the position and distance of the target.
  • the results of detection by the millimeter wave radar sensor 17 are used in various driving support systems such as a collision prevention support system and an obstacle warning system mounted on the vehicle 1, for example.
  • the right rear radar and left rear radar basically have the same configuration as the millimeter wave radar sensor 17 .
  • the right rear radar detects the position and distance of a target existing on the right rear of the vehicle 1
  • the left rear radar detects the position and distance of a target existing on the left rear of the vehicle 1 .
  • the right rear radar and the left rear radar are used for various driving support systems such as a blind spot monitor system mounted on the vehicle 1, for example.
  • the worker 3 When adjusting the optical axis of the front camera 16, the worker 3 installs three target plates at predetermined positions in front of the vehicle 1, respectively. Then, the three target plates are photographed by the front camera 16, and the optical axis of the front camera 16 is appropriately adjusted based on the photographed images of the three target plates.
  • the millimeter wave radar sensor 17 and the left and right rear radars it is necessary to appropriately adjust the angle axis (that is, the radiation angle of the radar wave) of each of these sensors.
  • a reflector is used for adjustment of the angle axis.
  • the worker 3 installs a reflector at a predetermined position in front of the vehicle 1 when adjusting the angle axis of the millimeter wave radar sensor 17 . Then, the millimeter wave radar sensor 17 emits a radar wave, receives a reflected wave of the radar wave reflected by a reflector, and appropriately adjusts the angle axis based on the received reflected wave. Also, when adjusting the angle axes of the left and right rear radars, the angle axes are adjusted in the same manner by installing a reflector at a predetermined position for each rear radar.
  • each of the three target plates must be properly installed at a predetermined position. Also, in order to properly adjust the angular axes of the millimeter wave radar sensor 17 and the left and right rear radars, it is necessary to properly install reflectors at predetermined positions for each of these sensors.
  • the AR machine 5 is configured to provide the worker 3 with the installation positions of the three target plates and reflectors when performing these adjustment operations. That is, for example, when adjusting the angular axis of the millimeter wave radar sensor 17, the AR device 5 places the reflector at the position where the reflector should be installed in the image 7 captured by the camera 32, as shown in FIG. is displayed.
  • the AR machine 5 includes a space recognition sensor 31 , a camera 32 , a position recognition sensor 33 , and a display section 34 . Each of these units is configured in the same manner as the AR machine described in Patent Document 1.
  • the space recognition sensor 31 is a sensor that acquires information indicating various objects existing in front of the AR machine 5, and the space in front of the AR machine 5 is detected three-dimensionally by the space recognition sensor 31. , and spatial information is detected as a result of the detection.
  • the camera 32 captures an image in front of the AR device 5 at a specific frame rate and outputs it as captured data. Further, the self-position recognition sensor 33 detects the position of the AR machine 5 itself in the three-dimensional space, more specifically, for example, the orientation (angle), and outputs self-position information as the detection result.
  • the display unit 34 is configured to be able to display an image, and displays the video indicated by the photographed data output from the camera 32, that is, the video 7. Also, the display unit 34 displays various objects such as the installation position object 112 shown in FIG.
  • the display unit 34 is mounted on the AR machine 5 so that the operator 3 who wears the AR machine 5 appropriately on his head can visually recognize the image 7 including various objects.
  • the AR device 5 further includes a space recognition unit 35, an image recognition unit 36, a self-position recognition unit 37, a wireless communication unit 38, an AR image generation unit 39, and an accuracy correction unit 50.
  • the space recognition unit 35 periodically acquires space information detected by the space recognition sensor 31 and analyzes the space information in real time. Then, based on the analysis result, the space in front of the AR device 5 is three-dimensionally recognized in real time.
  • the image recognition unit 36 recognizes a specific image in the captured image 7 based on the captured data output from the camera 32 .
  • feature points are extracted from the image 7, and various images can be recognized based on the extracted feature points.
  • Images that can be recognized by the image recognition unit 36 include, for example, emblems, license plates, and specific markers provided at the front and rear ends of various vehicles.
  • the vehicle 1 of this embodiment is provided with a front emblem 11 and a front number plate 13 at the front end, and a rear emblem 12 and a rear number plate 14 at the rear end.
  • the image recognition unit 36 can individually recognize the emblems 11 and 12 and the license plates 13 and 14 of the vehicle 1 .
  • Information on various images that can be recognized by the image recognition unit 36 may be stored in the memory of the AR machine 5 in advance, or may be downloaded from the server 201 or the PC 202 .
  • the image recognition unit 36 can continue to recognize the position of the once recognized image by the tracking function, even if the position or angle of the camera 32 changes and the captured image changes.
  • the self-position recognition unit 37 recognizes the direction (angle) of the AR device 5 based on the self-position information output from the self-position recognition sensor 33 .
  • the space recognition unit 35, the image recognition unit 36, and the self-position recognition unit 37 output the space recognition information, the image recognition information, and the self-position recognition information, which are the recognition results, to the AR image generation unit 39. do.
  • the wireless communication unit 38 is for performing wireless communication with the server 201 or the PC 202 via the relay device 203, and is connected to the AR image generation unit 39.
  • the AR image generation unit 39 causes the display unit 34 to display the image 7 based on the shooting data input from the image recognition unit 36 .
  • the AR image generation unit 39 also acquires vehicle type information, object data, work-related information, etc. of the vehicle to be adjusted from the server 201 or the PC 202 via the wireless communication unit 38 .
  • the AR image generation unit 39 based on the information obtained via the wireless communication unit 38 and the information input from the space recognition unit 35, the image recognition unit 36, and the own position recognition unit 37, adjusts the equipment for adjustment. , and displays the installation position object in the image 7 .
  • the AR image generation unit 39 can also cause the display unit 34 to display work-related information in accordance with a user's operation.
  • the AR machine 5 is equipped with a computer including a CPU and memory.
  • the memory is a semiconductor storage device such as ROM, RAM, and flash memory.
  • the memory stores a program for causing the computer to function as a space recognition section 35, an image recognition section 36, a self-position recognition section 37, and an AR image generation section 39.
  • the functions of the space recognition unit 35, the image recognition unit 36, the own position recognition unit 37, and the AR image generation unit 39 described above are realized by the CPU executing the programs stored in the memory. However, at least some or all of these functions may be implemented using one or more pieces of hardware instead of processing by software.
  • the AR image generation unit 39 calculates the installation position of the adjustment facility in order to display the installation position object in the image 7 displayed on the display unit 34. Calculate
  • the procedure for calculating the installation position will be described by taking the case of calculating the installation position of the reflector in order to adjust the radiation angle (angular axis) of the radar wave of the millimeter wave radar sensor 17 as an example. Note that the procedure described below is described in detail in Patent Document 1, so a detailed description thereof will be omitted here.
  • the AR image generator 39 three-dimensionally recognizes the space in front of the camera 32 based on the spatial information detected by the space recognition sensor 31 and the self-position information output from the self-position recognition sensor 33. , recognize the floor 2 in the space. When the floor surface 2 is recognized, the relative positional relationship between the recognized floor surface 2 and the position of the AR machine 5 in the three-dimensional coordinate space is calculated.
  • the AR image generation unit 39 acquires the positions of the front emblem 11 and the rear emblem 12 recognized by the image recognition unit 36 as three-dimensional coordinates as the worker 3 moves around the vehicle 1 .
  • the AR image generator 39 generates perpendiculars 21, 22 from the positions of the emblems 11, 12 to the floor 2, finds intersections 23, 24 between the perpendiculars 21, 22 and the floor 2, and A line passing through the intersections 23 and 24 is calculated as the centerline 25 of the vehicle 1 .
  • FIG. 3 shows an example of the image 7 in which the perpendicular lines 21, 22 and the center line 25 are superimposed.
  • the center line 25 and intersections 23 and 24 calculated in this way are used to set the installation position of the adjustment equipment.
  • the three-dimensional coordinates of the intersection 23 in front of the vehicle are used as specific coordinates that serve as a reference for the installation position. be done.
  • the AR image generation unit 39 selects from the vehicle type information of the vehicle 1 acquired from the server 201 or the PC 202, the distance between the specific coordinates as data for installation of the reflector, which is equipment for adjusting the millimeter wave radar sensor 17. Read distance data representing
  • This distance data and specific coordinates are set in advance for each vehicle type and on-vehicle sensor to be adjusted, and the AR image generation unit 39 determines the installation position of the adjustment equipment based on this information.
  • the AR image generator 39 displays the image of the reflector obtained from the object data as the installation position object 112 at the set reference point P1, as indicated by the two-dot chain line in FIG.
  • the reference point P1 for displaying the installation position object 112 is set as a three-dimensional coordinate that is a predetermined distance along the center line 25 from the specific coordinates that are the three-dimensional coordinates of the intersection point 23. Therefore, the reference point P1 and the vehicle The relative positional relationship with 1 is constant. Therefore, when the worker 3 turns or moves, the display position, angle, size, etc. of the installation position object 112 in the image 7 change along with the movement of the direction and position.
  • the installation position of the adjustment equipment used for the adjustment work of the in-vehicle sensor mounted on the vehicle 1 is set based on the feature points in the image captured by the camera 32. It is set based on a reference point P1 which is a predetermined distance away from the specified coordinates.
  • the reference point P1 may deviate from the normal position P0, which is a predetermined distance away from the specific coordinates. Since this deviation is due to individual differences in the AR device 5, it cannot be improved even by correcting the distance data stored in the server 201 or the PC 202.
  • FIG. 1 is a predetermined distance away from the specific coordinates. Since this deviation is due to individual differences in the AR device 5, it cannot be improved even by correcting the distance data stored in the server 201 or the PC 202.
  • the AR machine 5 is provided with the accuracy correction unit 50 shown in FIG.
  • the accuracy correction unit 50 corresponds to an example of the accuracy correction device of the present disclosure, and includes a display control unit 52, an input unit 54, and a correction value setting unit 56 in this embodiment.
  • the display control unit 52 causes the video 7 displayed on the display unit 34 by the AR image generation unit 39 to display a verification object representing the reference distance.
  • the display position of the verification object is set at a position where the passenger can easily measure the length of the verification object using a tape measure or the like within the space recognized by the AR image generator 39 .
  • the display position of the verification object may be set at a position a predetermined distance away from the worker 3 so that the length can be measured by the worker 3 just by reaching out.
  • the surface of the target recognized by the AR image generation unit 39 such as the body of the vehicle 1 or the floor surface 2 in the image 7, may be set as the display position.
  • the verification object displayed at the display position may be any object that allows the worker 3 to measure the reference distance while viewing the image 7.
  • a linear object whose length is the reference distance may be displayed as a verification object.
  • the input unit 54 allows the worker 3 to input information such as the length of the verification object displayed on the display unit 34 by voice input, key operation, or movement of the worker 3 himself. It is intended to
  • the input by the movement of the worker 3 himself/herself is realized, for example, by the worker 3 moving a finger or the like in front of the camera 32 and the image recognition unit 36 recognizing the movement as input information.
  • the input unit 54 inputs the recognition result by the image recognition unit 36 .
  • the worker 3 measures the length of the verification object in the real space while viewing the image 7, and the measurement result is displayed in the actual space.
  • the distance can be input to the accuracy correction unit 50 .
  • the correction value setting unit 56 recognizes the measured distance input via the input unit 54 as the actual distance that is actually measured as the reference distance, and based on the recognized actual distance and the reference distance, the AR device 5 recognizes A correction value (distance correction value) for the distance that is
  • This distance correction value is for correcting the distance data used to set the installation positions of various adjustment equipment such as the reflector and the three target plates described above. Therefore, the distance correction value can be used as a common correction value for various distance data acquired from the server 201 or the PC 202. By multiplying the distance data, the various distance data can be appropriately corrected. , is calculated as a correction factor.
  • the distance data is corrected from the predetermined distance L1 to the predetermined distance L0. become.
  • the reference point that defines the display position of the installation position object 112 is set to the reference point P0 that is separated from the reference point P1 before the correction by the correction distance dL. can be displayed at the regular installation position of the reflector.
  • the distance correction value does not necessarily have to be a correction coefficient, and a correction distance for distance data may be set for each verification object.
  • the CPU first causes the AR image generation unit 39 to display a reference distance verification object in the image 7 currently being displayed in S110. , the processing as the display control unit 52 is executed.
  • the worker 3 measures the length of the verification object in the real space using a tape measure or the like while viewing the image 7 displayed on the display unit 34. , the measurement result is input through the input unit 54 .
  • the CPU waits for the input of the measurement result of the reference distance from the input unit 54 in S120, and when the measurement result is input, the process proceeds to S130.
  • the measurement result input from the input unit 54 is recognized as the actual distance of the reference distance, and in subsequent S140, the correction value setting unit 56 calculates the distance correction value based on the actual distance and the reference distance. Execute the process as
  • this distance correction value is a correction coefficient for correcting the distance recognized by the AR device 5, and is calculated by dividing the actual distance by the reference distance according to the following equation, for example.
  • Correction coefficient actual distance/reference distance
  • the CPU proceeds to S150.
  • the AR image generation unit 39 corrects the distance data for installation of adjustment equipment, which is acquired from the server 201 or PC 202 for each vehicle type and for each in-vehicle sensor and stored in the memory, and the accuracy correction process ends.
  • the AR image generation unit 39 may correct various distance data stored in the memory.
  • various distance data are read from the AR image generation unit 39, the read distance data is multiplied by a correction coefficient to correct the distance data, and the corrected distance data is returned to the AR image generation unit 39. You may do so.
  • the AR machine 5 of the present embodiment is provided with the accuracy correction unit 50 as the accuracy correction device of the present disclosure.
  • the accuracy correction unit 50 displays the verification object at the reference distance in the image 7 displayed on the display unit 34, thereby allowing the worker 3 to measure the length of the verification object in the real space, and the length of the verification object.
  • a distance correction value is calculated from the actual distance and the reference distance which are the measurement results.
  • the distance data used when the AR image generation unit 39 displays the installation position object of the adjustment equipment in the image 7 is corrected using the distance correction value. be able to.
  • the installation position of the adjustment facility can be accurately presented to the worker 3 using the installation position object 112 when performing the adjustment work of various on-vehicle sensors mounted on the vehicle.
  • the present embodiment it is possible to suppress the influence of display errors that occur for each AR device 5 and improve the display accuracy of the installation position object.
  • a reference distance verification object is displayed in the image 7 currently being displayed, and the operator 3 measures the length of the object.
  • the description has been given assuming that the distance correction value is calculated based on the distance.
  • the worker 3 is made to specify two points that are the reference distance in the real space, and the AR image generation unit 39 is made to recognize the distance between the two points.
  • a distance correction value is calculated from the reference distance thus obtained and the actual distance, which is the reference distance in the physical space.
  • the operator 3 uses a tape measure or the like to measure a certain reference distance in the real space, and specifies two points at both ends of the distance via the input unit 54 by air tapping or the like.
  • the input unit 54 inputs the position as the point designated by the worker 3 when the image recognition unit 36 recognizes the air tapping action by the worker 3 .
  • the two points can be specified by allowing the image recognition unit 36 to recognize the positions of the two points, that is, the three-dimensional coordinates. , two points may be input. Alternatively, the worker 3 may, for example, photograph a long object of a certain length and cause the image recognition unit 36 to recognize two points at both ends of the long object.
  • S220 it is determined whether or not two points separated by a certain distance from the worker 3 have been designated, thereby waiting for two points to be designated.
  • the determination in S220 is made, for example, by determining whether or not two points have been recognized in the image recognition section 36 .
  • the two points recognized by the AR image generation unit 39 are displayed in the image 7 so as to be identifiable. It may be determined that two points have been input when 39 displays two points.
  • the process proceeds to S230, and the distance between the two points recognized by the image recognition unit 36 is acquired from the AR image generation unit 39. This distance is the distance between two points recognized by the AR machine 5, and in S220, this distance is acquired as the reference distance.
  • a distance correction value is calculated based on the reference distance acquired in S230 and the actual distance between two points designated by the worker 3 in the physical space. Note that the process of S240 functions as the correction value setting unit 56, like S140 of the first embodiment.
  • the process proceeds to S250.
  • the AR image generation unit 39 corrects the distance data for installation of adjustment equipment acquired from the server 201 or the PC 202, and ends the accuracy correction process.
  • the worker 3 is made to input two points that are the reference distance in the real space, and the AR image generation unit 39 is made to recognize the distance between the two points. Since the recognized reference distance includes an error specific to the AR machine 5, a distance correction value for canceling the error is calculated based on the reference distance and the actual distance in the real space.
  • the accuracy correction unit 50 of the second embodiment can also obtain the same effects as those of the first embodiment.
  • the third embodiment has the same basic configuration as the first embodiment, so the differences will be described below.
  • the same reference numerals as in the first embodiment indicate the same configuration, and refer to the preceding description.
  • a reference distance verification object is displayed in the image 7 currently being displayed, and the operator 3 measures the length of the object.
  • the description has been given assuming that the distance correction value is calculated based on the distance.
  • the AR image generation unit 39 generates an object for verification of the reference distance and a length of the reference distance in the image 7 currently being displayed.
  • display numerical information that indicates the Note that this process corresponds to an example of the display control unit of the present disclosure.
  • the worker 3 measures the length of the verification object in the real space using a tape measure or the like while viewing the image 7. Then, the worker 3 inputs the difference between the actual distance, which is the result of the measurement, and the reference distance obtained from the numerical information displayed on the image 7 via the input unit 54 as error information of the reference distance.
  • the distance correction value is calculated based on the reference distance error information input from the input unit 54, and the process proceeds to S340. Note that the processing of S330 corresponds to an example of the correction value setting unit of the present disclosure.
  • the AR image generation unit 39 corrects the distance data for installation of adjustment equipment acquired from the server 201 or PC 202 in the same procedure as in S150 and S250 described above, and the accuracy correction process ends.
  • the verification object at the reference distance and the numerical information representing the length thereof are displayed in the image 7 currently being displayed, and the operator 3 is instructed to determine the length of the verification object. is measured, and the difference between the actual distance and the reference distance, which is the result of the measurement, is input. This difference is the difference between the reference distance recognized by the AR unit 5 and the actual distance in the real space, and is a distance recognition error unique to the AR unit 5. Calculate the correction value.
  • the accuracy correction unit 50 of the third embodiment can also obtain the same effects as those of the first embodiment.
  • the input unit 54 has been described as inputting information input by the worker 3 through voice input, key operation, movement of the worker 3 himself, or the like.
  • the input section 54 may be further configured to input information transmitted from the external device 58 shown in FIG.
  • the external device 58 may be a mobile terminal such as a smart phone or a tablet terminal, or a fixed terminal such as a PC.
  • the AR machine 5 was described as being of a goggle type, but for example, a mobile terminal such as a tablet terminal equipped with a camera and a display unit may be provided with the functions of the AR machine.
  • the accuracy correction unit 50 has been described as being incorporated in the AR machine 5 as one function of the AR machine 5 .
  • the accuracy correction unit 50 may be configured as an accuracy correction device separate from the AR device 5 and connected to the AR device 5 so as to correct display errors unique to the AR device 5 .
  • the AR machine 5 was described as being used to present the installation location of the adjustment facility used for the adjustment work of the in-vehicle sensor installed in the vehicle.
  • the accuracy correction device of the present disclosure is an AR machine configured to place an object at a reference point a predetermined distance away from specific coordinates in the space captured by the camera, it is the same as the above embodiment. can be applied to
  • the functions of the display control unit 52 and the correction value setting unit 56 in the accuracy correction unit 50 are realized by executing a program by the CPU of the computer that constitutes the AR machine 5 or the accuracy correction unit 50. described as a thing.
  • these functions of the accuracy correction unit 50 may be realized by a combination of a computer program and a dedicated hardware logic circuit, or by only one or more dedicated hardware logic circuits.
  • a plurality of functions possessed by one component in the above embodiment may be realized by a plurality of components, or a function possessed by one component may be realized by a plurality of components. Also, a plurality of functions possessed by a plurality of components may be realized by a single component, or a function realized by a plurality of components may be realized by a single component. Also, part of the configuration of the above embodiment may be omitted. Moreover, at least part of the configuration of the above embodiment may be added or replaced with respect to the configuration of the other above embodiment.
  • the present disclosure includes a system having the accuracy correction device or the augmented reality image display device as a component, and a computer functioning as the accuracy correction device or the augmented reality image display device. It can also be realized in various forms such as a program, a non-transitional actual recording medium such as a semiconductor memory storing this program, an accuracy correction method, or an augmented reality image display method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Electromagnetism (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Image Analysis (AREA)

Abstract

Ce dispositif de correction de précision 50 est disposé dans un dispositif d'affichage d'image de réalité augmentée 5 et comprend une unité de saisie 54 et une unité de réglage de valeur de correction 56. Le dispositif d'affichage d'image de réalité augmentée : reconnaît un espace d'une vidéo filmée ; et génère une image de réalité augmentée par disposition d'un objet dans la vidéo à l'aide, en tant que point de référence, des coordonnées espacées d'une distance prescrite de coordonnées spécifiques dans l'espace. L'unité de saisie 54 est configurée pour saisir des informations de distance qui indiquent la distance réelle entre deux points spécifiques dans l'espace réel, et l'unité de réglage de valeur de correction règle une valeur de correction pour une distance prescrite sur la base des informations de distance saisies à partir de l'unité de saisie et d'une distance de référence entre les deux points reconnus dans le dispositif d'affichage d'image de réalité augmentée.
PCT/JP2022/013665 2021-03-26 2022-03-23 Dispositif de correction de précision et dispositif d'affichage d'image de réalité augmentée WO2022202926A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021053811A JP2022150965A (ja) 2021-03-26 2021-03-26 精度補正装置及び拡張現実画像表示装置
JP2021-053811 2021-03-26

Publications (1)

Publication Number Publication Date
WO2022202926A1 true WO2022202926A1 (fr) 2022-09-29

Family

ID=83395663

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/013665 WO2022202926A1 (fr) 2021-03-26 2022-03-23 Dispositif de correction de précision et dispositif d'affichage d'image de réalité augmentée

Country Status (2)

Country Link
JP (1) JP2022150965A (fr)
WO (1) WO2022202926A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019121322A (ja) * 2018-01-11 2019-07-22 株式会社デンソー 設置位置情報提供装置及び設置位置情報提供方法
JP2020008917A (ja) * 2018-07-03 2020-01-16 株式会社Eidea 拡張現実表示システム、拡張現実表示方法、及び、拡張現実表示用コンピュータプログラム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019121322A (ja) * 2018-01-11 2019-07-22 株式会社デンソー 設置位置情報提供装置及び設置位置情報提供方法
JP2020008917A (ja) * 2018-07-03 2020-01-16 株式会社Eidea 拡張現実表示システム、拡張現実表示方法、及び、拡張現実表示用コンピュータプログラム

Also Published As

Publication number Publication date
JP2022150965A (ja) 2022-10-07

Similar Documents

Publication Publication Date Title
CN109690623B (zh) 用于识别场景中的相机的姿势的系统和方法
JP4763250B2 (ja) 物体検出装置
JP6458439B2 (ja) 車載カメラ較正装置、画像生成装置、車載カメラ較正方法、画像生成方法
EP3070675B1 (fr) Processeur d'image, dispositif de prise de vue, programme, système de commande d'appareil et appareil
US20200264011A1 (en) Drift calibration method and device for inertial measurement unit, and unmanned aerial vehicle
JP4809019B2 (ja) 車両用障害物検出装置
US9738223B2 (en) Dynamic guideline overlay with image cropping
US11482003B2 (en) Installation position information providing apparatus and installation position information providing method
US7697029B2 (en) Image display apparatus and method
US8594378B2 (en) 3D object detecting apparatus and 3D object detecting method
US20080170122A1 (en) Image processor, driving assistance system, and out-of-position detecting method
KR20160056129A (ko) 주변차량의 위치정보 보정 시스템 및 방법
US20130322697A1 (en) Speed Calculation of a Moving Object based on Image Data
US20200090517A1 (en) Parking space detection apparatus
JPWO2017145541A1 (ja) 移動体
JPWO2016113875A1 (ja) 課金位置評価用情報提供システム
JP6669182B2 (ja) 乗員監視装置
EP1662440A1 (fr) Procède pour la détermination de la position d'un objet en utilisant une image numérique
JP2018084503A (ja) 距離測定装置
KR20160050439A (ko) 차량용 후방 카메라의 영상 보정 방법
WO2022202926A1 (fr) Dispositif de correction de précision et dispositif d'affichage d'image de réalité augmentée
KR102121287B1 (ko) 카메라 시스템 및 카메라 시스템의 제어 방법
JP2010122045A (ja) Pcsセンサの電波軸調整装置及びその方法
US20200096606A1 (en) Vehicle inspection system and vehicle inspection method
JPWO2017042995A1 (ja) 車載用ステレオカメラ装置、およびその補正方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22775708

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22775708

Country of ref document: EP

Kind code of ref document: A1