WO2022202926A1 - Accuracy correction device and augmented reality image display device - Google Patents

Accuracy correction device and augmented reality image display device Download PDF

Info

Publication number
WO2022202926A1
WO2022202926A1 PCT/JP2022/013665 JP2022013665W WO2022202926A1 WO 2022202926 A1 WO2022202926 A1 WO 2022202926A1 JP 2022013665 W JP2022013665 W JP 2022013665W WO 2022202926 A1 WO2022202926 A1 WO 2022202926A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
unit
augmented reality
image
input
Prior art date
Application number
PCT/JP2022/013665
Other languages
French (fr)
Japanese (ja)
Inventor
徹哉 土居
友典 川上
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2022202926A1 publication Critical patent/WO2022202926A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Definitions

  • the present disclosure relates to an accuracy correction device for an augmented reality image display device and an augmented reality image display device.
  • Augmented reality image display devices are known that virtually extend the world in front of you by displaying virtual objects overlaid on images captured by a camera.
  • the display position of the object within the augmented reality image is set in advance based on the distance from a specific position in the physical space.
  • a specific position recognized in the captured space is set as specific coordinates, and a position a predetermined distance away from the specific coordinates is set as a reference point for object display.
  • One aspect of the present disclosure aims to suppress deterioration in display position accuracy of an object due to individual differences in an augmented reality image display device that displays a virtual object overlaid on an image captured by a camera. do.
  • An accuracy correction device is provided in an augmented reality image display device.
  • This augmented reality image display device recognizes the space of the image captured by the image capturing unit, and places an object in the image using coordinates that are a predetermined distance away from specific coordinates in that space as a reference point. It is a device that generates an image and displays it on a display unit.
  • the accuracy correction device of the present disclosure includes an input section and a correction value setting section.
  • the input unit inputs distance information representing the actual distance between specific two points in the real space
  • the correction value setting unit recognizes the distance information input from the input unit and the augmented reality image display device.
  • a correction value for the predetermined distance is set based on the reference distance between the two points.
  • a correction value can be set.
  • the augmented reality image display device uses the set correction value to correct the predetermined distance used for setting the reference point, thereby displaying the object at the proper position where it should originally be displayed in the augmented reality image. be able to.
  • the accuracy correction device of the present disclosure it is possible to improve the display position accuracy of the object in the augmented reality image display device.
  • an augmented reality image display device includes an imaging unit, a space recognition unit, an image generation unit, and a display unit.
  • the shooting unit shoots images of the surroundings, and the space recognition unit recognizes the space of the images shot by the shooting unit.
  • the image generation unit generates and displays an augmented reality image by arranging an object in the video with coordinates that are a predetermined distance from the specific coordinates as reference points in the space recognized by the space recognition unit.
  • the unit displays the augmented reality image generated by the image generation unit.
  • the augmented reality image display device of the present disclosure includes an input unit and a correction value setting unit, similar to the accuracy correction device described above. Then, the image generating section corrects the predetermined distance based on the correction value set by the correction value setting section, and sets a reference point as the display position of the object.
  • the augmented reality image display device of the present disclosure it is possible to display an object at an appropriate position where it should originally be displayed in an augmented reality image, and it is possible to improve the display position accuracy of the object.
  • FIG. 1 is a block diagram showing the configuration of an augmented reality image display device
  • FIG. 4 is an explanatory diagram showing an example of specific coordinates and reference lines specified in the space around the vehicle
  • FIG. 10 is an explanatory diagram showing a method of setting a reference point, which is the display position of an object
  • 4 is a flowchart showing accuracy correction processing executed by an accuracy correction unit
  • 9 is a flowchart showing accuracy correction processing of the second embodiment
  • an augmented reality image display device (hereafter referred to as an AR machine) 5 of this embodiment is an in-vehicle sensor adjustment support system that assists adjustment work of in-vehicle sensors necessary for collision damage mitigation braking. , it is used to provide the operator 3 with the installation position of the equipment for adjustment.
  • the AR machine 5 of the present embodiment is worn on the head of the worker 3 who is the user, and displays an augmented reality image (hereafter referred to as an AR image) in front of the worker 3, so-called goggles. type display device.
  • AR is an abbreviation for "Augmented Reality”.
  • the vehicle-mounted sensor adjustment support system includes a server 201 , a personal computer (PC) 202 , a relay device 203 and an AR device 5 .
  • the server 201 and the PC 20 are connected to the communication network 200.
  • the relay device 203 relays wireless data communication, and is capable of wireless data communication with the communication network 200 .
  • the PC 202 and the AR device 5 have a wireless communication function, and can wirelessly communicate data with the relay device 203 . Therefore, the AR machine 5, the server 201 and the PC 202 can perform data communication with each other via the relay device 203 or the communication network 200.
  • the server 201 stores vehicle type information for each vehicle including vehicle 1.
  • the server 201 stores object data indicating an installation position object for each type of adjustment equipment.
  • the installation position object is an image indicating the installation position of the adjustment facility.
  • the server 201 also stores various work-related information (for example, work procedures) related to adjustment work for each type of adjustment equipment.
  • the PC 202 can download and save vehicle type information, object data, and work-related information from the server 201 . It may be possible to upload vehicle type information, installation position objects, and work-related information from the PC 202 to the server 201 .
  • the AR machine 5 can download and acquire vehicle type information, installation position objects, and work-related information from the server 201 or PC 202 .
  • the AR machine 5 displays in real time an image 7 captured by a camera 32 (see FIG. 2) as a capturing unit.
  • the video 7 is a video of the space in front of the worker 3.
  • the worker 3 puts the vehicle 1 on the floor surface 2 of the predetermined work area and performs adjustment work for various in-vehicle sensors provided on the vehicle 1 .
  • This adjustment work is performed by installing an adjustment facility corresponding to the vehicle-mounted sensor to be adjusted at a predetermined position outside the vehicle 1 and using the adjustment facility.
  • the in-vehicle sensor adjustment support system of this embodiment is used when performing this adjustment work. That is, the in-vehicle sensor adjustment support system provides the worker 3 with the installation position of the adjustment facility via the AR machine 5 so that the worker 3 can easily and efficiently install the adjustment facility.
  • the AR machine 5 displays the image 7, and furthermore, according to the operation of the operator 3, an installation position object indicating the installation position is added to the position in the image 7 where the adjustment equipment should be installed. display.
  • the AR machine 5 has a vision-based augmented reality function, and by means of the augmented reality function, the adjustment equipment can be installed at the installation position where the adjustment equipment should be installed in the actual image 7 captured by the camera 32.
  • the AR image is displayed on the display unit 34 by superimposing the position object.
  • the vehicle 1 of the present embodiment is provided with a front camera 16, a millimeter wave radar sensor 17, a right rear radar (not shown), a left rear radar (not shown), etc., as in-vehicle sensors for collision damage reduction. ing.
  • the front camera 16 is provided in front of the passenger compartment of the vehicle 1 and photographs the front of the vehicle 1 . Images captured by the front camera 16 are used in various driving support systems such as a collision prevention support system and a lane departure prevention support system mounted on the vehicle 1, for example.
  • the millimeter wave radar sensor 17 is provided behind the front bumper of the vehicle 1, for example.
  • the millimeter wave radar sensor 17 radiates radio waves in the millimeter wave band, and receives the reflected waves of the radiated radio waves that are reflected by a target in front of the vehicle 1, thereby detecting the position and distance of the target.
  • the results of detection by the millimeter wave radar sensor 17 are used in various driving support systems such as a collision prevention support system and an obstacle warning system mounted on the vehicle 1, for example.
  • the right rear radar and left rear radar basically have the same configuration as the millimeter wave radar sensor 17 .
  • the right rear radar detects the position and distance of a target existing on the right rear of the vehicle 1
  • the left rear radar detects the position and distance of a target existing on the left rear of the vehicle 1 .
  • the right rear radar and the left rear radar are used for various driving support systems such as a blind spot monitor system mounted on the vehicle 1, for example.
  • the worker 3 When adjusting the optical axis of the front camera 16, the worker 3 installs three target plates at predetermined positions in front of the vehicle 1, respectively. Then, the three target plates are photographed by the front camera 16, and the optical axis of the front camera 16 is appropriately adjusted based on the photographed images of the three target plates.
  • the millimeter wave radar sensor 17 and the left and right rear radars it is necessary to appropriately adjust the angle axis (that is, the radiation angle of the radar wave) of each of these sensors.
  • a reflector is used for adjustment of the angle axis.
  • the worker 3 installs a reflector at a predetermined position in front of the vehicle 1 when adjusting the angle axis of the millimeter wave radar sensor 17 . Then, the millimeter wave radar sensor 17 emits a radar wave, receives a reflected wave of the radar wave reflected by a reflector, and appropriately adjusts the angle axis based on the received reflected wave. Also, when adjusting the angle axes of the left and right rear radars, the angle axes are adjusted in the same manner by installing a reflector at a predetermined position for each rear radar.
  • each of the three target plates must be properly installed at a predetermined position. Also, in order to properly adjust the angular axes of the millimeter wave radar sensor 17 and the left and right rear radars, it is necessary to properly install reflectors at predetermined positions for each of these sensors.
  • the AR machine 5 is configured to provide the worker 3 with the installation positions of the three target plates and reflectors when performing these adjustment operations. That is, for example, when adjusting the angular axis of the millimeter wave radar sensor 17, the AR device 5 places the reflector at the position where the reflector should be installed in the image 7 captured by the camera 32, as shown in FIG. is displayed.
  • the AR machine 5 includes a space recognition sensor 31 , a camera 32 , a position recognition sensor 33 , and a display section 34 . Each of these units is configured in the same manner as the AR machine described in Patent Document 1.
  • the space recognition sensor 31 is a sensor that acquires information indicating various objects existing in front of the AR machine 5, and the space in front of the AR machine 5 is detected three-dimensionally by the space recognition sensor 31. , and spatial information is detected as a result of the detection.
  • the camera 32 captures an image in front of the AR device 5 at a specific frame rate and outputs it as captured data. Further, the self-position recognition sensor 33 detects the position of the AR machine 5 itself in the three-dimensional space, more specifically, for example, the orientation (angle), and outputs self-position information as the detection result.
  • the display unit 34 is configured to be able to display an image, and displays the video indicated by the photographed data output from the camera 32, that is, the video 7. Also, the display unit 34 displays various objects such as the installation position object 112 shown in FIG.
  • the display unit 34 is mounted on the AR machine 5 so that the operator 3 who wears the AR machine 5 appropriately on his head can visually recognize the image 7 including various objects.
  • the AR device 5 further includes a space recognition unit 35, an image recognition unit 36, a self-position recognition unit 37, a wireless communication unit 38, an AR image generation unit 39, and an accuracy correction unit 50.
  • the space recognition unit 35 periodically acquires space information detected by the space recognition sensor 31 and analyzes the space information in real time. Then, based on the analysis result, the space in front of the AR device 5 is three-dimensionally recognized in real time.
  • the image recognition unit 36 recognizes a specific image in the captured image 7 based on the captured data output from the camera 32 .
  • feature points are extracted from the image 7, and various images can be recognized based on the extracted feature points.
  • Images that can be recognized by the image recognition unit 36 include, for example, emblems, license plates, and specific markers provided at the front and rear ends of various vehicles.
  • the vehicle 1 of this embodiment is provided with a front emblem 11 and a front number plate 13 at the front end, and a rear emblem 12 and a rear number plate 14 at the rear end.
  • the image recognition unit 36 can individually recognize the emblems 11 and 12 and the license plates 13 and 14 of the vehicle 1 .
  • Information on various images that can be recognized by the image recognition unit 36 may be stored in the memory of the AR machine 5 in advance, or may be downloaded from the server 201 or the PC 202 .
  • the image recognition unit 36 can continue to recognize the position of the once recognized image by the tracking function, even if the position or angle of the camera 32 changes and the captured image changes.
  • the self-position recognition unit 37 recognizes the direction (angle) of the AR device 5 based on the self-position information output from the self-position recognition sensor 33 .
  • the space recognition unit 35, the image recognition unit 36, and the self-position recognition unit 37 output the space recognition information, the image recognition information, and the self-position recognition information, which are the recognition results, to the AR image generation unit 39. do.
  • the wireless communication unit 38 is for performing wireless communication with the server 201 or the PC 202 via the relay device 203, and is connected to the AR image generation unit 39.
  • the AR image generation unit 39 causes the display unit 34 to display the image 7 based on the shooting data input from the image recognition unit 36 .
  • the AR image generation unit 39 also acquires vehicle type information, object data, work-related information, etc. of the vehicle to be adjusted from the server 201 or the PC 202 via the wireless communication unit 38 .
  • the AR image generation unit 39 based on the information obtained via the wireless communication unit 38 and the information input from the space recognition unit 35, the image recognition unit 36, and the own position recognition unit 37, adjusts the equipment for adjustment. , and displays the installation position object in the image 7 .
  • the AR image generation unit 39 can also cause the display unit 34 to display work-related information in accordance with a user's operation.
  • the AR machine 5 is equipped with a computer including a CPU and memory.
  • the memory is a semiconductor storage device such as ROM, RAM, and flash memory.
  • the memory stores a program for causing the computer to function as a space recognition section 35, an image recognition section 36, a self-position recognition section 37, and an AR image generation section 39.
  • the functions of the space recognition unit 35, the image recognition unit 36, the own position recognition unit 37, and the AR image generation unit 39 described above are realized by the CPU executing the programs stored in the memory. However, at least some or all of these functions may be implemented using one or more pieces of hardware instead of processing by software.
  • the AR image generation unit 39 calculates the installation position of the adjustment facility in order to display the installation position object in the image 7 displayed on the display unit 34. Calculate
  • the procedure for calculating the installation position will be described by taking the case of calculating the installation position of the reflector in order to adjust the radiation angle (angular axis) of the radar wave of the millimeter wave radar sensor 17 as an example. Note that the procedure described below is described in detail in Patent Document 1, so a detailed description thereof will be omitted here.
  • the AR image generator 39 three-dimensionally recognizes the space in front of the camera 32 based on the spatial information detected by the space recognition sensor 31 and the self-position information output from the self-position recognition sensor 33. , recognize the floor 2 in the space. When the floor surface 2 is recognized, the relative positional relationship between the recognized floor surface 2 and the position of the AR machine 5 in the three-dimensional coordinate space is calculated.
  • the AR image generation unit 39 acquires the positions of the front emblem 11 and the rear emblem 12 recognized by the image recognition unit 36 as three-dimensional coordinates as the worker 3 moves around the vehicle 1 .
  • the AR image generator 39 generates perpendiculars 21, 22 from the positions of the emblems 11, 12 to the floor 2, finds intersections 23, 24 between the perpendiculars 21, 22 and the floor 2, and A line passing through the intersections 23 and 24 is calculated as the centerline 25 of the vehicle 1 .
  • FIG. 3 shows an example of the image 7 in which the perpendicular lines 21, 22 and the center line 25 are superimposed.
  • the center line 25 and intersections 23 and 24 calculated in this way are used to set the installation position of the adjustment equipment.
  • the three-dimensional coordinates of the intersection 23 in front of the vehicle are used as specific coordinates that serve as a reference for the installation position. be done.
  • the AR image generation unit 39 selects from the vehicle type information of the vehicle 1 acquired from the server 201 or the PC 202, the distance between the specific coordinates as data for installation of the reflector, which is equipment for adjusting the millimeter wave radar sensor 17. Read distance data representing
  • This distance data and specific coordinates are set in advance for each vehicle type and on-vehicle sensor to be adjusted, and the AR image generation unit 39 determines the installation position of the adjustment equipment based on this information.
  • the AR image generator 39 displays the image of the reflector obtained from the object data as the installation position object 112 at the set reference point P1, as indicated by the two-dot chain line in FIG.
  • the reference point P1 for displaying the installation position object 112 is set as a three-dimensional coordinate that is a predetermined distance along the center line 25 from the specific coordinates that are the three-dimensional coordinates of the intersection point 23. Therefore, the reference point P1 and the vehicle The relative positional relationship with 1 is constant. Therefore, when the worker 3 turns or moves, the display position, angle, size, etc. of the installation position object 112 in the image 7 change along with the movement of the direction and position.
  • the installation position of the adjustment equipment used for the adjustment work of the in-vehicle sensor mounted on the vehicle 1 is set based on the feature points in the image captured by the camera 32. It is set based on a reference point P1 which is a predetermined distance away from the specified coordinates.
  • the reference point P1 may deviate from the normal position P0, which is a predetermined distance away from the specific coordinates. Since this deviation is due to individual differences in the AR device 5, it cannot be improved even by correcting the distance data stored in the server 201 or the PC 202.
  • FIG. 1 is a predetermined distance away from the specific coordinates. Since this deviation is due to individual differences in the AR device 5, it cannot be improved even by correcting the distance data stored in the server 201 or the PC 202.
  • the AR machine 5 is provided with the accuracy correction unit 50 shown in FIG.
  • the accuracy correction unit 50 corresponds to an example of the accuracy correction device of the present disclosure, and includes a display control unit 52, an input unit 54, and a correction value setting unit 56 in this embodiment.
  • the display control unit 52 causes the video 7 displayed on the display unit 34 by the AR image generation unit 39 to display a verification object representing the reference distance.
  • the display position of the verification object is set at a position where the passenger can easily measure the length of the verification object using a tape measure or the like within the space recognized by the AR image generator 39 .
  • the display position of the verification object may be set at a position a predetermined distance away from the worker 3 so that the length can be measured by the worker 3 just by reaching out.
  • the surface of the target recognized by the AR image generation unit 39 such as the body of the vehicle 1 or the floor surface 2 in the image 7, may be set as the display position.
  • the verification object displayed at the display position may be any object that allows the worker 3 to measure the reference distance while viewing the image 7.
  • a linear object whose length is the reference distance may be displayed as a verification object.
  • the input unit 54 allows the worker 3 to input information such as the length of the verification object displayed on the display unit 34 by voice input, key operation, or movement of the worker 3 himself. It is intended to
  • the input by the movement of the worker 3 himself/herself is realized, for example, by the worker 3 moving a finger or the like in front of the camera 32 and the image recognition unit 36 recognizing the movement as input information.
  • the input unit 54 inputs the recognition result by the image recognition unit 36 .
  • the worker 3 measures the length of the verification object in the real space while viewing the image 7, and the measurement result is displayed in the actual space.
  • the distance can be input to the accuracy correction unit 50 .
  • the correction value setting unit 56 recognizes the measured distance input via the input unit 54 as the actual distance that is actually measured as the reference distance, and based on the recognized actual distance and the reference distance, the AR device 5 recognizes A correction value (distance correction value) for the distance that is
  • This distance correction value is for correcting the distance data used to set the installation positions of various adjustment equipment such as the reflector and the three target plates described above. Therefore, the distance correction value can be used as a common correction value for various distance data acquired from the server 201 or the PC 202. By multiplying the distance data, the various distance data can be appropriately corrected. , is calculated as a correction factor.
  • the distance data is corrected from the predetermined distance L1 to the predetermined distance L0. become.
  • the reference point that defines the display position of the installation position object 112 is set to the reference point P0 that is separated from the reference point P1 before the correction by the correction distance dL. can be displayed at the regular installation position of the reflector.
  • the distance correction value does not necessarily have to be a correction coefficient, and a correction distance for distance data may be set for each verification object.
  • the CPU first causes the AR image generation unit 39 to display a reference distance verification object in the image 7 currently being displayed in S110. , the processing as the display control unit 52 is executed.
  • the worker 3 measures the length of the verification object in the real space using a tape measure or the like while viewing the image 7 displayed on the display unit 34. , the measurement result is input through the input unit 54 .
  • the CPU waits for the input of the measurement result of the reference distance from the input unit 54 in S120, and when the measurement result is input, the process proceeds to S130.
  • the measurement result input from the input unit 54 is recognized as the actual distance of the reference distance, and in subsequent S140, the correction value setting unit 56 calculates the distance correction value based on the actual distance and the reference distance. Execute the process as
  • this distance correction value is a correction coefficient for correcting the distance recognized by the AR device 5, and is calculated by dividing the actual distance by the reference distance according to the following equation, for example.
  • Correction coefficient actual distance/reference distance
  • the CPU proceeds to S150.
  • the AR image generation unit 39 corrects the distance data for installation of adjustment equipment, which is acquired from the server 201 or PC 202 for each vehicle type and for each in-vehicle sensor and stored in the memory, and the accuracy correction process ends.
  • the AR image generation unit 39 may correct various distance data stored in the memory.
  • various distance data are read from the AR image generation unit 39, the read distance data is multiplied by a correction coefficient to correct the distance data, and the corrected distance data is returned to the AR image generation unit 39. You may do so.
  • the AR machine 5 of the present embodiment is provided with the accuracy correction unit 50 as the accuracy correction device of the present disclosure.
  • the accuracy correction unit 50 displays the verification object at the reference distance in the image 7 displayed on the display unit 34, thereby allowing the worker 3 to measure the length of the verification object in the real space, and the length of the verification object.
  • a distance correction value is calculated from the actual distance and the reference distance which are the measurement results.
  • the distance data used when the AR image generation unit 39 displays the installation position object of the adjustment equipment in the image 7 is corrected using the distance correction value. be able to.
  • the installation position of the adjustment facility can be accurately presented to the worker 3 using the installation position object 112 when performing the adjustment work of various on-vehicle sensors mounted on the vehicle.
  • the present embodiment it is possible to suppress the influence of display errors that occur for each AR device 5 and improve the display accuracy of the installation position object.
  • a reference distance verification object is displayed in the image 7 currently being displayed, and the operator 3 measures the length of the object.
  • the description has been given assuming that the distance correction value is calculated based on the distance.
  • the worker 3 is made to specify two points that are the reference distance in the real space, and the AR image generation unit 39 is made to recognize the distance between the two points.
  • a distance correction value is calculated from the reference distance thus obtained and the actual distance, which is the reference distance in the physical space.
  • the operator 3 uses a tape measure or the like to measure a certain reference distance in the real space, and specifies two points at both ends of the distance via the input unit 54 by air tapping or the like.
  • the input unit 54 inputs the position as the point designated by the worker 3 when the image recognition unit 36 recognizes the air tapping action by the worker 3 .
  • the two points can be specified by allowing the image recognition unit 36 to recognize the positions of the two points, that is, the three-dimensional coordinates. , two points may be input. Alternatively, the worker 3 may, for example, photograph a long object of a certain length and cause the image recognition unit 36 to recognize two points at both ends of the long object.
  • S220 it is determined whether or not two points separated by a certain distance from the worker 3 have been designated, thereby waiting for two points to be designated.
  • the determination in S220 is made, for example, by determining whether or not two points have been recognized in the image recognition section 36 .
  • the two points recognized by the AR image generation unit 39 are displayed in the image 7 so as to be identifiable. It may be determined that two points have been input when 39 displays two points.
  • the process proceeds to S230, and the distance between the two points recognized by the image recognition unit 36 is acquired from the AR image generation unit 39. This distance is the distance between two points recognized by the AR machine 5, and in S220, this distance is acquired as the reference distance.
  • a distance correction value is calculated based on the reference distance acquired in S230 and the actual distance between two points designated by the worker 3 in the physical space. Note that the process of S240 functions as the correction value setting unit 56, like S140 of the first embodiment.
  • the process proceeds to S250.
  • the AR image generation unit 39 corrects the distance data for installation of adjustment equipment acquired from the server 201 or the PC 202, and ends the accuracy correction process.
  • the worker 3 is made to input two points that are the reference distance in the real space, and the AR image generation unit 39 is made to recognize the distance between the two points. Since the recognized reference distance includes an error specific to the AR machine 5, a distance correction value for canceling the error is calculated based on the reference distance and the actual distance in the real space.
  • the accuracy correction unit 50 of the second embodiment can also obtain the same effects as those of the first embodiment.
  • the third embodiment has the same basic configuration as the first embodiment, so the differences will be described below.
  • the same reference numerals as in the first embodiment indicate the same configuration, and refer to the preceding description.
  • a reference distance verification object is displayed in the image 7 currently being displayed, and the operator 3 measures the length of the object.
  • the description has been given assuming that the distance correction value is calculated based on the distance.
  • the AR image generation unit 39 generates an object for verification of the reference distance and a length of the reference distance in the image 7 currently being displayed.
  • display numerical information that indicates the Note that this process corresponds to an example of the display control unit of the present disclosure.
  • the worker 3 measures the length of the verification object in the real space using a tape measure or the like while viewing the image 7. Then, the worker 3 inputs the difference between the actual distance, which is the result of the measurement, and the reference distance obtained from the numerical information displayed on the image 7 via the input unit 54 as error information of the reference distance.
  • the distance correction value is calculated based on the reference distance error information input from the input unit 54, and the process proceeds to S340. Note that the processing of S330 corresponds to an example of the correction value setting unit of the present disclosure.
  • the AR image generation unit 39 corrects the distance data for installation of adjustment equipment acquired from the server 201 or PC 202 in the same procedure as in S150 and S250 described above, and the accuracy correction process ends.
  • the verification object at the reference distance and the numerical information representing the length thereof are displayed in the image 7 currently being displayed, and the operator 3 is instructed to determine the length of the verification object. is measured, and the difference between the actual distance and the reference distance, which is the result of the measurement, is input. This difference is the difference between the reference distance recognized by the AR unit 5 and the actual distance in the real space, and is a distance recognition error unique to the AR unit 5. Calculate the correction value.
  • the accuracy correction unit 50 of the third embodiment can also obtain the same effects as those of the first embodiment.
  • the input unit 54 has been described as inputting information input by the worker 3 through voice input, key operation, movement of the worker 3 himself, or the like.
  • the input section 54 may be further configured to input information transmitted from the external device 58 shown in FIG.
  • the external device 58 may be a mobile terminal such as a smart phone or a tablet terminal, or a fixed terminal such as a PC.
  • the AR machine 5 was described as being of a goggle type, but for example, a mobile terminal such as a tablet terminal equipped with a camera and a display unit may be provided with the functions of the AR machine.
  • the accuracy correction unit 50 has been described as being incorporated in the AR machine 5 as one function of the AR machine 5 .
  • the accuracy correction unit 50 may be configured as an accuracy correction device separate from the AR device 5 and connected to the AR device 5 so as to correct display errors unique to the AR device 5 .
  • the AR machine 5 was described as being used to present the installation location of the adjustment facility used for the adjustment work of the in-vehicle sensor installed in the vehicle.
  • the accuracy correction device of the present disclosure is an AR machine configured to place an object at a reference point a predetermined distance away from specific coordinates in the space captured by the camera, it is the same as the above embodiment. can be applied to
  • the functions of the display control unit 52 and the correction value setting unit 56 in the accuracy correction unit 50 are realized by executing a program by the CPU of the computer that constitutes the AR machine 5 or the accuracy correction unit 50. described as a thing.
  • these functions of the accuracy correction unit 50 may be realized by a combination of a computer program and a dedicated hardware logic circuit, or by only one or more dedicated hardware logic circuits.
  • a plurality of functions possessed by one component in the above embodiment may be realized by a plurality of components, or a function possessed by one component may be realized by a plurality of components. Also, a plurality of functions possessed by a plurality of components may be realized by a single component, or a function realized by a plurality of components may be realized by a single component. Also, part of the configuration of the above embodiment may be omitted. Moreover, at least part of the configuration of the above embodiment may be added or replaced with respect to the configuration of the other above embodiment.
  • the present disclosure includes a system having the accuracy correction device or the augmented reality image display device as a component, and a computer functioning as the accuracy correction device or the augmented reality image display device. It can also be realized in various forms such as a program, a non-transitional actual recording medium such as a semiconductor memory storing this program, an accuracy correction method, or an augmented reality image display method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Processing Or Creating Images (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Image Analysis (AREA)

Abstract

This accuracy correction device 50 is provided in an augmented reality image display device 5 and comprises an input unit 54 and a correction value setting unit 56. The augmented reality image display device: recognizes a space of a filmed video; and generates an augmented reality image by disposing an object in the video by using, as a reference point, the coordinates spaced apart by a prescribed distance from specific coordinates in the space. The input unit 54 is configured to input distance information that indicates the actual distance between two specified points in the actual space, and the correction value setting unit sets a correction value for a prescribed distance on the basis of the distance information input from the input unit and a reference distance between the two points recognized in the augmented reality image display device.

Description

精度補正装置及び拡張現実画像表示装置Accuracy correction device and augmented reality image display device 関連出願の相互参照Cross-reference to related applications
 本国際出願は、2021年3月26日に日本国特許庁に出願された日本国特許出願第2021-053811号に基づく優先権を主張するものであり、日本国特許出願第2021-053811号の全内容を本国際出願に参照により援用する。 This international application claims priority based on Japanese Patent Application No. 2021-053811 filed with the Japan Patent Office on March 26, 2021, and Japanese Patent Application No. 2021-053811 The entire contents are incorporated by reference into this international application.
 本開示は、拡張現実画像表示装置の精度補正装置、及び、拡張現実画像表示装置に関する。 The present disclosure relates to an accuracy correction device for an augmented reality image display device and an augmented reality image display device.
 カメラにより撮影された映像に、仮想のオブジェクトを重ねて表示することで、目の前にある世界を仮想的に拡張する、拡張現実画像表示装置が知られている。 Augmented reality image display devices are known that virtually extend the world in front of you by displaying virtual objects overlaid on images captured by a camera.
 この装置においては、特許文献1に記載のように、拡張現実画像内でのオブジェクトの表示位置が、現実空間での特定位置からの距離に基づき、予め設定されている。 In this device, as described in Patent Document 1, the display position of the object within the augmented reality image is set in advance based on the distance from a specific position in the physical space.
 そして、実際に拡張現実画像を表示する際には、撮影された空間内で認識される特定位置を特定座標とし、その特定座標から所定距離離れた位置をオブジェクト表示用の基準点として、その基準点にオブジェクトを表示する。 Then, when actually displaying an augmented reality image, a specific position recognized in the captured space is set as specific coordinates, and a position a predetermined distance away from the specific coordinates is set as a reference point for object display. Display objects as points.
特開2019-121322号公報JP 2019-121322 A
 しかしながら、発明者の詳細な検討の結果、特許文献1に記載の装置においては、装置毎の特性のバラツキ、つまり個体差、により、拡張現実画像でのオブジェクトの表示位置が、現実空間で規定された所望位置からずれてしまう、という課題が見いだされた。 However, as a result of detailed studies by the inventor, in the device described in Patent Document 1, the display position of the object in the augmented reality image is defined in the real space due to variations in the characteristics of each device, that is, individual differences. However, a problem was found in that it deviated from the desired position.
 本開示の一局面は、カメラにより撮影された映像に仮想のオブジェクトを重ねて表示する拡張現実画像表示装置において、装置の個体差によりオブジェクトの表示位置精度が低下するのを抑制することを目的とする。 One aspect of the present disclosure aims to suppress deterioration in display position accuracy of an object due to individual differences in an augmented reality image display device that displays a virtual object overlaid on an image captured by a camera. do.
 本開示の第1の局面の精度補正装置は、拡張現実画像表示装置に設けられる。この拡張現実画像表示装置は、撮影部により撮影された映像の空間を認識し、その空間内で特定座標から所定距離離れた座標を基準点として、映像内にオブジェクトを配置することにより、拡張現実画像を生成して表示部に表示する装置である。 An accuracy correction device according to the first aspect of the present disclosure is provided in an augmented reality image display device. This augmented reality image display device recognizes the space of the image captured by the image capturing unit, and places an object in the image using coordinates that are a predetermined distance away from specific coordinates in that space as a reference point. It is a device that generates an image and displays it on a display unit.
 そして、本開示の精度補正装置は、入力部と、補正値設定部とを備えている。このうち、入力部は、現実空間での特定の2点間の実距離を表す距離情報を入力し、補正値設定部は、入力部から入力された距離情報と、拡張現実画像表示装置において認識される2点間の基準距離とに基づき、所定距離に対する補正値を設定する。 The accuracy correction device of the present disclosure includes an input section and a correction value setting section. Among these, the input unit inputs distance information representing the actual distance between specific two points in the real space, and the correction value setting unit recognizes the distance information input from the input unit and the augmented reality image display device. A correction value for the predetermined distance is set based on the reference distance between the two points.
 従って、本開示の精度補正装置によれば、拡張現実画像表示装置において認識される2点間の基準距離が、距離情報から得られる正規の距離からずれている場合、そのずれを補正するための補正値を設定することができる。 Therefore, according to the accuracy correction device of the present disclosure, when the reference distance between two points recognized by the augmented reality image display device deviates from the normal distance obtained from the distance information, A correction value can be set.
 このため、拡張現実画像表示装置は、その設定された補正値を用いて、基準点の設定に用いる所定距離を補正することで、拡張現実画像において、本来表示すべき適正位置にオブジェクトを表示することができる。 Therefore, the augmented reality image display device uses the set correction value to correct the predetermined distance used for setting the reference point, thereby displaying the object at the proper position where it should originally be displayed in the augmented reality image. be able to.
 よって、本開示の精度補正装置によれば、拡張現実画像表示装置におけるオブジェクトの表示位置精度を向上することができる。 Therefore, according to the accuracy correction device of the present disclosure, it is possible to improve the display position accuracy of the object in the augmented reality image display device.
 次に、本開示の第2の局面の拡張現実画像表示装置は、撮影部、空間認識部、画像生成部、及び、表示部を備える。 Next, an augmented reality image display device according to a second aspect of the present disclosure includes an imaging unit, a space recognition unit, an image generation unit, and a display unit.
 撮影部は、周囲の映像を撮影し、空間認識部は、撮影部により撮影された映像の空間を認識する。また、画像生成部は、空間認識部にて認識された空間内で、特定座標から所定距離離れた座標を基準点として、映像内にオブジェクトを配置することにより、拡張現実画像を生成し、表示部は、画像生成部にて生成された拡張現実画像を表示する。 The shooting unit shoots images of the surroundings, and the space recognition unit recognizes the space of the images shot by the shooting unit. In addition, the image generation unit generates and displays an augmented reality image by arranging an object in the video with coordinates that are a predetermined distance from the specific coordinates as reference points in the space recognized by the space recognition unit. The unit displays the augmented reality image generated by the image generation unit.
 また、本開示の拡張現実画像表示装置は、上記精度補正装置と同様、入力部と、補正値設定部と、を備える。そして、画像生成部は、補正値設定部にて設定された補正値に基づき、所定距離を補正して、オブジェクトの表示位置となる基準点を設定する。 Also, the augmented reality image display device of the present disclosure includes an input unit and a correction value setting unit, similar to the accuracy correction device described above. Then, the image generating section corrects the predetermined distance based on the correction value set by the correction value setting section, and sets a reference point as the display position of the object.
 従って、本開示の拡張現実画像表示装置によれば、拡張現実画像において、本来表示すべき適正位置にオブジェクトを表示することができるようになり、オブジェクトの表示位置精度を向上することができる。 Therefore, according to the augmented reality image display device of the present disclosure, it is possible to display an object at an appropriate position where it should originally be displayed in an augmented reality image, and it is possible to improve the display position accuracy of the object.
第1実施形態の車載センサ調整支援システム全体の構成を表す説明図である。It is an explanatory view showing composition of the whole vehicle-mounted sensor adjustment support system of a 1st embodiment. 拡張現実画像表示装置の構成を表すブロック図である。1 is a block diagram showing the configuration of an augmented reality image display device; FIG. 車両周囲の空間にて特定される特定座標及び基準線の例を表す説明図である。FIG. 4 is an explanatory diagram showing an example of specific coordinates and reference lines specified in the space around the vehicle; オブジェクトの表示位置となる基準点の設定方法を表す説明図である。FIG. 10 is an explanatory diagram showing a method of setting a reference point, which is the display position of an object; 精度補正部にて実行される精度補正処理を表すフローチャートである。4 is a flowchart showing accuracy correction processing executed by an accuracy correction unit; 第2実施形態の精度補正処理を表すフローチャートである。9 is a flowchart showing accuracy correction processing of the second embodiment; 第3実施形態の精度補正処理を表すフローチャートである。It is a flow chart showing accuracy correction processing of a 3rd embodiment.
 以下、本開示の実施形態について、図面を参照しながら説明する。 Hereinafter, embodiments of the present disclosure will be described with reference to the drawings.
 [第1実施形態]
 (1)全体構成
 図1に示すように、本実施形態の拡張現実画像表示装置(以下、AR機)5は、衝突被害軽減ブレーキに必要な車載センサの調整作業を支援する車載センサ調整支援システムにおいて、作業者3に対し、調整用設備の設置位置を提供するのに用いられる。
[First embodiment]
(1) Overall Configuration As shown in FIG. 1, an augmented reality image display device (hereafter referred to as an AR machine) 5 of this embodiment is an in-vehicle sensor adjustment support system that assists adjustment work of in-vehicle sensors necessary for collision damage mitigation braking. , it is used to provide the operator 3 with the installation position of the equipment for adjustment.
 このため、本実施形態のAR機5は、使用者である作業者3の頭部に装着されて、作業者3の目の前に拡張現実画像(以下、AR画像)を表示する、所謂ゴーグル型の表示デバイスとして構成されている。なお、「AR」は、拡張現実を意味する「Augmented Reality」の略称である。 For this reason, the AR machine 5 of the present embodiment is worn on the head of the worker 3 who is the user, and displays an augmented reality image (hereafter referred to as an AR image) in front of the worker 3, so-called goggles. type display device. Note that "AR" is an abbreviation for "Augmented Reality".
 車載センサ調整支援システムの基本構成は、特許文献1に記載の車載センサ調整支援システムと同じである。従って、車載センサ調整支援システムは、サーバ201と、パーソナルコンピュータ(以下、PC)202と、中継装置203と、AR機5とを備える。 The basic configuration of the in-vehicle sensor adjustment support system is the same as the in-vehicle sensor adjustment support system described in Patent Document 1. Accordingly, the vehicle-mounted sensor adjustment support system includes a server 201 , a personal computer (PC) 202 , a relay device 203 and an AR device 5 .
 このうち、サーバ201及びPC20は、通信ネットワーク200に接続されている。また、中継装置203は、無線によるデータ通信を中継するものであり、無線により通信ネットワーク200との間でデータ通信が可能である。また、PC202及びAR機5は、無線通信機能を備え、無線により中継装置203との間でデータ通信が可能である。従って、AR機5、サーバ201及びPC202は、中継装置203或いは通信ネットワーク200を介して、相互にデータ通信することができる。 Of these, the server 201 and the PC 20 are connected to the communication network 200. Also, the relay device 203 relays wireless data communication, and is capable of wireless data communication with the communication network 200 . Also, the PC 202 and the AR device 5 have a wireless communication function, and can wirelessly communicate data with the relay device 203 . Therefore, the AR machine 5, the server 201 and the PC 202 can perform data communication with each other via the relay device 203 or the communication network 200. FIG.
 サーバ201には、車両1を含む各種の車両毎の車種情報が記憶されている。また、サーバ201には、各種の調整用設備毎に、設置位置オブジェクトを示すオブジェクトデータが記憶されている。設置位置オブジェクトは、調整用設備の設置位置を示す画像である。また、サーバ201には、各種の調整用設備毎に、調整作業に関する各種の作業関連情報(例えば作業手順など)も記憶されている。 The server 201 stores vehicle type information for each vehicle including vehicle 1. In addition, the server 201 stores object data indicating an installation position object for each type of adjustment equipment. The installation position object is an image indicating the installation position of the adjustment facility. In addition, the server 201 also stores various work-related information (for example, work procedures) related to adjustment work for each type of adjustment equipment.
 PC202は、サーバ201から、車種情報、オブジェクトデータ及び作業関連情報をダウンロードして保存することが可能である。PC202からサーバ201へ車種情報、設置位置オブジェクト及び作業関連情報をアップロードすることが可能であってもよい。 The PC 202 can download and save vehicle type information, object data, and work-related information from the server 201 . It may be possible to upload vehicle type information, installation position objects, and work-related information from the PC 202 to the server 201 .
 AR機5は、サーバ201又はPC202から、車種情報、設置位置オブジェクト及び作業関連情報をダウンロードして取得することができる。AR機5には、撮影部としてのカメラ32(図2参照)で撮影された映像7がリアルタイムに表示される。AR機5が作業者3に適正に装着されている場合、映像7は、作業者3の前方の空間を撮影した映像となる。 The AR machine 5 can download and acquire vehicle type information, installation position objects, and work-related information from the server 201 or PC 202 . The AR machine 5 displays in real time an image 7 captured by a camera 32 (see FIG. 2) as a capturing unit. When the AR device 5 is properly attached to the worker 3, the video 7 is a video of the space in front of the worker 3. FIG.
 作業者3は、車両1を所定の作業エリアの床面2上に入庫させて、車両1に設けられた各種の車載センサの調整作業を行う。この調整作業は、調整対象の車載センサに対応した調整用設備を車両1の外部における所定の位置に設置し、その調整用設備を用いて行われる。 The worker 3 puts the vehicle 1 on the floor surface 2 of the predetermined work area and performs adjustment work for various in-vehicle sensors provided on the vehicle 1 . This adjustment work is performed by installing an adjustment facility corresponding to the vehicle-mounted sensor to be adjusted at a predetermined position outside the vehicle 1 and using the adjustment facility.
 本実施形態の車載センサ調整支援システムは、この調整作業を行う際に利用される。つまり、車載センサ調整支援システムは、作業者3が調整用設備を容易且つ効率的に設置できるように、調整用設備の設置位置を、AR機5を介して作業者3に提供する。 The in-vehicle sensor adjustment support system of this embodiment is used when performing this adjustment work. That is, the in-vehicle sensor adjustment support system provides the worker 3 with the installation position of the adjustment facility via the AR machine 5 so that the worker 3 can easily and efficiently install the adjustment facility.
 すなわち、AR機5は、映像7を表示させるが、さらに、作業者3の操作に応じて、その映像7の中における、調整用設備を設置すべき位置に、その設置位置を示す設置位置オブジェクトを表示させる。つまり、AR機5は、ビジョンベース型の拡張現実機能を有し、その拡張現実機能によって、カメラ32で撮影された実際の映像7の中で、調整用設備を設置すべき設置位置に、設置位置オブジェクトを重畳表示させることで、表示部34にAR画像を表示する。 That is, the AR machine 5 displays the image 7, and furthermore, according to the operation of the operator 3, an installation position object indicating the installation position is added to the position in the image 7 where the adjustment equipment should be installed. display. In other words, the AR machine 5 has a vision-based augmented reality function, and by means of the augmented reality function, the adjustment equipment can be installed at the installation position where the adjustment equipment should be installed in the actual image 7 captured by the camera 32. The AR image is displayed on the display unit 34 by superimposing the position object.
 本実施形態の車両1には、衝突被害軽減用の車載センサとして、フロントカメラ16、ミリ波レーダセンサ17、右後方レーダ(図示せず)、左後方レーダ(図示せず)、等が設けられている。 The vehicle 1 of the present embodiment is provided with a front camera 16, a millimeter wave radar sensor 17, a right rear radar (not shown), a left rear radar (not shown), etc., as in-vehicle sensors for collision damage reduction. ing.
 フロントカメラ16は、車両1の車室前方に設けられ、車両1の前方を撮影する。フロントカメラ16により撮影された画像は、例えば、車両1に搭載された衝突防止支援システムや車線逸脱防止支援システムなどの各種の運転支援システムに利用される。 The front camera 16 is provided in front of the passenger compartment of the vehicle 1 and photographs the front of the vehicle 1 . Images captured by the front camera 16 are used in various driving support systems such as a collision prevention support system and a lane departure prevention support system mounted on the vehicle 1, for example.
 ミリ波レーダセンサ17は、例えば、車両1におけるフロントバンパーの裏側に設けられる。ミリ波レーダセンサ17は、ミリ波帯の電波を放射し、その放射した電波が車両1前方の物標に当たって反射した反射波を受信することで、物標の位置や距離を検出する。ミリ波レーダセンサ17による検出結果は、例えば、車両1に搭載された衝突防止支援システムや障害物警報システムなどの各種の運転支援システムに利用される。 The millimeter wave radar sensor 17 is provided behind the front bumper of the vehicle 1, for example. The millimeter wave radar sensor 17 radiates radio waves in the millimeter wave band, and receives the reflected waves of the radiated radio waves that are reflected by a target in front of the vehicle 1, thereby detecting the position and distance of the target. The results of detection by the millimeter wave radar sensor 17 are used in various driving support systems such as a collision prevention support system and an obstacle warning system mounted on the vehicle 1, for example.
 右後方レーダ及び左後方レーダは、基本的には、ミリ波レーダセンサ17と同様の構成である。右後方レーダは、車両1の右後方に存在する物標の位置や距離を検出し、左後方レーダは、車両1の左後方に存在する物標の位置や距離を検出する。右後方レーダ及び左後方レーダは、例えば、車両1に搭載されたブラインドスポットモニタシステムなどの各種の運転支援システムに利用される。 The right rear radar and left rear radar basically have the same configuration as the millimeter wave radar sensor 17 . The right rear radar detects the position and distance of a target existing on the right rear of the vehicle 1 , and the left rear radar detects the position and distance of a target existing on the left rear of the vehicle 1 . The right rear radar and the left rear radar are used for various driving support systems such as a blind spot monitor system mounted on the vehicle 1, for example.
 フロントカメラ16によって車両1の前方における適切な範囲が撮影されるようにするには、フロントカメラ16の光軸を適正に調整する必要がある。その光軸の調整を行う際に用いられる調整用設備は、例えば、3つのターゲット板である。 In order for the front camera 16 to capture an appropriate range in front of the vehicle 1, it is necessary to properly adjust the optical axis of the front camera 16. Equipment for adjustment used when adjusting the optical axis is, for example, three target plates.
 作業者3は、フロントカメラ16の光軸を調整する際、3つのターゲット板をそれぞれ車両1の前方における所定位置に設置する。そして、フロントカメラ16においてそれら3つのターゲット板を撮影し、その撮影された3つのターゲット板の画像に基づいて、フロントカメラ16の光軸を適正に調整する。 When adjusting the optical axis of the front camera 16, the worker 3 installs three target plates at predetermined positions in front of the vehicle 1, respectively. Then, the three target plates are photographed by the front camera 16, and the optical axis of the front camera 16 is appropriately adjusted based on the photographed images of the three target plates.
 また、ミリ波レーダセンサ17や左右の後方レーダによって物標を精度良く検出できるようにするには、これら各センサの角度軸(すなわち、レーダ波の放射角度)を適正に調整する必要がある。その角度軸の調整を行う際に用いられる調整用設備は、リフレクタである。 Also, in order to accurately detect the target by the millimeter wave radar sensor 17 and the left and right rear radars, it is necessary to appropriately adjust the angle axis (that is, the radiation angle of the radar wave) of each of these sensors. A reflector is used for adjustment of the angle axis.
 作業者3は、ミリ波レーダセンサ17の角度軸を調整する際には、車両1の前方における所定位置にリフレクタを設置する。そして、ミリ波レーダセンサ17においてレーダ波を放射させると共にそのレーダ波がリフレクタで反射された反射波を受信し、その受信した反射波に基づいて、角度軸を適正に調整する。また、左右の後方レーダの角度軸を調整する際にも、後方レーダ毎に、所定位置にリフレクタを設置して、同様の手順で角度軸を調整する。 The worker 3 installs a reflector at a predetermined position in front of the vehicle 1 when adjusting the angle axis of the millimeter wave radar sensor 17 . Then, the millimeter wave radar sensor 17 emits a radar wave, receives a reflected wave of the radar wave reflected by a reflector, and appropriately adjusts the angle axis based on the received reflected wave. Also, when adjusting the angle axes of the left and right rear radars, the angle axes are adjusted in the same manner by installing a reflector at a predetermined position for each rear radar.
 ところで、フロントカメラ16の光軸を適正に調整するためには、3つのターゲット板の各々を所定の位置に正しく設置する必要がある。また、ミリ波レーダセンサ17や左右の後方レーダの角度軸を適正に調整するためには、これら各センサ毎に、リフレクタを所定の位置に正しく設置する必要がある。 By the way, in order to properly adjust the optical axis of the front camera 16, each of the three target plates must be properly installed at a predetermined position. Also, in order to properly adjust the angular axes of the millimeter wave radar sensor 17 and the left and right rear radars, it is necessary to properly install reflectors at predetermined positions for each of these sensors.
 AR機5は、これらの調整作業を行う際に、3つのターゲット板やリフレクタの設置位置を作業者3に提供できるように構成されている。つまり、例えば、ミリ波レーダセンサ17の角度軸を調整する際、AR機5は、図1に示すように、カメラ32で撮影された映像7の中で、リフレクタを設置すべき位置に、リフレクタの設置位置オブジェクト112を表示する。 The AR machine 5 is configured to provide the worker 3 with the installation positions of the three target plates and reflectors when performing these adjustment operations. That is, for example, when adjusting the angular axis of the millimeter wave radar sensor 17, the AR device 5 places the reflector at the position where the reflector should be installed in the image 7 captured by the camera 32, as shown in FIG. is displayed.
 (2)AR機の構成
 図2に示すように、AR機5は、空間認識用センサ31と、カメラ32と、自位置認識用センサ33と、表示部34とを備える。これら各部は、特許文献1に記載のAR機と同様に構成されている。
(2) Configuration of AR Machine As shown in FIG. 2 , the AR machine 5 includes a space recognition sensor 31 , a camera 32 , a position recognition sensor 33 , and a display section 34 . Each of these units is configured in the same manner as the AR machine described in Patent Document 1.
 すなわち、空間認識用センサ31は、AR機5の前方に存在する各種物体を示す情報を取得するセンサであり、空間認識用センサ31により、AR機5の前方の空間が三次元的に検出され、その検出結果である空間情報が検出される。 That is, the space recognition sensor 31 is a sensor that acquires information indicating various objects existing in front of the AR machine 5, and the space in front of the AR machine 5 is detected three-dimensionally by the space recognition sensor 31. , and spatial information is detected as a result of the detection.
 カメラ32は、AR機5の前方の映像を特定のフレームレートで撮影し、撮影データとして出力する。また、自位置認識用センサ33は、AR機5自身の、三次元空間内における位置、より具体的には例えば向き(角度)を検出し、その検出結果である自位置情報を出力する。 The camera 32 captures an image in front of the AR device 5 at a specific frame rate and outputs it as captured data. Further, the self-position recognition sensor 33 detects the position of the AR machine 5 itself in the three-dimensional space, more specifically, for example, the orientation (angle), and outputs self-position information as the detection result.
 表示部34は、画像を表示可能に構成されており、カメラ32から出力された撮影データが示す映像、すなわち映像7を表示する。また、表示部34は、AR画像生成部39からの表示情報に基づいて、撮影された映像7に加えて、図1に示す設置位置オブジェクト112等の各種オブジェクトを表示させる。なお、表示部34は、AR機5に対し、AR機5を頭部に適正に装着した作業者3が各種オブジェクトを含む映像7を視認できるように搭載されている。 The display unit 34 is configured to be able to display an image, and displays the video indicated by the photographed data output from the camera 32, that is, the video 7. Also, the display unit 34 displays various objects such as the installation position object 112 shown in FIG. The display unit 34 is mounted on the AR machine 5 so that the operator 3 who wears the AR machine 5 appropriately on his head can visually recognize the image 7 including various objects.
 また、AR機5は、さらに、空間認識部35と、画像認識部36と、自位置認識部37と、無線通信部38と、AR画像生成部39と、精度補正部50とを備える。 The AR device 5 further includes a space recognition unit 35, an image recognition unit 36, a self-position recognition unit 37, a wireless communication unit 38, an AR image generation unit 39, and an accuracy correction unit 50.
 空間認識部35は、空間認識用センサ31で検出される空間情報を周期的に取得してその空間情報をリアルタイムに解析する。そして、その解析結果に基づき、AR機5の前方の空間を三次元的にリアルタイムに認識する。 The space recognition unit 35 periodically acquires space information detected by the space recognition sensor 31 and analyzes the space information in real time. Then, based on the analysis result, the space in front of the AR device 5 is three-dimensionally recognized in real time.
 画像認識部36は、カメラ32から出力される撮影データに基づき、撮影された映像7内における特定の画像を認識する。本実施形態では、映像7中の特徴点を抽出し、その抽出した特徴点に基づいて、各種の画像を認識可能である。画像認識部36により認識可能な画像には、例えば、各種車両の前端及び後端に設けられたエンブレムやナンバープレート、特定のマーカなどが含まれる。 The image recognition unit 36 recognizes a specific image in the captured image 7 based on the captured data output from the camera 32 . In this embodiment, feature points are extracted from the image 7, and various images can be recognized based on the extracted feature points. Images that can be recognized by the image recognition unit 36 include, for example, emblems, license plates, and specific markers provided at the front and rear ends of various vehicles.
 図3に示すように、本実施形態の車両1には、前端にフロントエンブレム11及びフロントナンバープレート13が設けられ、後端にリヤエンブレム12及びリヤナンバープレート14が設けられている。 As shown in FIG. 3, the vehicle 1 of this embodiment is provided with a front emblem 11 and a front number plate 13 at the front end, and a rear emblem 12 and a rear number plate 14 at the rear end.
 このため、画像認識部36は、車両1自体を認識可能であることに加え、車両1におけるこれら各エンブレム11,12、各ナンバープレート13,14を個別に認識可能である。なお、画像認識部36が認識可能な各種画像の情報は、AR機5のメモリに予め記憶されていてもよいし、サーバ201或いはPC202からダウンロード可能であってもよい。 Therefore, in addition to being able to recognize the vehicle 1 itself, the image recognition unit 36 can individually recognize the emblems 11 and 12 and the license plates 13 and 14 of the vehicle 1 . Information on various images that can be recognized by the image recognition unit 36 may be stored in the memory of the AR machine 5 in advance, or may be downloaded from the server 201 or the PC 202 .
 また、画像認識部36は、トラッキング機能により、カメラ32の位置や角度が変化して撮影される映像が変化しても、一度認識した画像の位置を継続して把握することができる。 In addition, the image recognition unit 36 can continue to recognize the position of the once recognized image by the tracking function, even if the position or angle of the camera 32 changes and the captured image changes.
 また、自位置認識部37は、自位置認識用センサ33から出力される自位置情報に基づき、AR機5の向き(角度)を認識する。 Also, the self-position recognition unit 37 recognizes the direction (angle) of the AR device 5 based on the self-position information output from the self-position recognition sensor 33 .
 そして、空間認識部35、画像認識部36、及び、自位置認識部37は、それぞれ、認識結果である空間認識情報、画像認識情報、及び、自位置認識情報を、AR画像生成部39に出力する。 Then, the space recognition unit 35, the image recognition unit 36, and the self-position recognition unit 37 output the space recognition information, the image recognition information, and the self-position recognition information, which are the recognition results, to the AR image generation unit 39. do.
 無線通信部38は、中継装置203を介して、サーバ201或いはPC202との間で無線通信を行うためのものであり、AR画像生成部39に接続されている。 The wireless communication unit 38 is for performing wireless communication with the server 201 or the PC 202 via the relay device 203, and is connected to the AR image generation unit 39.
 AR画像生成部39は、画像認識部36から入力される撮影データに基づいて、表示部34に映像7を表示させる。また、AR画像生成部39は、無線通信部38を介して、サーバ201或いはPC202から調整対象となる車両の車種情報、オブジェクトデータ、作業関連情報等を取得する。 The AR image generation unit 39 causes the display unit 34 to display the image 7 based on the shooting data input from the image recognition unit 36 . The AR image generation unit 39 also acquires vehicle type information, object data, work-related information, etc. of the vehicle to be adjusted from the server 201 or the PC 202 via the wireless communication unit 38 .
 そして、AR画像生成部39は、無線通信部38を介して取得した情報と、空間認識部35、画像認識部36、及び、自位置認識部37から入力された情報に基づいて、調整用設備の設置位置を算出し、映像7内に設置位置オブジェクトを表示させる。また、AR画像生成部39は、ユーザ操作に応じて表示部34に作業関連情報を表示させることもできる。 Then, the AR image generation unit 39, based on the information obtained via the wireless communication unit 38 and the information input from the space recognition unit 35, the image recognition unit 36, and the own position recognition unit 37, adjusts the equipment for adjustment. , and displays the installation position object in the image 7 . The AR image generation unit 39 can also cause the display unit 34 to display work-related information in accordance with a user's operation.
 なお、AR機5は、CPU及びメモリを含むコンピュータを備えている。メモリは、ROM、RAM、フラッシュメモリなどの半導体記憶装置である。メモリには、コンピュータを、空間認識部35、画像認識部36、自位置認識部37及びAR画像生成部39として機能させるためのプログラムが記憶されている。 The AR machine 5 is equipped with a computer including a CPU and memory. The memory is a semiconductor storage device such as ROM, RAM, and flash memory. The memory stores a program for causing the computer to function as a space recognition section 35, an image recognition section 36, a self-position recognition section 37, and an AR image generation section 39. FIG.
 従って、上述した空間認識部35、画像認識部36、自位置認識部37及びAR画像生成部39の機能は、CPUがメモリに記憶されたプログラムを実行することにより、実現される。ただし、これら各機能の少なくとも一部又は全てが、ソフトウェアによる処理ではなく、一つあるいは複数のハードウェアを用いて実現されてもよい。 Therefore, the functions of the space recognition unit 35, the image recognition unit 36, the own position recognition unit 37, and the AR image generation unit 39 described above are realized by the CPU executing the programs stored in the memory. However, at least some or all of these functions may be implemented using one or more pieces of hardware instead of processing by software.
 (3)AR画像生成部の設置位置算出動作
 上記のように、AR画像生成部39は、表示部34に表示される映像7内に設置位置オブジェクトを表示させるために、調整用設備の設置位置を算出する。
(3) Installation position calculation operation of the AR image generation unit As described above, the AR image generation unit 39 calculates the installation position of the adjustment facility in order to display the installation position object in the image 7 displayed on the display unit 34. Calculate
 この設置位置の算出手順について、ミリ波レーダセンサ17のレーダ波の放射角度(角度軸)を調整するために、リフレクタの設置位置を算出する場合を例にとり説明する。なお、以下に説明する手順は、特許文献1に詳しく説明されているので、ここでは詳細な説明は省略する。 The procedure for calculating the installation position will be described by taking the case of calculating the installation position of the reflector in order to adjust the radiation angle (angular axis) of the radar wave of the millimeter wave radar sensor 17 as an example. Note that the procedure described below is described in detail in Patent Document 1, so a detailed description thereof will be omitted here.
 AR画像生成部39は、空間認識用センサ31で検出される空間情報、及び自位置認識用センサ33から出力される自位置情報に基づいて、カメラ32の前方の空間を三次元的に認識し、その空間内の床面2を認識する。そして、床面2を認識すると、三次元座標空間内における、その認識した床面2とAR機5の位置との相対的位置関係を算出する。 The AR image generator 39 three-dimensionally recognizes the space in front of the camera 32 based on the spatial information detected by the space recognition sensor 31 and the self-position information output from the self-position recognition sensor 33. , recognize the floor 2 in the space. When the floor surface 2 is recognized, the relative positional relationship between the recognized floor surface 2 and the position of the AR machine 5 in the three-dimensional coordinate space is calculated.
 次に、AR画像生成部39は、作業者3が車両1の周囲を移動することにより、画像認識部36にて認識されるフロントエンブレム11及びリヤエンブレム12の位置を三次元座標として取得する。 Next, the AR image generation unit 39 acquires the positions of the front emblem 11 and the rear emblem 12 recognized by the image recognition unit 36 as three-dimensional coordinates as the worker 3 moves around the vehicle 1 .
 そして、AR画像生成部39は、各エンブレム11,12の位置から床面2への垂線21,22を生成し、各垂線21,22と床面2との交点23,24を求めて、各交点23,24を通る線を、車両1の中心線25として算出する。図3は、垂線21,22及び中心線25が重畳表示された映像7の一例を示す。 Then, the AR image generator 39 generates perpendiculars 21, 22 from the positions of the emblems 11, 12 to the floor 2, finds intersections 23, 24 between the perpendiculars 21, 22 and the floor 2, and A line passing through the intersections 23 and 24 is calculated as the centerline 25 of the vehicle 1 . FIG. 3 shows an example of the image 7 in which the perpendicular lines 21, 22 and the center line 25 are superimposed.
 このように算出される中心線25、及び、交点23,24は、調整用設備の設置位置をを設定するのに用いられる。具体的には、例えば、ミリ波レーダセンサ17の調整用設備であるリフレクタの設置位置を算出する際には、車両前方の交点23の三次元座標が、設置位置の基準となる特定座標として用いられる。 The center line 25 and intersections 23 and 24 calculated in this way are used to set the installation position of the adjustment equipment. Specifically, for example, when calculating the installation position of the reflector, which is equipment for adjusting the millimeter wave radar sensor 17, the three-dimensional coordinates of the intersection 23 in front of the vehicle are used as specific coordinates that serve as a reference for the installation position. be done.
 そして、AR画像生成部39は、サーバ201若しくはPC202から取得した車両1の車種情報の中から、ミリ波レーダセンサ17の調整用設備であるリフレクタの設置用データとして、特定座標との間の距離を表す距離データを読み込む。 Then, the AR image generation unit 39 selects from the vehicle type information of the vehicle 1 acquired from the server 201 or the PC 202, the distance between the specific coordinates as data for installation of the reflector, which is equipment for adjusting the millimeter wave radar sensor 17. Read distance data representing
 この距離データと特定座標は、調整対象となる車種及び車載センサ毎に予め設定されており、AR画像生成部39は、これらの情報に基づき、調整用設備の設置位置を決定する。 This distance data and specific coordinates are set in advance for each vehicle type and on-vehicle sensor to be adjusted, and the AR image generation unit 39 determines the installation position of the adjustment equipment based on this information.
 具体的には、ミリ波レーダセンサ17に対するリフレクタの設置位置を設定する際には、図4に示すように、車両前方の交点23である特定座標から中心線25に沿って所定距離L1離れた座標を基準点P1として設定する。そして、AR画像生成部39は、図4に二点鎖線で示すように、その設定した基準点P1に、オブジェクトデータから得られるリフレクタの画像を、設置位置オブジェクト112として表示する。 Specifically, when setting the installation position of the reflector with respect to the millimeter wave radar sensor 17, as shown in FIG. Set the coordinates as a reference point P1. Then, the AR image generator 39 displays the image of the reflector obtained from the object data as the installation position object 112 at the set reference point P1, as indicated by the two-dot chain line in FIG.
 なお、設置位置オブジェクト112を表示する基準点P1は、交点23の三次元座標である特定座標から中心線25に沿って所定距離離れた三次元座標として設定されることから、基準点P1と車両1との相対的位置関係は一定である。このため、作業者3が向きを変えたり移動したりすると、その向きや位置の移動に追随して、映像7中の設置位置オブジェクト112の表示位置や角度、大きさなども変化する。 Note that the reference point P1 for displaying the installation position object 112 is set as a three-dimensional coordinate that is a predetermined distance along the center line 25 from the specific coordinates that are the three-dimensional coordinates of the intersection point 23. Therefore, the reference point P1 and the vehicle The relative positional relationship with 1 is constant. Therefore, when the worker 3 turns or moves, the display position, angle, size, etc. of the installation position object 112 in the image 7 change along with the movement of the direction and position.
 (4)精度補正部の構成
 上記のように、車両1に搭載された車載センサの調整作業に用いられる調整用設備の設置位置は、カメラ32にて撮影された映像内の特徴点に基づき設定される特定座標から所定距離離れた基準点P1に基づき設定される。
(4) Configuration of Accuracy Correction Unit As described above, the installation position of the adjustment equipment used for the adjustment work of the in-vehicle sensor mounted on the vehicle 1 is set based on the feature points in the image captured by the camera 32. It is set based on a reference point P1 which is a predetermined distance away from the specified coordinates.
 ところが、この基準点P1は、車両1周囲の現実空間で、特定座標から所定距離離れた正規の位置P0からずれることがある。このずれは、AR機5の個体差によるものであることから、サーバ201若しくはPC202に記憶される距離データを補正しても改善することができない。 However, in the real space around the vehicle 1, the reference point P1 may deviate from the normal position P0, which is a predetermined distance away from the specific coordinates. Since this deviation is due to individual differences in the AR device 5, it cannot be improved even by correcting the distance data stored in the server 201 or the PC 202. FIG.
 そこで、本実施形態では、AR機5に、図2に示す精度補正部50を設けることで、AR機5毎に基準点P1のずれを解消できるようにしている。 Therefore, in the present embodiment, the AR machine 5 is provided with the accuracy correction unit 50 shown in FIG.
 この精度補正部50は、本開示の精度補正装置の一例に相当するものであり、本実施形態では、表示制御部52、入力部54、及び、補正値設定部56を備える。 The accuracy correction unit 50 corresponds to an example of the accuracy correction device of the present disclosure, and includes a display control unit 52, an input unit 54, and a correction value setting unit 56 in this embodiment.
 表示制御部52は、AR画像生成部39が表示部34に表示させる映像7に対し、基準距離を表す検証用オブジェクトを表示させる。この検証用オブジェクトの表示位置は、AR画像生成部39が認識している空間内で、乗員がメジャー等を使って検証用オブジェクトの長さを測定し易い位置に設定される。 The display control unit 52 causes the video 7 displayed on the display unit 34 by the AR image generation unit 39 to display a verification object representing the reference distance. The display position of the verification object is set at a position where the passenger can easily measure the length of the verification object using a tape measure or the like within the space recognized by the AR image generator 39 .
 例えば、検証用オブジェクトの表示位置は、作業者3が手を伸ばすだけで長さを測定できるように、作業者3から所定距離離れた位置を、表示位置とするようにしてもよい。また、映像7中の車両1のボディや床面2等、AR画像生成部39が認識している物標の表面を、表示位置とするようにしてもよい。 For example, the display position of the verification object may be set at a position a predetermined distance away from the worker 3 so that the length can be measured by the worker 3 just by reaching out. Alternatively, the surface of the target recognized by the AR image generation unit 39, such as the body of the vehicle 1 or the floor surface 2 in the image 7, may be set as the display position.
 また、その表示位置に表示される検証用オブジェクトとしては、作業者3が映像7を見ながら基準距離を測定可能なオブジェクトであればよく、例えば、長さが基準距離となる直線状の物体を検証用オブジェクトとして表示するようにしてもよい。 Also, the verification object displayed at the display position may be any object that allows the worker 3 to measure the reference distance while viewing the image 7. For example, a linear object whose length is the reference distance. It may be displayed as a verification object.
 次に、入力部54は、作業者3が、表示部34に表示された検証用オブジェクトの長さ等の情報を、音声入力、キー操作、若しくは、作業者3自身の動き、等によって入力できるようにするためのものである。 Next, the input unit 54 allows the worker 3 to input information such as the length of the verification object displayed on the display unit 34 by voice input, key operation, or movement of the worker 3 himself. It is intended to
 なお、作業者3自身の動きによる入力は、例えば、作業者3がカメラ32の前で指等を動かし、その動きを画像認識部36が入力情報として認識することで実現されるものであり、この場合、入力部54は、画像認識部36による認識結果を入力する。 The input by the movement of the worker 3 himself/herself is realized, for example, by the worker 3 moving a finger or the like in front of the camera 32 and the image recognition unit 36 recognizing the movement as input information. In this case, the input unit 54 inputs the recognition result by the image recognition unit 36 .
 従って、作業者3は、表示部34に基準距離の検証用オブジェクトが表示されると、その映像7を見ながら、現実空間内で検証用オブジェクトの長さを測定し、その測定結果を、実距離として、精度補正部50に入力することができる。 Therefore, when the verification object of the reference distance is displayed on the display unit 34, the worker 3 measures the length of the verification object in the real space while viewing the image 7, and the measurement result is displayed in the actual space. The distance can be input to the accuracy correction unit 50 .
 補正値設定部56は、入力部54を介して入力された測定距離を、基準距離を実際に測定した実距離として認識し、その認識した実距離と基準距離とに基づき、AR機5が認識している距離に対する補正値(距離補正値)を算出する。 The correction value setting unit 56 recognizes the measured distance input via the input unit 54 as the actual distance that is actually measured as the reference distance, and based on the recognized actual distance and the reference distance, the AR device 5 recognizes A correction value (distance correction value) for the distance that is
 この距離補正値は、上述したリフレクタや3つのターゲット板等、各種調整用設備の設置位置を設定するのに用いられる距離データを補正するためのものである。このため、距離補正値は、サーバ201若しくはPC202から取得される各種距離データに対し、共通の補正値とすることができるように、距離データに対し乗じることで各種距離データを適正に補正し得る、補正係数として算出される。 This distance correction value is for correcting the distance data used to set the installation positions of various adjustment equipment such as the reflector and the three target plates described above. Therefore, the distance correction value can be used as a common correction value for various distance data acquired from the server 201 or the PC 202. By multiplying the distance data, the various distance data can be appropriately corrected. , is calculated as a correction factor.
 従って、例えば、図4に示すように、AR画像生成部39が、図4に示すリフレクタの設置位置オブジェクト112を表示する際には、距離データが所定距離L1から所定距離L0に補正されることになる。 Therefore, for example, as shown in FIG. 4, when the AR image generator 39 displays the installation position object 112 of the reflector shown in FIG. 4, the distance data is corrected from the predetermined distance L1 to the predetermined distance L0. become.
 この補正により、設置位置オブジェクト112の表示位置を規定する基準点は、補正前の基準点P1から補正距離dL離れた基準点P0に設定され、図4に実線で示すように、設置位置オブジェクト112を、リフレクタの正規の設置位置に表示できるようになる。 By this correction, the reference point that defines the display position of the installation position object 112 is set to the reference point P0 that is separated from the reference point P1 before the correction by the correction distance dL. can be displayed at the regular installation position of the reflector.
 なお、距離補正値は、必ずしも補正係数とする必要はなく、検証用オブジェクト毎に、距離データに対する補正距離を設定するようにしてもよい。 Note that the distance correction value does not necessarily have to be a correction coefficient, and a correction distance for distance data may be set for each verification object.
 (5)精度補正処理
 精度補正部50において、表示制御部52及び補正値設定部56としての機能は、AR機5に設けられたコンピュータ、若しくは、精度補正部50に設けられたコンピュータが、精度補正用のプログラムを実行することにより実現される。
(5) Accuracy correction processing In the accuracy correction unit 50, the functions of the display control unit 52 and the correction value setting unit 56 are performed by the computer provided in the AR machine 5 or the computer provided in the accuracy correction unit 50. It is realized by executing a program for correction.
 そこで、表示制御部52及び補正値設定部56としての機能を実現するために、コンピュータにて実行される精度補正処理について説明する。なお、この精度補正処理は、AR機5が表示部34に映像7を表示させている状態で、作業者3が、入力部54を介してAR機5の校正指令を入力し、その校正指令に従い、CPUがメモリに記憶された精度補正用のプログラムを実行することにより実施される。 Therefore, in order to realize the functions of the display control unit 52 and the correction value setting unit 56, accuracy correction processing executed by a computer will be described. In this accuracy correction process, the operator 3 inputs a calibration command for the AR device 5 via the input unit 54 while the AR device 5 is displaying the image 7 on the display unit 34, and the calibration command is , the CPU executes the accuracy correction program stored in the memory.
 図5に示すように、精度補正処理が開始されると、CPUは、まずS110にて、AR画像生成部39に対し、現在表示中の映像7内に、基準距離の検証用オブジェクトを表示させる、表示制御部52としての処理を実行する。 As shown in FIG. 5, when the accuracy correction process is started, the CPU first causes the AR image generation unit 39 to display a reference distance verification object in the image 7 currently being displayed in S110. , the processing as the display control unit 52 is executed.
 S110にて、検証用オブジェクトを表示させると、作業者3は、表示部34に表示されている映像7を見ながら、メジャー等を使って、検証用オブジェクトの現実空間での長さを測定し、その測定結果を、入力部54を介して入力する。 In S110, when the verification object is displayed, the worker 3 measures the length of the verification object in the real space using a tape measure or the like while viewing the image 7 displayed on the display unit 34. , the measurement result is input through the input unit 54 .
 このため、S110の処理実行後、CPUは、S120にて、入力部54から基準距離の測定結果が入力されるのを待機し、測定結果が入力されると、S130に移行する。 Therefore, after executing the process of S110, the CPU waits for the input of the measurement result of the reference distance from the input unit 54 in S120, and when the measurement result is input, the process proceeds to S130.
 S130では、入力部54から入力された測定結果を、基準距離の実距離であると認識し、続くS140にて、その実距離と基準距離とに基づき距離補正値を算出する、補正値設定部56としての処理を実行する。 In S130, the measurement result input from the input unit 54 is recognized as the actual distance of the reference distance, and in subsequent S140, the correction value setting unit 56 calculates the distance correction value based on the actual distance and the reference distance. Execute the process as
 なお、この距離補正値は、AR機5が認識している距離を補正するための補正係数であり、例えば、次式に従い、実距離を基準距離にて除算することにより算出される。 Note that this distance correction value is a correction coefficient for correcting the distance recognized by the AR device 5, and is calculated by dividing the actual distance by the reference distance according to the following equation, for example.
 補正係数=実距離/基準距離
 S140にて、距離補正値が算出されると、CPUは、S150に移行する。そして、S150では、AR画像生成部39がサーバ201若しくはPC202から車種毎及び車載センサ毎に取得してメモリに記憶した、調整用設備設置用の距離データを補正し、精度補正処理を終了する。
Correction coefficient=actual distance/reference distance After the distance correction value is calculated in S140, the CPU proceeds to S150. Then, in S150, the AR image generation unit 39 corrects the distance data for installation of adjustment equipment, which is acquired from the server 201 or PC 202 for each vehicle type and for each in-vehicle sensor and stored in the memory, and the accuracy correction process ends.
 なお、S150においては、AR画像生成部39に距離補正値として補正係数を出力することで、AR画像生成部39に対し、メモリに記憶されている各種距離データを補正させるようにしてもよい。 In addition, in S150, by outputting the correction coefficient as the distance correction value to the AR image generation unit 39, the AR image generation unit 39 may correct various distance data stored in the memory.
 また、S150においては、AR画像生成部39から各種距離データを読み込み、その読み込んだ距離データに補正係数を乗じることで、距離データを補正し、補正後の距離データをAR画像生成部39に戻すようにしてもよい。 In S150, various distance data are read from the AR image generation unit 39, the read distance data is multiplied by a correction coefficient to correct the distance data, and the corrected distance data is returned to the AR image generation unit 39. You may do so.
 (6)効果
 以上説明したように、本実施形態のAR機5には、本開示の精度補正装置としての精度補正部50が設けられている。精度補正部50は、表示部34に表示されている映像7内に基準距離の検証用オブジェクトを表示することで、作業者3に、検証用オブジェクトの現実空間での長さを測定させ、その測定結果である実距離と基準距離とから距離補正値を算出する。
(6) Effect As described above, the AR machine 5 of the present embodiment is provided with the accuracy correction unit 50 as the accuracy correction device of the present disclosure. The accuracy correction unit 50 displays the verification object at the reference distance in the image 7 displayed on the display unit 34, thereby allowing the worker 3 to measure the length of the verification object in the real space, and the length of the verification object. A distance correction value is calculated from the actual distance and the reference distance which are the measurement results.
 この結果、本実施形態の精度補正部50によれば、AR画像生成部39が調整用設備の設置位置オブジェクトを映像7内に表示する際に用いる距離データを、距離補正値を用いて補正させることができる。 As a result, according to the accuracy correction unit 50 of the present embodiment, the distance data used when the AR image generation unit 39 displays the installation position object of the adjustment equipment in the image 7 is corrected using the distance correction value. be able to.
 よって、本実施形態によれば、車両に搭載された各種車載センサの調整作業を行うに当たって、作業者3に対し、調整用設備の設置位置を、設置位置オブジェクト112を用いて正確に提示することができるようになる。従って、本実施形態によれば、AR機5毎に生じる表示誤差の影響を抑制し、設置位置オブジェクトの表示精度を高めることができる。 Therefore, according to the present embodiment, the installation position of the adjustment facility can be accurately presented to the worker 3 using the installation position object 112 when performing the adjustment work of various on-vehicle sensors mounted on the vehicle. will be able to Therefore, according to the present embodiment, it is possible to suppress the influence of display errors that occur for each AR device 5 and improve the display accuracy of the installation position object.
 [第2実施形態]
 第2実施形態は、基本的な構成は第1実施形態と同様であるため、相違点について以下に説明する。なお、以下の説明において、第1実施形態と同じ符号は、同一の構成を示すものであって、先行する説明を参照する。
[Second embodiment]
Since the basic configuration of the second embodiment is the same as that of the first embodiment, differences will be described below. In the following description, the same reference numerals as in the first embodiment indicate the same configuration, and refer to the preceding description.
 第1実施形態では、精度補正処理において、現在表示中の映像7内に基準距離の検証用オブジェクトを表示させて、作業者3にその長さを測定させ、その測定結果である実距離と基準距離とに基づき、距離補正値を算出するものとして説明した。 In the first embodiment, in the accuracy correction process, a reference distance verification object is displayed in the image 7 currently being displayed, and the operator 3 measures the length of the object. The description has been given assuming that the distance correction value is calculated based on the distance.
 これに対し、第2実施形態では、作業者3に対し、現実空間で基準距離となる2点を指定させて、AR画像生成部39に、その2点の間の距離を認識させ、その認識された基準距離と、現実空間での基準距離である実距離とから距離補正値を算出する。 On the other hand, in the second embodiment, the worker 3 is made to specify two points that are the reference distance in the real space, and the AR image generation unit 39 is made to recognize the distance between the two points. A distance correction value is calculated from the reference distance thus obtained and the actual distance, which is the reference distance in the physical space.
 そこで、本実施形態では、このように精度補正部50にて実行される精度補正処理について、図6に示すフローチャートに沿って説明する。 Therefore, in the present embodiment, the accuracy correction processing executed by the accuracy correction unit 50 in this manner will be described along the flowchart shown in FIG.
 図6に示すように、本実施形態の精度補正処理では、まずS210にて、作業者3が一定距離離れた2点を指定するよう、映像7中に入力案内を表示し、S220に移行する。 As shown in FIG. 6, in the accuracy correction process of this embodiment, first, in S210, input guidance is displayed in the image 7 so that the worker 3 designates two points separated by a certain distance, and the process proceeds to S220. .
 S210での入力案内の表示により、作業者3は、メジャー等を使って、現実空間で一定の基準距離を測定し、入力部54を介して、その両端の2点をエアタップ等で指定する。なお、この場合、入力部54は、作業者3によるエアタップの動作が、画像認識部36にて認識された際に、その位置を作業者3が指定した点として入力する。 According to the display of the input guide in S210, the operator 3 uses a tape measure or the like to measure a certain reference distance in the real space, and specifies two points at both ends of the distance via the input unit 54 by air tapping or the like. In this case, the input unit 54 inputs the position as the point designated by the worker 3 when the image recognition unit 36 recognizes the air tapping action by the worker 3 .
 なお、この2点の指定は、画像認識部36において2点の位置、つまり三次元座標を認識できるようにすればよく、作業者3は、例えば、一定距離の両端にマーカーを配置することで、2点を入力するようにしてもよい。また、作業者3は、例えば、一定長さの長尺物を撮影させて、画像認識部36に長尺物の両端の2点を認識させるようにしてもよい。 The two points can be specified by allowing the image recognition unit 36 to recognize the positions of the two points, that is, the three-dimensional coordinates. , two points may be input. Alternatively, the worker 3 may, for example, photograph a long object of a certain length and cause the image recognition unit 36 to recognize two points at both ends of the long object.
 次に、S220では、作業者3から一定距離離れた2点が指定されたか否かを判断することで、2点が指定されるのを待機する。S220での判断は、例えば、画像認識部36において、2点が認識されたか否かを判断することにより行われる。 Next, in S220, it is determined whether or not two points separated by a certain distance from the worker 3 have been designated, thereby waiting for two points to be designated. The determination in S220 is made, for example, by determining whether or not two points have been recognized in the image recognition section 36 .
 また、例えば、画像認識部36において、2点が認識された際に、AR画像生成部39が認識された2点を映像7中に識別可能に表示するようにし、S220では、AR画像生成部39が2点を表示した際に、2点が入力されたと判断するようにしてもよい。 Further, for example, when two points are recognized by the image recognition unit 36, the two points recognized by the AR image generation unit 39 are displayed in the image 7 so as to be identifiable. It may be determined that two points have been input when 39 displays two points.
 S220にて、2点が指定されたと判断されると、S230に移行し、AR画像生成部39から、画像認識部36にて認識された2点の間の距離を取得する。この距離は、AR機5において認識されている2点間の距離であり、S220では、この距離を基準距離として取得する。 When it is determined in S220 that two points have been designated, the process proceeds to S230, and the distance between the two points recognized by the image recognition unit 36 is acquired from the AR image generation unit 39. This distance is the distance between two points recognized by the AR machine 5, and in S220, this distance is acquired as the reference distance.
 そして、続くS240では、S230にて取得された基準距離と、現実空間で作業者3が指定した2点間の実距離とに基づき、距離補正値を算出する。なお、このS240の処理は、第1実施形態のS140と同様、補正値設定部56として機能する。 Then, in the following S240, a distance correction value is calculated based on the reference distance acquired in S230 and the actual distance between two points designated by the worker 3 in the physical space. Note that the process of S240 functions as the correction value setting unit 56, like S140 of the first embodiment.
 S240にて距離補正値が算出されると、S250に移行する。S250では、第1実施形態のS150と同様の手順で、AR画像生成部39がサーバ201若しくはPC202から取得した調整用設備設置用の距離データを補正し、精度補正処理を終了する。 When the distance correction value is calculated in S240, the process proceeds to S250. In S250, in the same procedure as in S150 of the first embodiment, the AR image generation unit 39 corrects the distance data for installation of adjustment equipment acquired from the server 201 or the PC 202, and ends the accuracy correction process.
 以上説明したように、第2実施形態では、作業者3に対し、現実空間で基準距離となる2点を入力させて、AR画像生成部39に、その2点の間の距離を認識させる。そして、その認識された基準距離にはAR機5固有の誤差が含まれることから、その基準距離と現実空間での実距離とに基づき、その誤差を解消するための距離補正値を算出する。 As described above, in the second embodiment, the worker 3 is made to input two points that are the reference distance in the real space, and the AR image generation unit 39 is made to recognize the distance between the two points. Since the recognized reference distance includes an error specific to the AR machine 5, a distance correction value for canceling the error is calculated based on the reference distance and the actual distance in the real space.
 従って、第2実施形態の精度補正部50においても、第1実施形態と同様の効果を得ることができる。 Therefore, the accuracy correction unit 50 of the second embodiment can also obtain the same effects as those of the first embodiment.
 [第3実施形態]
 第3実施形態は、第2実施形態と同様、基本的な構成は第1実施形態と同様であるため、相違点について以下に説明する。なお、以下の説明において、第1実施形態と同じ符号は、同一の構成を示すものであって、先行する説明を参照する。
[Third embodiment]
Like the second embodiment, the third embodiment has the same basic configuration as the first embodiment, so the differences will be described below. In the following description, the same reference numerals as in the first embodiment indicate the same configuration, and refer to the preceding description.
 第1実施形態では、精度補正処理において、現在表示中の映像7内に基準距離の検証用オブジェクトを表示させて、作業者3にその長さを測定させ、その測定結果である実距離と基準距離とに基づき、距離補正値を算出するものとして説明した。 In the first embodiment, in the accuracy correction process, a reference distance verification object is displayed in the image 7 currently being displayed, and the operator 3 measures the length of the object. The description has been given assuming that the distance correction value is calculated based on the distance.
 これに対し、第3実施形態では、現在表示中の映像7内に、基準距離の検証用オブジェクトに加えて、基準距離の長さを表す数値情報を表示させ、作業者3に、検証用オブジェクトの長さの測定結果である実距離と基準距離との差を、誤差情報として入力させる。そして、その入力された誤差情報に基づき、距離補正値を算出する。 On the other hand, in the third embodiment, numerical information indicating the length of the reference distance is displayed in addition to the verification object of the reference distance in the currently displayed image 7, and the operator 3 is provided with the verification object The difference between the actual distance and the reference distance, which is the measurement result of the length, is input as error information. A distance correction value is calculated based on the input error information.
 そこで、本実施形態では、このように精度補正部50にて実行される精度補正処理について、図7に示すフローチャートに沿って説明する。 Therefore, in the present embodiment, the accuracy correction processing executed by the accuracy correction unit 50 in this manner will be described along the flowchart shown in FIG.
 図7に示すように、本実施形態の精度補正処理では、まずS310にて、AR画像生成部39に対し、現在表示中の映像7内に、基準距離の検証用オブジェクトと、基準距離の長さを表す数値情報を表示させる。なお、この処理は、本開示の表示制御部の一例に相当する。 As shown in FIG. 7, in the accuracy correction process of the present embodiment, first, in S310, the AR image generation unit 39 generates an object for verification of the reference distance and a length of the reference distance in the image 7 currently being displayed. display numerical information that indicates the Note that this process corresponds to an example of the display control unit of the present disclosure.
 S310にて、検証用オブジェクト及び基準距離の数値情報を表示させると、作業者3は、映像7を見ながらメジャー等を使って、検証用オブジェクトの現実空間での長さを測定する。そして、作業者3は、その測定結果である実距離と、映像7に表示した数値情報から得られる基準距離との差を、基準距離の誤差情報として、入力部54を介して入力する。 In S310, when the verification object and the numerical information of the reference distance are displayed, the worker 3 measures the length of the verification object in the real space using a tape measure or the like while viewing the image 7. Then, the worker 3 inputs the difference between the actual distance, which is the result of the measurement, and the reference distance obtained from the numerical information displayed on the image 7 via the input unit 54 as error information of the reference distance.
 このため、続くS320では、入力部54から基準距離の誤差情報が入力されたか否かを判断することにより、誤差情報が入力されるのを待機し、誤差情報が入力されると、S330に移行する。 Therefore, in the following S320, it is determined whether or not the error information of the reference distance has been input from the input unit 54, thereby waiting for the error information to be input. do.
 S330では、入力部54から入力された基準距離の誤差情報に基づき、距離補正値を算出し、S340に移行する。なお、S330の処理は、本開示の補正値設定部の一例に相当する。 In S330, the distance correction value is calculated based on the reference distance error information input from the input unit 54, and the process proceeds to S340. Note that the processing of S330 corresponds to an example of the correction value setting unit of the present disclosure.
 S340では、上述したS150,S250と同様の手順で、AR画像生成部39がサーバ201若しくはPC202から取得した調整用設備設置用の距離データを補正し、精度補正処理を終了する。 In S340, the AR image generation unit 39 corrects the distance data for installation of adjustment equipment acquired from the server 201 or PC 202 in the same procedure as in S150 and S250 described above, and the accuracy correction process ends.
 このように、第3実施形態では、現在表示中の映像7内に、基準距離の検証用オブジェクトとその長さを表す数値情報とを表示させて、作業者3に、検証用オブジェクトの長さを測定して、その測定結果である実距離と基準距離との差を入力させる。この差は、AR機5が認識している基準距離と現実空間での実距離との差であり、AR機5固有の距離認識誤差であることから、本実施形態では、この誤差に基づき距離補正値を算出する。 As described above, in the third embodiment, the verification object at the reference distance and the numerical information representing the length thereof are displayed in the image 7 currently being displayed, and the operator 3 is instructed to determine the length of the verification object. is measured, and the difference between the actual distance and the reference distance, which is the result of the measurement, is input. This difference is the difference between the reference distance recognized by the AR unit 5 and the actual distance in the real space, and is a distance recognition error unique to the AR unit 5. Calculate the correction value.
 従って、第3実施形態の精度補正部50においても、第1実施形態と同様の効果を得ることができる。 Therefore, the accuracy correction unit 50 of the third embodiment can also obtain the same effects as those of the first embodiment.
 [その他変形例]
 以上、本開示の実施形態について説明したが、本開示は上述の実施形態に限定されることなく、種々変形して実施することができる。
[Other Modifications]
Although the embodiments of the present disclosure have been described above, the present disclosure is not limited to the above-described embodiments, and various modifications can be made.
 a)上記実施形態では、入力部54は、作業者3が、音声入力、キー操作、若しくは、作業者3自身の動き、等によって入力した情報を入力するものとして説明した。これに対し、入力部54は、更に、図2に示す外部機器58から送信されてくる情報を入力するように構成されていてもよい。なお、外部機器58としては、スマートフォンやタブレット端末等の携帯端末、或いは、PC等の固定端末を挙げることができる。 a) In the above embodiment, the input unit 54 has been described as inputting information input by the worker 3 through voice input, key operation, movement of the worker 3 himself, or the like. On the other hand, the input section 54 may be further configured to input information transmitted from the external device 58 shown in FIG. Note that the external device 58 may be a mobile terminal such as a smart phone or a tablet terminal, or a fixed terminal such as a PC.
 b)上記実施形態では、AR機5は、ゴーグル型であるものとして説明したが、例えば、カメラ及び表示部を備えたタブレット端末等の携帯端末に、AR機の機能を持たせてもよい。 b) In the above embodiment, the AR machine 5 was described as being of a goggle type, but for example, a mobile terminal such as a tablet terminal equipped with a camera and a display unit may be provided with the functions of the AR machine.
 c)上記実施形態では、精度補正部50は、AR機5の一機能として、AR機5に組み込まれるものとして説明した。しかし、精度補正部50は、AR機5とは別体の精度補正装置として構成され、AR機5に接続されることにより、AR機5固有の表示誤差を補正するように構成されてよい。 c) In the above embodiment, the accuracy correction unit 50 has been described as being incorporated in the AR machine 5 as one function of the AR machine 5 . However, the accuracy correction unit 50 may be configured as an accuracy correction device separate from the AR device 5 and connected to the AR device 5 so as to correct display errors unique to the AR device 5 .
 d)上記実施形態では、AR機5は、車両に搭載される車載センサの調整作業に使用される調整用設備の設置場所を提示するのに用いられるものとして説明した。しかし、本開示の精度補正装置は、カメラにて撮影される空間内で、特定座標から所定距離離れた基準点に、オブジェクトを配置するよう構成されたAR機であれば、上記実施形態と同様に適用することができる。 d) In the above embodiment, the AR machine 5 was described as being used to present the installation location of the adjustment facility used for the adjustment work of the in-vehicle sensor installed in the vehicle. However, if the accuracy correction device of the present disclosure is an AR machine configured to place an object at a reference point a predetermined distance away from specific coordinates in the space captured by the camera, it is the same as the above embodiment. can be applied to
 e)上記実施形態では、精度補正部50における表示制御部52及び補正値設定部56の機能は、AR機5若しくは精度補正部50を構成するコンピュータのCPUがプログラムを実行することで実現されるものとして説明した。しかし、精度補正部50のこれらの機能は、コンピュータのプログラムと専用ハードウェア論理回路との組み合わせ、或いは、一つ以上の専用ハードウェア論理回路だけで実現されるようにしてもよい。 e) In the above embodiment, the functions of the display control unit 52 and the correction value setting unit 56 in the accuracy correction unit 50 are realized by executing a program by the CPU of the computer that constitutes the AR machine 5 or the accuracy correction unit 50. described as a thing. However, these functions of the accuracy correction unit 50 may be realized by a combination of a computer program and a dedicated hardware logic circuit, or by only one or more dedicated hardware logic circuits.
 f)上記実施形態における1つの構成要素が有する複数の機能を、複数の構成要素によって実現したり、1つの構成要素が有する1つの機能を、複数の構成要素によって実現したりしてもよい。また、複数の構成要素が有する複数の機能を、1つの構成要素によって実現したり、複数の構成要素によって実現される1つの機能を、1つの構成要素によって実現したりしてもよい。また、上記実施形態の構成の一部を省略してもよい。また、上記実施形態の構成の少なくとも一部を、他の上記実施形態の構成に対して付加又は置換してもよい。 f) A plurality of functions possessed by one component in the above embodiment may be realized by a plurality of components, or a function possessed by one component may be realized by a plurality of components. Also, a plurality of functions possessed by a plurality of components may be realized by a single component, or a function realized by a plurality of components may be realized by a single component. Also, part of the configuration of the above embodiment may be omitted. Moreover, at least part of the configuration of the above embodiment may be added or replaced with respect to the configuration of the other above embodiment.
 g)本開示は、精度補正装置若しくは拡張現実画像表示装置の他、精度補正装置若しくは拡張現実画像表示装置を構成要素とするシステム、精度補正装置若しくは拡張現実画像表示装置としてコンピュータを機能させるためのプログラム、このプログラムを記録した半導体メモリ等の非遷移的実態的記録媒体、精度補正方法若しくは拡張現実画像表示方法など、種々の形態で実現することもできる。 g) In addition to the accuracy correction device or the augmented reality image display device, the present disclosure includes a system having the accuracy correction device or the augmented reality image display device as a component, and a computer functioning as the accuracy correction device or the augmented reality image display device. It can also be realized in various forms such as a program, a non-transitional actual recording medium such as a semiconductor memory storing this program, an accuracy correction method, or an augmented reality image display method.

Claims (7)

  1.  精度補正装置であって、
     前記精度補正装置は、拡張現実画像表示装置(5)に設けられ、
     前記拡張現実画像表示装置は、撮影部(32)により撮影された映像の空間を認識し、該空間内で特定座標から所定距離離れた座標を基準点として、前記映像内にオブジェクトを配置することにより、拡張現実画像を生成して表示部(34)に表示する装置であり、
     当該精度補正装置は、
     現実空間での特定の2点間の実距離を表す距離情報を入力するよう構成された入力部(54)と、
     前記入力部から入力された前記距離情報と、前記拡張現実画像表示装置において認識される前記2点間の基準距離とに基づき、前記所定距離に対する補正値を設定するよう構成された補正値設定部(56)と、
     を備えている、精度補正装置。
    An accuracy correction device,
    The accuracy correction device is provided in an augmented reality image display device (5),
    The augmented reality image display device recognizes the space of the image captured by the image capturing unit (32), and arranges the object in the image using the coordinates that are a predetermined distance away from the specific coordinates in the space as a reference point. is a device that generates an augmented reality image and displays it on a display unit (34),
    The accuracy correction device is
    an input unit (54) configured to input distance information representing the actual distance between two specific points in real space;
    A correction value setting unit configured to set a correction value for the predetermined distance based on the distance information input from the input unit and the reference distance between the two points recognized by the augmented reality image display device. (56) and
    A precision correction device.
  2.  請求項1に記載の精度補正装置であって、
     前記表示部に表示されている前記拡張現実画像上に、現実空間で前記基準距離を測定可能な検証用オブジェクトを表示させるよう構成された表示制御部(52)を備え、
     前記補正値設定部は、前記入力部から入力された前記距離情報を、使用者が前記表示部に表示された前記検証用オブジェクトを利用して前記基準距離を測定した実距離として取得し、該実距離と前記基準距離とに基づき前記補正値を設定するよう構成されている、精度補正装置。
    The accuracy correction device according to claim 1,
    a display control unit (52) configured to display a verification object capable of measuring the reference distance in real space on the augmented reality image displayed on the display unit;
    The correction value setting unit acquires the distance information input from the input unit as an actual distance measured by the user using the verification object displayed on the display unit, and An accuracy correction device configured to set the correction value based on the actual distance and the reference distance.
  3.  請求項1に記載の精度補正装置であって、
     前記入力部は、前記距離情報として、現実空間で一定距離離れた2点を指定可能に構成されており、
     前記補正値設定部は、前記入力部を介して指定された前記2点の間の距離を、前記拡張現実画像表示装置に認識させて、前記基準距離として取得し、該基準距離と、現実空間での前記2点間の実距離とに基づき、前記補正値を設定するよう構成されている、精度補正装置。
    The accuracy correction device according to claim 1,
    The input unit is configured to be able to specify two points separated by a certain distance in real space as the distance information,
    The correction value setting unit causes the augmented reality image display device to recognize the distance between the two points specified via the input unit, acquires it as the reference distance, an accuracy correction device configured to set the correction value based on the actual distance between the two points at .
  4.  請求項1に記載の精度補正装置であって、
     前記表示部に表示されている前記拡張現実画像上に、現実空間で前記基準距離を測定可能な検証用オブジェクトと、該基準距離を表す数値情報とを表示させるよう構成された表示制御部(52)を備え、
     前記補正値設定部は、
     前記入力部から入力された前記距離情報を、使用者が前記検証用オブジェクトを利用して前記基準距離を測定した実距離と、前記数値情報にて特定される前記基準距離との差を表す誤差情報、として取得し、
     該誤差情報に基づき前記補正値を設定するよう構成されている、精度補正装置。
    The accuracy correction device according to claim 1,
    A display control unit (52) configured to display a verification object capable of measuring the reference distance in real space and numerical information representing the reference distance on the augmented reality image displayed on the display unit ),
    The correction value setting unit
    An error representing a difference between an actual distance measured by a user using the verification object for the distance information input from the input unit and the reference distance specified by the numerical information information, obtained as
    An accuracy correction device configured to set the correction value based on the error information.
  5.  請求項1~請求項4の何れか1項に記載の精度補正装置であって、
     前記入力部は、外部機器(58)から取得した情報に従い、前記距離情報を入力するよう構成されている、精度補正装置。
    The accuracy correction device according to any one of claims 1 to 4,
    The accuracy correction device, wherein the input unit is configured to input the distance information according to information acquired from an external device (58).
  6.  周囲の映像を撮影するよう構成された撮影部(32)と、
     前記撮影部により撮影された前記映像の空間を認識するよう構成された空間認識部(35)と、
     前記空間認識部にて認識された前記空間内で、特定座標から所定距離離れた座標を基準点として、前記映像内にオブジェクトを配置することにより、拡張現実画像を生成するよう構成された画像生成部(39)と、
     前記画像生成部にて生成された前記拡張現実画像を表示するよう構成された表示部(34)と、
     現実空間での特定の2点間の実距離を表す距離情報を入力するよう構成された入力部(54)と、
     前記入力部から入力された前記距離情報と、前記画像生成部が前記空間内で認識している前記2点間の基準距離とに基づき、前記所定距離に対する補正値を設定するよう構成された補正値設定部(56)と、
     を備え、前記画像生成部は、前記補正値に基づき前記所定距離を補正して前記基準点を設定するよう構成されている、拡張現実画像表示装置。
    a capturing unit (32) configured to capture an image of the surroundings;
    a space recognition unit (35) configured to recognize the space of the image captured by the capturing unit;
    An image generator configured to generate an augmented reality image by arranging an object in the video using coordinates that are a predetermined distance away from specific coordinates in the space recognized by the space recognition unit as reference points. a part (39);
    a display unit (34) configured to display the augmented reality image generated by the image generator;
    an input unit (54) configured to input distance information representing the actual distance between two specific points in real space;
    A correction configured to set a correction value for the predetermined distance based on the distance information input from the input unit and the reference distance between the two points recognized by the image generation unit in the space. a value setting unit (56);
    and wherein the image generation unit is configured to correct the predetermined distance based on the correction value and set the reference point.
  7.  請求項6に記載の拡張現実画像表示装置であって、
     前記拡張現実画像表示装置は、使用者が装着可能なゴーグルである、拡張現実画像表示装置。
    The augmented reality image display device according to claim 6,
    An augmented reality image display device, wherein the augmented reality image display device is goggles wearable by a user.
PCT/JP2022/013665 2021-03-26 2022-03-23 Accuracy correction device and augmented reality image display device WO2022202926A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021053811A JP2022150965A (en) 2021-03-26 2021-03-26 Accuracy correction device and augmented reality image display device
JP2021-053811 2021-03-26

Publications (1)

Publication Number Publication Date
WO2022202926A1 true WO2022202926A1 (en) 2022-09-29

Family

ID=83395663

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/013665 WO2022202926A1 (en) 2021-03-26 2022-03-23 Accuracy correction device and augmented reality image display device

Country Status (2)

Country Link
JP (1) JP2022150965A (en)
WO (1) WO2022202926A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019121322A (en) * 2018-01-11 2019-07-22 株式会社デンソー Installation position information providing device and installation position information providing method
JP2020008917A (en) * 2018-07-03 2020-01-16 株式会社Eidea Augmented reality display system, augmented reality display method, and computer program for augmented reality display

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019121322A (en) * 2018-01-11 2019-07-22 株式会社デンソー Installation position information providing device and installation position information providing method
JP2020008917A (en) * 2018-07-03 2020-01-16 株式会社Eidea Augmented reality display system, augmented reality display method, and computer program for augmented reality display

Also Published As

Publication number Publication date
JP2022150965A (en) 2022-10-07

Similar Documents

Publication Publication Date Title
CN109690623B (en) System and method for recognizing pose of camera in scene
JP4763250B2 (en) Object detection device
JP6458439B2 (en) On-vehicle camera calibration device, image generation device, on-vehicle camera calibration method, and image generation method
EP3070675B1 (en) Image processor for correcting deviation of a coordinate in a photographed image at appropriate timing
US20200264011A1 (en) Drift calibration method and device for inertial measurement unit, and unmanned aerial vehicle
JP4809019B2 (en) Obstacle detection device for vehicle
US9738223B2 (en) Dynamic guideline overlay with image cropping
US11482003B2 (en) Installation position information providing apparatus and installation position information providing method
US7697029B2 (en) Image display apparatus and method
US8594378B2 (en) 3D object detecting apparatus and 3D object detecting method
US20080170122A1 (en) Image processor, driving assistance system, and out-of-position detecting method
KR20160056129A (en) System and method for correcting position information of surrounding vehicle
US20130322697A1 (en) Speed Calculation of a Moving Object based on Image Data
US20200090517A1 (en) Parking space detection apparatus
JPWO2017145541A1 (en) Moving body
JPWO2016113875A1 (en) Information provision system for billing location evaluation
JP6669182B2 (en) Occupant monitoring device
EP1662440A1 (en) Method for determining the position of an object from a digital image
JP2018084503A (en) Distance measurement device
KR20160050439A (en) Method for adjusting output video of rear camera for vehicles
WO2022202926A1 (en) Accuracy correction device and augmented reality image display device
KR102121287B1 (en) Camera system and controlling method of Camera system
JP2010122045A (en) Pcs sensor radio wave axis adjustment apparatus and its method
US20200096606A1 (en) Vehicle inspection system and vehicle inspection method
JPWO2017042995A1 (en) In-vehicle stereo camera device and correction method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22775708

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22775708

Country of ref document: EP

Kind code of ref document: A1