WO2019003492A1 - Control device, flying body, and control program - Google Patents
Control device, flying body, and control program Download PDFInfo
- Publication number
- WO2019003492A1 WO2019003492A1 PCT/JP2018/006001 JP2018006001W WO2019003492A1 WO 2019003492 A1 WO2019003492 A1 WO 2019003492A1 JP 2018006001 W JP2018006001 W JP 2018006001W WO 2019003492 A1 WO2019003492 A1 WO 2019003492A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- projection
- captured image
- unit
- projection position
- control device
- Prior art date
Links
- 238000012937 correction Methods 0.000 claims description 104
- 238000003384 imaging method Methods 0.000 claims description 14
- 239000000284 extract Substances 0.000 claims description 9
- 238000012545 processing Methods 0.000 description 31
- 238000000034 method Methods 0.000 description 27
- 230000008569 process Effects 0.000 description 15
- 239000011159 matrix material Substances 0.000 description 11
- 238000010586 diagram Methods 0.000 description 9
- 230000033001 locomotion Effects 0.000 description 8
- 230000009466 transformation Effects 0.000 description 8
- 230000005540 biological transmission Effects 0.000 description 6
- 238000013519 translation Methods 0.000 description 5
- 238000000605 extraction Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0094—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64C—AEROPLANES; HELICOPTERS
- B64C13/00—Control systems or transmitting systems for actuating flying-control surfaces, lift-increasing flaps, air brakes, or spoilers
- B64C13/02—Initiating means
- B64C13/16—Initiating means actuated automatically, e.g. responsive to gust detectors
- B64C13/18—Initiating means actuated automatically, e.g. responsive to gust detectors using automatic pilot
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D47/00—Equipment not otherwise provided for
- B64D47/02—Arrangements or adaptations of signal or lighting devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U10/00—Type of UAV
- B64U10/10—Rotorcrafts
- B64U10/13—Flying platforms
- B64U10/14—Flying platforms with four distinct rotor axes, e.g. quadcopters
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09F—DISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
- G09F21/00—Mobile visual advertising
- G09F21/06—Mobile visual advertising by aeroplanes, airships, balloons, or kites
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
Definitions
- One aspect of the present invention relates to a control device, an aircraft, and a control program.
- Patent Document 1 discloses a technique for presenting information using a projector mounted on an unmanned aerial vehicle.
- one embodiment of the present invention aims to preferably project an image to a target position.
- a control device concerning mode 1 of the present invention is a control device which controls a mobile provided with a projection part, and a captured image including a projection image which the projection part projected is included
- the projection image acquisition unit to acquire, and the projection control unit that controls at least one of the position of the movable body, the attitude of the movable body, and the direction of the projection unit with reference to the imaged image. It is characterized by
- the projection control unit extracts a projection position of the projection image and a target projection position from the captured image, and the projection position projected by the moving object is the target
- the projection position projected by the moving body may be controlled to be a projection position.
- the projection control unit may determine the correction amount of the projection position by calculating the difference between the projection position of the current projection image and the target projection position. Good.
- the projection control unit is configured to, from the captured image, a first feature point indicating a projection position of the projection image at present and a second feature indicating a target projection position.
- the correction amount may be determined by extracting points respectively and calculating the difference in position between the first feature point and the second feature point.
- control for projecting the image more accurately by calculating the difference between the first feature point and the second feature point of the target projection position from the current projection position to obtain the correction amount The device can be realized.
- control device may control the projection position projected by the moving body with reference to a plurality of captured images captured at different times by the same imaging device.
- the correction amount of the projection position can be calculated from the temporal change of the projection image.
- control apparatus which concerns on aspect 6 of this invention may control the projection position which the said mobile body projects by controlling the position or attitude
- control device can control the projection position by controlling the position or attitude of the moving body.
- control apparatus which concerns on the aspect 7 of this invention may control the projection position which the said mobile body projects by controlling the projection direction of the said projection part.
- control device can control the projection position by controlling the projection direction of the projection unit.
- the captured image acquisition unit may acquire, as the captured image, a captured image captured by an imaging device included in the moving body.
- a control apparatus can control a projection position using the captured image which the imaging device with which a mobile body was equipped imaged.
- the captured image acquisition unit may acquire, as the captured image, a captured image captured by an imaging device provided other than the moving body.
- control device can control the projection position using the captured image captured by the imaging device provided other than the mobile object.
- control device may control a flying object as the moving object.
- the image projected by the projection unit may include information for assisting the work by the worker.
- the flying object according to aspect 12 of the present invention is a flying object including a projection unit, and a captured image acquisition unit that acquires a captured image including a projection image projected by the projection unit;
- the projection control unit controls at least one of the position of the flight object, the attitude of the flight object, and the direction of the projection unit.
- At least one of the position of the flying object, the attitude of the flying object, and the direction of the projection unit can be controlled to realize the flying object that projects an image to a target position.
- a control program according to aspect 13 of the present invention is a control program for causing a computer to function as the control device according to one aspect of the present invention, and functions the computer as the captured image acquisition unit and the projection control unit. Is a control program to
- FIG. 1 is a view showing an outline of an unmanned aerial vehicle 1 (mobile body, flight body) according to a first embodiment of the present invention.
- a mobile body is not limited to the unmanned aerial vehicle 1, It can also be set as a self-propelled robot etc. which are not a flight body according to a use.
- control device 10 When projection is performed using an aircraft including the unmanned aerial vehicle 1, it is generally difficult to project to a target position. However, according to the control device 10 according to the present embodiment, it is possible to project an image to a target position even using a flying object.
- the unmanned aerial vehicle 1 includes blades 2a to 2d, a camera 7 (imaging device), and a projector 3 (projector).
- a motor for driving the blades 2a to 2d is present inside the unmanned aerial vehicle 1, and controls the flight of the unmanned aerial vehicle 1.
- the projector 3 projects the image 8.
- the image 8 may include information for assisting the work by the worker.
- the information for assisting the work is, for example, an image for assisting the worker to pick an object smoothly in the warehouse in the present embodiment.
- the projector 3 accurately projects the image 8 to the specified position, so that it is possible to accurately present the position of the object in the warehouse to the operator via the image 8.
- the projector 3 may project information including an instruction to another apparatus, a work robot, and the like, as well as an instruction to a human worker.
- the information projected by the projector 3 may be configured to include an instruction for the other device or the other work robot as optical information.
- the information projected by the projector 3 is not particularly limited as long as it is information using light, in other words, information transmitted by light. Further, the wavelength of light to be used is not particularly limited. For example, the light may be light outside the visible region, infrared light, ultraviolet light, or the like. However, when an instruction for a worker who is a human being is included in the information projected by the projector 3, the light preferably includes a visible light region.
- the camera 7 is an optical detection element, and performs imaging in a visible light region, for example.
- the wavelength range which the camera 7 picks up is not limited in particular, as long as the camera 7 can detect the light used for the projector 3 to project information, it may detect the light outside the visible range.
- FIG. 2 is a block diagram showing the configuration of the unmanned aerial vehicle 1.
- the unmanned aerial vehicle 1 includes blades 2a to 2d, projectors 3 (projection units), motors 4a to 4d, a control device 10, a projection information acquisition unit 14, and an imaging device 18.
- the control device 10 includes a projection information storage unit 13, a captured image acquisition unit 16, and a projection control unit 6.
- the projection control unit 6 includes a captured image processing unit 11, a correction amount calculation unit 17, a flying object control unit 12, and a projector control unit 25.
- the unmanned aerial vehicle 1 acquires projection information, which is information about the image 8 to be projected, from the external device or the like by the projection information acquisition unit 14.
- the projection information also includes projection position designation information for designating the projection position of the image 8.
- the projection information acquisition unit 14 may It is good also as composition acquired separately from a device etc.
- the projection position specification information is information for specifying a projection position on which the projection image is to be projected.
- the control device 10 controls the position and the projection direction of the unmanned aerial vehicle 1 with reference to the projection position designation information.
- the projection information may be information that changes with time according to the situation of the unmanned aerial vehicle 1.
- the projection information acquisition unit 14 supplies the acquired projection information to the projection information storage unit 13.
- the projection information storage unit 13 stores projection information.
- the captured image processing unit 11 supplies the projection information acquired from the projection information storage unit 13 to the projector 3 (projection unit).
- the projector 3 projects the image 8.
- the camera 7 acquires an image including a projected image of the image 8 as a captured image.
- captured image it refers to an image captured by the camera 7.
- the “captured image” may not show a projection image of the image projected by the projector 3 at all.
- the imaging device 18 supplies a captured image to the captured image acquisition unit 16 of the control device 10.
- the captured image acquisition unit 16 supplies the captured image acquired from the camera 7 to the captured image processing unit 11 of the projection control unit 6.
- the captured image processing unit 11 refers to the projection information and extracts the current projection position and the target projection position of the image 8 from the captured image.
- the target projection position is a projection position designated by the projection position designation information. Specific extraction processing of the projection position by the captured image processing unit 11 will be described later.
- the unmanned aerial vehicle 1 is moved to a rough position with reference to the map in the warehouse, and then the target projection position is extracted from the captured image. In such control, the target projection position is not included in the captured image before moving to the rough position, and the target projection position is included in the captured image after moving to the rough position. It will be.
- the captured image processing unit 11 supplies the current projection position and the target projection position extracted from the captured image to the correction amount calculation unit 17.
- the correction amount calculation unit 17 calculates the difference between the current projection position and the target projection position, and calculates the correction amount of the projection position. Specific calculation processing of the correction amount will be described later.
- the “difference” between the current projection position and the target projection position indicates the magnitude and direction of the shift for moving the current projection position to the target projection position.
- the correction amount calculation unit 17 supplies the calculated correction amount to the aircraft control unit 12 and the projector control unit 25.
- the flying body control unit 12 controls the motors 4a to 4d with reference to the correction amount to control the drive of the blades 2a to 2d to control the position or attitude of the unmanned aerial vehicle 1.
- the projector control unit 25 controls the projection position of the image 8 by controlling the projection direction of the projector 3. Note that controlling the "posture" of a moving object means directing the moving object in a target projection direction, and also directing a self-propelled robot that is not a flying object in a target projection direction. .
- the control of the projection direction of the projector 3 may be performed by controlling only the projection direction of the projector 3 itself without moving the position or posture of the unmanned aerial vehicle 1.
- the control of the projector 3 itself is unmanned
- the position or attitude of the unmanned aerial vehicle 1 may be controlled without changing the relative projection direction to the aircraft 1.
- the projector control unit 25 is not an essential configuration of the projection control unit 6.
- the flying object control unit 12 and the projector control unit 25 determine the control method of the unmanned aerial vehicle 1 and the projector 3 in consideration of the performance of the unmanned aerial vehicle, the projected position and the ease of holding the projected attitude.
- FIG. 3 is a flowchart showing the flow of the projection position correction process.
- Step S004 First, in step S 004, the projector 3 provided in the unmanned aerial vehicle 1 projects the image 8 based on the projection information supplied from the captured image processing unit 11.
- step S006 the camera 7 acquires an image including the projected image of the image 8 projected in step S004 as a captured image.
- the camera 7 supplies the captured image to the captured image acquisition unit 16.
- step S008 the captured image acquisition unit 16 acquires a captured image acquired by the camera 7.
- the captured image acquisition unit 16 supplies the acquired captured image to the captured image processing unit 11.
- step S010 the captured image processing unit 11 extracts the current projection position and the target projection position from the captured image.
- the captured image processing unit 11 supplies the current projection position and the target projection position to the correction amount calculation unit 17. Specific extraction processing of the projection position by the captured image processing unit 11 will be described later.
- step S012 the correction amount calculation unit 17 calculates the difference between the current projection position and the target projection position.
- step S014 the correction amount calculation unit 17 determines whether the difference between the current projection position and the target projection position is within a predetermined value.
- the correction amount calculation unit 17 determines that the projection onto the target projection position has been completed, and the projection position correction processing ends. If the difference between the current projection position and the target projection position is larger than a predetermined value, then the process of step S016 is performed. In addition, what is necessary is just to set the said "fixed value" according to the use by which the control apparatus 10 is used. As an example, a configuration is conceivable in which the correction amount calculation unit 17 calculates the correction amount with one pixel in the captured image as the “constant value”.
- step S016 the correction amount calculation unit 17 calculates the correction amount of the projection position so as to minimize the difference between the current projection position and the target projection position. Note that “minimizing” the difference is not necessarily limited to a perfect match between the current projection position and the target projection position. If the difference between the current projection position and the target projection position can be projected so as to be within the "constant value" described in the explanation of step S014, the projection control unit 6 determines that the projection has been performed with sufficient accuracy.
- step S018 the correction amount calculation unit 17 supplies the correction amount obtained in step S016 to the aircraft control unit 12 and the projector control unit 25.
- step S020 the flying object control unit 12 controls the motors 4a to 4d with reference to the correction amount supplied from the correction amount calculation unit 17 to project the projection position closer to the target projection position. Control the position.
- the projector control unit 25 controls the projection position so that the projection position approaches the target projection position by controlling the projection direction of the projector 3 with reference to the correction amount supplied from the correction amount calculation unit 17.
- step S014 if the difference between the current projection position and the target projection position is within a predetermined value, the projection control unit 6 determines that the projection onto the target projection position has been completed, and ends the projection position correction process. If the difference between the current projection position and the target projection position is larger than a predetermined value, the projection control unit 6 sequentially performs each processing of steps S016 to S020, and returns to step S004 to perform processing again.
- the number of times of repeating the processes of steps S 004 to S 020 is not particularly limited, and the process can be performed until the difference between the current projection position and the target projection position becomes within a predetermined value. Further, even when the position of the unmanned aerial vehicle 1 is shifted after the position correction, the position correction can be performed by repeating the processing of S 004 to S 020 again.
- the camera 7 acquires a single captured image in step S 004, and the correction amount calculation unit 17 calculates the correction amount with reference to the single captured image.
- a plurality of captured images may be acquired at different times.
- the correction amount calculation unit 17 refers to the projection positions of the projection images in a plurality of captured images captured at different times by the same camera 7. By extracting the change over time, the correction amount can be calculated.
- the correction amount calculation unit 17 refers to the plurality of captured images captured while changing the projection direction and the projection position, and the projection position and the target projection position of the projection image in the plurality of captured images
- the direction and the magnitude of the correction can be calculated by analyzing the relative positional relationship with and the time change thereof. As described above, it is possible to calculate the direction and magnitude of correction by extracting the direction and magnitude of the deviation of the projection position by trying to rotate or translate the unmanned aerial vehicle 1.
- FIG. 4 is a diagram for explaining an example 1 of the projection position correction by the projection control unit 6.
- the projector 3 projects one point as an image.
- the camera 7 acquires a captured image including the current projection position 20 as the point.
- the captured image processing unit 11 extracts the current projection position 20 as a first feature point from the captured image supplied from the camera 7, extracts the target projection position 21 as a second feature point, and performs projection for the projection.
- the marks 22a to 22d are also extracted as feature points.
- the landmarks for projection refer to one or more landmarks to be referred to for specifying a value in the world coordinate system of the projection position 20.
- a mark for projection is, for example, arranged near the target projection position.
- the correction amount calculation unit 17 specifies specific values of the actual coordinates (x, y, z) of the current projection position 20 of the point using the coordinates of the marks 22a to 22d for projection.
- the specific values of the actual coordinates (x ', y', z ') of the target projection position 21 are predetermined by the projection control unit 6 by referring to the projection position designation information. After specifying the specific value of (x, y, z), the correction amount calculation unit 17 calculates the correction amount according to, for example, the method of Calculation Example 1 or 2 described later.
- the correction of the projection position is made by the translational motion or the rotational motion of the unmanned aerial vehicle 1, but this does not limit the present embodiment.
- the method of projection position correction is not limited to the above example.
- the correction amount calculation unit 17 calculates the correction amount using a method of directly measuring the correction amount from the image processing or the measurement result of the 3D sensor.
- the camera coordinate system is a coordinate system in which the position of the CCD or lens of the camera 7 is the origin and the optical axis direction of the camera 7 is the Z axis, and the X axis Y axis is set by the right hand system or the left hand system.
- the projector coordinate system is a coordinate system in which the position of the mirror or light source of the projector 3 is the origin, and the optical axis direction of the projector 3 is the Z axis, and the X axis Y axis is set by the right hand system or the left hand system.
- a world coordinate system is a coordinate provided in the space which the unmanned aircraft 1 moves, and it is a three-dimensional coordinate system provided in the warehouse which the unmanned aircraft 1 moves in this embodiment.
- a transformation matrix from camera coordinates to projector coordinates is expressed as R pc and T pc .
- the transformation matrix R pc represents rotation
- the transformation matrix T pc represents translation
- transformation matrices from the world coordinate system to the camera coordinate system are expressed as R cw and T cw .
- the transformation matrix R cw represents rotation
- the transformation matrix T cw represents translation.
- the correction amount calculation unit 17 uses the world coordinates (x w , y w , z w ) and the following equation 1 to obtain projector coordinates (x p , y p , z p ) can be converted.
- the correction amount calculation unit 17 uses the equation 1 to calculate the coordinates (x ′, y ′, z ′) of the target projection position in the world coordinate system and the coordinates (x, y, z) of the current projection position.
- the coordinates of the target projection position in the projector coordinate system (x g , y g , z g ) and the coordinates of the current projection position in the projector coordinate system (x a , y a , z a ) are converted.
- the correction amount calculating section 17 coordinates of the target projection position of the projector coordinate system (x g, y g, z g) and the coordinates of the current projection position in the projector coordinate system (x a, y a, z a
- the transformation matrices R * and T * are determined by substituting)) into Equation 2 below.
- the transformation matrix R * represents rotation
- T * represents translation
- the correction amount ⁇ T calculated by the correction amount calculation unit 17 is supplied to the flying object control unit 12, and the flight object control unit 12 causes the unmanned aircraft to translate by the correction amount indicated by the correction amount to project to the target projection position. It can be performed.
- the coordinates (x ', y', z ') of the target projection position in the world coordinate system and the coordinates (x, y, z) of the current projection position are the target projection in the projector coordinate system.
- the coordinates (x g , y g , z g ) of the position and the coordinates (x a , y a , z a ) of the current projection position in the projector coordinate system are converted, and substituted into Expression 2 in the same manner as Calculation Example 1.
- Equations 3 and 4 can be used.
- the correction amount ⁇ R calculated by the correction amount calculation unit 17 is supplied to the flying object control unit 12, and the flight object control unit 12 projects the unmanned aircraft to the target projection position by rotating the unmanned aircraft by the correction amount indicated by the correction amount. It can be performed.
- FIG. 5 is a diagram for explaining an example 2 of the projection position correction.
- FIG. 5A shows an image 24 projected by the projector 3.
- FIG. 5 (b) is an example of a captured image captured by the camera 7, which includes the current projected position 20 of the point and the target projected position 21.
- the camera 7 acquires a captured image including the current projection position 20 of the point.
- the captured image processing unit 11 determines from the captured image supplied from the camera 7 the feature points 23a to 23d (first feature points) in the current projection position 20 of the image 24, and a mark for projection (second Feature points) 22a to 22d are extracted.
- the conversion from camera coordinates to projector coordinates may be performed using the above-described equation 1.
- FIG. 6 is a block diagram showing a schematic configuration of a projection system 5 according to a second embodiment.
- the projection system 5 includes the unmanned aerial vehicle 1 and a server 60.
- the unmanned aerial vehicle 1 includes blades 2a to 2d, projectors 3, motors 4a to 4d, a control device 10, a projection information acquisition unit 14, and a correction amount reception unit 9.
- the control device 10 includes a projection information storage unit 13, a captured image processing unit 11 a, a flying object control unit 12, and a projector control unit 25.
- the projector 3 is not an essential component of the control device 10.
- the server 60 includes a projection control unit 6 a, a camera 7, a captured image acquisition unit 16, and a correction amount transmission unit 19.
- the projection control unit 6 a includes a captured image processing unit 11 b and a correction amount calculation unit 17.
- the camera 7 provided in the server 60 captures an image projected by the projector 3 provided in the unmanned aerial vehicle 1, and the server 60 extracts and corrects the projection position described in the first embodiment.
- the amount calculation process is performed, and the calculated correction amount is transmitted from the server 60 to the unmanned aerial vehicle 1.
- FIG. 7 is a sequence diagram showing a flow of projection position correction processing.
- steps S 004 to S 020 may be repeated multiple times until the difference between the target projection position and the current projection position is within a certain range.
- Step S004 As in the first embodiment, the projector 3 provided in the unmanned aerial vehicle 1 projects the image 8.
- Step S006 the camera 7 provided in the server 60 acquires an image including the projection image of the image 8 projected in step S004 as a captured image.
- the camera 7 supplies the captured image to the captured image acquisition unit 16.
- step S008 the captured image acquisition unit 16 included in the server 60 acquires a captured image acquired by the camera 7.
- the captured image acquisition unit 16 supplies the acquired captured image to the captured image processing unit 11.
- step S010 the captured image processing unit 11 extracts the current projection position and the target projection position from the captured image.
- the captured image processing unit 11 supplies the current projection position and the target projection position to the correction amount calculation unit 17.
- the specific extraction process of the projection position by the captured image processing unit 11 is the same as that of the first embodiment.
- step S012 the correction amount calculation unit 17 calculates the difference between the current projection position and the target projection position.
- step S016 the correction amount calculation unit 17 calculates the correction amount of the projection position so as to minimize the difference between the current projection position and the target projection position.
- the specific content of step S016 is the same as that of the first embodiment.
- step S018 the correction amount calculation unit 17 sends the correction amount obtained in step S016 to the correction amount transmission unit 19.
- the correction amount transmission unit 19 transmits the correction amount to the correction amount reception unit 9 of the unmanned aerial vehicle 1.
- the correction amount reception unit 9 receives the correction amount from the correction amount transmission unit 19, and supplies the correction amount to the aircraft control unit 12 and the projector control unit 25.
- step S020 the flying object control unit 12 controls the motors 4a to 4d with reference to the correction amount to control the projection position so that the projection position approaches the target projection position.
- the projector control unit 25 controls the projector 3 by controlling the projector 3 with reference to the correction amount so as to bring the projection position close to the target projection position.
- the control of the aircraft 1 and the projector 3 is the same as in the first embodiment.
- the camera 7 is provided in addition to the unmanned aerial vehicle 1. More specifically, the camera 7 is provided on the server 60 side. For this reason, it is also possible to provide one server 60 for each area, and to perform projection position correction of a plurality of unmanned aerial vehicles in the area by one server 60. Further, in the present embodiment, since the projection position extraction process and the correction amount calculation process are performed in the server 60, the control device 10 can be realized with a relatively simple configuration.
- control block of the control device 10 may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like, or may be realized by software.
- the control device 10 includes a computer that executes instructions of a program that is software that implements each function.
- the computer includes, for example, one or more processors, and a computer readable recording medium storing the program.
- the processor reads the program from the recording medium and executes the program to achieve the object of the present invention.
- a CPU Central Processing Unit
- the above-mentioned recording medium a tape, a disk, a card, a semiconductor memory, a programmable logic circuit or the like can be used besides “a non-temporary tangible medium”, for example, a ROM (Read Only Memory).
- a RAM Random Access Memory
- the program may be supplied to the computer via any transmission medium (communication network, broadcast wave, etc.) capable of transmitting the program.
- any transmission medium communication network, broadcast wave, etc.
- one aspect of the present invention can also be realized in the form of a data signal embedded in a carrier wave in which the program is embodied by electronic transmission.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Mechanical Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Accounting & Taxation (AREA)
- Marketing (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Projection Apparatus (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Controls And Circuits For Display Device (AREA)
- Transforming Electric Information Into Light Information (AREA)
Abstract
The present invention addresses the problem of suitably projecting an image to a target position. A control device (10) controls a mobile body (1) that is provided with a projection unit (3). The control device (10) is provided with: a picked up image acquisition unit (16) that acquires a picked up image (8); and a projection control unit (6), which refers to the picked up image (8), and controls the position of the mobile body (1) and/or attitude of the mobile body (1) and/or direction of the projection unit (3).
Description
本発明の一様態は、制御装置、飛行体、および制御プログラムに関する。
One aspect of the present invention relates to a control device, an aircraft, and a control program.
従来、投影部を備えた移動体により、画像を投影する技術が知られている。特許文献1には、無人航空機に搭載したプロジェクタを用いて、情報を提示する技術が開示されている。
BACKGROUND ART Conventionally, there is known a technique for projecting an image by a moving body provided with a projection unit. Patent Document 1 discloses a technique for presenting information using a projector mounted on an unmanned aerial vehicle.
しかしながら、上述のような従来技術では、移動体に備えられた投影部により、目標となる位置へ画像を投影することは容易ではなかった。
However, in the prior art as described above, it is not easy to project an image to a target position by the projection unit provided on the moving body.
そこで本発明の一態様は、目標となる位置へ好適に画像を投影することを目的とする。
Therefore, one embodiment of the present invention aims to preferably project an image to a target position.
上記の課題を解決するために、本発明の態様1に係る制御装置は、投影部を備えている移動体を制御する制御装置であって、前記投影部が投影した投影像を含む撮像画像を取得する撮像画像取得部と、前記撮像画像を参照して、前記移動体の位置、前記移動体の姿勢、および前記投影部の方向のうち少なくとも1つを制御する投影制御部とを備えていることを特徴とする。
In order to solve the above-mentioned subject, a control device concerning mode 1 of the present invention is a control device which controls a mobile provided with a projection part, and a captured image including a projection image which the projection part projected is included The projection image acquisition unit to acquire, and the projection control unit that controls at least one of the position of the movable body, the attitude of the movable body, and the direction of the projection unit with reference to the imaged image. It is characterized by
上記の構成によれば、目標となる位置へ好適に画像を投影する制御装置を実現することができる。
According to the above configuration, it is possible to realize a control device that projects an image suitably to a target position.
また、本発明の態様2に係る制御装置において、前記投影制御部は、前記撮像画像から、前記投影像の投影位置と目標投影位置とを抽出し、前記移動体が投影する投影位置が前記目標投影位置となるように、前記移動体が投影する投影位置を制御してもよい。
In the control device according to aspect 2 of the present invention, the projection control unit extracts a projection position of the projection image and a target projection position from the captured image, and the projection position projected by the moving object is the target The projection position projected by the moving body may be controlled to be a projection position.
上記の構成によれば、移動体が投影する投影位置が目標投影位置となるように画像を投影する制御装置を実現することができる。
According to the above configuration, it is possible to realize a control device that projects an image such that the projection position projected by the moving object is the target projection position.
また、本発明の態様3に係る制御装置において、前記投影制御部は、現在の前記投影像の投影位置と目標投影位置との差を算出することにより、投影位置の補正量を決定してもよい。
Further, in the control device according to aspect 3 of the present invention, the projection control unit may determine the correction amount of the projection position by calculating the difference between the projection position of the current projection image and the target projection position. Good.
上記の構成によれば、移動体が投影する投影位置が目標投影位置となるように画像を投影する制御装置を実現することができる。
According to the above configuration, it is possible to realize a control device that projects an image such that the projection position projected by the moving object is the target projection position.
また、本発明の態様4に係る制御装置において、前記投影制御部は、前記撮像画像から、現在の前記投影像の投影位置を示す第一の特徴点、および目標投影位置を示す第二の特徴点をそれぞれ抽出し、第一の特徴点と第二の特徴点との位置の差を算出することにより、前記補正量を決定してもよい。
Further, in the control device according to the fourth aspect of the present invention, the projection control unit is configured to, from the captured image, a first feature point indicating a projection position of the projection image at present and a second feature indicating a target projection position. The correction amount may be determined by extracting points respectively and calculating the difference in position between the first feature point and the second feature point.
上記の構成によれば、現在の投影位置から第一の特徴点と、目標投影位置の第二の特徴点との差を算出して補正量を求めることにより、より精度よく画像を投影する制御装置を実現することができる。
According to the above configuration, control for projecting the image more accurately by calculating the difference between the first feature point and the second feature point of the target projection position from the current projection position to obtain the correction amount The device can be realized.
また、本発明の態様5に係る制御装置は、同一の撮像装置によって互いに異なる時刻に撮像された複数の撮像画像を参照して前記移動体が投影する投影位置を制御してもよい。
Further, the control device according to aspect 5 of the present invention may control the projection position projected by the moving body with reference to a plurality of captured images captured at different times by the same imaging device.
上記の構成によれば、投影像の経時変化から、投影位置の補正量を算出することができる。
According to the above configuration, the correction amount of the projection position can be calculated from the temporal change of the projection image.
また、本発明の態様6に係る制御装置は、前記移動体の位置又は姿勢を制御することにより、前記移動体が投影する投影位置を制御してもよい。
Moreover, the control apparatus which concerns on aspect 6 of this invention may control the projection position which the said mobile body projects by controlling the position or attitude | position of the said mobile body.
上記の構成によれば、制御装置は、移動体の位置又は姿勢を制御することにより、投影位置を制御することができる。
According to the above configuration, the control device can control the projection position by controlling the position or attitude of the moving body.
また、本発明の態様7に係る制御装置は、前記投影部の投影方向を制御することにより、前記移動体が投影する投影位置を制御してもよい。
Moreover, the control apparatus which concerns on the aspect 7 of this invention may control the projection position which the said mobile body projects by controlling the projection direction of the said projection part.
上記の構成によれば、制御装置は、投影部の投影方向を制御することにより、投影位置を制御することができる。
According to the above configuration, the control device can control the projection position by controlling the projection direction of the projection unit.
また、本発明の態様8に係る制御装置において、前記撮像画像取得部は、前記撮像画像として、前記移動体が備える撮像装置によって撮像された撮像画像を取得してもよい。
Further, in the control device according to aspect 8 of the present invention, the captured image acquisition unit may acquire, as the captured image, a captured image captured by an imaging device included in the moving body.
上記の構成によれば、制御装置は、移動体が備える撮像装置が撮像した撮像画像を用いて、投影位置を制御することができる。
According to said structure, a control apparatus can control a projection position using the captured image which the imaging device with which a mobile body was equipped imaged.
また、本発明の態様9に係る制御装置において、前記撮像画像取得部は、前記撮像画像として、前記移動体以外が備える撮像装置によって撮像された撮像画像を取得してもよい。
Further, in the control device according to aspect 9 of the present invention, the captured image acquisition unit may acquire, as the captured image, a captured image captured by an imaging device provided other than the moving body.
上記の構成によれば、制御装置は、移動体以外が備える撮像装置が撮像した撮像画像を用いて、投影位置を制御することができる。
According to the above configuration, the control device can control the projection position using the captured image captured by the imaging device provided other than the mobile object.
また、本発明の態様10に係る制御装置は、前記移動体として飛行体を制御してもよい。
Further, the control device according to aspect 10 of the present invention may control a flying object as the moving object.
上記の構成によれば、移動体として飛行体を用いて、目標となる位置へ画像を投影する制御装置を実現することができる。
According to the above configuration, it is possible to realize a control device that projects an image to a target position using a flying object as a moving object.
また、本発明の態様11に係る制御装置において、前記投影部により投影される画像には、作業者による作業を補助するための情報が含まれていてもよい。
Further, in the control device according to the eleventh aspect of the present invention, the image projected by the projection unit may include information for assisting the work by the worker.
上記の構成によれば、正確な位置で作業者に作業を補助するための情報を提示する制御装置を実現することが可能となる。
According to the above configuration, it is possible to realize a control device that presents information for assisting the worker at an accurate position.
また、本発明の態様12に係る飛行体は、投影部を備えている飛行体であって、前記投影部が投影した投影像を含む撮像画像を取得する撮像画像取得部と、前記撮像画像を参照して、前記飛行体の位置、前記飛行体の姿勢、および前記投影部の方向のうち少なくとも1つを制御する投影制御部とを備えている。
The flying object according to aspect 12 of the present invention is a flying object including a projection unit, and a captured image acquisition unit that acquires a captured image including a projection image projected by the projection unit; The projection control unit controls at least one of the position of the flight object, the attitude of the flight object, and the direction of the projection unit.
上記の構成によれば、飛行体の位置、飛行体の姿勢、および投影部の方向のうち少なくとも1つを制御して、目標となる位置へ画像を投影する飛行体を実現することができる。
According to the above configuration, at least one of the position of the flying object, the attitude of the flying object, and the direction of the projection unit can be controlled to realize the flying object that projects an image to a target position.
また、本発明の態様13に係る制御プログラムは、本発明の一態様に係る制御装置としてコンピュータを機能させるための制御プログラムであって、前記撮像画像取得部、および前記投影制御部としてコンピュータを機能させるための制御プログラムである。
A control program according to aspect 13 of the present invention is a control program for causing a computer to function as the control device according to one aspect of the present invention, and functions the computer as the captured image acquisition unit and the projection control unit. Is a control program to
上記の構成によれば、上記態様1と同様の効果を奏することができる。
According to the above configuration, the same effect as that of the above aspect 1 can be obtained.
本発明の一態様によれば、目標となる位置へ好適に画像を投影する制御装置を実現することができる。
According to one aspect of the present invention, it is possible to realize a control device that projects an image suitably to a target position.
以下、図面に基づいて本発明の実施の形態を詳細に説明する。ただし、本実施形態に記載されている構成は、特に特定的な記載がない限り、この発明の範囲をそれのみに限定する趣旨ではなく、単なる説明例に過ぎない。また説明の便宜上、各実施形態に示した部材と同一の機能を有する部材については、同一の符号を付し、適宜その説明を省略する。
Hereinafter, embodiments of the present invention will be described in detail based on the drawings. However, the configuration described in the present embodiment is not intended to limit the scope of the present invention to only that, unless specifically described otherwise, and is merely an illustrative example. Moreover, about the member which has the function same as the member shown to each embodiment for convenience of explanation, the same code | symbol is attached | subjected and the description is abbreviate | omitted suitably.
〔実施形態1〕
(無人航空機1の構成)
図1は、本発明の実施形態1に係る無人航空機1(移動体、飛行体)の概要を示す図である。なお、移動体は無人航空機1に限定されず、用途に応じて飛行体ではない自走式等のロボットとすることもできる。Embodiment 1
(Configuration of the UAV 1)
FIG. 1 is a view showing an outline of an unmanned aerial vehicle 1 (mobile body, flight body) according to a first embodiment of the present invention. In addition, a mobile body is not limited to the unmannedaerial vehicle 1, It can also be set as a self-propelled robot etc. which are not a flight body according to a use.
(無人航空機1の構成)
図1は、本発明の実施形態1に係る無人航空機1(移動体、飛行体)の概要を示す図である。なお、移動体は無人航空機1に限定されず、用途に応じて飛行体ではない自走式等のロボットとすることもできる。
(Configuration of the UAV 1)
FIG. 1 is a view showing an outline of an unmanned aerial vehicle 1 (mobile body, flight body) according to a first embodiment of the present invention. In addition, a mobile body is not limited to the unmanned
無人航空機1を含む飛行体を用いて投影を行う場合、一般的に、目標となる位置へ投影することが難しい。しかしながら、本実施形態に係る制御装置10によれば、飛行体を用いても、目標となる位置へ画像を投影することが可能である。
When projection is performed using an aircraft including the unmanned aerial vehicle 1, it is generally difficult to project to a target position. However, according to the control device 10 according to the present embodiment, it is possible to project an image to a target position even using a flying object.
図1に示すように、無人航空機1は、ブレード2a~2d、カメラ7(撮像装置)、およびプロジェクタ3(投影部)を備える。なお、図1には図示していないが、ブレード2a~2dを駆動させるモータが無人航空機1の内部に存在し、無人航空機1の飛行を制御している。
As shown in FIG. 1, the unmanned aerial vehicle 1 includes blades 2a to 2d, a camera 7 (imaging device), and a projector 3 (projector). Although not shown in FIG. 1, a motor for driving the blades 2a to 2d is present inside the unmanned aerial vehicle 1, and controls the flight of the unmanned aerial vehicle 1.
プロジェクタ3は、画像8を投影する。なお、以下単に「画像」といった場合、プロジェクタ3が投影する画像を指すものとする。画像8は、作業者による作業を補助するための情報を含むでもよい。作業を補助するための情報とは、例えば、本実施形態においては倉庫内において作業者が対象物を円滑にピッキングできるように補助する画像である。プロジェクタ3が指定の位置へ正確に画像8を投影することにより、画像8を介して、作業者に倉庫内での対象物の位置を正確に提示することが可能である。
The projector 3 projects the image 8. In the following, when simply referred to as an "image", the image projected by the projector 3 is referred to. The image 8 may include information for assisting the work by the worker. The information for assisting the work is, for example, an image for assisting the worker to pick an object smoothly in the warehouse in the present embodiment. The projector 3 accurately projects the image 8 to the specified position, so that it is possible to accurately present the position of the object in the warehouse to the operator via the image 8.
なお、プロジェクタ3は、人間である作業者への指示だけでなく、他の装置や作業用ロボット等への指示を含む情報を投影してもよい。その場合、プロジェクタ3が投影する情報には、当該他の装置や他の作業用ロボットに対する指示を光学的情報として含ませる構成とすればよい。
In addition, the projector 3 may project information including an instruction to another apparatus, a work robot, and the like, as well as an instruction to a human worker. In that case, the information projected by the projector 3 may be configured to include an instruction for the other device or the other work robot as optical information.
プロジェクタ3が投影する情報は、光を用いた情報、換言すれば光により伝達される情報であれば特に限定されない。また、用いる光の波長は特に限定されない。例えば、当該光は、可視領域外の光であってもよいし、赤外線や紫外線等であってもよい。ただし、人間である作業者に対する指示をプロジェクタ3が投影する情報に含める場合、当該光は可視光領域を含むことが好ましい。
The information projected by the projector 3 is not particularly limited as long as it is information using light, in other words, information transmitted by light. Further, the wavelength of light to be used is not particularly limited. For example, the light may be light outside the visible region, infrared light, ultraviolet light, or the like. However, when an instruction for a worker who is a human being is included in the information projected by the projector 3, the light preferably includes a visible light region.
本実施形態において、カメラ7は、光学的な検出素子であり、例えば、可視光領域で撮像を行う。しかしながら、カメラ7が撮像する波長領域は特に限定されるものではなく、カメラ7は、プロジェクタ3が情報を投影するために用いる光を検出できればよく、可視領域外の光を検出してもよい。
In the present embodiment, the camera 7 is an optical detection element, and performs imaging in a visible light region, for example. However, the wavelength range which the camera 7 picks up is not limited in particular, as long as the camera 7 can detect the light used for the projector 3 to project information, it may detect the light outside the visible range.
(無人航空機1の構成)
図2を参照して、無人航空機1の構成について説明する。図2は、無人航空機1の構成を示すブロック図である。図2に示すように、無人航空機1は、ブレード2a~2d、プロジェクタ3(投影部)、モータ4a~4d、制御装置10、投影情報取得部14、および撮像装置18を備えている。制御装置10は、投影情報記憶部13、撮像画像取得部16、および投影制御部6を備えている。投影制御部6は、撮像画像処理部11、補正量算出部17、飛行体制御部12、およびプロジェクタ制御部25を備えている。 (Configuration of the UAV 1)
The configuration of the unmannedaerial vehicle 1 will be described with reference to FIG. FIG. 2 is a block diagram showing the configuration of the unmanned aerial vehicle 1. As shown in FIG. 2, the unmanned aerial vehicle 1 includes blades 2a to 2d, projectors 3 (projection units), motors 4a to 4d, a control device 10, a projection information acquisition unit 14, and an imaging device 18. The control device 10 includes a projection information storage unit 13, a captured image acquisition unit 16, and a projection control unit 6. The projection control unit 6 includes a captured image processing unit 11, a correction amount calculation unit 17, a flying object control unit 12, and a projector control unit 25.
図2を参照して、無人航空機1の構成について説明する。図2は、無人航空機1の構成を示すブロック図である。図2に示すように、無人航空機1は、ブレード2a~2d、プロジェクタ3(投影部)、モータ4a~4d、制御装置10、投影情報取得部14、および撮像装置18を備えている。制御装置10は、投影情報記憶部13、撮像画像取得部16、および投影制御部6を備えている。投影制御部6は、撮像画像処理部11、補正量算出部17、飛行体制御部12、およびプロジェクタ制御部25を備えている。 (Configuration of the UAV 1)
The configuration of the unmanned
本実施形態においては、無人航空機1は、投影情報取得部14により、投影する画像8についての情報である投影情報を、外部の装置等から取得する。一例として、投影情報は、画像8の投影位置を指定する投影位置指定情報も含んでいるが、これは本実施形態を限定するものではなく、投影情報取得部14は、投影位置指定情報を他の装置等から別途取得する構成としてもよい。ここで、投影位置指定情報とは、当該投影画像を投影する投影位置を指定する情報である。制御装置10は、投影位置指定情報を参照して、無人航空機1の位置及び投影方向を制御する。
In the present embodiment, the unmanned aerial vehicle 1 acquires projection information, which is information about the image 8 to be projected, from the external device or the like by the projection information acquisition unit 14. As an example, the projection information also includes projection position designation information for designating the projection position of the image 8. However, this does not limit the present embodiment, and the projection information acquisition unit 14 may It is good also as composition acquired separately from a device etc. Here, the projection position specification information is information for specifying a projection position on which the projection image is to be projected. The control device 10 controls the position and the projection direction of the unmanned aerial vehicle 1 with reference to the projection position designation information.
また、投影情報は、無人航空機1の状況に合わせて、時間と共に変化する情報であり得る。
Also, the projection information may be information that changes with time according to the situation of the unmanned aerial vehicle 1.
投影情報取得部14は取得した投影情報を投影情報記憶部13に供給する。投影情報記憶部13は、投影情報を記憶する。
The projection information acquisition unit 14 supplies the acquired projection information to the projection information storage unit 13. The projection information storage unit 13 stores projection information.
撮像画像処理部11は、投影情報記憶部13から取得した投影情報を、プロジェクタ3(投影部)に供給する。プロジェクタ3は、画像8を投影する。
The captured image processing unit 11 supplies the projection information acquired from the projection information storage unit 13 to the projector 3 (projection unit). The projector 3 projects the image 8.
カメラ7(撮像装置)は、画像8の投影像を含む画像を、撮像画像として取得する。以下、「撮像画像」といった場合、カメラ7により撮像された画像を指すものとする。なお、カメラ7の撮像方向及びプロジェクタ3の投影方向如何によっては、「撮像画像」には、プロジェクタ3が投影した画像の投影像が全く映っていない場合もあり得る。
The camera 7 (imaging device) acquires an image including a projected image of the image 8 as a captured image. Hereinafter, in the case of “captured image”, it refers to an image captured by the camera 7. Depending on the imaging direction of the camera 7 and the projection direction of the projector 3, the “captured image” may not show a projection image of the image projected by the projector 3 at all.
撮像装置18は、撮像画像を制御装置10の撮像画像取得部16に供給する。撮像画像取得部16は、カメラ7から取得した撮像画像を、投影制御部6の撮像画像処理部11に供給する。撮像画像処理部11は、投影情報を参照し、撮像画像から、画像8の現在の投影位置および目標投影位置を抽出する。ここで、目標投影位置とは、投影位置指定情報によって指定される投影位置である。撮像画像処理部11による投影位置の具体的な抽出処理については、後述する。なお、一例としての制御において、無人航空機1を、倉庫内の地図を参照して、大まかな位置まで移動させ、そのうえで、撮像画像から目標投影位置を抽出する。このような制御では、当該大まかな位置まで移動する前には、撮像画像中に目標投影位置が含まれておらず、当該大まかな位置まで移動した後に、撮像画像中に目標投影位置が含まれることになる。
The imaging device 18 supplies a captured image to the captured image acquisition unit 16 of the control device 10. The captured image acquisition unit 16 supplies the captured image acquired from the camera 7 to the captured image processing unit 11 of the projection control unit 6. The captured image processing unit 11 refers to the projection information and extracts the current projection position and the target projection position of the image 8 from the captured image. Here, the target projection position is a projection position designated by the projection position designation information. Specific extraction processing of the projection position by the captured image processing unit 11 will be described later. In the control as an example, the unmanned aerial vehicle 1 is moved to a rough position with reference to the map in the warehouse, and then the target projection position is extracted from the captured image. In such control, the target projection position is not included in the captured image before moving to the rough position, and the target projection position is included in the captured image after moving to the rough position. It will be.
撮像画像処理部11は、撮像画像から抽出した現在の投影位置および目標投影位置を、補正量算出部17に供給する。補正量算出部17は、現在の投影位置と目標投影位置との差を算出し、投影位置の補正量を算出する。具体的な補正量の算出処理については、後述する。なお、現在の投影位置および目標投影位置の「差」とは、現在の投影位置を目標投影位置へと移動するためのずれの大きさ及び方向を指す。
The captured image processing unit 11 supplies the current projection position and the target projection position extracted from the captured image to the correction amount calculation unit 17. The correction amount calculation unit 17 calculates the difference between the current projection position and the target projection position, and calculates the correction amount of the projection position. Specific calculation processing of the correction amount will be described later. The “difference” between the current projection position and the target projection position indicates the magnitude and direction of the shift for moving the current projection position to the target projection position.
補正量算出部17は、算出した補正量を、飛行体制御部12およびプロジェクタ制御部25に供給する。飛行体制御部12は、補正量を参照して、モータ4a~4dを制御することによりブレード2a~2dの駆動を制御して無人航空機1の位置又は姿勢を制御する。また、プロジェクタ制御部25は、プロジェクタ3の投影方向を制御することにより、画像8の投影位置を制御する。なお、移動体の「姿勢」を制御するとは、移動体を、目標とする投影方向に向けることを指し、飛行体ではない自走式等のロボットを、目標とする投影方向に向けることも含む。
The correction amount calculation unit 17 supplies the calculated correction amount to the aircraft control unit 12 and the projector control unit 25. The flying body control unit 12 controls the motors 4a to 4d with reference to the correction amount to control the drive of the blades 2a to 2d to control the position or attitude of the unmanned aerial vehicle 1. Further, the projector control unit 25 controls the projection position of the image 8 by controlling the projection direction of the projector 3. Note that controlling the "posture" of a moving object means directing the moving object in a target projection direction, and also directing a self-propelled robot that is not a flying object in a target projection direction. .
このように、プロジェクタ3の投影方向の制御は、無人航空機1の位置や姿勢を移動させずにプロジェクタ3自体の投影方向のみを制御することによりなされてもよく、逆にプロジェクタ3自体の、無人航空機1に対する相対的な投影方向を変化させずに無人航空機1の位置又は姿勢を制御することによってもよい。なお、プロジェクタ3自体の相対的な投影方向を変化さない構成の場合、プロジェクタ制御部25は、投影制御部6の必須の構成ではない。
As described above, the control of the projection direction of the projector 3 may be performed by controlling only the projection direction of the projector 3 itself without moving the position or posture of the unmanned aerial vehicle 1. Conversely, the control of the projector 3 itself is unmanned The position or attitude of the unmanned aerial vehicle 1 may be controlled without changing the relative projection direction to the aircraft 1. In the case of a configuration in which the relative projection direction of the projector 3 itself is not changed, the projector control unit 25 is not an essential configuration of the projection control unit 6.
飛行体制御部12およびプロジェクタ制御部25は、無人航空機の性能、投影位置および投影姿勢保持の容易性等を考慮して、無人航空機1およびプロジェクタ3の制御方法を決定する。
The flying object control unit 12 and the projector control unit 25 determine the control method of the unmanned aerial vehicle 1 and the projector 3 in consideration of the performance of the unmanned aerial vehicle, the projected position and the ease of holding the projected attitude.
図3を参照して、無人航空機1による投影位置補正処理の流れについて説明する。図3は、投影位置補正処理の流れを示すフローチャートである。
The flow of the projection position correction process by the unmanned aerial vehicle 1 will be described with reference to FIG. FIG. 3 is a flowchart showing the flow of the projection position correction process.
(ステップS004)
まず、ステップS004において、無人航空機1に備えられたプロジェクタ3が、撮像画像処理部11から供給された投影情報に基づき、画像8を投影する。 (Step S004)
First, in step S 004, theprojector 3 provided in the unmanned aerial vehicle 1 projects the image 8 based on the projection information supplied from the captured image processing unit 11.
まず、ステップS004において、無人航空機1に備えられたプロジェクタ3が、撮像画像処理部11から供給された投影情報に基づき、画像8を投影する。 (Step S004)
First, in step S 004, the
(ステップS006)
続いて、ステップS006において、カメラ7が、ステップS004において投影された画像8の投影像を含む画像を、撮像画像として取得する。カメラ7は、撮像画像を、撮像画像取得部16に供給する。 (Step S006)
Subsequently, in step S006, thecamera 7 acquires an image including the projected image of the image 8 projected in step S004 as a captured image. The camera 7 supplies the captured image to the captured image acquisition unit 16.
続いて、ステップS006において、カメラ7が、ステップS004において投影された画像8の投影像を含む画像を、撮像画像として取得する。カメラ7は、撮像画像を、撮像画像取得部16に供給する。 (Step S006)
Subsequently, in step S006, the
(ステップS008)
続いて、ステップS008において、撮像画像取得部16は、カメラ7が取得した撮像画像を取得する。撮像画像取得部16は、取得した撮像画像を、撮像画像処理部11へ供給する。 (Step S008)
Subsequently, in step S008, the capturedimage acquisition unit 16 acquires a captured image acquired by the camera 7. The captured image acquisition unit 16 supplies the acquired captured image to the captured image processing unit 11.
続いて、ステップS008において、撮像画像取得部16は、カメラ7が取得した撮像画像を取得する。撮像画像取得部16は、取得した撮像画像を、撮像画像処理部11へ供給する。 (Step S008)
Subsequently, in step S008, the captured
(ステップS010)
続いて、ステップS010において、撮像画像処理部11は、撮像画像中から、現在の投影位置および目標投影位置を抽出する。撮像画像処理部11は、現在の投影位置および目標投影位置を、補正量算出部17に供給する。撮像画像処理部11による投影位置の具体的な抽出処理については後述する。 (Step S010)
Subsequently, in step S010, the capturedimage processing unit 11 extracts the current projection position and the target projection position from the captured image. The captured image processing unit 11 supplies the current projection position and the target projection position to the correction amount calculation unit 17. Specific extraction processing of the projection position by the captured image processing unit 11 will be described later.
続いて、ステップS010において、撮像画像処理部11は、撮像画像中から、現在の投影位置および目標投影位置を抽出する。撮像画像処理部11は、現在の投影位置および目標投影位置を、補正量算出部17に供給する。撮像画像処理部11による投影位置の具体的な抽出処理については後述する。 (Step S010)
Subsequently, in step S010, the captured
(ステップS012)
続いて、ステップS012において、補正量算出部17は、現在の投影位置および目標投影位置の差を算出する。 (Step S012)
Subsequently, in step S012, the correctionamount calculation unit 17 calculates the difference between the current projection position and the target projection position.
続いて、ステップS012において、補正量算出部17は、現在の投影位置および目標投影位置の差を算出する。 (Step S012)
Subsequently, in step S012, the correction
(ステップS014)
続いて、ステップS014において、補正量算出部17は、現在の投影位置および目標投影位置の差が一定値以内であるかを判断する。 (Step S014)
Subsequently, in step S014, the correctionamount calculation unit 17 determines whether the difference between the current projection position and the target projection position is within a predetermined value.
続いて、ステップS014において、補正量算出部17は、現在の投影位置および目標投影位置の差が一定値以内であるかを判断する。 (Step S014)
Subsequently, in step S014, the correction
補正量算出部17は、現在の投影位置および目標投影位置の差が一定値以内であれば、目標投影位置への投影ができたものと判断し、投影位置補正処理は終了する。現在の投影位置および目標投影位置の差が一定値より大きければ、続いてステップS016の処理を行う。なお、上記「一定値」は、制御装置10が用いられる用途に応じて設定すればよい。一例として、補正量算出部17が、撮像画像中の1ピクセルを「一定値」として補正量を算出する構成が考えられる。
If the difference between the current projection position and the target projection position is within the predetermined value, the correction amount calculation unit 17 determines that the projection onto the target projection position has been completed, and the projection position correction processing ends. If the difference between the current projection position and the target projection position is larger than a predetermined value, then the process of step S016 is performed. In addition, what is necessary is just to set the said "fixed value" according to the use by which the control apparatus 10 is used. As an example, a configuration is conceivable in which the correction amount calculation unit 17 calculates the correction amount with one pixel in the captured image as the “constant value”.
(ステップS016)
続いて、ステップS016において、補正量算出部17は、現在の投影位置と目標投影位置との差を最小化するよう投影位置の補正量を算出する。なお、差を「最小化」するとは、必ずしも現在の投影位置と目標投影位置とが完全一致することに限定されるものではない。現在の投影位置と目標投影位置との差が、ステップS014の説明において記載した「一定値」以内となるように投影できれば、投影制御部6は、十分精度よく投影ができたものと判断する。 (Step S016)
Subsequently, in step S016, the correctionamount calculation unit 17 calculates the correction amount of the projection position so as to minimize the difference between the current projection position and the target projection position. Note that “minimizing” the difference is not necessarily limited to a perfect match between the current projection position and the target projection position. If the difference between the current projection position and the target projection position can be projected so as to be within the "constant value" described in the explanation of step S014, the projection control unit 6 determines that the projection has been performed with sufficient accuracy.
続いて、ステップS016において、補正量算出部17は、現在の投影位置と目標投影位置との差を最小化するよう投影位置の補正量を算出する。なお、差を「最小化」するとは、必ずしも現在の投影位置と目標投影位置とが完全一致することに限定されるものではない。現在の投影位置と目標投影位置との差が、ステップS014の説明において記載した「一定値」以内となるように投影できれば、投影制御部6は、十分精度よく投影ができたものと判断する。 (Step S016)
Subsequently, in step S016, the correction
(ステップS018)
続いて、ステップS018において、補正量算出部17は、ステップS016において求めた補正量を、飛行体制御部12およびプロジェクタ制御部25に供給する。 (Step S018)
Subsequently, in step S018, the correctionamount calculation unit 17 supplies the correction amount obtained in step S016 to the aircraft control unit 12 and the projector control unit 25.
続いて、ステップS018において、補正量算出部17は、ステップS016において求めた補正量を、飛行体制御部12およびプロジェクタ制御部25に供給する。 (Step S018)
Subsequently, in step S018, the correction
(ステップS020)
続いて、ステップS020において、飛行体制御部12は、補正量算出部17から供給された補正量を参照して、モータ4a~4dを制御することにより、投影位置を目標投影位置に近づけるよう投影位置を制御する。プロジェクタ制御部25は、補正量算出部17から供給された補正量を参照して、プロジェクタ3の投影方向を制御することにより、投影位置を目標投影位置に近づけるよう投影位置を制御する。無人航空機1およびプロジェクタ3の制御については上述したためここでは説明を省略する。 (Step S020)
Subsequently, in step S020, the flyingobject control unit 12 controls the motors 4a to 4d with reference to the correction amount supplied from the correction amount calculation unit 17 to project the projection position closer to the target projection position. Control the position. The projector control unit 25 controls the projection position so that the projection position approaches the target projection position by controlling the projection direction of the projector 3 with reference to the correction amount supplied from the correction amount calculation unit 17. The control of the unmanned aerial vehicle 1 and the projector 3 has been described above, and thus the description thereof is omitted here.
続いて、ステップS020において、飛行体制御部12は、補正量算出部17から供給された補正量を参照して、モータ4a~4dを制御することにより、投影位置を目標投影位置に近づけるよう投影位置を制御する。プロジェクタ制御部25は、補正量算出部17から供給された補正量を参照して、プロジェクタ3の投影方向を制御することにより、投影位置を目標投影位置に近づけるよう投影位置を制御する。無人航空機1およびプロジェクタ3の制御については上述したためここでは説明を省略する。 (Step S020)
Subsequently, in step S020, the flying
続いて、ステップS004に戻り、位置補正後の投影位置から再度投影を行う。続いて、ステップS006~S014の各処理を順番に行う。ステップS014において、現在の投影位置および目標投影位置の差が一定値以内であれば、投影制御部6は、目標投影位置への投影ができたものと判断し、投影位置補正処理を終了させる。現在の投影位置および目標投影位置の差が一定値より大きければ、投影制御部6は、ステップS016~S020の各処理を順番に行い、もう一度ステップS004に戻り処理を行う。
Subsequently, the process returns to step S 004, and projection is performed again from the projection position after position correction. Subsequently, the processes of steps S006 to S014 are sequentially performed. In step S014, if the difference between the current projection position and the target projection position is within a predetermined value, the projection control unit 6 determines that the projection onto the target projection position has been completed, and ends the projection position correction process. If the difference between the current projection position and the target projection position is larger than a predetermined value, the projection control unit 6 sequentially performs each processing of steps S016 to S020, and returns to step S004 to perform processing again.
ステップS004~S020の処理を繰り返す回数は特に限定されず、現在の投影位置および目標投影位置の差が一定値以内となるまで行うことができる。また、位置補正後に無人航空機1の位置がずれた場合も、再度S004~S020の処理を繰り返すことで位置補正を行うことができる。
The number of times of repeating the processes of steps S 004 to S 020 is not particularly limited, and the process can be performed until the difference between the current projection position and the target projection position becomes within a predetermined value. Further, even when the position of the unmanned aerial vehicle 1 is shifted after the position correction, the position correction can be performed by repeating the processing of S 004 to S 020 again.
また、上記の説明ではステップS004においてカメラ7が一枚の撮像画像を取得し、一枚の撮像画像を参照して、補正量算出部17が補正量を算出する処理について説明したが、カメラ7は互いに異なる時刻に複数の撮像画像を取得してもよい。この場合、補正量算出部17は、投影位置や投影方向を変化させつつ、同一のカメラ7により互いに異なる時刻に撮像された複数の撮像画像中における投影像の投影位置を参照し、投影位置の経時変化を抽出することにより、補正量を算出することができる。より具体的には、例えば、補正量算出部17は、投影方向や投影位置を変化させつつ撮像された複数の撮像画像と参照し、当該複数の撮像画像における投影像の投影位置と目標投影位置との相対位置関係及びその時間変化を解析することによって、補正の方向及び大きさを算出することができる。このように、無人航空機1を回転または並進させてみて、投影位置がずれる方向及びずれの大きさを抽出することにより、補正の方向及び大きさを算出することができる。
In the above description, the camera 7 acquires a single captured image in step S 004, and the correction amount calculation unit 17 calculates the correction amount with reference to the single captured image. A plurality of captured images may be acquired at different times. In this case, while changing the projection position and the projection direction, the correction amount calculation unit 17 refers to the projection positions of the projection images in a plurality of captured images captured at different times by the same camera 7. By extracting the change over time, the correction amount can be calculated. More specifically, for example, the correction amount calculation unit 17 refers to the plurality of captured images captured while changing the projection direction and the projection position, and the projection position and the target projection position of the projection image in the plurality of captured images The direction and the magnitude of the correction can be calculated by analyzing the relative positional relationship with and the time change thereof. As described above, it is possible to calculate the direction and magnitude of correction by extracting the direction and magnitude of the deviation of the projection position by trying to rotate or translate the unmanned aerial vehicle 1.
(投影位置補正の例1)
図4は、投影制御部6による投影位置補正の例1を説明するための図である。本例では、プロジェクタ3が、画像として、一つの点を投影している場合を例に挙げる。カメラ7は、当該点としての現在の投影位置20を含む撮像画像を取得する。撮像画像処理部11は、カメラ7から供給された撮像画像から、現在の投影位置20を第1の特徴点として抽出し、目標投影位置21を第2の特徴点として抽出し、投影のための目印22a~22dも特徴点として抽出する。ここで、投影のための目印とは、投影位置20の世界座標系における値を特定するために参照される1又は複数の目印のことをいう。投影のための目印は、一例として目標投影位置付近に配置されている。
ここでは、説明を単純化するために、投影のための目印22a~22dが全て同一平面状に存在し、かつあらかじめ座標が判明しているものと仮定するが、これは本実施形態における補正量算出処理を限定するものではない。また、投影は平面に対して行い、現在の投影位置20および目標投影位置21も、投影のための目印22a~22dと同一平面上にあるものとするが、これも本実施形態における補正量算出処理を限定するものではない。 (Example 1 of projection position correction)
FIG. 4 is a diagram for explaining an example 1 of the projection position correction by theprojection control unit 6. In this example, the projector 3 projects one point as an image. The camera 7 acquires a captured image including the current projection position 20 as the point. The captured image processing unit 11 extracts the current projection position 20 as a first feature point from the captured image supplied from the camera 7, extracts the target projection position 21 as a second feature point, and performs projection for the projection. The marks 22a to 22d are also extracted as feature points. Here, the landmarks for projection refer to one or more landmarks to be referred to for specifying a value in the world coordinate system of the projection position 20. A mark for projection is, for example, arranged near the target projection position.
Here, in order to simplify the description, it is assumed that all themarks 22a to 22d for projection are present on the same plane and the coordinates are known in advance. The calculation process is not limited. Also, the projection is performed on a plane, and the current projection position 20 and the target projection position 21 are also on the same plane as the marks 22a to 22d for projection, but this is also the correction amount calculation in this embodiment. It does not limit the process.
図4は、投影制御部6による投影位置補正の例1を説明するための図である。本例では、プロジェクタ3が、画像として、一つの点を投影している場合を例に挙げる。カメラ7は、当該点としての現在の投影位置20を含む撮像画像を取得する。撮像画像処理部11は、カメラ7から供給された撮像画像から、現在の投影位置20を第1の特徴点として抽出し、目標投影位置21を第2の特徴点として抽出し、投影のための目印22a~22dも特徴点として抽出する。ここで、投影のための目印とは、投影位置20の世界座標系における値を特定するために参照される1又は複数の目印のことをいう。投影のための目印は、一例として目標投影位置付近に配置されている。
ここでは、説明を単純化するために、投影のための目印22a~22dが全て同一平面状に存在し、かつあらかじめ座標が判明しているものと仮定するが、これは本実施形態における補正量算出処理を限定するものではない。また、投影は平面に対して行い、現在の投影位置20および目標投影位置21も、投影のための目印22a~22dと同一平面上にあるものとするが、これも本実施形態における補正量算出処理を限定するものではない。 (Example 1 of projection position correction)
FIG. 4 is a diagram for explaining an example 1 of the projection position correction by the
Here, in order to simplify the description, it is assumed that all the
補正量算出部17は、投影のための目印22a~22dの座標を利用して点の現在の投影位置20の実座標(x,y,z)の具体的な値を特定する。なお、目標投影位置21の実座標(x’,y’,z’)の具体的な値は、投影制御部6により、投影位置指定情報を参照することによってあらかじめ定められている。補正量算出部17は、(x,y,z)の具体的な値を特定した後、例えば後述する算出例1または2の方法に従って、補正量を算出する。
The correction amount calculation unit 17 specifies specific values of the actual coordinates (x, y, z) of the current projection position 20 of the point using the coordinates of the marks 22a to 22d for projection. The specific values of the actual coordinates (x ', y', z ') of the target projection position 21 are predetermined by the projection control unit 6 by referring to the projection position designation information. After specifying the specific value of (x, y, z), the correction amount calculation unit 17 calculates the correction amount according to, for example, the method of Calculation Example 1 or 2 described later.
また、本実施形態においては、説明を単純化するために、投影位置の補正は、無人航空機1の並進運動または回転運動によるものとするが、これは本実施形態を限定するものではない。
Further, in the present embodiment, in order to simplify the description, the correction of the projection position is made by the translational motion or the rotational motion of the unmanned aerial vehicle 1, but this does not limit the present embodiment.
なお、投影位置補正の方法は上記の例に限定されない。他の方法として、例えば、補正量算出部17は、画像処理または3Dセンサでの計測結果から、直接補正量を計測するような方法を用いて、補正量を算出する方法が挙げられる。
Note that the method of projection position correction is not limited to the above example. As another method, for example, there is a method in which the correction amount calculation unit 17 calculates the correction amount using a method of directly measuring the correction amount from the image processing or the measurement result of the 3D sensor.
(算出例1:回転運動を考えず、並進量ΔTのみを求める方法)
図4において説明した条件において補正量算出部17が補正量を算出する算出例の一つを説明する。 (Calculation example 1: A method of obtaining only translational amount ΔT without considering rotational movement)
One of the calculation examples in which the correctionamount calculation unit 17 calculates the correction amount under the conditions described in FIG. 4 will be described.
図4において説明した条件において補正量算出部17が補正量を算出する算出例の一つを説明する。 (Calculation example 1: A method of obtaining only translational amount ΔT without considering rotational movement)
One of the calculation examples in which the correction
以下の説明において、カメラ座標とプロジェクタ座標は、キャリブレーション済みであり、これら2つの座標間の関係(一方の座標をどのように変換すれば他方の座標となるのか)は既知であるとする。なお、カメラ座標系とは、カメラ7のCCD又はレンズの位置を原点とし、カメラ7の光軸方向をZ軸として、右手系又は左手系によりX軸Y軸を設定した座標系である。プロジェクタ座標系とは、プロジェクタ3のミラー又は光源の位置を原点とし、プロジェクタ3の光軸方向をZ軸として、右手系又は左手系によりX軸Y軸を設定した座標系である。また、世界座標系とは、無人航空機1が移動する空間内に設けた座標であり、本実施形態においては、無人航空機1が移動する倉庫内に設けた三次元の座標系である。
In the following description, it is assumed that camera coordinates and projector coordinates are calibrated, and the relationship between these two coordinates (how one coordinate is transformed to the other) is known. The camera coordinate system is a coordinate system in which the position of the CCD or lens of the camera 7 is the origin and the optical axis direction of the camera 7 is the Z axis, and the X axis Y axis is set by the right hand system or the left hand system. The projector coordinate system is a coordinate system in which the position of the mirror or light source of the projector 3 is the origin, and the optical axis direction of the projector 3 is the Z axis, and the X axis Y axis is set by the right hand system or the left hand system. Moreover, a world coordinate system is a coordinate provided in the space which the unmanned aircraft 1 moves, and it is a three-dimensional coordinate system provided in the warehouse which the unmanned aircraft 1 moves in this embodiment.
以下では、カメラ座標からプロジェクタ座標への変換行列を、RpcおよびTpcと表現する。ここで、変換行列Rpcは回転を表現しており、変換行列Tpcは並進を表現している。同様に、世界座標系からカメラ座標系への変換行列を、RcwおよびTcwと表現する。ここで、変換行列Rcwは回転を表現しており、変換行列Tcwは並進を表現している。
In the following, a transformation matrix from camera coordinates to projector coordinates is expressed as R pc and T pc . Here, the transformation matrix R pc represents rotation, and the transformation matrix T pc represents translation. Similarly, transformation matrices from the world coordinate system to the camera coordinate system are expressed as R cw and T cw . Here, the transformation matrix R cw represents rotation, and the transformation matrix T cw represents translation.
上記のような表記法を用いた場合、補正量算出部17は、世界座標(xw,yw,zw)を、下記の式1を用いることにより、プロジェクタ座標(xp,yp,zp)に変換することができる。
補正量算出部17は、上記式1を用いて、世界座標系における目標投影位置の座標(x’,y’,z’)と現在の投影位置の座標(x,y,z)とを、プロジェクタ座標系における目標投影位置の座標(xg,yg,zg)、プロジェクタ座標系における現在の投影位置の座標(xa,ya,za)に変換する。
When the above notation is used, the correction amount calculation unit 17 uses the world coordinates (x w , y w , z w ) and the following equation 1 to obtain projector coordinates (x p , y p , z p ) can be converted.
The correction amount calculation unit 17 uses the equation 1 to calculate the coordinates (x ′, y ′, z ′) of the target projection position in the world coordinate system and the coordinates (x, y, z) of the current projection position. The coordinates of the target projection position in the projector coordinate system (x g , y g , z g ) and the coordinates of the current projection position in the projector coordinate system (x a , y a , z a ) are converted.
続いて、補正量算出部17は、プロジェクタ座標系における目標投影位置の座標(xg,yg,zg)および、プロジェクタ座標系における現在の投影位置の座標(xa,ya,za)を、下記の式2に代入することによって、変換行列R*およびT*を決定する。ここで、変換行列R*は回転を表現しており、T*は並進を表現している。
Subsequently, the correction amount calculating section 17, coordinates of the target projection position of the projector coordinate system (x g, y g, z g) and the coordinates of the current projection position in the projector coordinate system (x a, y a, z a The transformation matrices R * and T * are determined by substituting)) into Equation 2 below. Here, the transformation matrix R * represents rotation, and T * represents translation.
一般的に、式2の条件を満たすR*およびT*は複数存在するが、本算出例においては、無人航空機1の回転運動を考慮せずに、並進量ΔTのみを求める。換言すれば、本例では、補正量算出部17は、R*を単位行列に設定し、補正量T*=ΔTのみを算出する。補正量算出部17によって算出された補正量ΔTは飛行体制御部12に供給され、飛行体制御部12は当該補正量の示す補正量だけ無人航空機を並進運動させることにより、目標投影位置に投影を行うことができる。
Generally, although there are a plurality of R * and T * that satisfy the condition of Equation 2, in this calculation example, only the translation amount ΔT is determined without considering the rotational movement of the unmanned aerial vehicle 1. In other words, in the present example, the correction amount calculation unit 17 sets R * as a unit matrix and calculates only the correction amount T * = ΔT. The correction amount ΔT calculated by the correction amount calculation unit 17 is supplied to the flying object control unit 12, and the flight object control unit 12 causes the unmanned aircraft to translate by the correction amount indicated by the correction amount to project to the target projection position. It can be performed.
(算出例2:並進運動を考えず、回転量Rのみを求める方法)
次に補正量の算出例2として、無人航空機1の並進運動を考えず、回転量Rのみを求める方法について説明する。 (Calculation example 2: A method of determining only the rotation amount R without considering translational movement)
Next, a method of determining only the amount of rotation R without considering the translational movement of the unmannedaerial vehicle 1 will be described as Calculation Example 2 of the correction amount.
次に補正量の算出例2として、無人航空機1の並進運動を考えず、回転量Rのみを求める方法について説明する。 (Calculation example 2: A method of determining only the rotation amount R without considering translational movement)
Next, a method of determining only the amount of rotation R without considering the translational movement of the unmanned
まず算出例1と同様に、世界座標系における目標投影位置の座標(x’,y’,z’)と現在の投影位置の座標(x,y,z)とを、プロジェクタ座標系における目標投影位置の座標(xg,yg,zg)およびプロジェクタ座標系における現在の投影位置の座標(xa,ya,za)に変換し、算出例1と同様に式2に代入する。
First, as in Calculation Example 1, the coordinates (x ', y', z ') of the target projection position in the world coordinate system and the coordinates (x, y, z) of the current projection position are the target projection in the projector coordinate system. The coordinates (x g , y g , z g ) of the position and the coordinates (x a , y a , z a ) of the current projection position in the projector coordinate system are converted, and substituted into Expression 2 in the same manner as Calculation Example 1.
本算出例においては、T*を0とし、無人航空機1の並進運動を考えないものとする。プロジェクタ座標系における2つの投影位置のベクトル(xg,yg,zg)と(xa,ya,za)との間の角度、および無人航空機1の回転運動の回転軸は、下記の式3および式4を用いて求めることができる。
In this calculation example, it is assumed that T * is 0 and translational movement of the unmanned aerial vehicle 1 is not considered. The angle between the two projection position vectors (x g , y g , z g ) and (x a , y a , z a ) in the projector coordinate system, and the rotational axis of the rotational movement of the unmanned aerial vehicle 1 are as follows: Equations 3 and 4 can be used.
補正量算出部17は、式3及び式4を用いて求めた回転軸と回転角度とを、ロドリゲスの回転公式に当てはめることにより、補正量R*=ΔRを算出する。
The correction amount calculation unit 17 calculates the correction amount R * = ΔR by applying the rotation axis and the rotation angle obtained using Equation 3 and Equation 4 to the Rodriguez rotation formula.
補正量算出部17によって算出された補正量ΔRは飛行体制御部12に供給され、飛行体制御部12は当該補正量の示す補正量だけ無人航空機を回転運動させることにより、目標投影位置に投影を行うことができる。
The correction amount ΔR calculated by the correction amount calculation unit 17 is supplied to the flying object control unit 12, and the flight object control unit 12 projects the unmanned aircraft to the target projection position by rotating the unmanned aircraft by the correction amount indicated by the correction amount. It can be performed.
(投影位置補正の例2)
図5は、投影位置補正の例2を説明するための図である。図5(a)は、プロジェクタ3が、投影する画像24である。図5(b)は、カメラ7が撮像した撮像画像の例であり、点の現在の投影位置20および目標投影位置21を含む。 (Example 2 of projection position correction)
FIG. 5 is a diagram for explaining an example 2 of the projection position correction. FIG. 5A shows animage 24 projected by the projector 3. FIG. 5 (b) is an example of a captured image captured by the camera 7, which includes the current projected position 20 of the point and the target projected position 21.
図5は、投影位置補正の例2を説明するための図である。図5(a)は、プロジェクタ3が、投影する画像24である。図5(b)は、カメラ7が撮像した撮像画像の例であり、点の現在の投影位置20および目標投影位置21を含む。 (Example 2 of projection position correction)
FIG. 5 is a diagram for explaining an example 2 of the projection position correction. FIG. 5A shows an
カメラ7は、点の現在の投影位置20を含む撮像画像を取得する。撮像画像処理部11は、カメラ7から供給された撮像画像から、画像24の現在の投影位置20中の特徴点23a~23d(第一の特徴点)、および投影のための目印(第二の特徴点)22a~22dを抽出する。
The camera 7 acquires a captured image including the current projection position 20 of the point. The captured image processing unit 11 determines from the captured image supplied from the camera 7 the feature points 23a to 23d (first feature points) in the current projection position 20 of the image 24, and a mark for projection (second Feature points) 22a to 22d are extracted.
撮像画像処理部11は、特徴点23a~23dのそれぞれを目印22a~22dのそれぞれに対応づける。より具体的には、補正量算出部17は、対応する点の組から、ホモグラフィ行列を求め、求めたホモグラフィ行列をZhangの方法等を用いて、回転行列R(=ΔR)と並進行列T(=ΔT)とに分解する。これにより、補正量算出部17は補正量ΔRおよびΔTを求めることができる。なお、カメラ座標からプロジェクタ座標への変換は、上述した式1を用いて行えばよい。
The captured image processing unit 11 associates each of the feature points 23a to 23d with each of the marks 22a to 22d. More specifically, the correction amount calculation unit 17 obtains a homography matrix from a set of corresponding points, and uses the method of Zhang and the like to determine the homography matrix, and the rotation matrix R (= ΔR) and the translation matrix It decomposes into T (= ΔT). Thereby, the correction amount calculation unit 17 can obtain the correction amounts ΔR and ΔT. The conversion from camera coordinates to projector coordinates may be performed using the above-described equation 1.
〔実施形態2〕
以下、実施形態2について、図6を参照して、詳細に説明する。以下の説明では、上記実施形態において既に説明した部材については同じ参照符号を付して説明を省略し、上記実施形態とは異なる点について説明を行う。 Second Embodiment
The second embodiment will be described in detail below with reference to FIG. In the following description, the same reference numerals are assigned to members already described in the above embodiment, and the description thereof is omitted, and only differences from the above embodiment will be described.
以下、実施形態2について、図6を参照して、詳細に説明する。以下の説明では、上記実施形態において既に説明した部材については同じ参照符号を付して説明を省略し、上記実施形態とは異なる点について説明を行う。 Second Embodiment
The second embodiment will be described in detail below with reference to FIG. In the following description, the same reference numerals are assigned to members already described in the above embodiment, and the description thereof is omitted, and only differences from the above embodiment will be described.
(制御システムの構成)
図6は、実施形態2に係る投影システム5の概略構成を示すブロック図である。本実施形態においては、投影システム5は、無人航空機1およびサーバ60を備えている。 (Configuration of control system)
FIG. 6 is a block diagram showing a schematic configuration of aprojection system 5 according to a second embodiment. In the present embodiment, the projection system 5 includes the unmanned aerial vehicle 1 and a server 60.
図6は、実施形態2に係る投影システム5の概略構成を示すブロック図である。本実施形態においては、投影システム5は、無人航空機1およびサーバ60を備えている。 (Configuration of control system)
FIG. 6 is a block diagram showing a schematic configuration of a
無人航空機1は、ブレード2a~2d、プロジェクタ3、モータ4a~4d、制御装置10、投影情報取得部14、補正量受信部9を備えている。制御装置10は、投影情報記憶部13、撮像画像処理部11a、飛行体制御部12、およびプロジェクタ制御部25を備えている。なお、プロジェクタ3は、制御装置10の必須の構成ではない。
The unmanned aerial vehicle 1 includes blades 2a to 2d, projectors 3, motors 4a to 4d, a control device 10, a projection information acquisition unit 14, and a correction amount reception unit 9. The control device 10 includes a projection information storage unit 13, a captured image processing unit 11 a, a flying object control unit 12, and a projector control unit 25. The projector 3 is not an essential component of the control device 10.
サーバ60は、投影制御部6a、カメラ7、撮像画像取得部16、および補正量送信部19を備えている。投影制御部6aは、撮像画像処理部11b、および補正量算出部17を備えている。
The server 60 includes a projection control unit 6 a, a camera 7, a captured image acquisition unit 16, and a correction amount transmission unit 19. The projection control unit 6 a includes a captured image processing unit 11 b and a correction amount calculation unit 17.
本実施形態においては、無人航空機1に備えられたプロジェクタ3が投影した画像を、サーバ60に備えられたカメラ7が撮像し、サーバ60において、実施形態1において説明した投影位置の抽出処理、補正量算出処理を行い、算出された補正量が、サーバ60から無人航空機1に送信される構成となっている。
In the present embodiment, the camera 7 provided in the server 60 captures an image projected by the projector 3 provided in the unmanned aerial vehicle 1, and the server 60 extracts and corrects the projection position described in the first embodiment. The amount calculation process is performed, and the calculated correction amount is transmitted from the server 60 to the unmanned aerial vehicle 1.
図7を参照して、本実施形態に係る投影システム5による投影位置補正処理の流れについて説明する。図7は、投影位置補正処理の流れを示すシーケンス図である。なお、以下では、ステップS004~S020を、一回のみ行う例について説明したが、本実施形態はこれに限定されるものではない。実施形態1において説明したように、目標投影位置と現在の投影位置との差が一定以内となるまで、複数回ステップS004~S020を繰り返してもよい。
The flow of the projection position correction process by the projection system 5 according to the present embodiment will be described with reference to FIG. FIG. 7 is a sequence diagram showing a flow of projection position correction processing. In the following, an example in which steps S 004 to S 020 are performed only once has been described, but the present embodiment is not limited to this. As described in the first embodiment, steps S 004 to S 020 may be repeated multiple times until the difference between the target projection position and the current projection position is within a certain range.
(ステップS004)
実施形態1同様、無人航空機1に備えられたプロジェクタ3が、画像8を投影する。 (Step S004)
As in the first embodiment, theprojector 3 provided in the unmanned aerial vehicle 1 projects the image 8.
実施形態1同様、無人航空機1に備えられたプロジェクタ3が、画像8を投影する。 (Step S004)
As in the first embodiment, the
(ステップS006)
続いて、サーバ60に備えられたカメラ7が、ステップS004において投影された画像8の投影像を含む画像を、撮像画像として取得する。カメラ7は、撮像画像を、撮像画像取得部16に供給する。 (Step S006)
Subsequently, thecamera 7 provided in the server 60 acquires an image including the projection image of the image 8 projected in step S004 as a captured image. The camera 7 supplies the captured image to the captured image acquisition unit 16.
続いて、サーバ60に備えられたカメラ7が、ステップS004において投影された画像8の投影像を含む画像を、撮像画像として取得する。カメラ7は、撮像画像を、撮像画像取得部16に供給する。 (Step S006)
Subsequently, the
(ステップS008)
続いて、ステップS008において、サーバ60に備えられた撮像画像取得部16は、カメラ7が取得した撮像画像を取得する。撮像画像取得部16は、取得した撮像画像を、撮像画像処理部11へ供給する。 (Step S008)
Subsequently, in step S008, the capturedimage acquisition unit 16 included in the server 60 acquires a captured image acquired by the camera 7. The captured image acquisition unit 16 supplies the acquired captured image to the captured image processing unit 11.
続いて、ステップS008において、サーバ60に備えられた撮像画像取得部16は、カメラ7が取得した撮像画像を取得する。撮像画像取得部16は、取得した撮像画像を、撮像画像処理部11へ供給する。 (Step S008)
Subsequently, in step S008, the captured
(ステップS010)
続いて、ステップS010において、撮像画像処理部11は、撮像画像中から、現在の投影位置および目標投影位置を抽出する。撮像画像処理部11は、現在の投影位置および目標投影位置を、補正量算出部17に供給する。撮像画像処理部11による投影位置の具体的な抽出処理は、実施形態1と同様である。 (Step S010)
Subsequently, in step S010, the capturedimage processing unit 11 extracts the current projection position and the target projection position from the captured image. The captured image processing unit 11 supplies the current projection position and the target projection position to the correction amount calculation unit 17. The specific extraction process of the projection position by the captured image processing unit 11 is the same as that of the first embodiment.
続いて、ステップS010において、撮像画像処理部11は、撮像画像中から、現在の投影位置および目標投影位置を抽出する。撮像画像処理部11は、現在の投影位置および目標投影位置を、補正量算出部17に供給する。撮像画像処理部11による投影位置の具体的な抽出処理は、実施形態1と同様である。 (Step S010)
Subsequently, in step S010, the captured
(ステップS012)
続いて、ステップS012において、補正量算出部17は、現在の投影位置および目標投影位置の差を算出する。 (Step S012)
Subsequently, in step S012, the correctionamount calculation unit 17 calculates the difference between the current projection position and the target projection position.
続いて、ステップS012において、補正量算出部17は、現在の投影位置および目標投影位置の差を算出する。 (Step S012)
Subsequently, in step S012, the correction
(ステップS016)
続いて、ステップS016において、補正量算出部17は、現在の投影位置と目標投影位置との差を最小化するよう投影位置の補正量を算出する。ステップS016の具体的な内容は、実施形態1と同様である。 (Step S016)
Subsequently, in step S016, the correctionamount calculation unit 17 calculates the correction amount of the projection position so as to minimize the difference between the current projection position and the target projection position. The specific content of step S016 is the same as that of the first embodiment.
続いて、ステップS016において、補正量算出部17は、現在の投影位置と目標投影位置との差を最小化するよう投影位置の補正量を算出する。ステップS016の具体的な内容は、実施形態1と同様である。 (Step S016)
Subsequently, in step S016, the correction
(ステップS018)
続いて、ステップS018において、補正量算出部17は、ステップS016において求めた補正量を、補正量送信部19に送る。補正量送信部19は、補正量を無人航空機1の補正量受信部9に送る。 (Step S018)
Subsequently, in step S018, the correctionamount calculation unit 17 sends the correction amount obtained in step S016 to the correction amount transmission unit 19. The correction amount transmission unit 19 transmits the correction amount to the correction amount reception unit 9 of the unmanned aerial vehicle 1.
続いて、ステップS018において、補正量算出部17は、ステップS016において求めた補正量を、補正量送信部19に送る。補正量送信部19は、補正量を無人航空機1の補正量受信部9に送る。 (Step S018)
Subsequently, in step S018, the correction
(ステップS019)
補正量受信部9は、補正量送信部19から補正量を受け取り、飛行体制御部12およびプロジェクタ制御部25に供給する。 (Step S019)
The correctionamount reception unit 9 receives the correction amount from the correction amount transmission unit 19, and supplies the correction amount to the aircraft control unit 12 and the projector control unit 25.
補正量受信部9は、補正量送信部19から補正量を受け取り、飛行体制御部12およびプロジェクタ制御部25に供給する。 (Step S019)
The correction
(ステップS020)
続いて、ステップS020において、飛行体制御部12は、補正量を参照して、モータ4a~4dを制御することにより、投影位置を目標投影位置に近づけるよう投影位置を制御する。プロジェクタ制御部25は、補正量を参照して、プロジェクタ3を制御することにより、投影位置を目標投影位置に近づけるよう、飛行体1およびプロジェクタ3を制御する。飛行体1およびプロジェクタ3の制御については実施形態1と同様である。 (Step S020)
Subsequently, in step S020, the flyingobject control unit 12 controls the motors 4a to 4d with reference to the correction amount to control the projection position so that the projection position approaches the target projection position. The projector control unit 25 controls the projector 3 by controlling the projector 3 with reference to the correction amount so as to bring the projection position close to the target projection position. The control of the aircraft 1 and the projector 3 is the same as in the first embodiment.
続いて、ステップS020において、飛行体制御部12は、補正量を参照して、モータ4a~4dを制御することにより、投影位置を目標投影位置に近づけるよう投影位置を制御する。プロジェクタ制御部25は、補正量を参照して、プロジェクタ3を制御することにより、投影位置を目標投影位置に近づけるよう、飛行体1およびプロジェクタ3を制御する。飛行体1およびプロジェクタ3の制御については実施形態1と同様である。 (Step S020)
Subsequently, in step S020, the flying
本実施形態においては、カメラ7が無人航空機1以外に設けられている。より具体的には、カメラ7がサーバ60側に備えられている。このため、エリア毎に一つのサーバ60を設け、該エリア内の複数の無人航空機の投影位置補正を、一つのサーバ60により行う構成とすることも可能である。また、本実施形態では、投影位置抽出処理、および補正量算出処理がサーバ60において行われるため、制御装置10を比較的簡易な構成によって実現できる。
In the present embodiment, the camera 7 is provided in addition to the unmanned aerial vehicle 1. More specifically, the camera 7 is provided on the server 60 side. For this reason, it is also possible to provide one server 60 for each area, and to perform projection position correction of a plurality of unmanned aerial vehicles in the area by one server 60. Further, in the present embodiment, since the projection position extraction process and the correction amount calculation process are performed in the server 60, the control device 10 can be realized with a relatively simple configuration.
〔ソフトウェアによる実現例〕
制御装置10の制御ブロックは、集積回路(ICチップ)等に形成された論理回路(ハードウェア)によって実現してもよいし、ソフトウェアによって実現してもよい。 [Example of software implementation]
The control block of thecontrol device 10 may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like, or may be realized by software.
制御装置10の制御ブロックは、集積回路(ICチップ)等に形成された論理回路(ハードウェア)によって実現してもよいし、ソフトウェアによって実現してもよい。 [Example of software implementation]
The control block of the
後者の場合、制御装置10は、各機能を実現するソフトウェアであるプログラムの命令を実行するコンピュータを備えている。このコンピュータは、例えば1つ以上のプロセッサを備えていると共に、上記プログラムを記憶したコンピュータ読み取り可能な記録媒体を備えている。そして、上記コンピュータにおいて、上記プロセッサが上記プログラムを上記記録媒体から読み取って実行することにより、本発明の目的が達成される。上記プロセッサとしては、例えばCPU(Central Processing Unit)を用いることができる。上記記録媒体としては、「一時的でない有形の媒体」、例えば、ROM(Read Only Memory)等の他、テープ、ディスク、カード、半導体メモリ、プログラマブルな論理回路などを用いることができる。また、上記プログラムを展開するRAM(Random Access Memory)などをさらに備えていてもよい。また、上記プログラムは、該プログラムを伝送可能な任意の伝送媒体(通信ネットワークや放送波等)を介して上記コンピュータに供給されてもよい。なお、本発明の一態様は、上記プログラムが電子的な伝送によって具現化された、搬送波に埋め込まれたデータ信号の形態でも実現され得る。
In the latter case, the control device 10 includes a computer that executes instructions of a program that is software that implements each function. The computer includes, for example, one or more processors, and a computer readable recording medium storing the program. Then, in the computer, the processor reads the program from the recording medium and executes the program to achieve the object of the present invention. For example, a CPU (Central Processing Unit) can be used as the processor. As the above-mentioned recording medium, a tape, a disk, a card, a semiconductor memory, a programmable logic circuit or the like can be used besides “a non-temporary tangible medium”, for example, a ROM (Read Only Memory). In addition, a RAM (Random Access Memory) or the like for developing the program may be further provided. The program may be supplied to the computer via any transmission medium (communication network, broadcast wave, etc.) capable of transmitting the program. Note that one aspect of the present invention can also be realized in the form of a data signal embedded in a carrier wave in which the program is embodied by electronic transmission.
本発明は上述した各実施形態に限定されるものではなく、請求項に示した範囲で種々の変更が可能であり、異なる実施形態にそれぞれ開示された技術的手段を適宜組み合わせて得られる実施形態についても本発明の技術的範囲に含まれる。
The present invention is not limited to the above-described embodiments, and various modifications can be made within the scope of the claims, and embodiments obtained by appropriately combining the technical means disclosed in the different embodiments. Is also included in the technical scope of the present invention.
1 無人航空機(移動体)
3 プロジェクタ(投影部)
6、6a 投影制御部
7 カメラ(投影装置)
8 画像
10 制御装置
11 撮像画像処理部
12 飛行体制御部
17 補正量算出部
16 画像取得部
25 プロジェクタ制御部 1 Unmanned aerial vehicle (mobile)
3 Projector (Projector)
6,6a Projection controller 7 Camera (projector)
8image 10 control device 11 captured image processing unit 12 flying object control unit 17 correction amount calculation unit 16 image acquisition unit 25 projector control unit
3 プロジェクタ(投影部)
6、6a 投影制御部
7 カメラ(投影装置)
8 画像
10 制御装置
11 撮像画像処理部
12 飛行体制御部
17 補正量算出部
16 画像取得部
25 プロジェクタ制御部 1 Unmanned aerial vehicle (mobile)
3 Projector (Projector)
6,
8
Claims (13)
- 投影部を備えている移動体を制御する制御装置であって、
前記投影部が投影した投影像を含む撮像画像を取得する撮像画像取得部と、
前記撮像画像を参照して、前記移動体の位置、前記移動体の姿勢、および前記投影部の方向のうち少なくとも1つを制御する投影制御部と
を備えていることを特徴とする制御装置。 A control device for controlling a movable body provided with a projection unit, the control device comprising:
A captured image acquisition unit that acquires a captured image including a projection image projected by the projection unit;
A control apparatus comprising: a projection control unit configured to control at least one of a position of the movable body, an attitude of the movable body, and a direction of the projection unit with reference to the captured image. - 前記投影制御部は、前記撮像画像から、前記投影像の投影位置と目標投影位置とを抽出し、前記移動体が投影する投影位置が前記目標投影位置となるように、前記移動体が投影する投影位置を制御することを特徴とする請求項1に記載の制御装置。 The projection control unit extracts a projection position of the projection image and a target projection position from the captured image, and the mobile body projects so that the projection position projected by the mobile body becomes the target projection position. The control device according to claim 1, which controls a projection position.
- 前記投影制御部は、現在の前記投影像の投影位置と目標投影位置との差を算出することにより、投影位置の補正量を決定することを特徴とする請求項2に記載の制御装置。 The control device according to claim 2, wherein the projection control unit determines a correction amount of the projection position by calculating a difference between a projection position of the current projection image and a target projection position.
- 前記投影制御部は、前記撮像画像から、現在の前記投影像の投影位置を示す第一の特徴点、および目標投影位置を示す第二の特徴点をそれぞれ抽出し、第一の特徴点と第二の特徴点との位置の差を算出することにより、前記補正量を決定することを特徴とする請求項3に記載の制御装置。 The projection control unit extracts, from the captured image, a first feature point indicating a current projection position of the projection image and a second feature point indicating a target projection position, and the first feature point and the second feature point The control device according to claim 3, wherein the correction amount is determined by calculating a difference in position between two feature points.
- 同一の撮像装置によって互いに異なる時刻に撮像された複数の撮像画像を参照して前記移動体が投影する投影位置を制御することを特徴とする請求項1から3のいずれか1項に記載の制御装置。 The control according to any one of claims 1 to 3, wherein a projection position projected by the moving body is controlled with reference to a plurality of captured images captured at different times by the same imaging device. apparatus.
- 前記移動体の位置又は姿勢を制御することにより、前記移動体が投影する投影位置を制御することを特徴とする請求項1から5のいずれか1項に記載の制御装置。 The control device according to any one of claims 1 to 5, wherein a projection position projected by the movable body is controlled by controlling a position or an attitude of the movable body.
- 前記投影部の投影方向を制御することにより、前記移動体が投影する投影位置を制御することを特徴とする請求項1から6のいずれか1項に記載の制御装置。 The control device according to any one of claims 1 to 6, wherein a projection position projected by the movable body is controlled by controlling a projection direction of the projection unit.
- 前記撮像画像取得部は、前記撮像画像として、前記移動体が備える撮像装置によって撮像された撮像画像を取得することを特徴とする請求項1から7のいずれか1項に記載の制御装置。 The control device according to any one of claims 1 to 7, wherein the captured image acquisition unit acquires, as the captured image, a captured image captured by an imaging device included in the moving body.
- 前記撮像画像取得部は、前記撮像画像として、前記移動体以外が備える撮像装置によって撮像された撮像画像を取得することを特徴とする請求項1から7のいずれか1項に記載の制御装置。 The control device according to any one of claims 1 to 7, wherein the captured image acquisition unit acquires, as the captured image, a captured image captured by an imaging device provided other than the moving body.
- 前記移動体として飛行体を制御することを特徴とする請求項1から6のいずれか1項に記載の制御装置。 The control device according to any one of claims 1 to 6, characterized in that a flying object is controlled as the moving object.
- 前記投影部により投影される画像には、作業者による作業を補助するための情報が含まれていることを特徴とする請求項1から10のいずれか1項に記載の制御装置。 The control device according to any one of claims 1 to 10, wherein the image projected by the projection unit includes information for assisting a worker's work.
- 投影部を備えている飛行体であって、
前記投影部が投影した投影像を含む撮像画像を取得する撮像画像取得部と、
前記撮像画像を参照して、前記飛行体の位置、前記飛行体の姿勢、および前記投影部の方向のうち少なくとも1つを制御する投影制御部と
を備えている制御装置を備えていることを特徴とする飛行体。 An aircraft having a projection unit,
A captured image acquisition unit that acquires a captured image including a projection image projected by the projection unit;
A control device provided with a projection control unit for controlling at least one of the position of the flight object, the attitude of the flight object, and the direction of the projection unit with reference to the captured image Characterized flying body. - 請求項1から11のいずれか1項に記載の制御装置としてコンピュータを機能させるための制御プログラムであって、前記撮像画像取得部、および前記投影制御部としてコンピュータを機能させるための制御プログラム。 A control program for causing a computer to function as the control device according to any one of claims 1 to 11, wherein the control program causes the computer to function as the captured image acquisition unit and the projection control unit.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017125606A JP6988197B2 (en) | 2017-06-27 | 2017-06-27 | Controls, flying objects, and control programs |
JP2017-125606 | 2017-06-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019003492A1 true WO2019003492A1 (en) | 2019-01-03 |
Family
ID=64742067
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/006001 WO2019003492A1 (en) | 2017-06-27 | 2018-02-20 | Control device, flying body, and control program |
Country Status (3)
Country | Link |
---|---|
JP (1) | JP6988197B2 (en) |
TW (1) | TWI693959B (en) |
WO (1) | WO2019003492A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021097294A (en) * | 2019-12-16 | 2021-06-24 | 日亜化学工業株式会社 | Remote-controlled mobile and method for cooling projection device mounted on remote-controlled mobile |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020152143A (en) * | 2019-03-18 | 2020-09-24 | 株式会社リコー | Flying body |
WO2020240918A1 (en) * | 2019-05-28 | 2020-12-03 | 三菱電機株式会社 | Work supporting system, work supporting method and program |
JP2021154808A (en) * | 2020-03-26 | 2021-10-07 | セイコーエプソン株式会社 | Unmanned aircraft |
CN112379680B (en) * | 2020-10-10 | 2022-12-13 | 中国运载火箭技术研究院 | Aircraft attitude angle control method, control device and storage medium |
US11443518B2 (en) | 2020-11-30 | 2022-09-13 | At&T Intellectual Property I, L.P. | Uncrewed aerial vehicle shared environment privacy and security |
US11797896B2 (en) | 2020-11-30 | 2023-10-24 | At&T Intellectual Property I, L.P. | Autonomous aerial vehicle assisted viewing location selection for event venue |
US11726475B2 (en) | 2020-11-30 | 2023-08-15 | At&T Intellectual Property I, L.P. | Autonomous aerial vehicle airspace claiming and announcing |
WO2023127500A1 (en) * | 2021-12-27 | 2023-07-06 | 富士フイルム株式会社 | Control device, control method, and control program |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005338114A (en) * | 2004-05-24 | 2005-12-08 | Seiko Epson Corp | Automatic movement type air floating image display device |
JP2015165622A (en) * | 2014-03-03 | 2015-09-17 | セイコーエプソン株式会社 | Image projection device and image projection method |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014179698A (en) * | 2013-03-13 | 2014-09-25 | Ricoh Co Ltd | Projector and control method of projector, and program of control method and recording medium with program recorded thereon |
-
2017
- 2017-06-27 JP JP2017125606A patent/JP6988197B2/en active Active
-
2018
- 2018-02-20 WO PCT/JP2018/006001 patent/WO2019003492A1/en active Application Filing
- 2018-02-26 TW TW107106440A patent/TWI693959B/en active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005338114A (en) * | 2004-05-24 | 2005-12-08 | Seiko Epson Corp | Automatic movement type air floating image display device |
JP2015165622A (en) * | 2014-03-03 | 2015-09-17 | セイコーエプソン株式会社 | Image projection device and image projection method |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2021097294A (en) * | 2019-12-16 | 2021-06-24 | 日亜化学工業株式会社 | Remote-controlled mobile and method for cooling projection device mounted on remote-controlled mobile |
JP7406082B2 (en) | 2019-12-16 | 2023-12-27 | 日亜化学工業株式会社 | A method for cooling a remote-controlled moving object and a projection device mounted on the remote-controlled moving object |
Also Published As
Publication number | Publication date |
---|---|
TWI693959B (en) | 2020-05-21 |
TW201904643A (en) | 2019-02-01 |
JP2019008676A (en) | 2019-01-17 |
JP6988197B2 (en) | 2022-01-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6988197B2 (en) | Controls, flying objects, and control programs | |
US10234278B2 (en) | Aerial device having a three-dimensional measurement device | |
US11070793B2 (en) | Machine vision system calibration | |
CN104154875B (en) | Three-dimensional data acquisition system and acquisition method based on two-axis rotation platform | |
CN107784672B (en) | Method and device for acquiring external parameters of vehicle-mounted camera | |
CN111487043B (en) | Method for determining calibration parameters of speckle projector of monocular speckle structured light system | |
US9437005B2 (en) | Information processing apparatus and information processing method | |
WO2017042907A1 (en) | Navigation device and survey system | |
CN111612794A (en) | Multi-2D vision-based high-precision three-dimensional pose estimation method and system for parts | |
WO2019061064A1 (en) | Image processing method and device | |
JP7527546B2 (en) | Calibrating cameras on unmanned aerial vehicles using human joints | |
JP2019049467A (en) | Distance measurement system and distance measurement method | |
JP2006234703A (en) | Image processing device, three-dimensional measuring device, and program for image processing device | |
JP5198078B2 (en) | Measuring device and measuring method | |
US20210156710A1 (en) | Map processing method, device, and computer-readable storage medium | |
JP7008736B2 (en) | Image capture method and image capture device | |
CN116952229A (en) | Unmanned aerial vehicle positioning method, device, system and storage medium | |
CN113252066A (en) | Method and device for calibrating parameters of odometer equipment, storage medium and electronic device | |
US20230070281A1 (en) | Methods and systems of generating camera models for camera calibration | |
WO2020215296A1 (en) | Line inspection control method for movable platform, and line inspection control device, movable platform and system | |
EP3529977B1 (en) | A bundle adjustment system | |
CN115147495A (en) | Calibration method, device and system for vehicle-mounted system | |
EP3943979A1 (en) | Indoor device localization | |
EP3051494B1 (en) | Method for determining an image depth value depending on an image region, camera system and motor vehicle | |
JP2019117584A (en) | Mobile body |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18822636 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18822636 Country of ref document: EP Kind code of ref document: A1 |