WO2015151685A1 - データ伝送システム、データ伝送装置、及びデータ伝送方法 - Google Patents
データ伝送システム、データ伝送装置、及びデータ伝送方法 Download PDFInfo
- Publication number
- WO2015151685A1 WO2015151685A1 PCT/JP2015/055945 JP2015055945W WO2015151685A1 WO 2015151685 A1 WO2015151685 A1 WO 2015151685A1 JP 2015055945 W JP2015055945 W JP 2015055945W WO 2015151685 A1 WO2015151685 A1 WO 2015151685A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- point cloud
- data transmission
- cell
- transfer target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/04—Programme control other than numerical control, i.e. in sequence controllers or logic controllers
- G05B19/048—Monitoring; Safety
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1679—Programme controls characterised by the tasks executed
- B25J9/1689—Teleoperation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/003—Transmission of data between radar, sonar or lidar systems and remote stations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4808—Evaluating distance, position or velocity data
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/23—Pc programming
- G05B2219/23098—Manual control, via microprocessor instead of direct connection to actuators
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/37—Measurements
- G05B2219/37567—3-D vision, stereo vision, with two cameras
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40169—Display of actual situation at the remote site
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/50—Receiving or transmitting feedback, e.g. replies, status updates, acknowledgements, from the controlled devices
- G08C2201/51—Remote controlling of devices based on replies, status thereof
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C2201/00—Transmission systems of control signals via wireless link
- G08C2201/90—Additional features
- G08C2201/91—Remote control based on location and proximity
Definitions
- the present invention relates to a data transmission system, a data transmission device, a data transmission method, and a data transmission program, and more particularly, to a data transmission device, a data transmission method, and a data transmission program for transmitting three-dimensional point cloud data to a remote device.
- a technique for measuring a three-dimensional shape with a three-dimensional sensor and acquiring point cloud data (also referred to as a point cloud) indicating three-dimensional coordinates is known.
- a laser scanner or a stereo camera is exemplified as a three-dimensional sensor that acquires point cloud data.
- the laser scanner measures the three-dimensional position coordinates (point group data) of the surface of the measurement object using laser irradiation light and reflected light.
- the laser scanner acquires the three-dimensional position coordinates of the surface of the measurement object from the reciprocation time of the laser light between the measurement object and the sensor and the irradiation angle of the laser light.
- the color information acquired by a camera or the like provided separately from the laser scanner is combined with the point cloud data, thereby making it easy to visually recognize the three-dimensional shape of the measurement object.
- the amount of point cloud data acquired by a 3D sensor is large, so when modeling 3D shapes by analyzing the point cloud data, the amount of point cloud data is reduced to reduce the amount of calculation. Processing is performed. For example, when the point position data is acquired by changing the scan position and the shape of a wide area is obtained by combining the point point data at each scan position, the obtained point group data is aligned (matching process). Need to do. In this case, it is known to reduce the amount of point cloud data in order to reduce the amount of calculation in the matching process.
- the technology to reduce the amount of point cloud data for the matching process is described in, for example, “Fast range-independent subsampling of 3D laser scanner points and data reduction performance literature 1 reference).
- Non-Patent Document 1 describes a technique for reducing point cloud data so that the data interval is constant in a spherical coordinate system. In order to perform the matching process, it is necessary to reduce the point cloud data while retaining the shape information of the object. For this reason, in the method described in Non-Patent Document 1, the number of point cloud data is reduced so that the interval between the point cloud data after reduction is as constant as possible with respect to the entire data measurement range.
- the amount of point cloud data is large, it takes a lot of time to transfer all measured point cloud data to another device.
- the shape around the mobile robot for example, peripheral terrain
- the user who operates the remote control terminal grasps the surrounding situation of the mobile robot from the shape image (for example, terrain image) generated by processing the transferred point cloud data, and instructs the next operation of the mobile robot can do.
- the time required to instruct the next operation of the mobile robot becomes long, leading to an increase in the mission performance time of the robot.
- An object of the present invention is to provide a data transmission system, a data transmission device, a data transmission method, and a data transmission program that reduce the amount of point cloud data transferred to the remote operation terminal side.
- the data transmission device includes an actuator, a three-dimensional sensor, an arithmetic device (data selection unit), and a communication unit.
- the operation of the actuator is controlled in accordance with a control signal from the remote control device.
- the three-dimensional sensor acquires point cloud data indicating three-dimensional coordinates.
- the arithmetic device selects data to be transferred based on the point cloud data.
- the communication unit transmits the selected transfer target data to the remote operation terminal.
- the arithmetic device sets the upper limit of the data amount of the transfer target data belonging to the predetermined three-dimensional area.
- the data transmission method is a data transmission method by a data transmission device including an actuator whose operation is controlled according to a control signal from a remote operation terminal, and includes the following steps. That is, the data transmission method includes a step of acquiring point cloud data indicating three-dimensional coordinates, a step of selecting transfer target data based on the point cloud data, and a step of transmitting the transfer target data to the remote control terminal. To do.
- the arithmetic unit of the data transmission apparatus sets an upper limit of the amount of data to be transferred belonging to a predetermined three-dimensional area.
- the amount of point cloud data transferred to the remote operation terminal can be reduced.
- FIG. 1 is a diagram illustrating an example of a configuration of a data transmission system according to an embodiment.
- FIG. 2 is a diagram illustrating an example of point cloud data acquired by the robot according to the embodiment.
- FIG. 3 is a schematic block diagram illustrating an example of a data transmission system according to the embodiment.
- FIG. 4 is a conceptual diagram of point cloud data and a measurement target acquired by the robot according to the embodiment.
- FIG. 5 is a diagram illustrating an example of a grid arranged for point cloud data acquired by the robot according to the embodiment.
- FIG. 6 is a diagram illustrating an example of a transfer data reduction method according to the embodiment.
- FIG. 7 is a diagram illustrating a transfer data reduction method according to the first embodiment.
- FIG. 8 is a diagram illustrating an example of point cloud data acquired by the robot according to the embodiment.
- FIG. 9 is a diagram illustrating a grid arrangement example for the point cloud data in the transfer data reduction process according to the embodiment.
- FIG. 10 is a diagram illustrating an example of a transfer data reduction method according to the first embodiment.
- FIG. 11 is a diagram illustrating another example of the transfer data reduction method according to the first embodiment.
- FIG. 12 is a diagram illustrating an example of point cloud data after the amount of transfer data in the first embodiment is reduced.
- FIG. 12 is a diagram illustrating an example of point cloud data after the amount of transfer data in the first embodiment is reduced.
- FIG. 13 is a diagram illustrating still another example of the transfer data reduction method according to the first embodiment.
- FIG. 14 is a diagram illustrating another example of the point cloud data after reducing the data amount of the transfer data according to the first embodiment.
- FIG. 15 is a diagram illustrating an example of reducing the amount of transfer data when the point cloud data is distributed one-dimensionally in the transfer data reduction method according to the second embodiment.
- FIG. 16 is a diagram illustrating an example of reducing the amount of transfer data when the point cloud data is distributed two-dimensionally in the transfer data reduction method according to the second embodiment.
- FIG. 17 is a diagram illustrating a transfer data amount reduction example when the point cloud data is distributed three-dimensionally in the transfer data reduction method according to the second embodiment.
- the data transmission system selects data to be transferred from point cloud data acquired by a remotely operated robot.
- point cloud data belonging to a predetermined three-dimensional area
- an upper limit is set for the amount of data selected as transfer target data.
- the density of the transfer data can be controlled for the point cloud data in the area.
- the measurement target is virtually covered with a three-dimensional grid, and the point cloud data in the grid is reduced according to a predetermined algorithm.
- the robot transmits point cloud data (transfer target data) with a reduced amount of data to the remote operation terminal.
- the remote operation terminal creates a shape image around the robot based on the received point cloud data, and outputs it to a display device or the like so as to be visible.
- the user controls the operation of the robot by operating the remote operation terminal while viewing the shape image around the output robot.
- FIG. 1 is a diagram illustrating an example of the configuration of the data transmission system 100.
- FIG. 2 is a diagram illustrating an example of point cloud data acquired by the robot.
- the data transmission system 100 includes a remote operation terminal 101 and a robot 10.
- the robot 10 moves in response to an instruction (control signal) from the remote operation terminal 101.
- the operation of the arm 4 (manipulator) of the robot which will be described later, is controlled in accordance with an instruction (control signal) from the remote operation terminal 101.
- the robot 10 executes an operation of “moving to the vicinity of the target 90 and moving the target 90 from the current position to another position” in response to an instruction from the remote operation terminal 101.
- the robot 10 transmits the point group data 20 of the peripheral area acquired by the three-dimensional sensor 2 (that is, a sensor for acquiring a three-dimensional shape) to the remote operation terminal 101.
- the user instructs the robot 10 to perform the next operation by operating the remote operation terminal 101 while confirming the surface shape image around the robot 10 created based on the point cloud data 20.
- the remote operation terminal 101 is connected to the output device 102, the input device 103, and the transmission device 104.
- the remote operation terminal 101 is exemplified by a computer device, and includes a CPU and a storage device (not shown).
- the remote operation terminal 101 controls the operation of the robot 10, images the surface shape of the measurement target based on the point cloud data 20 transmitted from the robot 10, and outputs the surface shape to the output device 102 so as to be visible. Details of the configuration of the remote operation terminal 101 will be described later.
- the output device 102 is exemplified by a monitor and a printer, and outputs the image information output from the remote operation terminal 101 so as to be visible.
- the input device 103 is exemplified by a keyboard, a touch panel, a mouse, a joystick, and the like, and is an interface device that inputs various information (or various data) to the remote operation terminal 101 when operated by a user.
- the transmission device 104 is a communication interface device that controls transmission of data and signals between the remote operation terminal 101 and the robot 10 (transmission device 1). Specifically, the transmission device 104 constructs a transmission path with the transmission device 1 mounted on the robot 10 using either a wireless line, a wired line, or both, and the remote operation terminal 101 and the robot 10 Control the data transmission to and from.
- the remote operation terminal 101, the output device 102, the input device 103, and the transmission device 104 may be provided as individual devices as shown in FIG. 1, but all devices (or elements) are provided as a unit. Alternatively, at least two of all devices (or elements) may be provided integrally.
- a form in which the output device 102 and the input device 103 are integrated can be realized by a touch panel.
- the form in which the remote operation terminal 101 and the transmission device 104 are integrated can be realized by a computer device with a communication function.
- the remote control terminal 101, the output device 102, the input device 103, and the transmission device 104 are all installed in a touch panel type mobile phone (commonly referred to as a smart phone), a PDA (Personal Digital Assistants) with a communication function. Etc. are exemplified.
- the robot 10 includes a transmission device 1, a three-dimensional sensor 2, a leg portion 3, and an arm portion 4.
- the robot 10 functions as a data transmission device that reduces the data amount of the point cloud data 20 acquired by measurement by the three-dimensional sensor 2 according to a predetermined algorithm and then transfers the data to the remote operation terminal 101.
- the robot 10 is an aspect of the data transmission device.
- the transmission device 1 is an interface device that controls transmission of data and signals between the robot 10 and the remote operation terminal 101. Specifically, the transmission apparatus 1 constructs a transmission path between the wireless apparatus 101 and the transmission apparatus 104 connected to the remote operation terminal 101 by using either a wireless line or a wired line, or both lines, and remotely operates the robot 10. Controls data transmission with the terminal 101.
- the three-dimensional sensor 2 is exemplified by a laser scanner and a stereo camera, and acquires the three-dimensional position coordinates of the surface of the measurement object around the robot 10 as point cloud data 20 (also referred to as a point cloud).
- point cloud data 20 also referred to as a point cloud
- a laser scanner that can be used as the three-dimensional sensor 2 measures (or acquires) the point cloud data 20 by any one of a trigonometric method, a time-of-flight method, and a phase difference method (phase shift). .
- the measurement range (scanning range) of the point cloud data 20 by the three-dimensional sensor 2 will be described with reference to FIG.
- the measurement position (for example, installation position) of the three-dimensional sensor 2 is defined as the origin Os
- the coordinate system of the measured point cloud data 20 is defined as (Xs, Ys, Zs).
- the three-dimensional sensor 2 scans the laser within the range of the azimuth angle ⁇ and the elevation angle ⁇ with the irradiation angle centered on the origin Os, and based on the reflected light from the measurement object within this range,
- the surface three-dimensional coordinates are measured (or acquired) as point cloud data 20.
- the robot 10 moves by the leg 3 and measures the point cloud data 20 at a plurality of positions (in other words, a plurality of positions of the three-dimensional sensor), and performs matching synthesis of the measured point cloud data 20 to obtain a desired value. Can be obtained.
- the leg 3 is a moving means that is driven by an actuator 16 described later and moves the robot 10 to an arbitrary position.
- a leg having a joint and a link will be described as an example of the leg 3.
- a rotating body for example, a wheel
- the number of legs, the shape, and the number of joints (number of links) in the leg 3 are not limited to the illustrated number and shape, and can be arbitrarily set.
- the arm 4 is driven by an actuator 16 described later, and is exemplified by a manipulator (also referred to as an arm) having a joint, a link, and an end effector 401.
- the end effector 401 is preferably provided, for example, at the tip of the arm 4 and has a mechanism that applies a physical action (mechanical action, electromagnetic action, thermodynamic action) to the object.
- the end effector 401 may include a mechanism for gripping, painting, or welding an object.
- the end effector 401 may include an electromagnetic sensor, various measuring devices, and the like.
- the arm portion 4 is provided with a robot hand that grips (handles) an object as an end effector 401.
- the number of arms (arms), the shape, the number of joints (number of links), and the structure of the end effector 401 in the arm portion 4 are not limited to the numbers and shapes described in FIG. 1 and can be arbitrarily set.
- each function of the communication unit 201, the display unit 202, and the control unit 203 is realized by the CPU executing a program stored in a storage device (not shown).
- the functions of the communication unit 201, the display unit 202, and the control unit 203 may be realized only by hardware or by cooperation between software and hardware.
- the communication unit 201 includes a communication interface (hardware).
- the communication unit 201 controls the transmission device 104 shown in FIG. 1 to control communication with the transmission device 1 in the robot 10. Specifically, the communication unit 201 transfers a control signal from the control unit 203 to the transmission device 1 in the robot 10 via the transmission device 104. Alternatively, the point cloud data 20 transferred from the robot 10 (or a signal corresponding to the point cloud data 20) is output to the display unit 202. The display unit 202 generates image information to be displayed on the output device 102. More specifically, the display unit 202 uses the point cloud data 20 input from the communication unit 201 to create image information (image data) for displaying the surface shape of the measurement target, and outputs the image information to the output device 102. Output.
- the display unit 202 calculates image information for displaying the surface shape of the measurement target object through processing such as edge detection, smoothing by noise removal, and normal line extraction on the point cloud data 20.
- the control unit 203 generates a control signal corresponding to the input signal from the input device 103 and outputs the control signal to the communication unit 201.
- the robot 10 controls, for example, the movement of the leg portion 3 and the arm portion 4 or the operation of acquiring the point cloud data 20 in accordance with the control signal output from the control unit 203.
- the robot 10 includes a computer device (not shown) (the computer device includes, for example, an arithmetic device including a CPU, a storage device, and the like).
- the functions of the point cloud coordinate calculation unit 11, the data selection unit 12, the recognition unit 13, the communication unit 14, and the controller 15 are realized by the CPU executing a program stored in a storage device (not shown).
- the functions of the point group coordinate calculation unit 11, the data selection unit 12, the recognition unit 13, the communication unit 14, and the controller 15 may be realized by hardware alone or by cooperation between software and hardware.
- processing such as point cloud data acquisition processing, point cloud data selection processing, and surface shape calculation processing described later is realized.
- the point cloud coordinate calculation unit 11 uses the distance between the measurement object and the sensor and the irradiation angle (reflection angle) measured by the three-dimensional sensor 2 to determine the three-dimensional position coordinates of the measurement point. (X, Y, Z) is calculated as the point group data 20 (in other words, the point group coordinate calculation unit 11 calculates the three-dimensional position coordinates of the measurement points measured by the three-dimensional sensor 2 as the point group data 20. Execute point cloud data acquisition processing.) In addition, the point group coordinate calculation unit 11 matches a plurality of point group data 20 obtained by the three-dimensional sensor 2 with each other at a plurality of positions (a plurality of positions of the three-dimensional sensor). You may extract as 20.
- the point cloud data 20 calculated by the point cloud coordinate calculation unit 11 is output to the data selection unit 12.
- the robot 10 may include a CCD camera that acquires color information (RGB) for enhancing the visibility of the terrain around the robot and the shape of the object.
- the point group coordinate calculation unit 11 may combine (color matching) the point group data 20 and the color information.
- the point cloud data 20 and the color information are transmitted from the robot 10 to the remote operation terminal 101 at different timings. Then, color matching may be performed at the remote operation terminal 101.
- the data selection unit 12 executes a point cloud data selection process for selecting the point cloud data 20 to be transferred to the remote operation terminal 101 from the point cloud data 20 obtained by the point cloud coordinate calculation unit 11. At this time, it is preferable that the data selection unit 12 sets a predetermined area and determines the upper limit of the data amount of transfer data in the area.
- the data selection unit 12 arranges a grid 30 (predetermined three-dimensional region) in a virtual space in which the point group data 20 acquired from the point group coordinate calculation unit 11 is distributed, and the number of point group data 20 belonging to the grid 30 Are reduced according to a predetermined algorithm (point cloud data selection process).
- the data selection unit 12 outputs the point cloud data 20 to be transferred belonging to the grid 30 to the communication unit 14.
- the data selection unit 12 selects the point cloud data 20 to be transferred (point cloud data 20 to be transferred registered in the grid 30) as the point cloud data 20 to be transferred first in preference to the other point cloud data 20. May be.
- the point cloud data 20 not selected in the selection process may be output to the communication unit 14 as data having a low transfer priority.
- the point cloud data 20 selected by the data selection unit 12 and all the point cloud data 20 before selection are preferably recorded in a storage device (not shown). Details of the point cloud data selection processing operation in the data selection unit 12 will be described later.
- the data selection unit 12 may analyze the point cloud data 20 in a predetermined area and select data obtained based on the analysis result as transfer target data. Details of a method of acquiring data obtained based on the analysis result of the point cloud data 20 will be described later.
- the data selection unit 12 preferably outputs all of the point cloud data 20 acquired from the point cloud coordinate calculation unit 11 (point cloud data 20 before selection) to the recognition unit 13. However, the data selection unit 12 may output the point cloud data 20 selected as the transfer target data to the recognition unit 13.
- the recognition unit 13 analyzes the point cloud data 20 and performs a surface shape calculation process for calculating the surface shape of the measurement object in the region measured by the three-dimensional sensor 2 (the region in which the point cloud data 20 to be analyzed is distributed). Execute. Then, information (data) indicating the calculated surface shape is output to the controller 15. This information is preferably recorded in a storage device (not shown).
- the surface shape information obtained here includes, for example, information (data) indicating the peripheral topography in the measurement region and the detailed position coordinates of the target 90.
- the controller 15 controls the operation of the actuator 16 by an operation command signal based on a control signal input from the remote operation terminal 101 via the communication unit 14. Specifically, the controller 15 receives a control signal (for example, information indicating the target position and target posture) for moving the leg 3, arm 4, and the like from the remote operation terminal 101. Based on the control signal, the controller 15 controls the actuator 16 so that the leg 3, the arm 4, and the like are in the position and posture instructed from the remote operation terminal 101.
- a control signal for example, information indicating the target position and target posture
- the operation amount and the operation direction of the actuator 16 may be corrected.
- the controller 15 autonomously uses the surface coordinates of the measurement object output from the recognition unit 13 and the position coordinates of the links or end effectors 401 and 402 in the leg 3 or the arm 4 to autonomously move the amount of movement of the actuator 16.
- the operation direction may be determined and the operation of the robot 10 may be controlled.
- the controller 15 uses detailed surface shape information calculated by the recognition unit 13 instead of the point cloud data 20 selected as the transfer target in order to improve the operation accuracy and perform detailed analysis of the movement path. May be.
- the actuator 16 is exemplified by a servo motor, a power cylinder, a linear actuator, a rubber actuator, and the like, and controls the mechanical behavior of the leg portion 3 and the arm portion 4 in accordance with an operation command signal from the controller 15.
- the actuator 16 may indirectly drive the leg 3, the arm 4, or the like, or may directly drive the actuator 16. That is, the actuator 16 may be provided separately from the leg portion 3 or the arm portion 4, or may be mounted as a part (for example, a joint portion) of the leg portion 3, the arm portion 4, or the like.
- a motor or an engine may be used as the actuator 16.
- FIG. 4 is a conceptual diagram of the point cloud data 20 acquired by the robot 10 and the measurement object.
- the robot 10 acquires point cloud data 20 of the measurement object by the three-dimensional sensor 2.
- the measurement object is, for example, an element that reflects laser light within the scanning range of the three-dimensional sensor 2 and includes the surrounding landform and the target 90 within the scanning range.
- the point cloud data 20 is preferably represented by an orthogonal coordinate system (Xs, Ys, Zs). For example, as shown in FIG.
- the scan coordinate system (Xs, Ys, Zs) to which the point cloud data 20 belongs is preferably the same absolute coordinate system as the coordinate system in which the position coordinates of the robot 10, the leg 3 and the arm 4 are expressed.
- FIG. 5 is a diagram illustrating an example of a grid 30 (predetermined three-dimensional region) arranged with respect to the point cloud data 20 acquired by the robot 10.
- the grid 30 includes a straight line parallel to the virtual visual line direction Yg (hereinafter referred to as the visual line direction Yg), a straight line parallel to the direction Xg orthogonal to the visual line direction Yg, the visual line direction Yg, and the direction Xg.
- the visual line direction Yg a straight line parallel to the virtual visual line direction Yg
- Xg orthogonal to the visual line direction Yg
- the visual line direction Yg the visual line direction Yg
- the direction Xg Formed by a plurality of cells 31 (1, 1, 1) to (X1, Ym, Zn) defined by straight lines parallel to the direction Zg orthogonal to both (l, m, n are natural numbers of 2 or more).
- the line-of-sight direction Yg of the grid 30 can be arbitrarily set independently of the scan coordinate system (Xs, Ys, Zs) of the point cloud data 20.
- the line-of-sight direction Yg is specified based on an instruction from the remote operation terminal, for example.
- the line-of-sight direction Yg can be set regardless of the orientation of the robot head.
- the orientation of the grid 30 (the line-of-sight direction Yg), the size, number, and position of the cells 31 or the overall size of the grid 30 are preferably set by the remote operation terminal 101. It does not matter.
- the robot 10 reduces the data amount of the point cloud data 20 to be transferred by filtering using the grid 30, and transfers the point cloud data 20 after the data amount reduction to the remote operation terminal 101.
- the robot 10 data selection unit 12
- FIG. 6 is a diagram illustrating an example of a transfer data reduction method using the grid 30. With reference to FIG. 6, the transfer data reduction method when the upper limit of the number of point cloud data selected (registered) as transfer data in the cell 31 is one will be described.
- the data selection unit 12 selects (registers) only the point cloud data 20-1 as the point cloud data to be transferred, and the other point cloud data 20-2, 20-3. Are excluded from the transfer target (not registered) (where 1 ⁇ i ⁇ l (1), 1 ⁇ j ⁇ m, 1 ⁇ k ⁇ m).
- the size of the cell 31 is 1 cm 3 , for example, one of a plurality of point cloud data in a cube having a side of 1 cm is set as a transfer target, and other point cloud data is excluded from the transfer target. Can do.
- the point cloud data 20-2 and 20-3 excluded from the transfer target are selected (registered) as data to be transferred to the remote operation terminal 101 after the transfer of the previously selected point cloud data 20-1. It doesn't matter. In this case, the point cloud data 20 having a large amount of data can be divided into predetermined data amounts and transferred to the remote operation terminal 101.
- the order selected (registered) as a transfer target in the cell 31 can be arbitrarily set. For example, they are selected in the scanning order of the three-dimensional sensor 2.
- the point group data 20 on the upstream side in the scanning direction of the three-dimensional sensor 2 in the cell 31 is preferentially selected as a transfer target.
- the point cloud data 20-1, 20-2, 20-3 are measured in this order, and when the upper limit of the transfer target is 2, the point cloud data 20-1, 20-2 is selected as the transfer target.
- the shape of the grid 30 (cell 31) is not limited to a cube or a rectangular parallelepiped, and may be a polyhedron.
- the size of the cell 31 may not be uniform within the grid 30 but may be different depending on the location.
- the size of the cell 31 can be changed by applying an octree method to the cells 31 in a predetermined area. In this case, the cell size in the vicinity of the edge to be measured can be set small, and the cell size in the region away from the edge can be set large.
- the grid 30 is preferably arranged with reference to the viewing direction Yg, but may be arranged with reference to other directions.
- the transfer rate of point cloud data in the cell 31 is changed according to the position of the cell 31. That is, according to the position of the cell 31, an area where the point cloud data 20 is greatly thinned out and an area where the point cloud data 20 is thinned out are set. As a result, it is possible to display in detail the important area that affects the remote operation while reducing the data transfer amount.
- the transfer rate of the point cloud data 20 in the cell 31 is the data selected as the transfer target in the unit volume (point cloud data and the data amount of all the point cloud data in the unit volume of the cell 31). Indicates the ratio of the amount of data. In other words, the transfer rate indicates a value obtained by normalizing the ratio of the data amount of the transfer data to the total data amount of the point cloud data in the cell 31 by the cell size (cell volume).
- FIG. 7 is a diagram illustrating an example of a transfer data reduction method according to the first embodiment.
- the transfer rate of transfer data in an area 33 (also referred to as a first area) indicated by a predetermined distance from important point 32 is the other area 34 (second area). It is set so as to be larger than (also referred to as).
- the important point 32 that determines the area 33 where the transfer rate of the transfer data is large is set to an arbitrary point (or a nearby point) in the end effector 401 (hand) or end effector 402 (foot tip) in order to improve operation accuracy.
- the region 33 may have any shape as long as it is determined based on the important point 32. For example, a range in which the distance from the important point 32 is constant is preferable.
- the area 34 for reducing the transfer rate of transfer data is preferably set to an area other than the area 33 in the area where the point cloud data 20 is distributed. Further, in the area 34, the transfer rate of the transfer data may be reduced stepwise according to the distance from the important point 32. For example, the area where the point cloud data 20 is distributed may be divided into a plurality of areas, and the transfer rate of the transfer data may be reduced according to the distance from the important point 32 (for example, in proportion). Furthermore, a plurality of areas 33 and 34 may be set.
- the upper limit of the number of point cloud data registered as transfer data (transfer target point cloud data) and the transfer rate can be arbitrarily set for the cells 31 included in each of the plurality of regions.
- the conditions for determining the areas 33 and 34 are not limited to the above-described method, and can be set arbitrarily.
- a region defined by a plurality of cells 31 that satisfy a predetermined condition may be defined as a region 33, and a region defined by a plurality of other cells 31 that do not satisfy a predetermined condition may be defined as a region 34.
- the predetermined condition may be selected based on, for example, the position coordinates of the cell 31 or the cell arrangement.
- Each of the important point 32, the region 33, and the region 34 can be designated from the remote operation terminal 101. Further, the robot 10 may automatically calculate the area 33 and the area 34 based on the important point 32 designated by the remote operation terminal 101. In this case, it is preferable that parameters such as the distance from the important point 32 for determining the regions 33 and 34 are set in the robot 10 in advance.
- the transfer rate of the point cloud data 20 set for each of the areas 33 and 34 can be changed by changing the size of the cell 31 or changing the upper limit of the point cloud data registered as a transfer target in the cell 31. Become. An example of changing the transfer rate of transfer data using the grid 30 will be described with reference to FIGS. Actually, the point cloud data 20 indicated by the three-dimensional coordinates is excluded from the transfer target, but in the following, the point cloud data 20 and the grid 30 will be displayed in a two-dimensional manner to simplify the explanation.
- FIG. 8 is a diagram illustrating an example of the point cloud data 20 measured by the robot 10.
- a grid 30 is arranged in a virtual space in which the point cloud data 20 is distributed.
- the virtual viewpoint 35, the arrangement position or shape of the grid 30, the line-of-sight direction Yg, the size of the cell 31 (grid division size), and the like are preferably specified by the remote operation terminal 101.
- any one of the virtual viewpoint 35, the arrangement position or shape of the grid 30, the line-of-sight direction Yg, the size (grid division size), the number, and the arrangement of the cells 31 is set in advance in the robot 10, and the setting is used.
- the grid 30 may be arranged.
- the grid division size (the size of the cell 31) in the region 33 around the important point 32 is smaller than the grid division size (the size of the cell 31) in the other region 34.
- the size of the cell 31-1 in the region 33 with the radius r 1 centering on the important point 32 is set to half the size of the cell 31-2 in the other region 34.
- the transfer rate of the point group data 20 in the region 33 having a small cell size is higher than that in the other regions 34. growing.
- the point cloud data 20 excluded as a transfer target in the area 34 having a large cell size is larger than that in the area 33.
- the data density of the point cloud data 20 to be transferred in the area 33 is higher than the data density of the transfer target data in the other areas 34.
- the size of the cells 31 in the region 33 and the region 34 is not changed (equalized), and the upper limit of the number of data in the cell 31 is changed according to the region (location).
- the transfer rate of the point cloud data 20 in the cell 31-1 in the region 33 with the radius r1 centered on the important point 32 is set to 100% (no limit on the number of point cloud data to be transferred).
- the upper limit of the cell 31-2 in the region 34 can be set to 1.
- the transfer rate of the point cloud data 20 in the area 33 can be made larger than that in the other area 34, and the data density of the point cloud data 20 to be transferred in the area 33 can be changed to the other area 34.
- the data density of the transfer target data can be made higher.
- the upper limit of the point cloud data in the cell 31 is determined for each of the regions 33 and 34, points in a predetermined region (for example, the region 33 or the region 34).
- the amount of data communication can be reduced while arbitrarily changing the density of the group data 20.
- FIG. 12 is a diagram showing the point cloud data 20 selected as a transfer target by the method shown in FIG. As shown in FIG. 12, since there is a density difference in the transfer data in the areas 33 and 34, the remote control terminal 101 can obtain a detailed image in the area around the end effector and a simplified image in the other areas. It becomes possible.
- the method of increasing the data transfer rate of the area 33 determined by the important point 32 has been described.
- the method is not limited to this, and transfer is performed in the cell 31 according to the virtual viewpoint 35 and the line-of-sight direction Yg.
- the upper limit of the target point cloud data and the transfer rate may be determined.
- a grid 30 is arranged in a virtual space in which the point cloud data 20 is distributed.
- the remote operation terminal 101 specifies the virtual viewpoint 35, the arrangement position and shape of the grid 30, the line-of-sight direction Yg, the size (grid division size), the number, or the position of the cell 31.
- the point cloud data 20 in the cell 31-3 (also referred to as a first region) that is visible when viewed from the virtual viewpoint 35 in the line-of-sight direction Yg is registered as a transfer target, and other cells 31 -4 (also referred to as second region) is excluded from the transfer target.
- the cell row (cells 31 (i, 1, k) to 31 (i, Ym, k)) in the line-of-sight direction Yg, the virtual viewpoint 35 side of the cells 31 including the point cloud data 20 is displayed.
- the point cloud data 20 in the cell 31-3 (i, h, k) is registered as a transfer target. Then, the point cloud data 20 of the other cells 31-4 (i, h + 1, k) to cells 31-4 (i, Ym, k) are excluded from the transfer target (however, 1 ⁇ h ⁇ Ym ⁇ 1) ).
- the cell 31-3 closest to the virtual viewpoint 35 side among the cells 31 including the point cloud data 20 is a cell column (cells 31 (i, 1, k) to 31 (i, Ym, k)), if it is the Ym-th cell 31, the point cloud data 20 of the cell 31-3 (i, Ym, k) is registered as the transfer target.
- FIG. 14 is a diagram showing the point cloud data 20 selected as a transfer target after the amount of transfer data is reduced by the method shown in FIG. As shown in FIG. 14, only the surface shape that is visible when viewing the line-of-sight direction Yg from the virtual viewpoint 35 is the transfer target to the remote control terminal 101, and the point cloud data 20 on the back side as viewed in the line-of-sight direction Yg. Is excluded from the transfer target.
- the present embodiment only the surface shape that is visible when viewing the line-of-sight direction Yg from the virtual viewpoint 35 side is transferred to the remote operation terminal 101. Therefore, as shown in FIG. It is possible to display an easy-to-view image that omits the point cloud data 20 overlapping with. Further, since all of the point cloud data 20 overlapping in the depth direction is excluded from the transfer data, the amount of data communication can be further reduced as compared with the methods shown in FIGS. In the present embodiment, all of the point group data 20 of the cell 31-4 on the back side as viewed from the virtual viewpoint 35 is excluded from the transfer target, but this is not limiting, and the point group data 20 in the cell 31-4 is not limited to this.
- a predetermined number of point cloud data 20 may be registered as transfer targets.
- the upper limit of the number of point cloud data to be transferred in a predetermined area (cell 31-4) is set to 0, but this upper limit can be arbitrarily set.
- the transfer data is selected by the above-described method.
- the transfer data may be selected for the cell 31-3 by the method described above.
- the upper limit of the number of point cloud data 20 to be transferred in the cell 31-3 is set to be larger than the upper limit of the number of point cloud data 20 to be transferred in the cell 31-4.
- the method of setting the area 34 or the cell 31-4 for decreasing the transfer rate of the point cloud data and the area 33 or the cell 31-3 for increasing the transfer rate is not limited to the above-described example.
- the cell 31 with a large transfer rate and the cell 31 with a small transfer rate may be set according to the conditions which show a cell position (coordinate).
- even-numbered cells in the Xg coordinate direction, even-numbered cells in the Yg coordinate direction, or even-numbered cells 31 in the Zg coordinate direction are cells 31-3 having a large transfer rate, and odd-numbered cells in the Xg coordinate direction.
- the odd-numbered cells in the Yg coordinate direction or the odd-numbered cells 31 in the Zg coordinate direction are set as the cells 31-4 having a small transfer rate.
- the predetermined distance for determining the regions 33 and 34 or the conditions for determining the cells 31-3 and 31-4 may be set in advance in the robot 10 or specified by the remote operation terminal 101. Also good.
- the robot 10 in the second embodiment determines data (shape reproduction data described later) to be transferred to the remote operation terminal 101 according to the shape of the measurement object predicted from the point cloud data 20. To do.
- the display unit 202 of the remote operation terminal 101 according to the second embodiment generates point cloud data based on the data transferred from the robot 10 and displays the surface shape of the measurement object using the point cloud data.
- a second embodiment of the transfer data reduction method in the data transmission system 100 will be described.
- the data transmission system 100 in the present embodiment changes the transfer data reduction rate in accordance with the “local shape of the measurement target”.
- the “local shape of the measurement object” can be classified by the size of three eigenvalues obtained by principal component analysis for point cloud data within a predetermined range. If the eigenvalues are d1, d2, and d3 in descending order, the local shape of the measurement object can be classified as pattern 1 to pattern 5 below.
- the data selection unit 12 of the robot 10 sets one of the measured point cloud data 20 as a reference point 51, and sets the reference point 51 as the reference point 51.
- a corresponding range is set as the analysis region 52.
- the analysis area 52 is, for example, a spherical area having a radius r2 with the reference point 51 as the center.
- the reference point 51 may be determined randomly.
- the distance (for example, radius r2) from the reference point 51 that determines the analysis region 52 is preferably set based on a fixed value.
- the distance (for example, radius r2) from the reference point 51 that determines the analysis region 52 is set to 1 ⁇ 2 of the largest eigenvalue, for example.
- the interval between the adjacent reference points 51 is a length that does not overlap the analysis region 52 (for example, a radius r2 or more).
- the data selection unit 12 performs principal component analysis on the point cloud data 20 (position coordinates indicated by the point cloud data 20) in the analysis region 52, and the eigenvalues d1, d2, and d3 and the eigenvectors e1, e2, and e3 corresponding thereto. Ask for. Specifically, the eigenvalue decomposition is performed on the covariance matrix obtained from the position coordinates indicated by the point cloud data 20 in the analysis region 52, and the eigenvalues d1, d2, d3 and the eigenvectors e1, e2, e3 corresponding thereto are obtained.
- the data selection unit 12 classifies the shape in the analysis region 52 into any one of the patterns 1 to 5 based on the magnitudes of the eigenvalues d1, d2, and d3.
- the data selection unit 12 selects data to be transferred according to the classified pattern.
- the data selection unit 12 replaces the point cloud data 20 in the analysis region 52 with the analysis result for the analysis region 52 as shape reproduction data, and the remote control terminal 101.
- the remote operation terminal 101 arranges point cloud data distributed at predetermined intervals within the shape range indicated by the shape reproduction data, and generates and displays a measurement target shape image.
- the eigenvalue is d1 ⁇ d2 ⁇ d3 ⁇ 0 and is classified as pattern 1. That is, when all of the eigenvalues d1, d2, and d3 are smaller than the predetermined first threshold value (in other words, when all of the eigenvalues d1, d2, and d3 can be approximated by 0 (including 0)), the pattern 1 is classified. Is done.
- the data selection unit 12 excludes all of the point cloud data 20 in the analysis region 52 classified as the pattern 1 from the transfer target (transfer rate is 0%).
- the eigenvalues are classified as pattern 2 as d1 >> d2 ⁇ d3 ⁇ 0. That is, the eigenvalue d2 and the eigenvalue d3 are smaller than the predetermined second threshold (the eigenvalue d2 and the eigenvalue d3 can be approximated by 0 (including 0)), and the value of the eigenvalue d1 is the third threshold (the third threshold is , Equal to the second threshold value or larger than the second threshold value), it is classified as pattern 2.
- the data selection unit 12 selects the shape reproduction data as a transfer target for the remote operation terminal 101 instead of the point cloud data 20 in the analysis region 52 classified as the pattern 2.
- the average coordinates 60 (three-dimensional coordinates Aa) of the point cloud data 20 (three-dimensional coordinates A1 to Ai) in the analysis region 52 classified as pattern 2 and the eigenvalue d1
- the corresponding eigenvector e1 and eigenvalue d1 are transmitted to the remote operation terminal 101 as shape reproduction data.
- the average coordinate 60 means the center coordinate of the distribution area of the point cloud data 20.
- the eigenvector e1 means the direction in which the distribution range of the point cloud data 20 is widened
- the eigenvalue d1 means the size of the distribution range of the point cloud data 20 in the eigenvector e1 direction.
- the shape reproduction data having a small data amount is transmitted instead of the point cloud data, the data communication amount can be greatly reduced (transfer rate is small).
- the display unit 202 of the remote operation terminal 101 has a point cloud data distribution region that extends around the average coordinate 60 by a size based on the eigenvalue d1 with respect to the direction of the eigenvector e1.
- the point cloud data arranged at a predetermined interval in the distribution area is generated and displayed.
- the display unit 202 sets a linear region in the range of ⁇ 3 times the eigenvalue d1 ( ⁇ 3d1 ⁇ eigenvector e1) from the average coordinate 60 in the direction of the eigenvector e1 as a point cloud data distribution region.
- Point cloud data is arranged and displayed at intervals.
- the interval of the point cloud data to be reproduced may be set in advance or specified by the input device 103 operated by the user.
- the display unit 202 may generate and display the surface shape of the measurement target based on the generated point cloud data.
- the eigenvalues are classified as pattern 3 because d1> d2 >> d3 ⁇ 0. That is, only the eigenvalue d3 is smaller than the predetermined fourth threshold (the eigenvalue d3 can be approximated by 0 (including 0)), and the values of the eigenvalue d1 and the eigenvalue d2 are the fifth threshold (the fifth threshold is 4 is equal to or greater than the fourth threshold value), and the eigenvalue d1 is greater than d2, the pattern 3 is classified.
- the data selection unit 12 transmits shape reproduction data to the remote operation terminal 101 instead of the point cloud data 20 in the analysis region 52 classified as the pattern 3.
- the average coordinates 60 (three-dimensional coordinates Aa) of the point cloud data 20 (three-dimensional coordinates A1 to Ai) in the analysis region 52 classified as pattern 3 correspond to the eigenvalue d1.
- the eigenvector e1, the eigenvector e2, the eigenvalue d1, and the eigenvalue d2 corresponding to the eigenvector e1 and d2 are transmitted to the remote operation terminal 101 as shape reproduction data.
- the average coordinate 60 means the center coordinate of the distribution area of the point cloud data 20.
- the eigenvectors e1 and e2 mean the direction in which the distribution range of the point cloud data 20 is widened, the eigenvalue d1 means the size of the distribution range of the point cloud data 20 in the eigenvector e1 direction, and the eigenvalue d2 is the eigenvector e2 This means the size of the distribution range of the point cloud data 20 in the direction.
- the data communication amount can be greatly reduced (low transfer rate).
- the display unit 202 of the remote control terminal 101 spreads about the average coordinate 60 by the magnitude based on the eigenvalue d1 with respect to the direction of the eigenvector e1, and with respect to the direction of the eigenvector e2.
- An area that expands by a size based on the eigenvalue d2 is set as a distribution area of point cloud data, and point cloud data arranged at a predetermined interval in the distribution area is generated and displayed.
- the display unit 202 has a range of ⁇ 3 times the eigenvalue d1 from the average coordinate 60 in the direction of the eigenvector e1 ( ⁇ 3d1 ⁇ eigenvector e1), and ⁇ 3 times the eigenvalue d2 from the average coordinate 60 in the direction of the eigenvector e2.
- a plane region surrounded by the range ( ⁇ 3d2 ⁇ eigenvector e2) is set as a distribution region of point cloud data, and the point cloud data is arranged and displayed at a predetermined interval in the region.
- the interval between the point cloud data may be set in advance or specified by the input device 103 operated by the user.
- the display unit 202 may generate and display the surface shape of the measurement target based on the generated point cloud data.
- the eigenvalues are classified as pattern 4 as d1> d2> d3 >> 0. That is, if all of the eigenvalues d1, d2, and d3 are larger than a predetermined sixth threshold value compared to 0, the eigenvalue d2 is larger than the eigenvalue d3, and the eigenvalue d1 is larger than the eigenvalue d2, the pattern 4 is classified. .
- all of the point cloud data 20 in the analysis area 52 classified into the pattern 4 since there is a high demand for the user operating the remote operation terminal 101 to confirm the three-dimensional shape in detail, all of the point cloud data 20 is selected as a transfer target. Is preferable (transfer rate 100%).
- the data selection method using the grid 30 described above may be adopted for the point cloud data 20 in the analysis region 52 classified into the pattern 4.
- the eigenvalues d1, d2, and d3 indicate values that do not correspond to any of the patterns 1 to 4, they are classified into the pattern 5.
- transfer data is selected by the data selection method using the grid 30 described above.
- all point cloud data 20 in the area classified as pattern 5 may be excluded from the transfer target (transfer rate 0%).
- the first to sixth threshold values which are reference values for comparing the magnitudes of the eigenvalues in pattern determination, can be arbitrarily set according to the use for acquiring the point cloud data and the measurement accuracy of the sensor.
- the first to sixth threshold values used for pattern determination are arbitrarily set according to the measurement accuracy of the sensor. Specifically, when the measurement variation is ⁇ 1 cm, three times the standard deviation (3 ⁇ ) is ⁇ 1 cm, and the eigenvalue d3 ( ⁇ squared) is 1/9.
- the criterion (fourth threshold value) for determining whether or not the eigenvalue d3 is approximated to 0 needs to be set to a value larger than 1/9.
- the fourth threshold value For example, in a sensor with a measurement variation of ⁇ 1 cm, by setting the fourth threshold value to 1/5, when the eigenvalue d3 is smaller than 1/5, it can be determined that it can be approximated to 0, and the planar shape can be determined.
- arbitrary measurement accuracy can be realized by arbitrarily setting the first to sixth threshold values used for pattern determination.
- the reference value that determines whether or not the eigenvalue is approximated to 0 is measured in units of m compared to the case of measuring unevenness on a plane (three-dimensional object) in units of mm (precision measurement). In the case of (non-precision measurement), it is set larger.
- the shape reproduction data format is not limited to the above.
- at least two point group data 20 for example, two points separated by the eigenvalue d1 in the eigenvector e1 direction
- Selected as a target can be transferred.
- At least three point group data 20 that can define a planar shape (for example, two points separated by the eigenvalue d1 in the eigenvector e1 direction, and For at least one of the two points, one point separated by the eigenvalue d2 in the eigenvector e2 direction) is selected as the transfer target. Also in this case, the amount of communication between the robot 10 and the remote control terminal 101 can be greatly reduced.
- transfer data in the area is selected and the upper limit of the data amount is determined according to the pattern classified for each area. For example, when the shape reproduction data is set as transfer target data for a certain area, the data amount for the area is determined by the data amount of the shape reproduction data.
- the shape reproduction data capable of reproducing the surface shape of the measurement object in the remote operation terminal 101 is selected according to the expected shape of the measurement object.
- the shape reproduction data has a data amount smaller than that of the point cloud data 20
- the data communication amount can be reduced as compared with the case where the point cloud data 20 is transmitted.
- the situation around the robot 10 is within a range that does not affect the operability for the robot 10. Can be grasped.
- the robot 10 may transmit not only the data selected as the transfer target by the selection method described above but also the data excluded from the selection target to the remote operation terminal 101. In this case, it is preferable that the robot 10 transmits the data selected as the transfer target before the data excluded from the selection target. That is, it is preferable that the transmission order of data to be transmitted to the remote operation terminal 101 is set by the method for selecting the transfer target described above. Specifically, first, the data selected as the transfer target from the measured point cloud data 20 is transferred with the highest priority, and other data (data excluded from the transfer target) is transferred thereafter. The transmission order is given as follows. Further, the above-described selection process may be further performed on the point cloud data excluded from the transfer target, and the transmission order may be determined.
- the point cloud data 20 or the shape reproduction data with high importance is transmitted first to the remote operation terminal 101, and the data with low importance is sequentially transmitted.
- a user who operates the remote operation terminal 101 can obtain important information (for example, the situation around the hand) for operating the robot 10 at an early stage after data transfer from the robot 10 is started. It is possible to grasp the entire image of the measurement object from information with low importance.
- the robot 10 transmits the point cloud data 20 to be transferred in the area 33 or the cell 31-3 having a large transfer rate of the point cloud data 20 with the highest priority, and the transfer target in the other area 34 or the cell 31-4.
- the point cloud data 20 may be transmitted thereafter. That is, it is preferable that the transmission order for the remote operation terminal 101 is set according to the transfer rate of the point cloud data 20.
- the point cloud data 20 excluded from the transfer target is transferred after the point cloud data 20 set as the transfer target in the regions 33 and 34 or the cells 31-3 and 31-4.
- the user of the remote operation terminal 101 can visually recognize the surface shape image in an important region (for example, the hand side or the near side in the line-of-sight direction) at an early stage. I can grasp it. Note that when the transmission order of the point cloud data 20 in the area 33 and the area 34 is set, all the point cloud data 20 in the area 33 and the area 34 may be transmitted.
- the robot 10 may set the transmission order of the point cloud data 20 according to the condition indicating the cell position (for example, cell coordinates).
- the point cloud data 20 in the multiple cell number 4 in the Xg coordinate direction, the multiple cell number 4 in the Yg coordinate direction, or the multiple cell number 4 in the Zg coordinate direction is transmitted with the highest priority.
- a cell that is a multiple of 2 in the Yg coordinate direction (excluding a cell that is a multiple of 4)
- 2 in the Zg coordinate direction Next, the point cloud data 20 in the multiple cell (excluding the multiple cell of 4) is transmitted next, and the point cloud data 20 in the other cells is transmitted last.
- point cloud data 20 point cloud data in a cell for each predetermined interval
- a point cloud for forming an image having a fine spatial resolution is transmitted.
- Data 20 (the point cloud data 20 in the cell between the cell 31 that has transmitted the point cloud data and the other cell 31 that has transmitted the point cloud data) is transmitted.
- the user of the remote operation terminal 101 can check the rough shape of the measurement object when receiving data with coarse spatial resolution, and grasps the detailed situation with the passage of time (reception of sequentially transmitted data). can do.
- the minimum data necessary for the operation of the robot 10 is preferentially transmitted, and imaging can be performed in the remote operation terminal 101 based on the data.
- the user can grasp the situation around the robot in a short time even when the communication environment is bad or a transmission path with a small communication capacity is used. Thereby, the time required for the robot operation can be shortened. In addition, more detailed status can be grasped as time passes by data transmitted in stages.
- the robot 10 performs high-speed point cloud data 20 (hereinafter referred to as low-density data) measured by the three-dimensional sensor 2 for autonomous operation. It is preferable to be able to use high-density data). That is, the robot 10 preferably uses a plurality of data of low density data and high density point cloud data 20 depending on the application. A human can operate the robot 10 by referring to map information and surface shape created by low-density data (for example, the minimum interval of point cloud data is about 1 cm).
- the robot 10 transmits low-density data for remote operation and uses map information generated based on the high-density data for autonomous movement.
- the coarse / fine data As described above, it is possible to reduce the data transfer amount while maintaining the accuracy of the autonomous control of the robot 10.
- the robot 10 uses the point group data 20 (Xs, Ys, Zs, R, G, B) to which the color information (RGB) is added, or the color information (RGB) and the point group data (Xs, Ts, Zs). ) Is preferably transmitted to the remote operation terminal 101.
- the point cloud data (Xs, Ys, Zs) that does not add color information (RGB) to the control of the robot 10 such as autonomous movement. Is preferably used.
- the robot 10 transmits colored data for remote operation and uses map information generated based on data without color for autonomous movement.
- map information generated based on data without color for autonomous movement.
- the robot 10 controls the transfer data reduction rate according to the communication quality or communication capacity with the remote operation terminal 101.
- the robot 10 sets a large reduction amount of the transfer data when the communication speed is low, and decreases the reduction amount when the communication speed is high.
- the transfer data reduction amount is set to be large.
- the communication quality indicates a communication speed or a propagation environment (for example, reception intensity) on a transmission path between the robot 10 and the remote operation terminal 101, and the communication quality is measured by the robot 10 or the remote operation terminal 101.
- the robot 10 itself may measure the communication quality and set or change the transfer rate accordingly.
- the communication quality measurement and the transfer rate setting or change control for the robot 10 be controlled by the remote operation terminal 101.
- the robot 10 can be remotely operated with a small amount of data communication even in a situation where the communication speed is low, the upper limit of the communication capacity is small, or the communication quality is poor.
- the data regarding the shape that has an important influence on the remote operation is selected and transmitted at an early stage, the user can make a quick decision, and the operation using the robot 10 can be completed in a short time. Become.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Electromagnetism (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
- Length Measuring Devices With Unspecified Measuring Means (AREA)
- Arrangements For Transmission Of Measured Signals (AREA)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US15/123,477 US20170075330A1 (en) | 2014-03-31 | 2015-02-27 | Data transmission system, data transmission apparatus and data transmission method |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2014-074378 | 2014-03-31 | ||
| JP2014074378A JP2015197329A (ja) | 2014-03-31 | 2014-03-31 | データ伝送システム、データ伝送装置、データ伝送方法、及びデータ伝送プログラム |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2015151685A1 true WO2015151685A1 (ja) | 2015-10-08 |
Family
ID=54240018
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2015/055945 Ceased WO2015151685A1 (ja) | 2014-03-31 | 2015-02-27 | データ伝送システム、データ伝送装置、及びデータ伝送方法 |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US20170075330A1 (enExample) |
| JP (1) | JP2015197329A (enExample) |
| WO (1) | WO2015151685A1 (enExample) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105291114A (zh) * | 2015-12-04 | 2016-02-03 | 北京建筑大学 | 基于移动互联网的家居服务型机器人系统 |
| JP2020136882A (ja) * | 2019-02-19 | 2020-08-31 | 株式会社メディア工房 | 点群データ通信システム、点群データ送信装置および点群データ送信方法 |
Families Citing this family (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6637855B2 (ja) | 2016-08-22 | 2020-01-29 | 株式会社ソニー・インタラクティブエンタテインメント | データ処理装置、データ処理方法およびコンピュータプログラム |
| WO2018191970A1 (zh) | 2017-04-21 | 2018-10-25 | 深圳前海达闼云端智能科技有限公司 | 一种机器人控制方法、机器人装置及机器人设备 |
| WO2019035363A1 (ja) * | 2017-08-18 | 2019-02-21 | 株式会社小糸製作所 | 認識センサおよびその制御方法、自動車、車両用灯具、オブジェクト識別システム、オブジェクトの識別方法 |
| JP2019113553A (ja) * | 2017-12-25 | 2019-07-11 | シナノケンシ株式会社 | 三次元レーザー光走査装置 |
| JP2021043475A (ja) * | 2017-12-25 | 2021-03-18 | 住友電気工業株式会社 | 送信装置、点群データ収集システムおよびコンピュータプログラム |
| US11604284B2 (en) | 2019-05-06 | 2023-03-14 | Waymo Llc | Methods and systems to determine a strategy for a drop process associated with a light detection and ranging (LIDAR) device |
| JP7616401B2 (ja) * | 2021-09-01 | 2025-01-17 | 日本電気株式会社 | 映像送信システム、端末装置および映像送信方法 |
| JP7739868B2 (ja) * | 2021-09-03 | 2025-09-17 | 株式会社Ihi | 監視システム、監視方法、及び、監視プログラム |
| JP2023140094A (ja) * | 2022-03-22 | 2023-10-04 | 日本電気通信システム株式会社 | 通信方法 |
| CN116660926B (zh) * | 2023-05-06 | 2025-11-14 | 安徽图联科技有限公司 | 基于人工智能的测绘激光雷达扫描策略控制方法与系统 |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2002040941A1 (en) * | 2000-11-15 | 2002-05-23 | National Institute Of Advanced Industrial Science And Technology | Footprint information distributing system |
| JP2011167815A (ja) * | 2010-02-19 | 2011-09-01 | Ihi Corp | 物体認識ロボットシステム |
| JP2012146262A (ja) * | 2011-01-14 | 2012-08-02 | Toshiba Corp | 構造物計測システム |
| US20130028482A1 (en) * | 2011-07-29 | 2013-01-31 | Raytheon Company | Method and System for Thinning a Point Cloud |
Family Cites Families (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7376279B2 (en) * | 2000-12-14 | 2008-05-20 | Idx Investment Corporation | Three-dimensional image streaming system and method for medical images |
| US8192501B2 (en) * | 2008-09-29 | 2012-06-05 | Shriners Hospital For Children | Prosthetic knee with gravity-activated lock |
| US20110310088A1 (en) * | 2010-06-17 | 2011-12-22 | Microsoft Corporation | Personalized navigation through virtual 3d environments |
| US9463574B2 (en) * | 2012-03-01 | 2016-10-11 | Irobot Corporation | Mobile inspection robot |
-
2014
- 2014-03-31 JP JP2014074378A patent/JP2015197329A/ja active Pending
-
2015
- 2015-02-27 US US15/123,477 patent/US20170075330A1/en not_active Abandoned
- 2015-02-27 WO PCT/JP2015/055945 patent/WO2015151685A1/ja not_active Ceased
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2002040941A1 (en) * | 2000-11-15 | 2002-05-23 | National Institute Of Advanced Industrial Science And Technology | Footprint information distributing system |
| JP2011167815A (ja) * | 2010-02-19 | 2011-09-01 | Ihi Corp | 物体認識ロボットシステム |
| JP2012146262A (ja) * | 2011-01-14 | 2012-08-02 | Toshiba Corp | 構造物計測システム |
| US20130028482A1 (en) * | 2011-07-29 | 2013-01-31 | Raytheon Company | Method and System for Thinning a Point Cloud |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN105291114A (zh) * | 2015-12-04 | 2016-02-03 | 北京建筑大学 | 基于移动互联网的家居服务型机器人系统 |
| JP2020136882A (ja) * | 2019-02-19 | 2020-08-31 | 株式会社メディア工房 | 点群データ通信システム、点群データ送信装置および点群データ送信方法 |
| US11146773B2 (en) | 2019-02-19 | 2021-10-12 | Media Kobo, Inc. | Point cloud data communication system, point cloud data transmitting apparatus, and point cloud data transmission method |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2015197329A (ja) | 2015-11-09 |
| US20170075330A1 (en) | 2017-03-16 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2015151685A1 (ja) | データ伝送システム、データ伝送装置、及びデータ伝送方法 | |
| US11729367B2 (en) | Wide viewing angle stereo camera apparatus and depth image processing method using the same | |
| US11691729B2 (en) | Methods and associated systems for communicating with/controlling moveable devices by gestures | |
| CN108283021B (zh) | 机器人和用于机器人定位的方法 | |
| KR100703692B1 (ko) | 공간상에 존재하는 오브젝트들을 구별하기 위한 시스템,장치 및 방법 | |
| JP5548482B2 (ja) | 位置姿勢計測装置、位置姿勢計測方法、プログラム及び記憶媒体 | |
| US9704248B2 (en) | Position and orientation measuring apparatus, information processing apparatus and information processing method | |
| US9317735B2 (en) | Information processing apparatus, information processing method, and program to calculate position and posture of an object having a three-dimensional shape | |
| JP5736622B1 (ja) | 検出装置およびこの装置を具えたマニプレータの動作制御 | |
| KR102432370B1 (ko) | 피킹 로봇을 위한 비젼 분석 장치 | |
| US20240061436A1 (en) | Moving apparatus and moving apparatus control method | |
| JP2012021958A (ja) | 位置姿勢計測装置、その計測処理方法及びプログラム | |
| JP2012247364A (ja) | ステレオカメラ装置、ステレオカメラシステム、プログラム | |
| CN103170973A (zh) | 基于Kinect摄像机的人机协作装置及方法 | |
| JP2014013146A5 (enExample) | ||
| JP2015043488A (ja) | 遠隔操作装置及びそれを用いた遠隔施工方法 | |
| JP2015049776A (ja) | 画像処理装置、画像処理方法及び画像処理プログラム | |
| JP2013210339A (ja) | 接触状態推定装置 | |
| US20200380727A1 (en) | Control method and device for mobile device, and storage device | |
| JP6188345B2 (ja) | 情報処理装置、情報処理方法 | |
| Liu et al. | Navigation on point-cloud—A Riemannian metric approach | |
| JP5198078B2 (ja) | 計測装置および計測方法 | |
| US20240033930A1 (en) | Remote control manipulator system and remote control assistance system | |
| JP6368503B2 (ja) | 障害物監視システム及びプログラム | |
| Gan et al. | Robust binocular pose estimation based on pigeon-inspired optimization |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15773205 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 15123477 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 15773205 Country of ref document: EP Kind code of ref document: A1 |