WO2020071001A1 - Information presentation device, information presentation method, and program - Google Patents

Information presentation device, information presentation method, and program

Info

Publication number
WO2020071001A1
WO2020071001A1 PCT/JP2019/032763 JP2019032763W WO2020071001A1 WO 2020071001 A1 WO2020071001 A1 WO 2020071001A1 JP 2019032763 W JP2019032763 W JP 2019032763W WO 2020071001 A1 WO2020071001 A1 WO 2020071001A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
abnormality
predetermined area
presentation device
unit
Prior art date
Application number
PCT/JP2019/032763
Other languages
French (fr)
Japanese (ja)
Inventor
亘 小久保
篤之 坂本
弘樹 西條
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2020071001A1 publication Critical patent/WO2020071001A1/en

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B23/00Alarms responsive to unspecified undesired or abnormal conditions
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators

Definitions

  • the present disclosure relates to an information presentation device, an information presentation method, and a program.
  • Patent Literature 1 includes a sensor device for detecting an abnormality of an inspection target such as a pipe, converts data detected by the sensor device into image data, and causes the display device to display the image data.
  • a mobile robot is described.
  • the data related to the abnormality detected by the sensor device is only displayed on the display device, and is not directly displayed on the source of the abnormality. Therefore, when the user performs a process for the abnormality at the site where the abnormality has occurred, the user may not be able to intuitively understand the position of the abnormality source, the situation around the abnormality source, and the like.
  • the present disclosure proposes a new and improved information presenting device, information presenting method, and program capable of presenting information about an abnormality so that a user can easily and intuitively understand the information.
  • an acquisition unit that acquires information about a predetermined area, a presentation unit that is detected based on the information about the predetermined area, and presents information about an abnormality in the predetermined area,
  • a control unit for presenting information on the abnormality to a position where the abnormality is detected.
  • acquiring information on a predetermined area presenting information on an abnormality in the predetermined area, which is detected based on the information on the predetermined area, and detecting the abnormality And presenting the information on the abnormality to a position where the information is abnormal, provided by the processor.
  • a computer an acquisition unit that acquires information about a predetermined area, and a presentation unit that is detected based on the information about the predetermined area, and presents information about an abnormality in the predetermined area
  • a program for causing the presentation unit to function as a control unit that presents information about the abnormality to a position where the abnormality is detected is provided.
  • FIG. 1 is a diagram illustrating an example of a physical configuration of an information presentation device according to each embodiment of the present disclosure.
  • FIG. 4 is a diagram illustrating a specific example of a predetermined area according to each embodiment. It is a block diagram showing an example of functional composition of an information presentation device concerning each embodiment.
  • 5 is a flowchart illustrating a flow of a main process according to the first embodiment. 4 is a flowchart illustrating a flow of a normal scan process according to the embodiment.
  • FIG. 6 is a diagram showing a first example according to the same embodiment.
  • FIG. 7 is a diagram showing a second example according to the same embodiment. It is a figure showing the 3rd example concerning the embodiment.
  • FIG. 9 is a flowchart illustrating a flow of a main process according to the second embodiment. It is a flow chart which shows the flow of the detailed scan processing concerning the embodiment. It is a figure showing an example concerning the same embodiment. It is a figure showing the 1st modification concerning each embodiment. It is a figure showing the 2nd modification concerning each embodiment. It is a figure showing the 3rd modification concerning each embodiment.
  • FIG. 3 is a block diagram illustrating a hardware configuration example of an information presentation device according to each embodiment.
  • a plurality of components having substantially the same function and configuration may be distinguished by adding different alphabets to the same reference numeral.
  • a plurality of components having substantially the same functional configuration are distinguished as necessary, such as the joint 102a and the joint 102b.
  • only the same reference numeral is assigned.
  • the joint 102a and the joint 102b they are simply referred to as the joint 102.
  • a device has been developed that includes a sensor device for detecting an abnormality of an inspection target such as a pipe, converts data detected by the sensor device into image data, and displays the image data on a display device.
  • the device only causes the display device to display data relating to the abnormality detected by the sensor device, and does not directly display the data to the abnormality source. Therefore, when the user performs a process for the abnormality at the site where the abnormality has occurred, the user may not be able to intuitively understand the position of the abnormality source, the situation around the abnormality source, and the like.
  • the information presentation apparatus 10 has been created with the above-mentioned information as a single point of view.
  • the information presentation device 10 acquires information about a predetermined area, and presents information about an abnormality in the predetermined area, which is detected based on the information about the predetermined area. Further, the information presenting device 10 causes information relating to the abnormality to be presented at the position where the abnormality is detected. For this reason, the information presenting apparatus 10 can present information relating to the abnormality so that the user can intuitively understand the information.
  • each embodiment will be sequentially described in detail.
  • FIG. 1 is a diagram illustrating an example of a physical configuration of the information presentation device 10-1 according to the first embodiment.
  • the information presenting device 10-1 according to the present embodiment includes a robot arm 12 and a movable cart 14.
  • the information presentation device 10-1 may be a device (machine) that can move autonomously by using the electric and / or magnetic action by including the movable cart 14.
  • the information presentation device 10-1 may be, for example, a robot that can autonomously move on the ground or in the air.
  • the present embodiment is not limited to such an example, and the information presenting apparatus 10-1 may be a machine (apparatus) that can operate autonomously using an electric and / or magnetic action or another general mobile object. Any device may be used.
  • the information presentation device 10-1 is another type of robot (for example, a humanoid robot or a drone), a vehicle (for example, a vehicle, a ship, a flying body, etc.), various industrial machines, or a toy. Is also good.
  • the information presentation device 10-1 may be a device (machine) that can simultaneously perform a rotational motion and a translational motion (hereinafter, may be referred to as a rotational translational motion) at the same time.
  • the information presentation device 10-1 may be an omnidirectional mobile.
  • the rotational translation for example, the information presentation device 10-1 moves while staring at a certain point, the information presentation device 10-1 continuously rotates, and senses the surrounding environment. Moving while moving, or tracking another moving body (for example, a person or a car) moving in an arbitrary direction in real time.
  • the robot arm 12 includes, for example, a robot arm tip 100, a plurality of joints 102, and a plurality of links 104.
  • the robot arm 12 has, for example, one robot arm tip 100, four joints 102a to 102d, and three links 104a to 104c.
  • the four joints 102a to 102d connect the robot arm distal end 100, the three links 104a to 104c, and the movable carriage 14.
  • the joint 102a connects the robot arm tip 100 and the link 104a.
  • the joint 102b connects the link 104a and the link 104b.
  • the joint 102c connects the link 104b and the link 104c.
  • the joint 102d connects the link 104c and the movable carriage 14.
  • the shapes of the robot arm 12, the robot arm tip portion 100, the joint portion 102, and the link 104 are not limited to the shapes shown in FIG. 1, but may be any shapes.
  • the robot arm 12 may include, for example, a motion sensor for measuring the motion of the joint 102 at a position corresponding to each of the joints 102.
  • An example of the motion sensor is an encoder.
  • the encoder can acquire each position of the joint 102 as three-dimensional coordinates.
  • the information presentation device 10-1 can calculate the position of the robot arm tip 100 as three-dimensional coordinates based on the three-dimensional coordinates of the joint 102.
  • the robot arm 12 may include, for example, a driving mechanism for driving the joint unit 102 at a position corresponding to each of the joint units 102.
  • the driving mechanism includes, for example, a motor and a driver. Such a driving mechanism can be controlled by a control unit 120-1 described below.
  • the joint 102 may have a function of rotatably connecting the robot arm tip 100 and the plurality of links 104 to each other.
  • the joints 102a to 102d rotatably connect the robot arm tip 100 and the links 104a to 104c to each other.
  • the operation of the robot arm 12 is controlled by rotating and operating the joints 102a to 102d.
  • the position of each component of the robot arm 12 means a position (coordinate) in a space defined for the operation control, and the posture of each component is the operation control. Means the direction (angle) with respect to an arbitrary axis in the space defined for.
  • the operation (or operation control) of the robot arm 12 refers to the operation (or operation control) of the joints 102a to 102d in which the position and orientation of each component of the robot arm 12 are changed. (Change is controlled).
  • each of the robot arm tip portion 100, the joint portion 102, and the link 104 may have a movable portion that rotates about a predetermined rotation axis.
  • each of the robot arm tip portion 100, the joint portion 102, and the link 104 has an actuator (for example, a motor), and rotates around a predetermined rotation axis by driving the actuator.
  • the rotation of each of the robot arm tip 100, the joint 102, and the link 104 is controlled, so that the operation of the robot arm 12, for example, extending, contracting (folding), or twisting the robot arm 12. Is controlled.
  • the robot arm 12 has movable portions at each of the one robot arm tip portion 100, the four joint portions 102a to 102d, and the three links 104a to 104c, and thus has eight degrees of freedom (DoF). : Degrees @ of @ Freedom). Therefore, since the robot arm 12 generally has six or more degrees of freedom (3 degrees of freedom of position + 3 degrees of freedom of attitude) required to control the position and attitude of the robot arm, A robot arm with redundant degrees of freedom.
  • the robot arm 12 has four joints 102, four links 104, and eight movable parts, but the number of the joints 102, the links 104, and the movable parts is not particularly limited. Not done.
  • the number of movable parts of the robot arm 12 may be six or less.
  • the numbers of the joints 102, the links 104, and the movable parts included in the robot arm 12 may be set such that a desired degree of freedom is realized in consideration of the degree of freedom of the position and orientation of the robot arm tip 100. It may be set appropriately.
  • the shapes of the joint 102 and the link 104, the direction of the rotation axis of the joint 102, and the like are not limited to the example illustrated in FIG.
  • the degree of freedom of the position and posture of the robot arm tip 100 is taken into consideration. May be appropriately set so as to realize a desired degree of freedom.
  • the robot arm 12 may be moved in the same direction as the moving direction of the movable cart 14 by the operation of the movable cart 14. For example, when the mobile trolley 14 moves, the robot arm 12 may be moved in the same direction as the mobile trolley 14 in accordance with the movement of the mobile trolley 14. Further, for example, when the movable cart 14 rotates, the robot arm 12 may be rotated in the same direction as the rotating direction of the movable cart 14 in accordance with the rotation of the movable cart 14.
  • a sensor device is provided at the distal end portion 100 of the robot arm.
  • the sensor device may include various sensors.
  • the sensor device may include a camera, a thermo camera, a distance measurement sensor, a depth sensor, a microphone (hereinafter, also referred to as a microphone), a chemical sensor, an odor sensor, a metal sensor, a temperature sensor, and a sensor capable of capturing a fluoroscopic image.
  • the number of sensor devices provided at the robot arm tip 100 is not particularly limited.
  • the sensor device provided on the robot arm tip portion 100 may include one or more of the above-described sensor devices in combination, or may include a plurality of devices of the same type.
  • a camera is an imaging device that has a lens system, a drive system, and an imaging element, such as an RGB camera, and captures an image (still image or moving image).
  • a thermo camera is an imaging device that captures a captured image including information indicating the temperature of an imaging target using infrared light or the like.
  • the depth sensor is a device that acquires depth information such as a ToF (Time of Flight) sensor, an infrared distance measuring device, an ultrasonic distance measuring device, a LiDAR (Laser Imaging, Detection and Ranging), or a stereo camera.
  • the microphone is a device that collects surrounding sounds and outputs audio data converted into digital signals via an amplifier and an ADC (Analog Digital Converter).
  • a chemical sensor is a sensor that detects a chemical substance.
  • the odor sensor is a sensor that detects odor and converts it into a numerical value.
  • the metal sensor is a sensor that detects a metal.
  • the temperature sensor is a sensor that measures a temperature.
  • the sensor capable of capturing a fluoroscopic image is, for example, a sensor that captures an image that visualizes the inside of a target by infrared rays, X-rays, or the like.
  • the robot arm tip portion 100 may include a transmitter and a receiver for infrared rays or X-rays or the like, and may have a structure capable of positioning an imaging target between the transmitter and the receiver.
  • a presentation device may be provided at the robot arm tip portion 100 in addition to the sensor device.
  • the presentation device can include various presentation devices.
  • the presentation device may include a projector and a laser device.
  • the number of presentation devices provided on the robot arm tip portion 100 is not particularly limited.
  • the presentation device provided on the robot arm tip portion 100 may include one or more of the above-described presentation devices in combination, or may include a plurality of devices of the same type.
  • a projector is a projection device that projects an image at an arbitrary place in space.
  • the projector may be, for example, a fixed-type wide-angle projector, or a so-called moving projector having a movable unit capable of changing the projection direction, such as a Pan / Tilt drive type.
  • the sensor device and the presentation device described above can be provided in various ways.
  • the sensor device and the presentation device are provided at the robot arm tip 100 of the robot arm 12 as described above.
  • the sensor device and the presentation device may be provided in a main body 106 of the mobile trolley 14 described later.
  • the movable trolley 14 includes, for example, a main body 106 and a plurality of wheels 108. At least one processing circuit such as a CPU (Central Processing Unit) and a GPU (Graphics Processing Unit) may be arranged in the main body 106. Further, at least one memory such as a ROM (Read Only Memory) and a RAM (Random Access Memory) can be arranged in the main body 106. In addition, a power supply device that supplies power for driving each component of the information presentation device 10-1 may be arranged in the main body 106.
  • a power supply device that supplies power for driving each component of the information presentation device 10-1 may be arranged in the main body 106.
  • the mobile trolley 14 has, for example, four wheels 108a to 108d. Any type of wheel may be used as the wheel 108.
  • the wheels 108 may be omni wheels, mecanum wheels, or the like.
  • the types of the plurality of wheels 108 included in the mobile trolley 14 may be basically the same. However, the present disclosure is not limited to such an example, and a plurality of types of wheels may be mixed in the plurality of wheels 108.
  • the plurality of wheels 108 may have a movable part.
  • each of the plurality of wheels 108 has an actuator (for example, a motor or the like) and rotates by driving the actuator.
  • the actuator for example, a motor or the like
  • the mobile trolley 14 can perform a rotation motion and a translation motion.
  • the movable carriage 14 outputs a speed in an arbitrary direction by changing the rotation speed of each of the plurality of wheels 108 by controlling the driving of the actuator. It is possible. Thereby, the mobile trolley 14 may be movable in all directions.
  • FIG. 2 is a diagram illustrating a specific example of the predetermined area according to the first embodiment.
  • the information presentation device 10-1 according to the present embodiment senses (hereinafter, also referred to as "scanning") a predetermined area using a sensor device provided at the robot arm tip portion 100 described above.
  • the predetermined area is an area in which the information presentation device 10-1 checks for an abnormality.
  • the predetermined area may include a surface of the object specified by the user.
  • the predetermined region may be the region 30 that is the surface of the object 20 specified by the user. Good.
  • the object 20 specified by the user is, for example, an unknown object that has a risk of explosion or the like.
  • the target specified by the user is not limited to the unknown object described above, and may be an arbitrary object.
  • the target specified by the user may be a known object.
  • the predetermined area may include an area specified by the user.
  • the regions specified by the user are partial regions 32a and 32b of a wall of a predetermined space (for example, a room) and a partial region 32c of a floor.
  • the predetermined area may be the area 32 including the area 32a, the area 32b, and the area 32c.
  • the predetermined area may be a part of the surface of the target specified by the user.
  • the predetermined area may be an area 34a that is a part of the surface of the tank 22a.
  • the tank 22 shown in the specific example c of FIG. 2 is, for example, a tank in which a gas, a chemical, or the like is put.
  • the predetermined area may be a combination of the surface of the target specified by the user and the area specified by the user. Specifically, as shown in a specific example c of FIG. 2, the user designates the areas 34a to 34c which are a part of the surface of each of the three tanks 22a to 22c and the area 34d which is a part of the floor. And In this case, the predetermined area may be an area including the areas 34a to 34d.
  • the predetermined area is not limited to the area specified by the user.
  • the predetermined area may be an area determined based on information scanned by the sensor device. Further, for example, the predetermined area may be an area determined based on information obtained from an external device. Further, for example, the predetermined area may be a predetermined area.
  • the information presentation device 10-1 obtains information on a predetermined area by scanning the above-described predetermined area.
  • the information on the predetermined area is, for example, spatial information on the predetermined area.
  • the space information is, for example, sensing data such as the distance between the information presentation device 10-1 and an object forming the space, which is acquired by a camera or a depth sensor.
  • the space information is, for example, sensing data such as the distance between the information presenting apparatus 10-1 and a predetermined area acquired by a camera or a depth sensor.
  • the acquired distance between the information presenting apparatus 10-1 and the predetermined area can be used for detecting the positional relationship between the information presenting apparatus 10-1 and the predetermined area.
  • the detected positional relationship between the information presenting apparatus 10-1 and the predetermined area can be used when the information presenting apparatus 10-1 moves to the vicinity of the predetermined area.
  • the space information may be, for example, sensing data such as the distance between the robot arm tip 100 and a predetermined area, which is acquired by a camera or a depth sensor.
  • the acquired distance between the robot arm tip 100 and the predetermined area can be used for detecting the positional relationship between the robot arm tip 100 and the predetermined area.
  • the detected positional relationship between the robot arm tip 100 and the predetermined area can be used for controlling the distance between the robot arm tip 100 and the predetermined area.
  • the information on the predetermined area is, for example, state information of the predetermined area.
  • the state information is, for example, sensing data such as the temperature of a predetermined area acquired by a thermo camera, a temperature sensor, or the like.
  • the state information includes, for example, the loudness of a sound output from a predetermined area obtained by a microphone, object information present in the predetermined area obtained by a chemical sensor, and a predetermined area obtained by an odor sensor. May be sensing data such as the intensity of the odor at.
  • the state information may be sensing information such as metal information existing in a predetermined area acquired by a metal sensor and internal information of a target acquired by a sensor capable of capturing a fluoroscopic image. The state information acquired by the various sensors described above can be used for detecting an abnormality in a predetermined area.
  • FIG. 3 is a block diagram illustrating an example of a functional configuration of the information presentation device 10-1 according to the first embodiment.
  • the information presentation device 10-1 according to the present embodiment includes an acquisition unit 110, a control unit 120-1, a storage unit 130, a presentation unit 140, a robot arm drive unit 150, and a mobile trolley drive unit 160. Is provided.
  • the acquisition unit 110 has a function of acquiring information on a predetermined area.
  • the acquisition unit 110 has at least one sensor device, and acquires information on a predetermined area by using the sensor device. Specifically, the acquiring unit 110 acquires information on a predetermined area by scanning a predetermined area with the sensor device. Then, the obtaining unit 110 outputs the obtained information on the predetermined area to the spatial processing unit 122-1 of the control unit 120-1.
  • the above-described function can be realized by, for example, a sensor device provided at the robot arm tip portion 100.
  • the method by which the acquisition unit 110 acquires information on a predetermined area is not limited to a method of acquiring information using a sensor device.
  • the acquisition unit 110 may acquire information on a predetermined area from an external device.
  • the acquisition unit 110 may include a communication unit (not shown) capable of communicating with the external device, and may acquire information on a predetermined area acquired by the external device via communication with the external device. .
  • Control unit 120-1 has a function of controlling the operation of the information presentation device 10-1.
  • the control unit 120-1 includes a spatial processing unit 122-1, a projection mapping control unit 124, and a robot control unit 126, as shown in FIG.
  • the function of each component of the control unit 120-1 can be realized by, for example, a CPU, a ROM, and a RAM arranged in the main unit 106.
  • the spatial processing unit 122-1 has a function of performing processing based on input information.
  • the spatial processing unit 122-1 may perform a space recognition process, a process of detecting a predetermined region, a process of determining a scan trajectory, a process of generating drive information, and a process of generating abnormalities based on information about a predetermined region input from the acquisition unit 110. Is performed.
  • the spatial processing unit 122-1 performs a process of recognizing a space around the information presentation device 10-1 in the space recognition process. For example, the spatial processing unit 122-1 detects a space or the like around the information presenting device 10-1 by a spatial recognition process based on the spatial information input from the acquiring unit 110. Specifically, the space processing unit 122-1 is configured to determine the shape of an object such as a wall surface, a ceiling, a floor, a door, furniture, a living article, an unknown object, etc. By detecting the position and the like as three-dimensional coordinates, each space described above is recognized. Then, the spatial processing unit 122-1 outputs the space recognition information, which is information on the space recognized in the space recognition process, to the storage unit 130, and stores the space recognition information in the storage unit 130.
  • the space recognition information which is information on the space recognized in the space recognition process
  • the position of the object forming the shape of the space detected by the space processing unit 122-1 is not limited to the three-dimensional coordinates.
  • the space recognition information stored in the storage unit 130 by the space processing unit 122-1 may include space information.
  • the spatial processing unit 122-1 may generate map information obtained by mapping information on the recognized space, output the generated map information to the storage unit 130, and store the generated map information in the storage unit 130.
  • the space processing unit 122-1 performs the space recognition processing indoors
  • the place where the space processing unit 122-1 performs the space recognition processing is not limited to indoors. It may be outdoors.
  • the spatial processing unit 122-1 performs a process of detecting a predetermined region to be inspected for an abnormality by the information presentation device 10-1 in the process of detecting the predetermined region. For example, when the user specifies a predetermined area by gesture or the like, the spatial processing unit 122-1 detects the predetermined area based on the user's operation information and space information acquired by a camera or the like. You may. Further, for example, when the user specifies a predetermined area by voice such as voice, the spatial processing unit 122-1 may detect the predetermined area based on voice and space information acquired by a microphone or the like. Good. Further, for example, when the user does not specify a predetermined area, the spatial processing unit 122-1 may set the predetermined area as the predetermined area, or may detect the predetermined area based on the predetermined information. You may.
  • the spatial processing unit 122-1 performs the above-described space recognition processing on the detected predetermined area to recognize the shape of the predetermined area, the shape of the object forming the predetermined area, and the like.
  • the spatial processing unit 122-1 performs a process of determining a scan trajectory that is a trajectory when the information presentation device 10-1 scans a predetermined area in the scan trajectory determination process.
  • the spatial processing unit 122-1 determines a scan trajectory based on the shape of a predetermined area, the shape of an object forming the predetermined area, and the like. For example, the spatial processing unit 122-1 determines a scan trajectory so that the information presentation device 10-1 scans a predetermined area as a whole.
  • the fact that the information presentation device 10-1 scans a predetermined area as a whole is also referred to as a normal scan.
  • the spatial processing unit 122-1 may determine the scan trajectory so that the information presentation device 10-1 partially scans a predetermined area. Further, the path, length, shape, and the like of the scan trajectory are not particularly limited.
  • the spatial processing unit 122-1 performs a process of generating drive information used for controlling the drive of the information presentation device 10-1 in the drive information generation process.
  • the space processing unit 122-1 calculates the positions of the information presentation device 10-1 and the robot arm tip 100 based on the space recognition information, and generates drive information using the positions.
  • the space processing unit 122-1 generates drive information for driving the mobile trolley 14 based on the space recognition information and the calculated position of the information presentation device 10-1.
  • the spatial processing unit 122-1 outputs the generated drive information to the robot control unit 126.
  • the information presentation device 10-1 can move in a space by driving the movable cart 14 based on the drive information.
  • the space processing unit 122-1 generates drive information for driving the robot arm 12 based on the space recognition information and the position of the robot arm tip 100.
  • the information presentation device 10-1 can scan a predetermined area.
  • the space processing unit 122-1 generates drive information for driving the robot arm 12 based on the scan trajectory.
  • the information presentation device 10-1 can appropriately scan a predetermined area along the scan trajectory.
  • the space processing unit 122-1 drives the robot arm 12 based on the space recognition information and the position of the robot arm tip 100 such that the distance between the robot arm tip 100 and a predetermined area is kept constant.
  • Drive information may be generated.
  • the spatial processing unit 122-1 generates drive information such that the distance between the robot arm tip 100 and a predetermined area is kept constant at a distance set in advance by a user or the like.
  • the spatial processing unit 122-1 first calculates three-dimensional coordinates indicating the position of the robot arm tip 100 based on the information acquired by the encoder.
  • the spatial processing unit 122-1 determines the distance between the robot arm tip 100 and the predetermined area based on the three-dimensional coordinates indicating the position of the robot arm tip 100 and the three-dimensional coordinates indicating the position of the predetermined area. calculate. Then, the spatial processing unit 122-1 generates drive information for driving the robot arm 12 such that the calculated distance between the robot arm tip portion 100 and the predetermined area matches a distance set in advance by a user or the like. I do.
  • the information presentation device 10-1 can scan a predetermined area while maintaining a constant distance between the robot arm tip 100 and the predetermined area. Further, for example, the information presentation device 10-1 can more accurately maintain the distance between the robot arm tip 100 and a predetermined area constant by using the depth sensor provided on the robot arm tip 100 as an auxiliary. Is also good.
  • the spatial processing unit 122-1 performs a process of detecting an abnormality in a predetermined area in the abnormality detection process. For example, the spatial processing unit 122-1 detects an abnormality in a predetermined area based on the state information. Specifically, when the value indicated by the state information is equal to or greater than a predetermined threshold, the space processing unit 122-1 determines that the value of the state information is abnormal, and detects an abnormality corresponding to the state information.
  • the spatial processing unit 122-1 When the space processing unit 122-1 detects an abnormality, the spatial processing unit 122-1 associates the position information indicating the position where the abnormality is detected with the sensing data of the sensor device, outputs the data to the storage unit 130, and causes the storage unit 130 to store the data. Further, the spatial processing unit 122-1 outputs the position information and the sensing data of the sensor device to the projection mapping control unit 124.
  • the projection mapping control unit 124 has a function of performing processing related to information presentation.
  • the projection mapping control unit 124 has a function of generating projection mapping information that is information presented by the presentation unit 140.
  • the projection mapping control unit 124 generates information regarding an abnormality as projection mapping information based on the position information input from the spatial processing unit 122-1 and the sensing data of the sensor device.
  • the information on the abnormality is, for example, information indicating a range in which the abnormality is detected.
  • the information on the abnormality may be, for example, the name of the substance detected as the abnormality (for example, ammonia). Note that the information regarding the abnormality is not limited to the above example.
  • the projection mapping control unit 124 has a function of controlling the operation of the presentation unit 140.
  • the projection mapping control unit 124 controls a process in which the presenting unit 140 presents information about an abnormality.
  • the projection mapping control unit 124 causes the presentation unit 140 to present information on the abnormality on the source of the abnormality.
  • the abnormal source is a substance detected as abnormal by the space processing unit 122-1 such as a heat source generating abnormal heat, a substance generating abnormal odor, and the like.
  • the projection mapping control unit 124 may cause the presentation unit 140 to present information indicating the range in which the abnormality is detected on the source of the abnormality. This allows the user to intuitively grasp the location where the abnormality has occurred.
  • the projection mapping control unit 124 may cause the presentation unit 140 to further present information regarding the abnormality near the source of the abnormality.
  • the projection mapping control unit 124 may cause the presentation unit 140 to present the substance name of the substance detected as the source of the abnormality near the source of the abnormality.
  • the user can intuitively grasp the additional information regarding the abnormality.
  • the projection mapping control unit 124 may cause the presentation unit 140 to present a plurality of different pieces of information.
  • the acquisition unit 110 has a plurality of sensor devices, and information about a plurality of different abnormalities is detected based on information about a plurality of predetermined regions acquired by the plurality of sensor devices.
  • the projection mapping control unit 124 may cause the presentation unit 140 to present each of the pieces of information relating to the detected different abnormalities. Thereby, the user can grasp the abnormality in more detail.
  • the projection mapping control unit 124 can cause the information relating to the abnormality to be presented in various ways by controlling the presentation unit 140.
  • the projection mapping control unit 124 causes the presenting unit 140 to present information regarding the abnormality in text.
  • the projection mapping control unit 124 may cause the presentation unit 140 to present information regarding the abnormality using an icon.
  • the projection mapping control unit 124 may cause the presentation unit 140 to present information about an abnormality whose color has been changed in accordance with the detected abnormality.
  • the projection mapping control unit 124 may cause the presentation unit 140 to change the intensity of light projected from a projector or the like in accordance with the information about the abnormality, or blink the information about the abnormality to present it. . Thereby, the user can more intuitively grasp the information regarding the abnormality.
  • Robot control unit 126 has a function of controlling the operation of the information presentation device 10-1. Specifically, the robot control unit 126 controls the operations of the robot arm 12 and the mobile trolley 14 of the information presentation device 10-1.
  • the robot control unit 126 can control the distance between the acquisition unit 110 and a predetermined area by operating at least one of the robot arm 12 and the movable trolley 14. For example, the robot control unit 126 controls the position of the acquisition unit 110 by operating the robot arm 12 such that the distance between the acquisition unit 110 and a predetermined area is constant. When controlling the position of the acquisition unit 110 so that the distance between the acquisition unit 110 and the predetermined area is constant, the robot control unit 126 may operate both the robot arm 12 and the mobile trolley 14. Alternatively, only the movable carriage 14 may be operated.
  • Keeping the distance between the acquisition unit 110 and the predetermined area constant means that the distance between the acquisition unit 110 and the target is kept constant.
  • the information presentation device 10-1 can scan the target without contacting the target. Further, since the information presentation apparatus 10-1 does not contact the target, the target can be scanned without affecting the target. Further, since the information presentation device 10-1 does not contact the target, the robot arm 12 can be kept clean.
  • the robot controller 126 includes a robot arm controller 1260 and a movable trolley controller 1262 for controlling operations of the robot arm 12 and the movable trolley 14 as shown in FIG.
  • Robot arm controller 1260 has a function of controlling the operation of the robot arm 12.
  • the robot arm controller 1260 controls the driving of the robot arm driving unit 150 that drives each component of the robot arm 12.
  • the robot arm controller 1260 generates a driving instruction for driving the robot arm driving unit 150 based on the driving information input from the space processing unit 122-1.
  • the robot arm controller 1260 outputs the generated driving instruction to the robot arm driving unit 150.
  • the robot arm controller 1260 can control the operation of the robot arm 12 while controlling the driving of the robot arm drive unit 150.
  • the mobile trolley controller 1262 has a function of controlling the operation of the mobile trolley 14.
  • the mobile trolley controller 1262 controls the driving of the mobile trolley drive unit 160 that drives each component of the mobile trolley 14.
  • the mobile trolley controller 1262 generates a driving instruction for driving the mobile trolley driving unit 160 based on the driving information input from the space processing unit 122-1.
  • the mobile trolley controller 1262 outputs the generated drive instruction to the mobile trolley drive unit 160.
  • the mobile trolley controller 1262 can control the operation of the mobile trolley 14 while controlling the driving of the mobile trolley drive unit 160.
  • Storage unit 130 has a function of storing data obtained by the processing in the control unit 120-1.
  • the storage unit 130 stores the space recognition information obtained in the space recognition processing by the space processing unit 122-1.
  • map information is generated by the space recognition processing by the spatial processing unit 122-1
  • the storage unit 130 may also store the map information.
  • the storage unit 130 stores the position where the abnormality is detected and the sensing data of the sensor device in association with each other.
  • the information stored in the storage unit 130 is not limited to the above example.
  • the storage unit 130 may store programs such as various applications, data, and the like.
  • the presentation unit 140 has a function of presenting information.
  • the function of the presenting unit 140 can be realized by, for example, a presenting device such as a projector or a laser device provided on the robot arm tip 100.
  • the presentation unit 140 presents the presentation information based on the projection mapping information input from the projection mapping control unit 124.
  • the presentation unit 140 presents information about an abnormality as presentation information based on the projection mapping information.
  • the presentation unit 140 may present, as the presentation information, information other than the information about the abnormality (hereinafter, also referred to as information about a non-abnormality) based on the projection mapping information.
  • the information on the non-abnormality is, for example, the material of the target. Note that the information presented by the presentation unit 140 is not limited to the above example, and any information may be presented.
  • Robot arm drive unit 150 has a function of driving the robot arm 12.
  • the robot arm driving unit 150 drives the robot arm 12 by driving based on a driving instruction input from the robot arm controller 1260.
  • the robot arm driving section 150 can be realized by, for example, an actuator included in each of the robot arm tip section 100, the joint section 102, and the link 104.
  • Mobile trolley drive unit 160 has a function of driving the moving trolley 14.
  • the mobile trolley drive unit 160 drives the mobile trolley 14 by driving based on a drive instruction input from the mobile trolley controller 1262.
  • the mobile trolley driving unit 160 can be realized by, for example, an actuator included in each of the plurality of wheels 108.
  • FIG. 4 is a flowchart illustrating an example of the flow of the main process according to the first embodiment.
  • FIG. 5 is a flowchart illustrating an example of the flow of the normal scan process according to the first embodiment.
  • the information presentation device 10-1 determines a predetermined area (S1000). After the determination of the predetermined area, the information presentation device 10-1 recognizes the three-dimensional shape of the predetermined area (S1002). Next, the information presentation device 10-1 determines a scan trajectory in a predetermined area (S1004). Next, the information presentation device 10-1 performs a normal scan process in a predetermined area according to the determined scan trajectory (S1006). The detailed process of the normal scan process will be described later.
  • the information presentation device 10-1 checks whether or not there is a record of abnormality detection (S1008). When there is no record of the abnormality detection (S1008 / NO), the information presentation device 10-1 ends the process. When there is a record of abnormality detection (S1008 / YES), the information presentation device 10-1 generates projection mapping information (S1010). Then, the information presentation device 10-1 performs a presentation process for presenting the generated projection mapping information, and ends the process (S1012).
  • the information presentation device 10-1 starts scanning a predetermined area (S2000).
  • the information presentation device 10-1 checks whether an abnormality has been detected while performing the scan (S2002). If no abnormality is detected (S2002 / NO), the information presenting device 10-1 continues scanning without performing any particular processing. If an abnormality is detected (S2002 / YES), the information presentation device 10-1 records the sensing data of the sensor device and the detection position of the abnormality in the storage unit 130 (S2004), and continues scanning.
  • the information presentation device 10-1 checks whether or not the scanning of the predetermined area has been completed (S2006). When the scanning of the predetermined area is completed (S2006 / YES), the information presentation device 10-1 ends the normal scanning process. If the scanning of the predetermined area has not been completed (S2006 / NO), the information presentation device 10-1 repeats the processing of S2002 to S2006.
  • Example> The flow of the process according to the first embodiment has been described above. Next, an example according to the first embodiment will be described with reference to FIGS.
  • FIG. 6 is a diagram illustrating a first example according to the first embodiment.
  • the surface of the unknown object 20 is designated as a predetermined area (not shown) by the user.
  • a temperature sensor is provided as a sensor device.
  • the information presentation device 10-1 scans the unknown object 20 by scanning a predetermined area along the scan trajectory 40. It is assumed that the scan has detected an abnormality that the position 50 of the unknown object 20 is at a high temperature, as shown in the middle part of FIG. In this case, as shown in the lower part of FIG. 6, the information presenting apparatus 10-1 presents, on the unknown object 20, presentation information 60 indicating a position and a range where the temperature is high as the information regarding the abnormality. This allows the user to intuitively and easily grasp the position and range where the abnormality has occurred.
  • the example of the presentation information 60 described above is not limited to an example applied to an abnormality related to the temperature of an unknown object, and may be applied to an arbitrary abnormality.
  • FIG. 7 is a diagram illustrating a second example according to the first embodiment.
  • a user designates a part of a wall and a floor of a room as a predetermined area (not shown).
  • a chemical sensor is provided as a sensor device.
  • the information presentation device 10-1 scans a predetermined area along the scan trajectory 44 as shown in the middle part of FIG. It is assumed that the scan has detected that the vomit of the pet 25a and the excrement of the pet 25b are abnormal. In this case, as shown in the lower part of FIG. 7, the information presentation device 10-1 presents the presentation information 62 on the vomit and the presentation information 64 on the excrement as the information on the abnormality.
  • the information regarding the abnormality may be information indicating a position and a range where the abnormality is detected, such as the presentation information 62a, the presentation information 64a, and the presentation information 64b.
  • the presentation information 62a indicates the position and range where the vomit exists
  • the presentation information 64a and the presentation information 64b indicate the position and range where the excrement exists.
  • the information presentation device 10-1 may express that the content of the detected abnormality is different by changing the color of the presentation information.
  • the information presentation device 10-1 may express that the content of the abnormality differs depending on the color.
  • the information presentation device 10-1 may classify the vomit into red and the excreta into blue, and express the presentation information 62a in red and the presentation information 64a and the presentation information 64b in blue.
  • the information presentation device 10-1 may express that the content of the abnormality is different depending on the difference in color density. More specifically, as shown in the lower part of FIG. 7, the information presentation device 10-1 makes the color of the presentation information 64a thinner than the color of the presentation information 64b, Is made darker than the color of the presentation information 64a.
  • the information presentation device 10-1 may express that the excrement concentration at the position and range of the presentation information 64a is lower than the excretion concentration at the position and range of the presentation information 64b. As described above, the information presentation device 10-1 changes the color of the presentation information, so that the user can easily distinguish the difference in the content of the abnormality based on the difference in the color. Note that the information presentation device 10-1 may express the difference in color by gradation.
  • the information relating to the abnormality may be information indicating the content of the abnormality indicated by the icon, for example, the presentation information 62b. Specifically, the presentation information 62b indicates that the pet has vomited.
  • the information presentation device 10-1 may change the size of the icon. For example, the information presentation device 10-1 may increase the size of the icon in a position and range where the density of the vomit is high, and may decrease the size of the icon in a place where the density of the vomit is low. As described above, the information presentation device 10-1 presents the presentation information by the icons, so that the user can intuitively and easily grasp what kind of abnormality has occurred.
  • the information relating to the abnormality may be information indicating the content of the abnormality indicated by text, such as the presentation information 64c.
  • the presentation information 64c indicates that ammonia has been detected from the excrement.
  • the information presentation device 10-1 may change the size of the text. For example, the information presentation device 10-1 may increase the size of the text in a position and range where the concentration of ammonia is high, and may decrease the size of the text in a position where the concentration of ammonia is low.
  • the information presenting apparatus 10-1 presents the presented information by text, so that the user can intuitively and easily grasp detailed information on the abnormality.
  • the information presenting apparatus 10-1 changes the display method of the presentation information according to the information regarding the abnormality, thereby providing additional information such as the range in which the abnormality is detected, the type of abnormality, and the degree of abnormality. Can be presented in various ways.
  • the user can examine a processing method for the abnormality based on the presentation content.
  • the example of the presentation information 62 and the presentation information 64 described above is not limited to the example applied to the abnormality related to the vomit and excrement of the pet, and may be applied to any abnormality.
  • FIG. 8 is a diagram illustrating a third example according to the first embodiment.
  • a part of the surface and a part of the floor of the plurality of tanks 22 are designated as predetermined regions (not shown) by the user.
  • a chemical sensor is provided as a sensor device.
  • the surface of the tank 22b has a crack 24a
  • the surface of the tank 22c has a crack 24b.
  • the information presentation device 10-1 scans a predetermined area that is a part of the surface of the tanks 22a to 22c and a part of the floor by scanning along the scan trajectories 42a to 42d. . It is assumed that the information presentation device 10-1 has detected as an abnormality that the medicine leaks from the cracks 24a and 24b at the positions 52a and 52b shown in the middle part of FIG. 8 by the scan. It is also assumed that the information presenting device 10-1 has also detected, as an abnormality, the presence of a chemical leaked from the crack 52c at the position 52c. In this case, as shown in the lower part of FIG.
  • the information presentation device 10-1 presents the presentation information 66a to 66c indicating the position and range of the leakage of the medicine on the medicine as information relating to the abnormality.
  • the information presenting device 10-1 presents the detected position and range of the abnormality, so that the user can easily and intuitively grasp the position and range where the abnormality has occurred.
  • the information on the abnormality may be information indicating the contents of the abnormality indicated by text, such as the presentation information 67a and 67b.
  • the presentation information 67a indicates that the medicine leaked from the crack 24a is the medicine B.
  • the presentation information 67b indicates that the medicine leaked from the crack 24b is the medicine C.
  • the information presenting apparatus 10-1 presents the presented information by text, so that the user can intuitively and easily grasp detailed information on the abnormality.
  • the information presentation device 10-1 may express that the content of the detected abnormality is different by changing the color of the presentation information.
  • the information presentation device 10-1 may express that the content of the abnormality differs depending on the color.
  • the information presentation device 10-1 may classify the medicine B into red and the medicine B into blue, and express the presentation information 67a in red and the presentation information 67b in blue.
  • the information presentation device 10-1 changes the color of the presentation information, so that the user can easily distinguish the difference in the content of the abnormality based on the difference in the color.
  • the example of the presentation information 66 and the presentation information 67 described above is not limited to an example applied to an abnormality related to leakage of a medicine from a tank, and may be applied to an arbitrary abnormality.
  • Second Embodiment >> The first embodiment has been described above. Next, a second embodiment according to the present disclosure will be described. As will be described later, according to the second embodiment, when an abnormality is detected in a predetermined area by the normal scan, the information presentation device 10-2 according to the second embodiment is configured to detect the detected abnormality. Scan the vicinity of a position in more detail. In addition, scanning the vicinity of the position of the detected abnormality in more detail by the information presentation device 10-2 is also referred to as a detailed scan below. The information presenting apparatus 10-2 can acquire more detailed information about the abnormality by performing the detailed scan, and can present more detailed information about the abnormality.
  • the second embodiment will be described in detail, but the description of the same contents as the first embodiment will be omitted.
  • the information presentation device 10-2 according to the second embodiment has a function of performing a detailed scan, unlike the information presentation device 10-1 according to the first embodiment shown in FIG. Therefore, the information presentation device 10-2 includes a control unit 120-2 having a function partially different from that of the control unit 120-1 of the information presentation device 10-1. Specifically, the information presentation device 10-2 includes a spatial processing unit 122-2 having a function partially different from that of the spatial processing unit 122-1 of the control unit 120-1.
  • Control unit 120-2 When the abnormality is detected in the predetermined area, the control unit 120-2 causes the acquisition unit 110 to acquire information about the predetermined area near the position where the abnormality is detected in more detail than before the abnormality is detected. First, when an abnormality is detected in a predetermined area by the normal scan, the control unit 120-2 determines a predetermined area for performing a detailed scan. Next, the control unit 120-2 determines a scan trajectory in a predetermined area so that the acquisition unit 110 can scan in more detail than the normal scan. Then, the control unit 120-2 causes the acquiring unit 110 to scan along the determined scan trajectory, thereby causing the acquiring unit 110 to acquire more detailed information on the predetermined area than before the detection of the abnormality.
  • the information presenting apparatus 10-2 can acquire more detailed information on the predetermined area than when performing only the normal scan, and can present more detailed information on the abnormality based on the more detailed information on the predetermined area. Can be.
  • the control unit 120-2 includes a spatial processing unit 122-2 having a function partially different from that of the spatial processing unit 122-1 of the control unit 120-1. Having.
  • the spatial processing unit 122-2 is different from the spatial processing unit 122-1 in the process of detecting a predetermined area and the process of determining a scan trajectory in order to realize that the information presentation device 10-2 performs a detailed scan. Different processes are performed.
  • Predetermined area detection processing When an abnormality is detected in a predetermined area by the normal scan, the spatial processing unit 122-2 performs a predetermined area for performing a detailed scan based on the position information stored in the storage unit 130 and indicating the position where the abnormality is detected. Is detected. Note that the spatial processing unit 122-2 may set a predetermined area where a detailed scan is performed as a position designated by the user. Further, the spatial processing unit 122-2 may additionally perform a drive information generation process described later.
  • the spatial processing unit 122-2 may set a predetermined area to be subjected to a normal scan as a target for a detailed scan.
  • the acquisition unit 110 may cause the acquisition unit 110 to perform a detailed scan of not only the vicinity of the position where the abnormality is detected but also the entire predetermined region.
  • the spatial processing unit 122-2 determines a scan trajectory so that the acquisition unit 110 can scan in more detail than during a normal scan. For example, the spatial processing unit 122-2 makes the interval between the scan trajectories during the detailed scan narrower than the interval between the scan trajectories during the normal scan. Accordingly, the acquisition unit 110 can scan the predetermined area more finely, and can acquire more detailed information on the predetermined area.
  • the spatial processing unit 122-2 may generate drive information for controlling the position of the acquisition unit 110 so that the acquisition unit 110 can perform more detailed scanning than during normal scanning. For example, the spatial processing unit 122-2 generates the driving information such that the position of the acquisition unit 110 with respect to the predetermined area during the detailed scan is closer to the position of the acquisition unit 110 with respect to the predetermined area during the normal scan. .
  • FIG. 9 is a flowchart illustrating an example of the flow of a main process according to the second embodiment.
  • FIG. 10 is a flowchart illustrating an example of the flow of the detailed scan process according to the second embodiment.
  • the main processing according to the second embodiment will be described with reference to FIG. As shown in FIG. 9, the main processing according to the second embodiment is such that the detailed scan processing (S1014) is performed when the abnormality detection is recorded (S1008 / YES) according to the first embodiment. Different from main processing. Processing other than the detailed scan processing is described in ⁇ 2-4. Flow of processing>. Therefore, description of processes other than the detailed scan process will be omitted.
  • the information presentation device 10-2 determines a scan trajectory in the vicinity of the abnormality detection position (S3000). Next, the information presentation device 10-2 starts scanning near the abnormality detection position (S3002). The information presentation device 10-2 checks whether an abnormality has been detected while performing the scan (S3004). If no abnormality is detected (S3004 / NO), the information presenting device 10-2 continues scanning without performing any processing. When an abnormality is detected (S3004 / YES), the information presentation device 10-2 records the sensing data of the sensor device and the detection position of the abnormality in the storage unit 130 (S3006), and continues scanning.
  • the information presentation device 10-2 checks whether the scan near the abnormality detection position has been completed (S3008). When the scan near the abnormality detection position is completed (S3008 / YES), the information presentation device 10-2 ends the detailed scan process. If the scan near the abnormality detection position has not been completed (S3008 / NO), the information presentation device 10-2 repeats the processing of S3004 to S3008.
  • FIG. 11 is a diagram illustrating an example according to the second embodiment.
  • the information presentation device 10-2 performs a normal scan on the tanks 22a to 22c, and as a result, an abnormality is detected at the same position as the positions 52a to 52c shown in FIG.
  • a detailed scan is performed in the vicinity of is described.
  • it is assumed that a chemical sensor is provided as a sensor device.
  • the information presentation device 10-2 determines the predetermined area 36a and the predetermined area 36b based on the position information of the position where the abnormality is detected. Next, the information presentation device 10-2 determines the scan trajectory in the predetermined area 36a as the scan trajectory 46a and the scan trajectory in the predetermined area 36b as the scan trajectory 46b.
  • the information presentation device 10-2 After the determination of the scan trajectory, the information presentation device 10-2 performs a detailed scan of the predetermined area 36a and the predetermined area 36b along the scan trajectory 46a and the scan trajectory 46b. It is assumed that the information presentation device 10-2 detects that the chemical leaked from the cracks 24a and 24b is present at the positions 54a and 54b by the scan, as shown in the middle part of FIG. In this case, as shown in the lower diagram of FIG. 11, the information presentation device 10-2 presents presentation information 68a and 68b indicating the location and range where the leaked medicine exists, as information relating to abnormality on the leaked medicine. . Thereby, the user can grasp the position and range where the abnormality has occurred in more detail than in the case where only the normal scan is performed.
  • FIG. 12 is an explanatory diagram illustrating a first modified example according to each embodiment.
  • the presentation information presented by the information presenting apparatus 10 may not be normally presented.
  • the presentation information output from the output unit of the presentation unit 140 is blocked by the object.
  • Information about the abnormality is not presented at the detection position.
  • the user when the user is performing a process for an abnormality, the user moves on the straight line due to a movement accompanying work, and a moving object such as a person, an animal, and a robot other than the user is placed on the straight line. It is possible to move.
  • the presentation information is not normally presented while the user is performing a process for the abnormality based on the presentation information, the user may not be able to grasp the detection position of the abnormality, and the user may be harmed.
  • the information presentation device 10 may control the operation so that the presentation information is not blocked by the object.
  • the control unit 120 of the information presenting apparatus 10 causes the presenting unit 140 to present information about the abnormality so as to avoid the user.
  • the worker 28 is performing a processing operation for a position abnormality indicated by the presentation information 60 presented on the unknown object 20.
  • the information presentation device 10 detects the movement of the worker 28 during the processing work by the worker 28, and when it is determined that the presentation information 60 is blocked by the worker 28, as illustrated in the lower diagram of FIG. By moving so as to avoid the user, the presentation information 60 is not obstructed.
  • the information presentation device 10 keeps presenting the presentation information 60 even while moving, so that the worker 28 does not lose sight of the abnormality detection position.
  • the information presenting apparatus 10 can reduce the possibility of harm to the user because the user does not lose sight of the detection position of the abnormality during the processing operation for the abnormality.
  • FIG. 13 is an explanatory diagram illustrating a second modified example according to each embodiment.
  • the information presenting apparatus 10 may further have a function of simultaneously presenting information regarding non-abnormalities.
  • the function can be realized by providing the acquisition unit 110 with a plurality of sensor devices and acquiring information on a plurality of predetermined regions by the plurality of sensor devices.
  • the acquisition unit 110 is provided with a temperature sensor and a metal sensor.
  • the information presentation device 10 scans the unknown object 20 along the scan trajectory 40 as shown in the upper part of FIG.
  • the information presentation device 10 presents the presentation information 60 indicating the position and range of heat generation on the unknown object 20 as information relating to abnormality, as shown in the lower diagram of FIG.
  • the information presentation apparatus 10 may present "heat generation" in the vicinity of the presentation information 60 by text as in the case of the presentation information 72a.
  • the information presenting apparatus 10 presents the presentation information 70 indicating the region where the material of the unknown object 20 is metal on the unknown object 20 as information relating to non-abnormality.
  • the information presentation device 10 may present “metal” by text near the presentation information 70 like the presentation information 72b.
  • the user can comprehensively determine the state of the predetermined area based on the information regarding the abnormality and the information regarding the non-abnormality.
  • FIG. 14 is an explanatory diagram illustrating a third modification example according to each embodiment. In the third modification, it is assumed that a chemical sensor is provided as a sensor device.
  • the information on the abnormality presented by the information presentation device 10 may be, for example, information indicating a warning to the user.
  • the information presentation device 10 may present presentation information 74 that is a warning line as information indicating a warning.
  • the information presentation device 10 presents the presentation information 68a and the presentation information 68b based on the result of scanning the tanks 22a to 22c, as in the example described in the example of the second embodiment.
  • the information presenting apparatus 10 displays the presentation information 74 indicating the warning line in addition to the presentation information 68a and 68b. Present.
  • the information presenting device 10 may further present the reason why the warning line is presented near the warning line by text. For example, like the presentation information 76 shown in FIG. 14, the information presentation device 10 presents “drugs: entry warning” by text. Thus, the user can understand that the warning is issued so as not to enter the warning line due to the leakage of the powerful drug from the tank 22b and the tank 22c.
  • the information presenting device 10 may present, by text, that the medicine leaking from the tank 22b is the medicine B and the medicine leaking from the tank 22c is the medicine C. At this time, the information presentation device 10 may express that the properties of the detected medicines are different by presenting the respective presentation information so that the sizes of the texts are different. For example, by making the text size of the presentation information 74b larger than the text size of the presentation information 74a like the presentation information 74a and the presentation information 74b, it is expressed that the medicine C is more dangerous than the medicine B. May be.
  • FIG. 15 is a block diagram illustrating a hardware configuration example of the information presentation device 10 according to each embodiment. Note that the information presentation device 10 illustrated in FIG. 15 can realize, for example, the information presentation device 10 illustrated in FIGS. 1 and 3. Information processing by the information presentation device according to the present embodiment is realized by cooperation between software and hardware described below.
  • the information presentation device 10 includes a CPU (Central Processing Unit) 901, a ROM (Read Only Memory) 902, and a RAM (Random Access Memory) 903.
  • the information presentation device 10 includes a host bus 904a, a bridge 904, an external bus 904b, an interface 905, an input device 906, an output device 907, a storage device 908, a drive 909, a connection port 911, and a communication device 913.
  • the hardware configuration shown here is an example, and some of the components may be omitted. Further, the hardware configuration may further include components other than the components shown here.
  • the CPU 901 functions as, for example, an arithmetic processing device or a control device, and controls the overall operation of each component or a part thereof based on various programs recorded in the ROM 902, the RAM 903, or the storage device 908.
  • the ROM 902 is a unit that stores a program read by the CPU 901, data used for calculation, and the like.
  • the RAM 903 temporarily or permanently stores, for example, a program read by the CPU 901 and various parameters that appropriately change when the program is executed. These are interconnected by a host bus 904a composed of a CPU bus and the like.
  • the CPU 901, the ROM 902, and the RAM 903 can realize the function of the control unit 120 described with reference to FIG. 3 in cooperation with software, for example.
  • the CPU 901, the ROM 902, and the RAM 903 are mutually connected via, for example, a host bus 904a capable of high-speed data transmission.
  • the host bus 904a is connected, for example, via a bridge 904 to an external bus 904b having a relatively low data transmission speed.
  • the external bus 904b is connected to various components via an interface 905.
  • the input device 906 is realized by a device to which information is input by a user, such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch, and a lever.
  • the input device 906 may be, for example, a remote control device using infrared rays or other radio waves, or may be an externally connected device such as a mobile phone or a PDA that supports the operation of the information presentation device 10. .
  • the input device 906 may include, for example, an input control circuit that generates an input signal based on information input by a user using the above-described input unit and outputs the input signal to the CPU 901. By operating the input device 906, the user of the information presentation device 10 can input various data to the information presentation device 10 or instruct a processing operation.
  • the input device 906 may be formed by a device that detects information about the user.
  • the input device 906 includes an image sensor (for example, a camera), a depth sensor (for example, a stereo camera), an acceleration sensor, a gyro sensor, a terrestrial magnetism sensor, an optical sensor, a sound sensor, and a distance measurement sensor (for example, ToF (Time of Flight) ) Sensors), various sensors such as force sensors.
  • the input device 906 is used for information on the state of the information presenting device 10 itself, such as the posture and moving speed of the information presenting device 10, and information on the surrounding environment of the information presenting device 10, such as brightness and noise around the information presenting device 10. May be obtained.
  • the input device 906 receives a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a GPS (Global Positioning System) satellite) and receives position information including the latitude, longitude, and altitude of the device.
  • GNSS Global Navigation Satellite System
  • a GNSS module for measuring may be included.
  • the input device 906 may be a device that detects a position by Wi-Fi (registered trademark), transmission / reception with a mobile phone / PHS / smartphone, or the like, or short-range communication.
  • the input device 906 can realize, for example, the function of the acquisition unit 110 described with reference to FIG.
  • the output device 907 is formed of a device that can visually or audibly notify the user of the acquired information. Examples of such devices include CRT display devices, liquid crystal display devices, plasma display devices, EL display devices, display devices such as laser projectors, LED projectors and lamps, audio output devices such as speakers and headphones, and printer devices. .
  • the output device 907 outputs, for example, results obtained by various processes performed by the information presentation device 10. Specifically, the display device visually displays the results obtained by various processes performed by the information presentation device 10 in various formats such as text, images, tables, and graphs.
  • the audio output device converts an audio signal composed of reproduced audio data, acoustic data, and the like into an analog signal and outputs it audibly.
  • the output device 907 can realize, for example, the function of the presentation unit 140 described with reference to FIG.
  • the storage device 908 is a data storage device formed as an example of a storage unit of the information presentation device 10.
  • the storage device 908 is realized by, for example, a magnetic storage device such as an HDD, a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
  • the storage device 908 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like.
  • the storage device 908 stores programs executed by the CPU 901 and various data, various data acquired from the outside, and the like.
  • the storage device 908 can realize, for example, the function of the storage unit 130 described with reference to FIG.
  • the drive 909 is a reader / writer for a storage medium, and is built in or external to the information presentation device 10.
  • the drive 909 reads information recorded on a removable storage medium such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory, and outputs the information to the RAM 903.
  • the drive 909 can also write information on a removable storage medium.
  • connection port 911 is, for example, a port for connecting an external connection device such as a USB (Universal Serial Bus) port, an IEEE1394 port, a SCSI (Small Computer System Interface), an RS-232C port, or an optical audio terminal.
  • an external connection device such as a USB (Universal Serial Bus) port, an IEEE1394 port, a SCSI (Small Computer System Interface), an RS-232C port, or an optical audio terminal.
  • the communication device 913 is, for example, a communication interface formed by a communication device or the like for connecting to the network 920.
  • the communication device 913 is, for example, a communication card for a wired or wireless LAN (Local Area Network), LTE (Long Term Evolution), Bluetooth (registered trademark), or WUSB (Wireless USB).
  • the communication device 913 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various kinds of communication.
  • the communication device 913 can transmit and receive signals and the like to and from the Internet and other communication devices in accordance with a predetermined protocol such as TCP / IP.
  • the network 920 is a wired or wireless transmission path for information transmitted from a device connected to the network 920.
  • the network 920 may include a public line network such as the Internet, a telephone line network, a satellite communication network, various LANs including Ethernet (registered trademark) (Local Area Network), a WAN (Wide Area Network), and the like.
  • the network 920 may include a dedicated line network such as an IP-VPN (Internet ⁇ Protocol-Virtual ⁇ Private ⁇ Network).
  • the information presentation device 10 acquires information about a predetermined area, and replaces information about an abnormality in the predetermined area, which is detected based on the information about the predetermined area, with the abnormality. Present to the detected position. Thereby, the information presentation device 10 can present information on the abnormality on the source of the abnormality.
  • each device described in this specification may be realized as a single device, or some or all of the devices may be realized as separate devices.
  • the information presentation device 10 may be realized as a single device including the robot arm 12, the moving trolley 14, the control unit 120, and the storage unit 130 illustrated in FIG.
  • the control unit 120 illustrated in FIG. 3 may be realized as a server device connected to the information presentation device 10 via a network or the like.
  • the server device may include the storage unit 130 illustrated in FIG.
  • a series of processes by each device described in this specification may be realized using any of software, hardware, and a combination of software and hardware.
  • a program constituting the software is stored in advance in a recording medium (non-transitory medium: non-transition @ media) provided inside or outside each device, for example.
  • Each program is read into the RAM at the time of execution by a computer, for example, and executed by a processor such as a CPU.
  • An acquisition unit that acquires information about a predetermined area
  • a presentation unit that is presented based on information about the predetermined area and presents information about an abnormality in the predetermined area.
  • a control unit that causes the presenting unit to present information on the abnormality to a position where the abnormality is detected
  • An information presentation device comprising: (2) The information presentation device according to (1), wherein the control unit causes the presentation unit to present information on the abnormality on a source of the abnormality.
  • the information presentation device according to any one of (1) to (3), wherein the acquisition unit includes at least one sensor device, and acquires information on the predetermined area by the sensor device.
  • the acquisition unit has a plurality of sensor devices, based on information about a plurality of predetermined regions acquired by the plurality of sensor devices, when information about a plurality of different abnormalities is detected,
  • the information presentation device according to any one of (1) to (4), wherein the control unit causes the presentation unit to present each of the information on the plurality of different abnormalities.
  • the control unit controls a distance between the acquisition unit and the predetermined area.
  • the information presentation device controls a position of the acquisition unit such that a distance between the acquisition unit and the predetermined area is constant.
  • the predetermined area includes a target surface designated by a user.
  • the predetermined area includes an area specified by a user.
  • the control unit when the abnormality is detected in the predetermined area, the information about the predetermined area in the vicinity of the position where the abnormality is detected, the acquisition unit in more detail than before the detection of the abnormality
  • the information presentation device according to any one of (1) to (9), wherein the information presentation device acquires the information.
  • the information presentation device according to any one of (1) to (15), wherein the information regarding the abnormality is information indicating a range in which the abnormality is detected.
  • the information presentation device according to any one of (1) to (16), wherein the information on the abnormality is a name of a substance detected as the abnormality.
  • the information regarding the abnormality is information indicating a warning to a user.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)

Abstract

An information presentation device (10) comprises: an acquisition unit (110) that acquires information pertaining to a prescribed region; a presenting unit (140) that presents information pertaining to an abnormality in the prescribed region, the abnormality detected on the basis of the information pertaining to the prescribed region; and a control unit (120) that causes the presenting unit (140) to present the information pertaining to the abnormality to the position where the abnormality was detected.

Description

情報提示装置、情報提示方法、及びプログラムInformation presentation device, information presentation method, and program
 本開示は、情報提示装置、情報提示方法、及びプログラムに関する。 (4) The present disclosure relates to an information presentation device, an information presentation method, and a program.
 従来、例えば、周囲の異常を検出し、検出した異常に関する情報を提示する装置が各種開発されている。 Conventionally, for example, various devices have been developed that detect surrounding abnormalities and present information on the detected abnormalities.
 例えば、下記特許文献1には、配管等の点検対象の異常を検出するためのセンサ装置を備え、当該センサ装置により検出されたデータを画像データに変換し、当該画像データを表示装置に表示させる移動ロボットが記載されている。 For example, Patent Literature 1 below includes a sensor device for detecting an abnormality of an inspection target such as a pipe, converts data detected by the sensor device into image data, and causes the display device to display the image data. A mobile robot is described.
特開平6-198586号公報JP-A-6-198586
 しかしながら、特許文献1に記載の技術では、センサ装置により検出された異常に関するデータは、表示装置に表示されるのみであり、異常発生源に対して直接的に表示されない。そのため、ユーザが異常の発生現場で異常に対する処理を行う際に、異常発生源の位置や異常発生源の周辺の状況等をユーザが直感的に理解しにくい場合が有り得る。 However, according to the technology described in Patent Literature 1, the data related to the abnormality detected by the sensor device is only displayed on the display device, and is not directly displayed on the source of the abnormality. Therefore, when the user performs a process for the abnormality at the site where the abnormality has occurred, the user may not be able to intuitively understand the position of the abnormality source, the situation around the abnormality source, and the like.
 そこで、本開示では、ユーザが直感的に理解しやすいように異常に関する情報を提示することが可能な、新規かつ改良された情報提示装置、情報提示方法、及びプログラムを提案する。 Therefore, the present disclosure proposes a new and improved information presenting device, information presenting method, and program capable of presenting information about an abnormality so that a user can easily and intuitively understand the information.
 本開示によれば、所定の領域に関する情報を取得する取得部と、前記所定の領域に関する情報に基づき検出される、前記所定の領域における異常に関する情報を提示する提示部と、前記提示部に、前記異常が検出された位置へ前記異常に関する情報を提示させる制御部と、を備える、情報提示装置が提供される。 According to the present disclosure, an acquisition unit that acquires information about a predetermined area, a presentation unit that is detected based on the information about the predetermined area, and presents information about an abnormality in the predetermined area, A control unit for presenting information on the abnormality to a position where the abnormality is detected.
 また、本開示によれば、所定の領域に関する情報を取得することと、前記所定の領域に関する情報に基づき検出される、前記所定の領域における異常に関する情報を提示することと、前記異常が検出された位置へ前記異常に関する情報を提示させることと、を含む、プロセッサにより実行される情報提示方法が提供される。 According to the present disclosure, acquiring information on a predetermined area, presenting information on an abnormality in the predetermined area, which is detected based on the information on the predetermined area, and detecting the abnormality And presenting the information on the abnormality to a position where the information is abnormal, provided by the processor.
 また、本開示によれば、コンピュータを、所定の領域に関する情報を取得する取得部と、前記所定の領域に関する情報に基づき検出される、前記所定の領域における異常に関する情報を提示する提示部と、前記提示部に、前記異常が検出された位置へ前記異常に関する情報を提示させる制御部と、として機能させるためのプログラムが提供される。 Also, according to the present disclosure, a computer, an acquisition unit that acquires information about a predetermined area, and a presentation unit that is detected based on the information about the predetermined area, and presents information about an abnormality in the predetermined area, A program for causing the presentation unit to function as a control unit that presents information about the abnormality to a position where the abnormality is detected is provided.
 以上説明したように本開示によれば、ユーザが直感的に理解しやすいように異常に関する情報を提示することが可能である。なお、上記の効果は必ずしも限定的なものではなく、上記の効果とともに、または上記の効果に代えて、本明細書に示されたいずれかの効果、または本明細書から把握され得る他の効果が奏されてもよい。 According to the present disclosure, as described above, it is possible to present information about an abnormality so that a user can easily understand it intuitively. Note that the above effects are not necessarily limited, and any of the effects shown in the present specification or other effects that can be grasped from the present specification are used together with or in place of the above effects. May be played.
本開示の各実施形態に係る情報提示装置の物理構成の例を示す図である。1 is a diagram illustrating an example of a physical configuration of an information presentation device according to each embodiment of the present disclosure. 各実施形態に係る所定の領域の具体例を示す図である。FIG. 4 is a diagram illustrating a specific example of a predetermined area according to each embodiment. 各実施形態に係る情報提示装置の機能構成の例を示すブロック図である。It is a block diagram showing an example of functional composition of an information presentation device concerning each embodiment. 第1の実施形態に係るメイン処理の流れを示すフローチャートである。5 is a flowchart illustrating a flow of a main process according to the first embodiment. 同実施形態に係る通常スキャン処理の流れを示すフローチャートである。4 is a flowchart illustrating a flow of a normal scan process according to the embodiment. 同実施形態に係る第1の実施例を示す図である。FIG. 6 is a diagram showing a first example according to the same embodiment. 同実施形態に係る第2の実施例を示す図である。FIG. 7 is a diagram showing a second example according to the same embodiment. 同実施形態に係る第3の実施例を示す図である。It is a figure showing the 3rd example concerning the embodiment. 第2の実施形態に係るメイン処理の流れを示すフローチャートである。9 is a flowchart illustrating a flow of a main process according to the second embodiment. 同実施形態に係る詳細スキャン処理の流れを示すフローチャートである。It is a flow chart which shows the flow of the detailed scan processing concerning the embodiment. 同実施形態に係る実施例を示す図である。It is a figure showing an example concerning the same embodiment. 各同実施形態に係る第1の変形例を示す図である。It is a figure showing the 1st modification concerning each embodiment. 各同実施形態に係る第2の変形例を示す図である。It is a figure showing the 2nd modification concerning each embodiment. 各同実施形態に係る第3の変形例を示す図である。It is a figure showing the 3rd modification concerning each embodiment. 各同実施形態に係る情報提示装置のハードウェア構成例を示すブロック図である。FIG. 3 is a block diagram illustrating a hardware configuration example of an information presentation device according to each embodiment.
 以下に添付図面を参照しながら、本開示の好適な実施の形態について詳細に説明する。なお、本明細書及び図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複説明を省略する。 Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the accompanying drawings. In the specification and the drawings, components having substantially the same functional configuration are denoted by the same reference numerals, and redundant description is omitted.
 また、本明細書及び図面において、実質的に同一の機能構成を有する複数の構成要素を、同一の符号の後に異なるアルファベットを付して区別する場合もある。例えば、実質的に同一の機能構成を有する複数の構成要素は、必要に応じて関節部102a及び関節部102bのように区別される。ただし、実質的に同一の機能構成を有する複数の構成要素の各々を特に区別する必要がない場合、同一符号のみを付する。例えば、関節部102a及び関節部102bを特に区別する必要が無い場合には、単に関節部102と称する。 In addition, in the present specification and the drawings, a plurality of components having substantially the same function and configuration may be distinguished by adding different alphabets to the same reference numeral. For example, a plurality of components having substantially the same functional configuration are distinguished as necessary, such as the joint 102a and the joint 102b. However, when it is not necessary to particularly distinguish each of a plurality of components having substantially the same functional configuration, only the same reference numeral is assigned. For example, when it is not necessary to particularly distinguish the joint 102a and the joint 102b, they are simply referred to as the joint 102.
 なお、説明は以下の順序で行うものとする。
 1.概要
 2.第1の実施形態
 3.第2の実施形態
 4.変形例
 5.ハードウェア構成例
 6.まとめ
The description will be made in the following order.
1. Overview 2. First embodiment3. Second embodiment4. Modification 5. Hardware configuration example 6. Conclusion
<<1.概要>>
 本開示は、一例として「2.第1の実施形態」及び「3.第2の実施形態」において詳細に説明するように、多様な形態で実施され得る。
<< 1. Overview >>
The present disclosure may be implemented in various forms as described in detail in “2. First Embodiment” and “3. Second Embodiment” as an example.
 従来、例えば、周囲の異常を検出し、検出した異常に関する情報を提示する装置が各種開発されている。例えば、配管等の点検対象の異常を検出するためのセンサ装置を備え、当該センサ装置により検出されたデータを画像データに変換し、当該画像データを表示装置に表示させる装置が開発されている。しかしながら、当該装置は、センサ装置により検出された異常に関するデータを表示装置に表示させるのみであり、異常発生源に対して直接的に表示することは行わない。そのため、ユーザが異常の発生現場で異常に対する処理を行う際に、異常発生源の位置や異常発生源の周辺の状況等をユーザが直感的に理解しにくい場合が有り得る。 Conventionally, for example, various devices have been developed that detect surrounding abnormalities and present information on the detected abnormalities. For example, a device has been developed that includes a sensor device for detecting an abnormality of an inspection target such as a pipe, converts data detected by the sensor device into image data, and displays the image data on a display device. However, the device only causes the display device to display data relating to the abnormality detected by the sensor device, and does not directly display the data to the abnormality source. Therefore, when the user performs a process for the abnormality at the site where the abnormality has occurred, the user may not be able to intuitively understand the position of the abnormality source, the situation around the abnormality source, and the like.
 そこで、上記事情を一着眼点にして、各実施形態に係る情報提示装置10を創作するに至った。各実施形態に係る情報提示装置10は、所定の領域に関する情報を取得し、所定の領域に関する情報に基づき検出される、所定の領域における異常に関する情報を提示する。さらに、情報提示装置10は、異常が検出された位置へ異常に関する情報を提示させる。このため、情報提示装置10は、ユーザが直感的に理解しやすいように異常に関する情報を提示することができる。以下、各実施形態について順次詳細に説明する。 Therefore, the information presentation apparatus 10 according to each embodiment has been created with the above-mentioned information as a single point of view. The information presentation device 10 according to each embodiment acquires information about a predetermined area, and presents information about an abnormality in the predetermined area, which is detected based on the information about the predetermined area. Further, the information presenting device 10 causes information relating to the abnormality to be presented at the position where the abnormality is detected. For this reason, the information presenting apparatus 10 can present information relating to the abnormality so that the user can intuitively understand the information. Hereinafter, each embodiment will be sequentially described in detail.
<<2.第1の実施形態>>
 <2-1.情報提示装置の物理構成>
 まず、本開示に係る第1の実施形態について説明する。最初に、図1を参照しながら、第1の実施形態に係る情報提示装置10-1の物理構成について説明する。図1は、第1の実施形態に係る情報提示装置10-1の物理構成の一例を示す図である。図1に示すように、本実施形態に係る情報提示装置10-1は、ロボットアーム12及び移動台車14を備える。
<< 2. First Embodiment >>
<2-1. Physical configuration of information presentation device>
First, a first embodiment according to the present disclosure will be described. First, the physical configuration of the information presentation device 10-1 according to the first embodiment will be described with reference to FIG. FIG. 1 is a diagram illustrating an example of a physical configuration of the information presentation device 10-1 according to the first embodiment. As shown in FIG. 1, the information presenting device 10-1 according to the present embodiment includes a robot arm 12 and a movable cart 14.
 本実施形態において、情報提示装置10-1は、移動台車14を備えることで、電気的および/または磁気的な作用を用いて自律的に移動可能な装置(機械)であり得る。情報提示装置10-1は、例えば地面、または、空中などを自律的に移動可能なロボットであり得る。但し、本実施形態はかかる例に限定されず、情報提示装置10-1は、電気的および/または磁気的な作用を用いて自律的に動作可能な機械(装置)あるいはその他一般的な移動体装置であればよい。例えば、情報提示装置10-1は、他の種類のロボット(例えばヒューマノイドロボットやドローンなど)、乗り物(例えば、車両、船舶、飛行体など)、各種の産業用機械、または、玩具などであってもよい。 In the present embodiment, the information presentation device 10-1 may be a device (machine) that can move autonomously by using the electric and / or magnetic action by including the movable cart 14. The information presentation device 10-1 may be, for example, a robot that can autonomously move on the ground or in the air. However, the present embodiment is not limited to such an example, and the information presenting apparatus 10-1 may be a machine (apparatus) that can operate autonomously using an electric and / or magnetic action or another general mobile object. Any device may be used. For example, the information presentation device 10-1 is another type of robot (for example, a humanoid robot or a drone), a vehicle (for example, a vehicle, a ship, a flying body, etc.), various industrial machines, or a toy. Is also good.
 また、情報提示装置10-1は、回転運動および並進運動(以下では、回転並進運動と称する場合がある)を同時に実行可能な装置(機械)であり得る。例えば、情報提示装置10-1は、全方位移動型の移動体であり得る。回転並進運動の具体例としては、例えば、情報提示装置10-1がある一点を凝視しながら移動すること、情報提示装置10-1が連続的に回転を行い、かつ、周囲の環境をセンシングしながら移動すること、または、任意の方向へ移動する他の移動体(例えば人や車など)をリアルタイムに追跡することなどが挙げられる。 The information presentation device 10-1 may be a device (machine) that can simultaneously perform a rotational motion and a translational motion (hereinafter, may be referred to as a rotational translational motion) at the same time. For example, the information presentation device 10-1 may be an omnidirectional mobile. As a specific example of the rotational translation, for example, the information presentation device 10-1 moves while staring at a certain point, the information presentation device 10-1 continuously rotates, and senses the surrounding environment. Moving while moving, or tracking another moving body (for example, a person or a car) moving in an arbitrary direction in real time.
 (1)ロボットアーム12
 ロボットアーム12は、例えば、ロボットアーム先端部100、複数の関節部102、及び複数のリンク104で構成される。図1に示すように、ロボットアーム12は、例えば、1つのロボットアーム先端部100と、4つの関節部102a~102dと、3つのリンク104a~104cとを有する。4つの関節部102a~102dは、ロボットアーム先端部100、3つのリンク104a~104c、及び移動台車14を連結する。具体的に、図1に示すように、関節部102aは、ロボットアーム先端部100とリンク104aとを連結する。また、関節部102bは、リンク104aとリンク104bとを連結する。また、関節部102cは、リンク104bとリンク104cとを連結する。また、関節部102dは、リンク104cと移動台車14とを連結する。なお、ロボットアーム12、ロボットアーム先端部100、関節部102、及びリンク104の形状は、図1に示す形状に限定されず、任意の形状であってもよい。
(1) Robot arm 12
The robot arm 12 includes, for example, a robot arm tip 100, a plurality of joints 102, and a plurality of links 104. As shown in FIG. 1, the robot arm 12 has, for example, one robot arm tip 100, four joints 102a to 102d, and three links 104a to 104c. The four joints 102a to 102d connect the robot arm distal end 100, the three links 104a to 104c, and the movable carriage 14. Specifically, as shown in FIG. 1, the joint 102a connects the robot arm tip 100 and the link 104a. The joint 102b connects the link 104a and the link 104b. The joint 102c connects the link 104b and the link 104c. In addition, the joint 102d connects the link 104c and the movable carriage 14. The shapes of the robot arm 12, the robot arm tip portion 100, the joint portion 102, and the link 104 are not limited to the shapes shown in FIG. 1, but may be any shapes.
 また、ロボットアーム12は、例えば、関節部102の動きを計測するための動きセンサを、関節部102のそれぞれに対応する位置に備えてもよい。当該動きセンサとしては、例えば、エンコーダが挙げられる。当該エンコーダは、関節部102の各々の位置を3次元座標として取得することができる。そして、情報提示装置10-1は、関節部102の3次元座標に基づき、ロボットアーム先端部100の位置を3次元座標として算出することができる。また、ロボットアーム12は、例えば、関節部102を駆動させるための駆動機構を、関節部102のそれぞれに対応する位置に備えてもよい。当該駆動機構としては、例えば、モータとドライバとが挙げられる。かかる駆動機構は、後述する制御部120-1により制御され得る。 The robot arm 12 may include, for example, a motion sensor for measuring the motion of the joint 102 at a position corresponding to each of the joints 102. An example of the motion sensor is an encoder. The encoder can acquire each position of the joint 102 as three-dimensional coordinates. Then, the information presentation device 10-1 can calculate the position of the robot arm tip 100 as three-dimensional coordinates based on the three-dimensional coordinates of the joint 102. In addition, the robot arm 12 may include, for example, a driving mechanism for driving the joint unit 102 at a position corresponding to each of the joint units 102. The driving mechanism includes, for example, a motor and a driver. Such a driving mechanism can be controlled by a control unit 120-1 described below.
 関節部102は、ロボットアーム先端部100及び複数のリンク104を互いに回動可能に連結する機能を有してもよい。例えば、関節部102a~102dは、ロボットアーム先端部100とリンク104a~104cとを互いに回動可能に連結する。そして、関節部102a~102dが回転して動作することにより、ロボットアーム12の動作が制御される。ここで、以下の説明においては、ロボットアーム12の各構成部材の位置とは、動作制御のために規定している空間における位置(座標)を意味し、各構成部材の姿勢とは、動作制御のために規定している空間における任意の軸に対する向き(角度)を意味する。また、以下の説明では、ロボットアーム12の動作(又は動作制御)とは、関節部102a~102dの動作(又は動作制御)を行うことによりロボットアーム12の各構成部材の位置及び姿勢が変化される(変化が制御される)ことをいう。 The joint 102 may have a function of rotatably connecting the robot arm tip 100 and the plurality of links 104 to each other. For example, the joints 102a to 102d rotatably connect the robot arm tip 100 and the links 104a to 104c to each other. The operation of the robot arm 12 is controlled by rotating and operating the joints 102a to 102d. Here, in the following description, the position of each component of the robot arm 12 means a position (coordinate) in a space defined for the operation control, and the posture of each component is the operation control. Means the direction (angle) with respect to an arbitrary axis in the space defined for. In the following description, the operation (or operation control) of the robot arm 12 refers to the operation (or operation control) of the joints 102a to 102d in which the position and orientation of each component of the robot arm 12 are changed. (Change is controlled).
 また、ロボットアーム先端部100、関節部102、及びリンク104の各々は、所定の回転軸に対して回転する可動部を有してもよい。例えば、ロボットアーム先端部100、関節部102、及びリンク104の各々は、アクチュエータ(例えばモータなど)を有し、当該アクチュエータの駆動により所定の回転軸に対して回転する。ロボットアーム先端部100、関節部102、及びリンク104の各々における回転がそれぞれ制御されることにより、例えば、ロボットアーム12を伸ばしたり、縮めたり(折り畳んだり)、捻ったりといった、ロボットアーム12の動作が制御される。なお、本実施形態に係るロボットアーム12は、1つのロボットアーム先端部100、4つの関節部102a~102d、及び3つのリンク104a~104cの各々に可動部を有するため、8つの自由度(DoF:Degrees of Freedom)を有する。よって、ロボットアーム12は、一般的にロボットアームの位置と姿勢を制御するために必要とされる6つの自由度(位置の3自由度+姿勢の3自由度)以上の自由度を有するため、冗長自由度を持つロボットアームである。 先端 Further, each of the robot arm tip portion 100, the joint portion 102, and the link 104 may have a movable portion that rotates about a predetermined rotation axis. For example, each of the robot arm tip portion 100, the joint portion 102, and the link 104 has an actuator (for example, a motor), and rotates around a predetermined rotation axis by driving the actuator. The rotation of each of the robot arm tip 100, the joint 102, and the link 104 is controlled, so that the operation of the robot arm 12, for example, extending, contracting (folding), or twisting the robot arm 12. Is controlled. Note that the robot arm 12 according to the present embodiment has movable portions at each of the one robot arm tip portion 100, the four joint portions 102a to 102d, and the three links 104a to 104c, and thus has eight degrees of freedom (DoF). : Degrees @ of @ Freedom). Therefore, since the robot arm 12 generally has six or more degrees of freedom (3 degrees of freedom of position + 3 degrees of freedom of attitude) required to control the position and attitude of the robot arm, A robot arm with redundant degrees of freedom.
 なお、上述の例では、ロボットアーム12は、4つの関節部102、4つのリンク104、及び8つの可動部を有しているが、関節部102、リンク104、及び可動部の数は特に限定されない。例えば、ロボットアーム12の可動部の数は、6つ以下でもよい。また、例えば、ロボットアーム12が有する関節部102、リンク104、及び可動部の数は、ロボットアーム先端部100の位置及び姿勢の自由度等を考慮して、所望の自由度を実現するように適宜設定されてもよい。また、関節部102及びリンク104の形状、及び関節部102の回転軸の方向等も、図1に示す例に限定されず、ロボットアーム先端部100の位置及び姿勢の自由度等を考慮して、所望の自由度を実現するように適宜設定されてもよい。なお、ロボットアーム12は、移動台車14の動作により、移動台車14の動作方向と同一方向に動かされてもよい。例えば、ロボットアーム12は、移動台車14が移動した際に、移動台車14の移動に合わせて移動台車14の移動方向と同一方向へ移動させられてもよい。また、例えば、ロボットアーム12は、移動台車14が回転した際に、移動台車14の回転に合わせて移動台車14の回転方向と同一方向へ回転させられてもよい。 In the above example, the robot arm 12 has four joints 102, four links 104, and eight movable parts, but the number of the joints 102, the links 104, and the movable parts is not particularly limited. Not done. For example, the number of movable parts of the robot arm 12 may be six or less. Also, for example, the numbers of the joints 102, the links 104, and the movable parts included in the robot arm 12 may be set such that a desired degree of freedom is realized in consideration of the degree of freedom of the position and orientation of the robot arm tip 100. It may be set appropriately. Further, the shapes of the joint 102 and the link 104, the direction of the rotation axis of the joint 102, and the like are not limited to the example illustrated in FIG. 1, and the degree of freedom of the position and posture of the robot arm tip 100 is taken into consideration. May be appropriately set so as to realize a desired degree of freedom. Note that the robot arm 12 may be moved in the same direction as the moving direction of the movable cart 14 by the operation of the movable cart 14. For example, when the mobile trolley 14 moves, the robot arm 12 may be moved in the same direction as the mobile trolley 14 in accordance with the movement of the mobile trolley 14. Further, for example, when the movable cart 14 rotates, the robot arm 12 may be rotated in the same direction as the rotating direction of the movable cart 14 in accordance with the rotation of the movable cart 14.
 ロボットアーム12のロボットアーム先端部100には、多様な装置が設けられ得る。本実施形態では、例えば、ロボットアーム先端部100にはセンサ装置が設けられる。当該センサ装置は、多様なセンサを含み得る。例えば、センサ装置は、カメラ、サーモカメラ、測距センサ、デプスセンサ、マイクロフォン(以下、マイクとも称する)、化学センサ、臭気センサ、金属センサ、温度センサ、及び透視画像を撮像可能なセンサを含み得る。なお、ロボットアーム先端部100に設けられるセンサ装置の数は特に限定されない。例えば、ロボットアーム先端部100に設けられるセンサ装置は、上述したセンサ装置のうち一つ又は複数を組み合わせ含んでも良いし、同一種類の装置を複数含んでも良い。 Various devices may be provided at the robot arm tip portion 100 of the robot arm 12. In the present embodiment, for example, a sensor device is provided at the distal end portion 100 of the robot arm. The sensor device may include various sensors. For example, the sensor device may include a camera, a thermo camera, a distance measurement sensor, a depth sensor, a microphone (hereinafter, also referred to as a microphone), a chemical sensor, an odor sensor, a metal sensor, a temperature sensor, and a sensor capable of capturing a fluoroscopic image. The number of sensor devices provided at the robot arm tip 100 is not particularly limited. For example, the sensor device provided on the robot arm tip portion 100 may include one or more of the above-described sensor devices in combination, or may include a plurality of devices of the same type.
 カメラは、RGBカメラ等の、レンズ系、駆動系、及び撮像素子を有し、画像(静止画像又は動画像)を撮像する撮像装置である。サーモカメラは、赤外線等により撮像対象の温度を示す情報を含む撮像画像を撮像する撮像装置である。デプスセンサは、例えば、ToF(Time of Flight)センサ、赤外線測距装置、超音波測距装置、LiDAR(Laser Imaging Detection and Ranging)又はステレオカメラ等の深度情報を取得する装置である。マイクは、周囲の音を収音し、アンプ及びADC(Analog Digital Converter)を介してデジタル信号に変換した音声データを出力する装置である。化学センサは、化学物質を検知するセンサである。臭気センサは、臭気を検知して数値化するセンサである。金属センサは、金属を検知するセンサである。温度センサは、温度を測定するセンサである。透視画像を撮像可能なセンサは、例えば、赤外線やX線(X-ray)等により対象の内部を可視化する画像を撮像するセンサである。なお、透視画像を撮像可能なセンサを用いる場合、赤外線又はX線等の送信器と受信機との間に撮像対象が位置することが可能な構造が必要な場合が有り得る。そのため、ロボットアーム先端部100は、赤外線またはX線等の送信器及び受信機を備え、撮像対象を送信器及び受信機の間に位置させることが可能な構造であってもよい。 A camera is an imaging device that has a lens system, a drive system, and an imaging element, such as an RGB camera, and captures an image (still image or moving image). A thermo camera is an imaging device that captures a captured image including information indicating the temperature of an imaging target using infrared light or the like. The depth sensor is a device that acquires depth information such as a ToF (Time of Flight) sensor, an infrared distance measuring device, an ultrasonic distance measuring device, a LiDAR (Laser Imaging, Detection and Ranging), or a stereo camera. The microphone is a device that collects surrounding sounds and outputs audio data converted into digital signals via an amplifier and an ADC (Analog Digital Converter). A chemical sensor is a sensor that detects a chemical substance. The odor sensor is a sensor that detects odor and converts it into a numerical value. The metal sensor is a sensor that detects a metal. The temperature sensor is a sensor that measures a temperature. The sensor capable of capturing a fluoroscopic image is, for example, a sensor that captures an image that visualizes the inside of a target by infrared rays, X-rays, or the like. When a sensor capable of capturing a fluoroscopic image is used, there may be a case where a structure capable of positioning an imaging target between a transmitter for infrared rays or X-rays and a receiver may be required. Therefore, the robot arm tip portion 100 may include a transmitter and a receiver for infrared rays or X-rays or the like, and may have a structure capable of positioning an imaging target between the transmitter and the receiver.
 また、本実施形態では、ロボットアーム先端部100には、センサ装置に加えて提示装置が設けられ得る。当該提示装置は、多様な提示装置を含み得る。例えば、提示装置は、プロジェクタ、及びレーザ装置を含み得る。なお、ロボットアーム先端部100に設けられる提示装置の数は特に限定されない。例えば、ロボットアーム先端部100に設けられる提示装置は、上述した提示装置のうち一つ又は複数を組み合わせ含んでも良いし、同一種類の装置を複数含んでも良い。 Also, in the present embodiment, a presentation device may be provided at the robot arm tip portion 100 in addition to the sensor device. The presentation device can include various presentation devices. For example, the presentation device may include a projector and a laser device. Note that the number of presentation devices provided on the robot arm tip portion 100 is not particularly limited. For example, the presentation device provided on the robot arm tip portion 100 may include one or more of the above-described presentation devices in combination, or may include a plurality of devices of the same type.
 プロジェクタは、空間の任意の場所に画像を投影する投影装置である。プロジェクタは、例えば、固定型の広角プロジェクタであってもよいし、Pan/Tilt駆動型等の投影方向を変更可能な可動部を備えるいわゆるムービングプロジェクタであってもよい。 A projector is a projection device that projects an image at an arbitrary place in space. The projector may be, for example, a fixed-type wide-angle projector, or a so-called moving projector having a movable unit capable of changing the projection direction, such as a Pan / Tilt drive type.
 なお、上述したセンサ装置及び提示装置は、多様に設けられ得る。例えば、センサ装置及び提示装置は、上述したようにロボットアーム12のロボットアーム先端部100に設けられる。また、センサ装置及び提示装置は、後述する移動台車14の本体部106に設けられてもよい。 The sensor device and the presentation device described above can be provided in various ways. For example, the sensor device and the presentation device are provided at the robot arm tip 100 of the robot arm 12 as described above. In addition, the sensor device and the presentation device may be provided in a main body 106 of the mobile trolley 14 described later.
 (2)移動台車14
 移動台車14は、例えば、本体部106及び複数の車輪108で構成される。本体部106内には、例えばCPU(Central Processing Unit)やGPU(Graphics Processing Unit)などの、少なくとも一つの処理回路が配置され得る。さらに、ROM(Read Only Memory)やRAM(Random Access Memory)などの、少なくとも一つのメモリが本体部106内に配置され得る。また、本体部106内には、情報提示装置10-1の各構成要素を駆動するための電力を供給する電源装置が配置され得る。
(2) Mobile trolley 14
The movable trolley 14 includes, for example, a main body 106 and a plurality of wheels 108. At least one processing circuit such as a CPU (Central Processing Unit) and a GPU (Graphics Processing Unit) may be arranged in the main body 106. Further, at least one memory such as a ROM (Read Only Memory) and a RAM (Random Access Memory) can be arranged in the main body 106. In addition, a power supply device that supplies power for driving each component of the information presentation device 10-1 may be arranged in the main body 106.
 また、移動台車14は、図1に示すように、例えば、4つの車輪108a~108dを有する。車輪108には、任意の種類の車輪が用いられてもよい。例えば、車輪108は、オムニホイール、またはメカナムホイール等であってもよい。なお、移動台車14が有する複数の車輪108の種類は、基本的には全て同じであり得る。但し、本開示はかかる例に限定されず、当該複数の車輪108の中には、複数の種類の車輪が混在していてもよい。 移動 Further, as shown in FIG. 1, the mobile trolley 14 has, for example, four wheels 108a to 108d. Any type of wheel may be used as the wheel 108. For example, the wheels 108 may be omni wheels, mecanum wheels, or the like. Note that the types of the plurality of wheels 108 included in the mobile trolley 14 may be basically the same. However, the present disclosure is not limited to such an example, and a plurality of types of wheels may be mixed in the plurality of wheels 108.
 また、複数の車輪108は、可動部を有してもよい。例えば、複数の車輪108の各々は、アクチュエータ(例えばモータなど)を有し、当該アクチュエータの駆動により回転する。複数の車輪108の各々が有するアクチュエータの駆動が制御されることにより、少なくとも一つの車輪108が回転し得る。これにより、移動台車14は、回転運動および並進運動を行い得る。例えば、複数の車輪108の各々にメカナムホイールが用いられている場合、アクチュエータの駆動の制御により、複数の車輪108の各々の回転速度を変えることで移動台車14は任意の方向に速度を出すことが可能である。これにより、移動台車14は、全方位に移動可能であり得る。 複数 Also, the plurality of wheels 108 may have a movable part. For example, each of the plurality of wheels 108 has an actuator (for example, a motor or the like) and rotates by driving the actuator. By controlling the drive of the actuator of each of the plurality of wheels 108, at least one wheel 108 can rotate. Thereby, the mobile trolley 14 can perform a rotation motion and a translation motion. For example, when a mecanum wheel is used for each of the plurality of wheels 108, the movable carriage 14 outputs a speed in an arbitrary direction by changing the rotation speed of each of the plurality of wheels 108 by controlling the driving of the actuator. It is possible. Thereby, the mobile trolley 14 may be movable in all directions.
 <2-2.所定の領域について>
 以上、第1の実施形態に係る情報提示装置10-1の物理構成について説明した。次に、図2を参照しながら、所定の領域について説明する。図2は、第1の実施形態に係る所定の領域の具体例を示す図である。本実施形態に係る情報提示装置10-1は、上述したロボットアーム先端部100に設けられたセンサ装置により、所定の領域をセンシング(以下では、スキャンとも称される)する。
<2-2. About predetermined area>
The physical configuration of the information presentation device 10-1 according to the first embodiment has been described above. Next, the predetermined area will be described with reference to FIG. FIG. 2 is a diagram illustrating a specific example of the predetermined area according to the first embodiment. The information presentation device 10-1 according to the present embodiment senses (hereinafter, also referred to as "scanning") a predetermined area using a sensor device provided at the robot arm tip portion 100 described above.
 本実施形態に係る所定の領域は、情報提示装置10-1が異常の有無を検査する領域である。例えば、所定の領域は、ユーザにより指定される対象の表面を含み得る。具体的に、図2に示す具体例aのように、ユーザにより指定された対象が物体20である場合、所定の領域は、ユーザにより指定された物体20の表面である領域30であってもよい。ユーザにより指定された物体20は、例えば、爆発等の危険性を有する可能性がある未知の物体である。なお、ユーザにより指定される対象は、上述の未知の物体に限定されず、任意の物体であってもよい。例えば、ユーザにより指定される対象は、既知の物体であってもよい。 所 定 The predetermined area according to the present embodiment is an area in which the information presentation device 10-1 checks for an abnormality. For example, the predetermined area may include a surface of the object specified by the user. Specifically, when the target specified by the user is the object 20 as in the specific example a illustrated in FIG. 2, the predetermined region may be the region 30 that is the surface of the object 20 specified by the user. Good. The object 20 specified by the user is, for example, an unknown object that has a risk of explosion or the like. The target specified by the user is not limited to the unknown object described above, and may be an arbitrary object. For example, the target specified by the user may be a known object.
 また、例えば、所定の領域は、ユーザにより指定される領域を含み得る。具体的に、図2に示す具体例bのように、ユーザにより指定された領域が所定の空間(例えば部屋)の壁の一部の領域32aと32b、床の一部の領域32cである場合、所定の領域は、領域32a、領域32b、及び領域32cを含む領域32であってもよい。 Also, for example, the predetermined area may include an area specified by the user. Specifically, as in a specific example b shown in FIG. 2, the regions specified by the user are partial regions 32a and 32b of a wall of a predetermined space (for example, a room) and a partial region 32c of a floor. The predetermined area may be the area 32 including the area 32a, the area 32b, and the area 32c.
 また、例えば、所定の領域は、ユーザにより指定される対象の表面の一部であってもよい。具体的に、図2に示す具体例cのように、ユーザにより指定された対象がタンク22aである場合、所定の領域は、タンク22aの表面の一部である領域34aであってもよい。なお、図2の具体例cに示すタンク22は、例えば、ガスまたは薬液等が入れられたタンクである。 For example, the predetermined area may be a part of the surface of the target specified by the user. Specifically, when the target specified by the user is the tank 22a as in the specific example c illustrated in FIG. 2, the predetermined area may be an area 34a that is a part of the surface of the tank 22a. The tank 22 shown in the specific example c of FIG. 2 is, for example, a tank in which a gas, a chemical, or the like is put.
 また、例えば、所定の領域は、ユーザにより指定される対象の表面とユーザにより指定される領域との組み合わせであってもよい。具体的に、図2の具体例cに示すように、ユーザが3個のタンク22a~22cの各々の表面の一部である領域34a~領域34cと、床の一部の領域34dを指定したとする。この場合、所定の領域は、領域34a~領域34dを含む領域であってもよい。 For example, the predetermined area may be a combination of the surface of the target specified by the user and the area specified by the user. Specifically, as shown in a specific example c of FIG. 2, the user designates the areas 34a to 34c which are a part of the surface of each of the three tanks 22a to 22c and the area 34d which is a part of the floor. And In this case, the predetermined area may be an area including the areas 34a to 34d.
 なお、所定の領域は、ユーザにより指定される領域に限定されない。例えば、所定の領域は、センサ装置によりスキャンされた情報に基づき決定される領域であってもよい。また、例えば、所定の領域は、外部装置から取得される情報に基づき決定される領域であってもよい。また、例えば、所定の領域は、あらかじめ決められた領域であってもよい。 The predetermined area is not limited to the area specified by the user. For example, the predetermined area may be an area determined based on information scanned by the sensor device. Further, for example, the predetermined area may be an area determined based on information obtained from an external device. Further, for example, the predetermined area may be a predetermined area.
 本実施形態に係る情報提示装置10-1は、上述した所定の領域をスキャンすることで、所定の領域に関する情報を取得する。所定の領域に関する情報は、例えば、所定の領域の空間情報である。空間情報は、例えば、カメラまたはデプスセンサ等により取得される情報提示装置10-1と空間を形成する物体との距離等のセンシングデータである。 情報 The information presentation device 10-1 according to the present embodiment obtains information on a predetermined area by scanning the above-described predetermined area. The information on the predetermined area is, for example, spatial information on the predetermined area. The space information is, for example, sensing data such as the distance between the information presentation device 10-1 and an object forming the space, which is acquired by a camera or a depth sensor.
 また、空間情報は、例えば、カメラまたはデプスセンサ等により取得される情報提示装置10-1と所定の領域との距離等のセンシングデータである。取得された情報提示装置10-1と所定の領域との距離は、情報提示装置10-1と所定の領域との位置関係の検出に用いられ得る。検出された情報提示装置10-1と所定の領域との位置関係は、情報提示装置10-1が所定の領域の近傍へ移動する際に用いられ得る。 The space information is, for example, sensing data such as the distance between the information presenting apparatus 10-1 and a predetermined area acquired by a camera or a depth sensor. The acquired distance between the information presenting apparatus 10-1 and the predetermined area can be used for detecting the positional relationship between the information presenting apparatus 10-1 and the predetermined area. The detected positional relationship between the information presenting apparatus 10-1 and the predetermined area can be used when the information presenting apparatus 10-1 moves to the vicinity of the predetermined area.
 また、空間情報は、例えば、カメラまたはデプスセンサ等により取得されるロボットアーム先端部100と所定の領域との距離等のセンシングデータであってもよい。取得されたロボットアーム先端部100と所定の領域との距離は、ロボットアーム先端部100と所定の領域との位置関係の検出に用いられ得る。検出されたロボットアーム先端部100と所定の領域との位置関係は、ロボットアーム先端部100と所定の領域との間の距離の制御に用いられ得る。 The space information may be, for example, sensing data such as the distance between the robot arm tip 100 and a predetermined area, which is acquired by a camera or a depth sensor. The acquired distance between the robot arm tip 100 and the predetermined area can be used for detecting the positional relationship between the robot arm tip 100 and the predetermined area. The detected positional relationship between the robot arm tip 100 and the predetermined area can be used for controlling the distance between the robot arm tip 100 and the predetermined area.
 また、所定の領域に関する情報は、例えば、所定の領域の状態情報である。状態情報は、例えば、サーモカメラ、または温度センサ等により取得される所定の領域の温度等のセンシングデータである。また、状態情報は、例えば、マイクにより取得される所定の領域から出力される音の大きさ、化学センサにより取得される所定の領域に存在する物体情報、及び臭気センサにより取得される所定の領域における臭気の強度等のセンシングデータであってもよい。また、状態情報は、金属センサにより取得される所定の領域に存在する金属情報、及び透視画像を撮像可能なセンサにより取得される対象の内部情報等のセンシングデータであってもよい。上述した各種センサにより取得される状態情報は、所定の領域における異常の検出に用いられ得る。 情報 In addition, the information on the predetermined area is, for example, state information of the predetermined area. The state information is, for example, sensing data such as the temperature of a predetermined area acquired by a thermo camera, a temperature sensor, or the like. The state information includes, for example, the loudness of a sound output from a predetermined area obtained by a microphone, object information present in the predetermined area obtained by a chemical sensor, and a predetermined area obtained by an odor sensor. May be sensing data such as the intensity of the odor at. In addition, the state information may be sensing information such as metal information existing in a predetermined area acquired by a metal sensor and internal information of a target acquired by a sensor capable of capturing a fluoroscopic image. The state information acquired by the various sensors described above can be used for detecting an abnormality in a predetermined area.
 <2-3.情報提示装置の機能構成例>
 以上、第1の実施形態に係る所定の領域について説明した。次に、図3を参照しながら、第1の実施形態に係る情報提示装置10-1の機能構成について説明する。図3は、第1の実施形態に係る情報提示装置10-1の機能構成の一例を示すブロック図である。図3に示すように、本実施形態に係る情報提示装置10-1は、取得部110、制御部120-1、記憶部130、提示部140、ロボットアーム駆動部150、及び移動台車駆動部160を備える。
<2-3. Example of functional configuration of information presentation device>
The predetermined area according to the first embodiment has been described above. Next, the functional configuration of the information presentation device 10-1 according to the first embodiment will be described with reference to FIG. FIG. 3 is a block diagram illustrating an example of a functional configuration of the information presentation device 10-1 according to the first embodiment. As shown in FIG. 3, the information presentation device 10-1 according to the present embodiment includes an acquisition unit 110, a control unit 120-1, a storage unit 130, a presentation unit 140, a robot arm drive unit 150, and a mobile trolley drive unit 160. Is provided.
 (1)取得部110
 取得部110は、所定の領域に関する情報を取得する機能を有する。例えば、取得部110は、少なくとも1つのセンサ装置を有し、当該センサ装置により所定の領域に関する情報を取得する。具体的に、取得部110は、所定の領域をセンサ装置によりスキャンすることで、所定の領域に関する情報を取得する。そして、取得部110は、取得した所定の領域に関する情報を制御部120-1の空間処理部122-1へ出力する。上述の機能は、例えば、ロボットアーム先端部100に設けられるセンサ装置により実現され得る。
(1) Acquisition unit 110
The acquisition unit 110 has a function of acquiring information on a predetermined area. For example, the acquisition unit 110 has at least one sensor device, and acquires information on a predetermined area by using the sensor device. Specifically, the acquiring unit 110 acquires information on a predetermined area by scanning a predetermined area with the sensor device. Then, the obtaining unit 110 outputs the obtained information on the predetermined area to the spatial processing unit 122-1 of the control unit 120-1. The above-described function can be realized by, for example, a sensor device provided at the robot arm tip portion 100.
 なお、取得部110が所定の領域に関する情報を取得する方法は、センサ装置により取得する方法に限定されない。例えば、取得部110は、所定の領域に関する情報を外部装置から取得してもよい。具体的に、取得部110は、外部装置と通信可能な通信部(図示しない)を有し、外部装置が取得した所定の領域に関する情報を、外部装置との通信を介して取得してもよい。 The method by which the acquisition unit 110 acquires information on a predetermined area is not limited to a method of acquiring information using a sensor device. For example, the acquisition unit 110 may acquire information on a predetermined area from an external device. Specifically, the acquisition unit 110 may include a communication unit (not shown) capable of communicating with the external device, and may acquire information on a predetermined area acquired by the external device via communication with the external device. .
 (2)制御部120-1
 制御部120-1は、情報提示装置10-1の動作を制御する機能を有する。当該機能を実現するために、本実施形態に係る制御部120-1は、図3に示すように、空間処理部122-1、プロジェクションマッピング制御部124、及びロボット制御部126を備える。なお、制御部120-1の各構成要素の機能は、例えば、本体部106に配置されるCPU、ROM、及びRAMにより実現され得る。
(2) Control unit 120-1
The control unit 120-1 has a function of controlling the operation of the information presentation device 10-1. In order to realize the function, the control unit 120-1 according to the present embodiment includes a spatial processing unit 122-1, a projection mapping control unit 124, and a robot control unit 126, as shown in FIG. The function of each component of the control unit 120-1 can be realized by, for example, a CPU, a ROM, and a RAM arranged in the main unit 106.
 (2-1)空間処理部122-1
 空間処理部122-1は、入力される情報に基づく処理を行う機能を有する。例えば、空間処理部122-1は、取得部110から入力される所定の領域に関する情報に基づき、空間認識処理、所定の領域の検出処理、スキャン軌道の決定処理、駆動情報の生成処理、及び異常の検出処理を行う。
(2-1) Spatial processing unit 122-1
The spatial processing unit 122-1 has a function of performing processing based on input information. For example, the spatial processing unit 122-1 may perform a space recognition process, a process of detecting a predetermined region, a process of determining a scan trajectory, a process of generating drive information, and a process of generating abnormalities based on information about a predetermined region input from the acquisition unit 110. Is performed.
 (空間認識処理)
 空間処理部122-1は、空間認識処理にて、情報提示装置10-1の周辺の空間を認識する処理を行う。例えば、空間処理部122-1は、取得部110から入力される空間情報に基づき、情報提示装置10-1の周辺の空間等を空間認識処理にて検出する。具体的に、空間処理部122-1は、空間情報に基づいて、空間の形状を形成する壁面、天井、床、ドア、家具、生活用品、及び未知の物体等の物体の形状、並びに当該物体の位置等を3次元座標として検出することで、上述の各空間を認識する。そして、空間処理部122-1は、空間認識処理にて認識した空間に関する情報である空間認識情報を記憶部130へ出力し、記憶部130に記憶させる。
(Spatial recognition processing)
The spatial processing unit 122-1 performs a process of recognizing a space around the information presentation device 10-1 in the space recognition process. For example, the spatial processing unit 122-1 detects a space or the like around the information presenting device 10-1 by a spatial recognition process based on the spatial information input from the acquiring unit 110. Specifically, the space processing unit 122-1 is configured to determine the shape of an object such as a wall surface, a ceiling, a floor, a door, furniture, a living article, an unknown object, etc. By detecting the position and the like as three-dimensional coordinates, each space described above is recognized. Then, the spatial processing unit 122-1 outputs the space recognition information, which is information on the space recognized in the space recognition process, to the storage unit 130, and stores the space recognition information in the storage unit 130.
 なお、空間処理部122-1が検出する空間の形状を形成する物体の位置は、3次元座標に限定されない。また、空間処理部122-1が記憶部130に記憶させる空間認識情報には、空間情報も含まれてよい。また、空間処理部122-1は、認識した空間に関する情報をマップ化したマップ情報を生成し、生成したマップ情報を記憶部130へ出力し、記憶部130に記憶させてもよい。 The position of the object forming the shape of the space detected by the space processing unit 122-1 is not limited to the three-dimensional coordinates. Further, the space recognition information stored in the storage unit 130 by the space processing unit 122-1 may include space information. Further, the spatial processing unit 122-1 may generate map information obtained by mapping information on the recognized space, output the generated map information to the storage unit 130, and store the generated map information in the storage unit 130.
 なお、上述の空間認識処理の例は、空間処理部122-1が屋内にて空間認識処理を行う例であるが、空間処理部122-1が空間認識処理を行う場所は屋内に限定されず、屋外であってもよい。 Although the above-described example of the space recognition processing is an example in which the space processing unit 122-1 performs the space recognition processing indoors, the place where the space processing unit 122-1 performs the space recognition processing is not limited to indoors. It may be outdoors.
 (所定の領域の検出処理)
 空間処理部122-1は、所定の領域の検出処理にて、情報提示装置10-1による異常の有無の検査の対象となる所定の領域を検出する処理を行う。例えば、ユーザがジェスチャー等の動作により所定の領域を指定している場合、空間処理部122-1は、カメラ等により取得されるユーザの動作情報及び空間情報等に基づき、所定の領域を検出してもよい。また、例えば、ユーザが声等の音声により所定の領域を指定している場合、空間処理部122-1は、マイク等により取得される音声及び空間情報に基づき、所定の領域を検出してもよい。また、例えば、ユーザによる所定の領域の指定がない場合、空間処理部122-1は、あらかじめ設定された領域を所定の領域としてもよいし、あらかじめ設定された情報に基づき所定の領域を検出してもよい。
(Predetermined area detection processing)
The spatial processing unit 122-1 performs a process of detecting a predetermined region to be inspected for an abnormality by the information presentation device 10-1 in the process of detecting the predetermined region. For example, when the user specifies a predetermined area by gesture or the like, the spatial processing unit 122-1 detects the predetermined area based on the user's operation information and space information acquired by a camera or the like. You may. Further, for example, when the user specifies a predetermined area by voice such as voice, the spatial processing unit 122-1 may detect the predetermined area based on voice and space information acquired by a microphone or the like. Good. Further, for example, when the user does not specify a predetermined area, the spatial processing unit 122-1 may set the predetermined area as the predetermined area, or may detect the predetermined area based on the predetermined information. You may.
 また、空間処理部122-1は、検出した所定の領域に対して上述した空間認識処理を行うことで、所定の領域の形状、所定の領域を形成する物体の形状等の認識を行う。 {Circle around (4)} The spatial processing unit 122-1 performs the above-described space recognition processing on the detected predetermined area to recognize the shape of the predetermined area, the shape of the object forming the predetermined area, and the like.
 (スキャン軌道の決定処理)
 空間処理部122-1は、スキャン軌道の決定処理にて、情報提示装置10-1が所定の領域をスキャンする際の軌道であるスキャン軌道を決定する処理を行う。空間処理部122-1は、所定の領域の形状、所定の領域を形成する物体の形状等に基づき、スキャン軌道を決定する。例えば、空間処理部122-1は、情報提示装置10-1が所定の領域を全体的にスキャンするように、スキャン軌道を決定する。なお、情報提示装置10-1が所定の領域を全体的にスキャンすることは、以下では、通常スキャンとも称される。なお、空間処理部122-1は、情報提示装置10-1が所定の領域を部分的にスキャンするようにスキャン軌道を決定してもよい。また、スキャン軌道の経路、長さ、形状等は特に限定されない。
(Scan trajectory determination processing)
The spatial processing unit 122-1 performs a process of determining a scan trajectory that is a trajectory when the information presentation device 10-1 scans a predetermined area in the scan trajectory determination process. The spatial processing unit 122-1 determines a scan trajectory based on the shape of a predetermined area, the shape of an object forming the predetermined area, and the like. For example, the spatial processing unit 122-1 determines a scan trajectory so that the information presentation device 10-1 scans a predetermined area as a whole. Hereinafter, the fact that the information presentation device 10-1 scans a predetermined area as a whole is also referred to as a normal scan. The spatial processing unit 122-1 may determine the scan trajectory so that the information presentation device 10-1 partially scans a predetermined area. Further, the path, length, shape, and the like of the scan trajectory are not particularly limited.
 (駆動情報の生成処理)
 空間処理部122-1は、駆動情報の生成処理にて、情報提示装置10-1の駆動の制御に用いられる駆動情報を生成する処理を行う。まず、例えば、空間処理部122-1は、空間認識情報に基づき、情報提示装置10-1及びロボットアーム先端部100の位置を算出し、当該位置を用いて駆動情報を生成する。次いで、空間処理部122-1は、空間認識情報と算出した情報提示装置10-1の位置とに基づき、移動台車14を駆動させる駆動情報を生成する。そして、空間処理部122-1は、生成した駆動情報をロボット制御部126へ出力する。当該駆動情報に基づき移動台車14が駆動することで、情報提示装置10-1は、空間を移動することができる。
(Drive information generation processing)
The spatial processing unit 122-1 performs a process of generating drive information used for controlling the drive of the information presentation device 10-1 in the drive information generation process. First, for example, the space processing unit 122-1 calculates the positions of the information presentation device 10-1 and the robot arm tip 100 based on the space recognition information, and generates drive information using the positions. Next, the space processing unit 122-1 generates drive information for driving the mobile trolley 14 based on the space recognition information and the calculated position of the information presentation device 10-1. Then, the spatial processing unit 122-1 outputs the generated drive information to the robot control unit 126. The information presentation device 10-1 can move in a space by driving the movable cart 14 based on the drive information.
 また、例えば、空間処理部122-1は、空間認識情報及びロボットアーム先端部100の位置に基づき、ロボットアーム12を駆動させる駆動情報を生成する。当該駆動情報に基づきロボットアーム12が駆動することで、情報提示装置10-1は、所定の領域をスキャンすることができる。 {For example, the space processing unit 122-1 generates drive information for driving the robot arm 12 based on the space recognition information and the position of the robot arm tip 100. When the robot arm 12 is driven based on the drive information, the information presentation device 10-1 can scan a predetermined area.
 また、例えば、空間処理部122-1は、スキャン軌道に基づき、ロボットアーム12を駆動させる駆動情報を生成する。当該駆動情報に基づき、ロボットアーム12が駆動することで、情報提示装置10-1は、スキャン軌道に沿って所定の領域を適切にスキャンすることができる。 {For example, the space processing unit 122-1 generates drive information for driving the robot arm 12 based on the scan trajectory. When the robot arm 12 is driven based on the drive information, the information presentation device 10-1 can appropriately scan a predetermined area along the scan trajectory.
 なお、空間処理部122-1は、空間認識情報及びロボットアーム先端部100の位置に基づき、ロボットアーム先端部100と所定の領域との距離が一定に保たれるようにロボットアーム12を駆動させる駆動情報を生成してもよい。例えば、空間処理部122-1は、ロボットアーム先端部100と所定の領域との距離が、ユーザ等により予め設定された距離を一定に保たれるように、駆動情報を生成する。具体的に、空間処理部122-1は、まず、エンコーダが取得する情報に基づきロボットアーム先端部100の位置を示す3次元座標を算出する。次いで、空間処理部122-1は、ロボットアーム先端部100の位置を示す3次元座標と所定の領域の位置を示す3次元座標とに基づき、ロボットアーム先端部100と所定の領域との距離を算出する。そして、空間処理部122-1は、算出したロボットアーム先端部100と所定の領域との距離が、ユーザ等により予め設定された距離と一致するように、ロボットアーム12を駆動させる駆動情報を生成する。当該駆動情報に基づきロボットアーム12が駆動することで、情報提示装置10-1は、ロボットアーム先端部100と所定の領域との距離を一定に保ったまま所定の領域をスキャンすることができる。また、例えば、情報提示装置10-1は、ロボットアーム先端部100に設けられたデプスセンサを補助的に使うことで、ロボットアーム先端部100と所定の領域との距離をより正確に一定に保ってもよい。 The space processing unit 122-1 drives the robot arm 12 based on the space recognition information and the position of the robot arm tip 100 such that the distance between the robot arm tip 100 and a predetermined area is kept constant. Drive information may be generated. For example, the spatial processing unit 122-1 generates drive information such that the distance between the robot arm tip 100 and a predetermined area is kept constant at a distance set in advance by a user or the like. Specifically, the spatial processing unit 122-1 first calculates three-dimensional coordinates indicating the position of the robot arm tip 100 based on the information acquired by the encoder. Next, the spatial processing unit 122-1 determines the distance between the robot arm tip 100 and the predetermined area based on the three-dimensional coordinates indicating the position of the robot arm tip 100 and the three-dimensional coordinates indicating the position of the predetermined area. calculate. Then, the spatial processing unit 122-1 generates drive information for driving the robot arm 12 such that the calculated distance between the robot arm tip portion 100 and the predetermined area matches a distance set in advance by a user or the like. I do. When the robot arm 12 is driven based on the drive information, the information presentation device 10-1 can scan a predetermined area while maintaining a constant distance between the robot arm tip 100 and the predetermined area. Further, for example, the information presentation device 10-1 can more accurately maintain the distance between the robot arm tip 100 and a predetermined area constant by using the depth sensor provided on the robot arm tip 100 as an auxiliary. Is also good.
 (異常の検出処理)
 空間処理部122-1は、異常の検出処理にて、所定の領域における異常を検出する処理を行う。例えば、空間処理部122-1は、状態情報に基づき、所定の領域における異常を検出する。具体的に、状態情報が示す値が所定の閾値以上である場合、空間処理部122-1は、当該状態情報の値が異常であると判定し、当該状態情報に対応する異常を検出する。
(Abnormality detection processing)
The spatial processing unit 122-1 performs a process of detecting an abnormality in a predetermined area in the abnormality detection process. For example, the spatial processing unit 122-1 detects an abnormality in a predetermined area based on the state information. Specifically, when the value indicated by the state information is equal to or greater than a predetermined threshold, the space processing unit 122-1 determines that the value of the state information is abnormal, and detects an abnormality corresponding to the state information.
 空間処理部122-1は、異常を検出した場合、異常を検出した位置を示す位置情報とセンサ装置のセンシングデータとを関連付けて記憶部130へ出力し、記憶部130に記憶させる。また、空間処理部122-1は、位置情報とセンサ装置のセンシングデータとをプロジェクションマッピング制御部124へ出力する。 When the space processing unit 122-1 detects an abnormality, the spatial processing unit 122-1 associates the position information indicating the position where the abnormality is detected with the sensing data of the sensor device, outputs the data to the storage unit 130, and causes the storage unit 130 to store the data. Further, the spatial processing unit 122-1 outputs the position information and the sensing data of the sensor device to the projection mapping control unit 124.
 (2-2)プロジェクションマッピング制御部124
 プロジェクションマッピング制御部124は、情報の提示に関する処理を行う機能を有する。例えば、プロジェクションマッピング制御部124は、提示部140により提示される情報であるプロジェクションマッピング情報を生成する機能を有する。具体的に、プロジェクションマッピング制御部124は、空間処理部122-1から入力される位置情報とセンサ装置のセンシングデータとに基づき、異常に関する情報をプロジェクションマッピング情報として生成する。
(2-2) Projection mapping control unit 124
The projection mapping control unit 124 has a function of performing processing related to information presentation. For example, the projection mapping control unit 124 has a function of generating projection mapping information that is information presented by the presentation unit 140. Specifically, the projection mapping control unit 124 generates information regarding an abnormality as projection mapping information based on the position information input from the spatial processing unit 122-1 and the sensing data of the sensor device.
 異常に関する情報は、例えば、異常が検出された範囲を示す情報である。また、異常に関する情報は、例えば、異常として検出された物質の名称(例えばアンモニア)であってもよい。なお、異常に関する情報は、上述の例に限定されない。 情報 The information on the abnormality is, for example, information indicating a range in which the abnormality is detected. The information on the abnormality may be, for example, the name of the substance detected as the abnormality (for example, ammonia). Note that the information regarding the abnormality is not limited to the above example.
 また、プロジェクションマッピング制御部124は、提示部140の動作を制御する機能を有する。例えば、プロジェクションマッピング制御部124は、提示部140が異常に関する情報を提示する処理を制御する。具体的に、プロジェクションマッピング制御部124は、提示部140に、異常に関する情報を異常の発生源上に提示させる。異常の発生源とは、異常な熱を発している熱源、異常な臭いを発している物質等、空間処理部122-1により異常として検出された物質である。プロジェクションマッピング制御部124は、例えば、提示部140に、異常が検出された範囲を示す情報を異常の発生源上に提示させてもよい。これにより、ユーザは、異常が発生している箇所を直感的に把握することができる。 The projection mapping control unit 124 has a function of controlling the operation of the presentation unit 140. For example, the projection mapping control unit 124 controls a process in which the presenting unit 140 presents information about an abnormality. Specifically, the projection mapping control unit 124 causes the presentation unit 140 to present information on the abnormality on the source of the abnormality. The abnormal source is a substance detected as abnormal by the space processing unit 122-1 such as a heat source generating abnormal heat, a substance generating abnormal odor, and the like. For example, the projection mapping control unit 124 may cause the presentation unit 140 to present information indicating the range in which the abnormality is detected on the source of the abnormality. This allows the user to intuitively grasp the location where the abnormality has occurred.
 また、プロジェクションマッピング制御部124は、提示部140に、異常に関する情報を異常の発生源の近傍にさらに提示させてもよい。例えば、プロジェクションマッピング制御部124は、提示部140に、異常の発生源として検出された物質の物質名を異常の発生源の近傍に提示させてもよい。これにより、ユーザは、異常に関する付加的な情報を直感的に把握することができる。 (4) The projection mapping control unit 124 may cause the presentation unit 140 to further present information regarding the abnormality near the source of the abnormality. For example, the projection mapping control unit 124 may cause the presentation unit 140 to present the substance name of the substance detected as the source of the abnormality near the source of the abnormality. Thus, the user can intuitively grasp the additional information regarding the abnormality.
 また、プロジェクションマッピング制御部124は、提示部140に、複数の異なる情報を提示させてもよい。例えば、取得部110が複数のセンサ装置を有し、複数のセンサ装置により取得される複数の所定の領域に関する情報に基づき、複数の異なる異常に関する情報が検出されたとする。この場合、プロジェクションマッピング制御部124は、提示部140に、検出された複数の異なる異常に関する情報の各々を提示させてもよい。これにより、ユーザは、異常についてより詳細に把握することができる。 (4) The projection mapping control unit 124 may cause the presentation unit 140 to present a plurality of different pieces of information. For example, it is assumed that the acquisition unit 110 has a plurality of sensor devices, and information about a plurality of different abnormalities is detected based on information about a plurality of predetermined regions acquired by the plurality of sensor devices. In this case, the projection mapping control unit 124 may cause the presentation unit 140 to present each of the pieces of information relating to the detected different abnormalities. Thereby, the user can grasp the abnormality in more detail.
 プロジェクションマッピング制御部124は、提示部140を制御することで、異常に関する情報を多様に提示させ得る。例えば、プロジェクションマッピング制御部124は、提示部140に、異常に関する情報をテキストで提示させる。また、例えば、プロジェクションマッピング制御部124は、提示部140に、異常に関する情報をアイコンで提示させてもよい。また、例えば、プロジェクションマッピング制御部124は、提示部140に、検出された異常に応じて色を変化させた異常に関する情報を提示させてもよい。また、例えば、プロジェクションマッピング制御部124は、提示部140に、異常に関する情報に応じてプロジェクタ等から投射される光の強度を変化させたり、異常に関する情報を点滅させたりして提示させてもよい。これにより、ユーザは、異常に関する情報をより直感的に把握することができる。 The projection mapping control unit 124 can cause the information relating to the abnormality to be presented in various ways by controlling the presentation unit 140. For example, the projection mapping control unit 124 causes the presenting unit 140 to present information regarding the abnormality in text. In addition, for example, the projection mapping control unit 124 may cause the presentation unit 140 to present information regarding the abnormality using an icon. In addition, for example, the projection mapping control unit 124 may cause the presentation unit 140 to present information about an abnormality whose color has been changed in accordance with the detected abnormality. In addition, for example, the projection mapping control unit 124 may cause the presentation unit 140 to change the intensity of light projected from a projector or the like in accordance with the information about the abnormality, or blink the information about the abnormality to present it. . Thereby, the user can more intuitively grasp the information regarding the abnormality.
 (2-3)ロボット制御部126
 ロボット制御部126は、情報提示装置10-1の動作を制御する機能を有する。具体的に、ロボット制御部126は、情報提示装置10-1のロボットアーム12と移動台車14の動作を制御する。
(2-3) Robot control unit 126
The robot control unit 126 has a function of controlling the operation of the information presentation device 10-1. Specifically, the robot control unit 126 controls the operations of the robot arm 12 and the mobile trolley 14 of the information presentation device 10-1.
 ロボット制御部126は、ロボットアーム12または移動台車14の少なくともいずれか一方を動作させることで、取得部110と所定の領域との距離を制御し得る。例えば、ロボット制御部126は、ロボットアーム12を動作させることで、取得部110と所定の領域との距離が一定となるように、取得部110の位置を制御する。なお、ロボット制御部126は、取得部110と所定の領域との距離が一定となるように取得部110の位置を制御する際に、ロボットアーム12と移動台車14の両方を動作させてもよいし、移動台車14のみを動作させてもよい。 The robot control unit 126 can control the distance between the acquisition unit 110 and a predetermined area by operating at least one of the robot arm 12 and the movable trolley 14. For example, the robot control unit 126 controls the position of the acquisition unit 110 by operating the robot arm 12 such that the distance between the acquisition unit 110 and a predetermined area is constant. When controlling the position of the acquisition unit 110 so that the distance between the acquisition unit 110 and the predetermined area is constant, the robot control unit 126 may operate both the robot arm 12 and the mobile trolley 14. Alternatively, only the movable carriage 14 may be operated.
 取得部110と所定の領域との距離が一定に保たれることは、即ち、取得部110と対象との距離が一定に保たれることでもある。取得部110と対象との距離が一定に保たれることにより、情報提示装置10-1は、対象と接触することなく、対象をスキャンすることができる。また、情報提示装置10-1は、対象と接触しないため、対象に影響を与えることなく、対象をスキャンすることができる。また、情報提示装置10-1は、対象と接触しないため、ロボットアーム12を清潔に保つことができる。 Keeping the distance between the acquisition unit 110 and the predetermined area constant means that the distance between the acquisition unit 110 and the target is kept constant. By keeping the distance between the acquisition unit 110 and the target constant, the information presentation device 10-1 can scan the target without contacting the target. Further, since the information presentation apparatus 10-1 does not contact the target, the target can be scanned without affecting the target. Further, since the information presentation device 10-1 does not contact the target, the robot arm 12 can be kept clean.
 本開示の実施形態に係るロボット制御部126は、図3に示すように、ロボットアーム12及び移動台車14の動作を制御するためのロボットアームコントローラ1260及び移動台車コントローラ1262を備える。 ロ ボ ッ ト The robot controller 126 according to the embodiment of the present disclosure includes a robot arm controller 1260 and a movable trolley controller 1262 for controlling operations of the robot arm 12 and the movable trolley 14 as shown in FIG.
 (2-3-1)ロボットアームコントローラ1260
 ロボットアームコントローラ1260は、ロボットアーム12の動作を制御する機能を有する。例えば、ロボットアームコントローラ1260は、ロボットアーム12の各構成要素を駆動させるロボットアーム駆動部150の駆動を制御する。具体的に、ロボットアームコントローラ1260は、空間処理部122-1から入力される駆動情報に基づき、ロボットアーム駆動部150を駆動させるための駆動指示を生成する。次いで、ロボットアームコントローラ1260は、生成した駆動指示をロボットアーム駆動部150へ出力する。これにより、ロボットアームコントローラ1260は、ロボットアーム駆動部150の駆動を制御するとともに、ロボットアーム12の動作を制御することができる。
(2-3-1) Robot arm controller 1260
The robot arm controller 1260 has a function of controlling the operation of the robot arm 12. For example, the robot arm controller 1260 controls the driving of the robot arm driving unit 150 that drives each component of the robot arm 12. Specifically, the robot arm controller 1260 generates a driving instruction for driving the robot arm driving unit 150 based on the driving information input from the space processing unit 122-1. Next, the robot arm controller 1260 outputs the generated driving instruction to the robot arm driving unit 150. Thereby, the robot arm controller 1260 can control the operation of the robot arm 12 while controlling the driving of the robot arm drive unit 150.
 (2-3-2)移動台車コントローラ1262
 移動台車コントローラ1262は、移動台車14の動作を制御する機能を有する。例えば、移動台車コントローラ1262は、移動台車14の各構成要素を駆動させる移動台車駆動部160の駆動を制御する。具体的に、移動台車コントローラ1262は、空間処理部122-1から入力される駆動情報に基づき、移動台車駆動部160を駆動させるための駆動指示を生成する。次いで、移動台車コントローラ1262は、生成した駆動指示を移動台車駆動部160へ出力する。これにより、移動台車コントローラ1262は、移動台車駆動部160の駆動を制御するとともに、移動台車14の動作を制御することができる。
(2-3-2) Moving cart controller 1262
The mobile trolley controller 1262 has a function of controlling the operation of the mobile trolley 14. For example, the mobile trolley controller 1262 controls the driving of the mobile trolley drive unit 160 that drives each component of the mobile trolley 14. Specifically, the mobile trolley controller 1262 generates a driving instruction for driving the mobile trolley driving unit 160 based on the driving information input from the space processing unit 122-1. Next, the mobile trolley controller 1262 outputs the generated drive instruction to the mobile trolley drive unit 160. Thereby, the mobile trolley controller 1262 can control the operation of the mobile trolley 14 while controlling the driving of the mobile trolley drive unit 160.
 (3)記憶部130
 記憶部130は、制御部120-1における処理にて取得されるデータを記憶する機能を有する。例えば、記憶部130は、空間処理部122-1による空間認識処理にて取得される空間認識情報を記憶する。なお、空間処理部122-1による空間認識処理にてマップ情報が生成された場合、記憶部130は、当該マップ情報も記憶してもよい。また、記憶部130は、空間処理部122-1による異常の検出処理にて異常が検出された場合、異常が検出された位置とセンサ装置のセンシングデータとを関連付けて記憶する。なお、記憶部130が記憶する情報は、上述の例に限定されない。例えば、記憶部130は、各種アプリケーション等のプログラム、及びデータ等を記憶してもよい。
(3) Storage unit 130
The storage unit 130 has a function of storing data obtained by the processing in the control unit 120-1. For example, the storage unit 130 stores the space recognition information obtained in the space recognition processing by the space processing unit 122-1. When map information is generated by the space recognition processing by the spatial processing unit 122-1, the storage unit 130 may also store the map information. Further, when an abnormality is detected by the spatial processing unit 122-1 in the abnormality detection processing, the storage unit 130 stores the position where the abnormality is detected and the sensing data of the sensor device in association with each other. Note that the information stored in the storage unit 130 is not limited to the above example. For example, the storage unit 130 may store programs such as various applications, data, and the like.
 (4)提示部140
 提示部140は、情報を提示する機能を有する。当該提示部140の機能は、例えば、ロボットアーム先端部100に設けられるプロジェクタ、レーザ装置等の提示装置により実現され得る。
(4) Presentation unit 140
The presentation unit 140 has a function of presenting information. The function of the presenting unit 140 can be realized by, for example, a presenting device such as a projector or a laser device provided on the robot arm tip 100.
 例えば、提示部140は、プロジェクションマッピング制御部124から入力されるプロジェクションマッピング情報に基づき、提示情報を提示する。例えば、提示部140は、プロジェクションマッピング情報に基づき、異常に関する情報を提示情報として提示する。また、提示部140は、プロジェクションマッピング情報に基づき、異常に関する情報以外の情報(以下では、非異常に関する情報とも称される)を提示情報として提示してもよい。非異常に関する情報は、例えば、対象の材質等である。なお、提示部140が提示する情報は、上述の例に限定されず、任意の情報を提示してもよい。 For example, the presentation unit 140 presents the presentation information based on the projection mapping information input from the projection mapping control unit 124. For example, the presentation unit 140 presents information about an abnormality as presentation information based on the projection mapping information. In addition, the presentation unit 140 may present, as the presentation information, information other than the information about the abnormality (hereinafter, also referred to as information about a non-abnormality) based on the projection mapping information. The information on the non-abnormality is, for example, the material of the target. Note that the information presented by the presentation unit 140 is not limited to the above example, and any information may be presented.
 (5)ロボットアーム駆動部150
 ロボットアーム駆動部150は、ロボットアーム12を駆動する機能を有する。例えば、ロボットアーム駆動部150は、ロボットアームコントローラ1260から入力される駆動指示に基づき駆動することで、ロボットアーム12を駆動する。当該ロボットアーム駆動部150は、例えば、ロボットアーム先端部100、関節部102、及びリンク104の各々が有するアクチュエータにより実現され得る。
(5) Robot arm drive unit 150
The robot arm driving section 150 has a function of driving the robot arm 12. For example, the robot arm driving unit 150 drives the robot arm 12 by driving based on a driving instruction input from the robot arm controller 1260. The robot arm driving section 150 can be realized by, for example, an actuator included in each of the robot arm tip section 100, the joint section 102, and the link 104.
 (6)移動台車駆動部160
 移動台車駆動部160は、移動台車14を駆動する機能を有する。例えば、移動台車駆動部160は、移動台車コントローラ1262から入力される駆動指示に基づき駆動することで、移動台車14を駆動する。当該移動台車駆動部160は、例えば、複数の車輪108の各々が有するアクチュエータにより実現され得る。
(6) Mobile trolley drive unit 160
The moving trolley driving unit 160 has a function of driving the moving trolley 14. For example, the mobile trolley drive unit 160 drives the mobile trolley 14 by driving based on a drive instruction input from the mobile trolley controller 1262. The mobile trolley driving unit 160 can be realized by, for example, an actuator included in each of the plurality of wheels 108.
 <2-4.処理の流れ>
 以上、第1の実施形態に係る情報提示装置10-1の機能構成について説明した。次に、図4及び図5を参照しながら、第1の実施形態に係る処理の流れについて説明する。図4は、第1の実施形態に係るメイン処理の流れの一例を示すフローチャートである。図5は、第1の実施形態に係る通常スキャン処理の流れの一例を示すフローチャートである。
<2-4. Processing Flow>
The functional configuration of the information presentation device 10-1 according to the first embodiment has been described above. Next, a flow of processing according to the first embodiment will be described with reference to FIGS. FIG. 4 is a flowchart illustrating an example of the flow of the main process according to the first embodiment. FIG. 5 is a flowchart illustrating an example of the flow of the normal scan process according to the first embodiment.
 (1)メイン処理
 最初に、図4を参照しながら、第1の実施形態に係るメイン処理について説明する。図4に示すように、まず、情報提示装置10-1は、所定の領域を決定する(S1000)。所定の領域の決定後、情報提示装置10-1は、所定の領域の立体形状の認識を行う(S1002)。次いで、情報提示装置10-1は、所定の領域におけるスキャン軌道を決定する(S1004)。次いで、情報提示装置10-1は、決定したスキャン軌道に従い、所定の領域内における通常スキャン処理を行う(S1006)。なお、通常スキャン処理の詳細な処理については後述される。
(1) Main Processing First, the main processing according to the first embodiment will be described with reference to FIG. As shown in FIG. 4, first, the information presentation device 10-1 determines a predetermined area (S1000). After the determination of the predetermined area, the information presentation device 10-1 recognizes the three-dimensional shape of the predetermined area (S1002). Next, the information presentation device 10-1 determines a scan trajectory in a predetermined area (S1004). Next, the information presentation device 10-1 performs a normal scan process in a predetermined area according to the determined scan trajectory (S1006). The detailed process of the normal scan process will be described later.
 通常スキャン処理後、情報提示装置10-1は、異常検出の記録があるか否かを確認する(S1008)。異常検出の記録がない場合(S1008/NO)、情報提示装置10-1は処理を終了する。異常検出の記録がある場合(S1008/YES)、情報提示装置10-1は、プロジェクションマッピング情報の生成を行う(S1010)。そして、情報提示装置10-1は、生成したプロジェクションマッピング情報を提示する提示処理を行い、処理を終了する(S1012)。 (4) After the normal scanning process, the information presentation device 10-1 checks whether or not there is a record of abnormality detection (S1008). When there is no record of the abnormality detection (S1008 / NO), the information presentation device 10-1 ends the process. When there is a record of abnormality detection (S1008 / YES), the information presentation device 10-1 generates projection mapping information (S1010). Then, the information presentation device 10-1 performs a presentation process for presenting the generated projection mapping information, and ends the process (S1012).
 (2)通常スキャン処理
 次に、図5を参照しながら、第1の実施形態に係る通常スキャン処理について説明する。図5に示すように、まず、情報提示装置10-1は、所定の領域のスキャンを開始する(S2000)。情報提示装置10-1は、スキャンを行いながら、異常を検出したか否かを確認する(S2002)。異常が検出されない場合(S2002/NO)、情報提示装置10-1は、特に処理を行わず、スキャンを継続する。異常が検出された場合(S2002/YES)、情報提示装置10-1は、センサ装置のセンシングデータ及び異常の検出位置を記憶部130に記録し(S2004)、スキャンを継続する。
(2) Normal Scan Processing Next, the normal scan processing according to the first embodiment will be described with reference to FIG. As shown in FIG. 5, first, the information presentation device 10-1 starts scanning a predetermined area (S2000). The information presentation device 10-1 checks whether an abnormality has been detected while performing the scan (S2002). If no abnormality is detected (S2002 / NO), the information presenting device 10-1 continues scanning without performing any particular processing. If an abnormality is detected (S2002 / YES), the information presentation device 10-1 records the sensing data of the sensor device and the detection position of the abnormality in the storage unit 130 (S2004), and continues scanning.
 次いで、情報提示装置10-1は、所定の領域のスキャンが終了したか否かを確認する(S2006)。所定の領域のスキャンが終了した場合(S2006/YES)、情報提示装置10-1は、通常スキャン処理を終了する。所定の領域のスキャンが終了していない場合(S2006/NO)、情報提示装置10-1は、S2002~S2006の処理を繰り返す。 Next, the information presentation device 10-1 checks whether or not the scanning of the predetermined area has been completed (S2006). When the scanning of the predetermined area is completed (S2006 / YES), the information presentation device 10-1 ends the normal scanning process. If the scanning of the predetermined area has not been completed (S2006 / NO), the information presentation device 10-1 repeats the processing of S2002 to S2006.
 <2-5.実施例>
 以上、第1の実施形態に係る処理の流れについて説明した。次に、図6~図8を参照しながら、第1の実施形態に係る実施例について説明する。
<2-5. Example>
The flow of the process according to the first embodiment has been described above. Next, an example according to the first embodiment will be described with reference to FIGS.
 (1)第1の実施例
 第1の実施例では、図6を参照しながら、未知の物体等の調査に関する実施例について説明する。図6は、第1の実施形態に係る第1の実施例を示す図である。第1の実施例では、ユーザにより未知の物体20の表面を所定の領域(図示しない)として指定された例について説明する。また、第1の実施例では、センサ装置として温度センサが設けられているとする。
(1) First Example In a first example, an example relating to an investigation of an unknown object or the like will be described with reference to FIG. FIG. 6 is a diagram illustrating a first example according to the first embodiment. In the first embodiment, an example will be described in which the surface of the unknown object 20 is designated as a predetermined area (not shown) by the user. In the first embodiment, it is assumed that a temperature sensor is provided as a sensor device.
 図6の上部の図に示すように、情報提示装置10-1は、所定の領域をスキャン軌道40に沿ってスキャンすることで未知の物体20をスキャンする。当該スキャンにより、図6の中部の図に示すように、未知の物体20の位置50が高温であるという異常が検出されたとする。その場合、情報提示装置10-1は、図6の下部の図に示すように、異常に関する情報として高温である位置及び範囲を示す提示情報60を未知の物体20上に提示する。これにより、ユーザは、異常が発生している位置及び範囲を直感的に容易に把握することができる。なお、上述した提示情報60の例は、未知の物体の温度に関する異常に適用される例に限定されず、任意の異常に対して適用されてもよい。 (6) As shown in the upper part of FIG. 6, the information presentation device 10-1 scans the unknown object 20 by scanning a predetermined area along the scan trajectory 40. It is assumed that the scan has detected an abnormality that the position 50 of the unknown object 20 is at a high temperature, as shown in the middle part of FIG. In this case, as shown in the lower part of FIG. 6, the information presenting apparatus 10-1 presents, on the unknown object 20, presentation information 60 indicating a position and a range where the temperature is high as the information regarding the abnormality. This allows the user to intuitively and easily grasp the position and range where the abnormality has occurred. Note that the example of the presentation information 60 described above is not limited to an example applied to an abnormality related to the temperature of an unknown object, and may be applied to an arbitrary abnormality.
 (2)第2の実施例
 第2の実施例では、図7を参照しながら、屋内で飼育されているペット(例えば犬)に関する実施例について説明する。図7は、第1の実施形態に係る第2の実施例を示す図である。第2の実施例では、ユーザにより部屋の壁と床の一部が所定の領域(図示しない)として指定された例について説明する。また、第2の実施例では、センサ装置として化学センサが設けられているとする。
(2) Second Example In a second example, an example of a pet (for example, a dog) bred indoors will be described with reference to FIG. FIG. 7 is a diagram illustrating a second example according to the first embodiment. In the second embodiment, an example will be described in which a user designates a part of a wall and a floor of a room as a predetermined area (not shown). In the second embodiment, it is assumed that a chemical sensor is provided as a sensor device.
 図7の上部の図に示すように、ペット25aが部屋の任意の位置で嘔吐し、ペット25bが任意の位置で排泄したとする。情報提示装置10-1は、図7の中部の図に示すように、所定の領域をスキャン軌道44に沿ってスキャンする。当該スキャンにより、ペット25aの嘔吐物とペット25bの排泄物が異常として検出されたとする。その場合、情報提示装置10-1は、図7の下部の図に示すように、異常に関する情報として嘔吐物に関する提示情報62と排泄物に関する提示情報64を提示する。 と す る As shown in the upper part of FIG. 7, it is assumed that the pet 25a vomits at an arbitrary position in the room and the pet 25b excretes at an arbitrary position. The information presentation device 10-1 scans a predetermined area along the scan trajectory 44 as shown in the middle part of FIG. It is assumed that the scan has detected that the vomit of the pet 25a and the excrement of the pet 25b are abnormal. In this case, as shown in the lower part of FIG. 7, the information presentation device 10-1 presents the presentation information 62 on the vomit and the presentation information 64 on the excrement as the information on the abnormality.
 異常に関する情報は、例えば、提示情報62a、提示情報64a、及び提示情報64bのように、異常が検出された位置及び範囲を示す情報であってもよい。具体的に、提示情報62aは嘔吐物が存在する位置及び範囲を示し、提示情報64a及び提示情報64bは排泄物が存在する位置及び範囲を示している。 The information regarding the abnormality may be information indicating a position and a range where the abnormality is detected, such as the presentation information 62a, the presentation information 64a, and the presentation information 64b. Specifically, the presentation information 62a indicates the position and range where the vomit exists, and the presentation information 64a and the presentation information 64b indicate the position and range where the excrement exists.
 なお、情報提示装置10-1は、提示情報の色を変化させることで、検出された異常の内容が異なることを表現してもよい。例えば、情報提示装置10-1は、色の違いにより異常の内容が異なることを表現してもよい。具体的に、情報提示装置10-1は、嘔吐物を赤色、排泄物を青色に分類し、提示情報62aを赤色、提示情報64a及び提示情報64bを青色で表現してもよい。また、例えば、情報提示装置10-1は、色の濃度の違いにより、異常の内容が異なることを表現してもよい。具体的に、情報提示装置10-1は、図7の下部の図に示すように、情報提示装置10-1は、提示情報64aの色を提示情報64bの色よりも薄くし、提示情報64bの色を提示情報64aの色よりも濃くする。これにより、情報提示装置10-1は、提示情報64aの位置及び範囲における排泄物の濃度が提示情報64bの位置及び範囲における排泄物の濃度よりも薄いことを表現してもよい。このように、情報提示装置10-1が提示情報の色を変化させることで、ユーザは、色の違いにより容易に異常の内容の違いを区別することができる。なお、情報提示装置10-1は、色の違いをグラデーションにより表現してもよい。 The information presentation device 10-1 may express that the content of the detected abnormality is different by changing the color of the presentation information. For example, the information presentation device 10-1 may express that the content of the abnormality differs depending on the color. Specifically, the information presentation device 10-1 may classify the vomit into red and the excreta into blue, and express the presentation information 62a in red and the presentation information 64a and the presentation information 64b in blue. Further, for example, the information presentation device 10-1 may express that the content of the abnormality is different depending on the difference in color density. More specifically, as shown in the lower part of FIG. 7, the information presentation device 10-1 makes the color of the presentation information 64a thinner than the color of the presentation information 64b, Is made darker than the color of the presentation information 64a. Thus, the information presentation device 10-1 may express that the excrement concentration at the position and range of the presentation information 64a is lower than the excretion concentration at the position and range of the presentation information 64b. As described above, the information presentation device 10-1 changes the color of the presentation information, so that the user can easily distinguish the difference in the content of the abnormality based on the difference in the color. Note that the information presentation device 10-1 may express the difference in color by gradation.
 また、異常に関する情報は、例えば、提示情報62bのように、アイコンにより示された異常の内容を示す情報であってもよい。具体的に、提示情報62bは、ペットが嘔吐したことを示している。なお、情報提示装置10-1は、当該アイコンの大きさを変化させてもよい。例えば、情報提示装置10-1は、嘔吐物の濃度が濃い位置及び範囲ではアイコンの大きさを大きくし、嘔吐物の濃度が薄い場所ではアイコンの大きさを小さくしてもよい。このように、情報提示装置10-1が提示情報をアイコンにより提示することで、ユーザは、どのような異常が発生しているかを直感的に容易に把握することができる。 情報 Furthermore, the information relating to the abnormality may be information indicating the content of the abnormality indicated by the icon, for example, the presentation information 62b. Specifically, the presentation information 62b indicates that the pet has vomited. Note that the information presentation device 10-1 may change the size of the icon. For example, the information presentation device 10-1 may increase the size of the icon in a position and range where the density of the vomit is high, and may decrease the size of the icon in a place where the density of the vomit is low. As described above, the information presentation device 10-1 presents the presentation information by the icons, so that the user can intuitively and easily grasp what kind of abnormality has occurred.
 また、異常に関する情報は、例えば、提示情報64cのように、テキストにより示された異常の内容を示す情報であってもよい。具体的に、提示情報64cは、排泄物からアンモニアが検出されたことを示している。なお、情報提示装置10-1は、当該テキストの大きさを変化させてもよい。例えば、情報提示装置10-1は、アンモニアの濃度が濃い位置及び範囲ではテキストの大きさを大きくし、アンモニアの濃度が薄い場所ではテキストの大きさを小さくしてもよい。このように、情報提示装置10-1が提示情報をテキストにより提示することで、ユーザは、異常に関する詳細な情報を直感的に容易に把握することができる。 情報 Furthermore, the information relating to the abnormality may be information indicating the content of the abnormality indicated by text, such as the presentation information 64c. Specifically, the presentation information 64c indicates that ammonia has been detected from the excrement. Note that the information presentation device 10-1 may change the size of the text. For example, the information presentation device 10-1 may increase the size of the text in a position and range where the concentration of ammonia is high, and may decrease the size of the text in a position where the concentration of ammonia is low. As described above, the information presenting apparatus 10-1 presents the presented information by text, so that the user can intuitively and easily grasp detailed information on the abnormality.
 上述のように、情報提示装置10-1は、異常に関する情報に応じて提示情報の表示方法を変更することで、異常が検出された範囲、異常の種類、異常の程度等の付加的な情報を多様に提示することができる。また、異常に関する情報の詳細な情報が情報提示装置10-1により多様に提示されることにより、ユーザは、提示内容に基づき異常に対する処理方法を検討することができる。なお、上述した提示情報62及び提示情報64の例は、ペットの嘔吐物及び排泄物に関する異常に適用される例に限定されず、任意の異常に対して適用されてもよい。 As described above, the information presenting apparatus 10-1 changes the display method of the presentation information according to the information regarding the abnormality, thereby providing additional information such as the range in which the abnormality is detected, the type of abnormality, and the degree of abnormality. Can be presented in various ways. In addition, since the detailed information of the information on the abnormality is variously presented by the information presentation device 10-1, the user can examine a processing method for the abnormality based on the presentation content. In addition, the example of the presentation information 62 and the presentation information 64 described above is not limited to the example applied to the abnormality related to the vomit and excrement of the pet, and may be applied to any abnormality.
 (3)第3の実施例
 第3の実施例では、図8を参照しながら、有毒性を有する可能性がある薬品等の漏れの調査に関する実施例について説明する。図8は、第1の実施形態に係る第3の実施例を示す図である。第3の実施例では、ユーザにより複数のタンク22の表面の一部と床の一部が所定の領域(図示しない)として指定された例について説明する。また、第3の実施例では、センサ装置として化学センサが設けられているとする。なお、タンク22bの表面にはひび24aがあり、タンク22cの表面にはひび24bがあるものとする。
(3) Third Example A third example will be described with reference to FIG. 8, with reference to FIG. 8, which relates to an investigation of leakage of potentially toxic chemicals. FIG. 8 is a diagram illustrating a third example according to the first embodiment. In the third embodiment, an example will be described in which a part of the surface and a part of the floor of the plurality of tanks 22 are designated as predetermined regions (not shown) by the user. In the third embodiment, it is assumed that a chemical sensor is provided as a sensor device. The surface of the tank 22b has a crack 24a, and the surface of the tank 22c has a crack 24b.
 図8に示すように、情報提示装置10-1は、スキャン軌道42a~42dに沿ってスキャンすることで、タンク22a~22cの表面の一部及び床の一部である所定の領域をスキャンする。情報提示装置10-1は、当該スキャンにより、図8の中部の図に示す位置52a及び位置52bの各々にて、ひび24a及びひび24bから薬品が漏れていることを異常として検出したとする。また、情報提示装置10-1は、位置52cにてひび52cから漏れた薬品が存在することも異常として検出したとする。その場合、情報提示装置10-1は、図8の下部の図に示すように、薬品が漏れている位置及び範囲を示す提示情報66a~66cを薬品上に異常に関する情報として提示する。このように、情報提示装置10-1が異常の検出位置及び範囲を提示することで、ユーザは、異常が発生している位置及び範囲を直感的に容易に把握することができる。 As shown in FIG. 8, the information presentation device 10-1 scans a predetermined area that is a part of the surface of the tanks 22a to 22c and a part of the floor by scanning along the scan trajectories 42a to 42d. . It is assumed that the information presentation device 10-1 has detected as an abnormality that the medicine leaks from the cracks 24a and 24b at the positions 52a and 52b shown in the middle part of FIG. 8 by the scan. It is also assumed that the information presenting device 10-1 has also detected, as an abnormality, the presence of a chemical leaked from the crack 52c at the position 52c. In this case, as shown in the lower part of FIG. 8, the information presentation device 10-1 presents the presentation information 66a to 66c indicating the position and range of the leakage of the medicine on the medicine as information relating to the abnormality. Thus, the information presenting device 10-1 presents the detected position and range of the abnormality, so that the user can easily and intuitively grasp the position and range where the abnormality has occurred.
 なお、異常に関する情報は、提示情報67a及び67bのように、テキストにより示された異常の内容を示す情報であってもよい。具体的に、提示情報67aは、ひび24aから漏れた薬品が薬品Bであることを示している。また、提示情報67bは、ひび24bから漏れた薬品が薬品Cであることを示している。このように、情報提示装置10-1が提示情報をテキストにより提示することで、ユーザは、異常に関する詳細な情報を直感的に容易に把握することができる。 Note that the information on the abnormality may be information indicating the contents of the abnormality indicated by text, such as the presentation information 67a and 67b. Specifically, the presentation information 67a indicates that the medicine leaked from the crack 24a is the medicine B. The presentation information 67b indicates that the medicine leaked from the crack 24b is the medicine C. As described above, the information presenting apparatus 10-1 presents the presented information by text, so that the user can intuitively and easily grasp detailed information on the abnormality.
 また、情報提示装置10-1は、提示情報の色を変化させることで、検出された異常の内容が異なることを表現してもよい。例えば、情報提示装置10-1は、色の違いにより異常の内容が異なることを表現してもよい。具体的に、情報提示装置10-1は、薬品Bを赤色、薬品Bを青色に分類し、提示情報67aを赤色、提示情報67bを青色で表現してもよい。このように、情報提示装置10-1が提示情報の色を変化させることで、ユーザは、色の違いにより容易に異常の内容の違いを区別することができる。なお、上述した提示情報66及び提示情報67の例は、タンクから薬品が漏れていることに関する異常に適用される例に限定されず、任意の異常に対して適用されてもよい。 The information presentation device 10-1 may express that the content of the detected abnormality is different by changing the color of the presentation information. For example, the information presentation device 10-1 may express that the content of the abnormality differs depending on the color. Specifically, the information presentation device 10-1 may classify the medicine B into red and the medicine B into blue, and express the presentation information 67a in red and the presentation information 67b in blue. As described above, the information presentation device 10-1 changes the color of the presentation information, so that the user can easily distinguish the difference in the content of the abnormality based on the difference in the color. In addition, the example of the presentation information 66 and the presentation information 67 described above is not limited to an example applied to an abnormality related to leakage of a medicine from a tank, and may be applied to an arbitrary abnormality.
<<3.第2の実施形態>>
 以上、第1の実施形態について説明した。次に、本開示に係る第2の実施形態について説明する。後述するように、第2の実施形態によれば、通常スキャンにより所定の領域にて異常が検出された際に、第2の実施形態に係る情報提示装置10-2は、検出された異常の位置の近傍をより詳細にスキャンする。なお、情報提示装置10-2が検出された異常の位置の近傍をより詳細にスキャンすることは、以下では、詳細スキャンとも称される。情報提示装置10-2は、詳細スキャンを行うことで、異常に関するより詳細な情報を取得することができ、異常に関するより詳細な情報を提示することができる。以下、第2の実施形態について詳細に説明するが、上述した第1の実施形態と同一の内容については説明を省略する。
<< 3. Second Embodiment >>
The first embodiment has been described above. Next, a second embodiment according to the present disclosure will be described. As will be described later, according to the second embodiment, when an abnormality is detected in a predetermined area by the normal scan, the information presentation device 10-2 according to the second embodiment is configured to detect the detected abnormality. Scan the vicinity of a position in more detail. In addition, scanning the vicinity of the position of the detected abnormality in more detail by the information presentation device 10-2 is also referred to as a detailed scan below. The information presenting apparatus 10-2 can acquire more detailed information about the abnormality by performing the detailed scan, and can present more detailed information about the abnormality. Hereinafter, the second embodiment will be described in detail, but the description of the same contents as the first embodiment will be omitted.
 <3-1.情報提示装置の物理構成>
 第2の実施形態に係る情報提示装置10-2の物理構成は、第1の実施形態の<2-1.情報提示装置の物理構成>にて説明した物理構成と同様であり得る。そのため、第2の実施形態に係る情報提示装置10-2の物理構成についての説明は省略する。
<3-1. Physical configuration of information presentation device>
The physical configuration of the information presentation device 10-2 according to the second embodiment is the same as that of the first embodiment <2-1. Physical Configuration of Information Presenting Apparatus>. Therefore, description of the physical configuration of the information presentation device 10-2 according to the second embodiment will be omitted.
 <3-2.情報提示装置の機能構成例>
 第2の実施形態に係る情報提示装置10-2は、図3に示した第1の実施形態に係る情報提示装置10-1と異なり、詳細スキャンを行う機能を有する。そのため、情報提示装置10-2は、情報提示装置10-1の制御部120-1とは一部機能が異なる制御部120-2を有する。具体的に、情報提示装置10-2は、制御部120-1の空間処理部122-1とは一部機能が異なる空間処理部122-2を有する。
<3-2. Example of functional configuration of information presentation device>
The information presentation device 10-2 according to the second embodiment has a function of performing a detailed scan, unlike the information presentation device 10-1 according to the first embodiment shown in FIG. Therefore, the information presentation device 10-2 includes a control unit 120-2 having a function partially different from that of the control unit 120-1 of the information presentation device 10-1. Specifically, the information presentation device 10-2 includes a spatial processing unit 122-2 having a function partially different from that of the spatial processing unit 122-1 of the control unit 120-1.
 (1)制御部120-2
 制御部120-2は、所定の領域にて異常が検出された場合、異常が検出された位置の近傍における所定の領域に関する情報を、異常の検出前よりも詳細に取得部110に取得させる。まず、制御部120-2は、通常スキャンにより所定の領域にて異常が検出された場合、詳細スキャンを行う所定の領域を決定する。次いで、制御部120-2は、取得部110が通常スキャンよりも詳細にスキャンできるように、所定の領域におけるスキャン軌道を決定する。そして、制御部120-2は、決定したスキャン軌道に沿って取得部110にスキャンさせることで、異常の検出前よりも詳細な所定の領域に関する情報を取得部110に取得させる。
(1) Control unit 120-2
When the abnormality is detected in the predetermined area, the control unit 120-2 causes the acquisition unit 110 to acquire information about the predetermined area near the position where the abnormality is detected in more detail than before the abnormality is detected. First, when an abnormality is detected in a predetermined area by the normal scan, the control unit 120-2 determines a predetermined area for performing a detailed scan. Next, the control unit 120-2 determines a scan trajectory in a predetermined area so that the acquisition unit 110 can scan in more detail than the normal scan. Then, the control unit 120-2 causes the acquiring unit 110 to scan along the determined scan trajectory, thereby causing the acquiring unit 110 to acquire more detailed information on the predetermined area than before the detection of the abnormality.
 これにより、情報提示装置10-2は、通常スキャンのみを行う場合よりも詳細な所定の領域に関する情報を取得でき、より詳細な所定の領域に関する情報に基づきより詳細な異常に関する情報を提示することができる。情報提示装置10-2が詳細スキャンを行うことを実現するために、制御部120-2は、制御部120-1の空間処理部122-1とは一部機能が異なる空間処理部122-2を有する。 Thereby, the information presenting apparatus 10-2 can acquire more detailed information on the predetermined area than when performing only the normal scan, and can present more detailed information on the abnormality based on the more detailed information on the predetermined area. Can be. In order to implement the detailed scan by the information presentation device 10-2, the control unit 120-2 includes a spatial processing unit 122-2 having a function partially different from that of the spatial processing unit 122-1 of the control unit 120-1. Having.
 (1-1)空間処理部122-2
 空間処理部122-2は、情報提示装置10-2が詳細スキャンを行うことを実現するために、所定の領域の検出処理及びスキャン軌道の決定処理にて、空間処理部122-1とは一部異なる処理を行う。
(1-1) Spatial processing unit 122-2
The spatial processing unit 122-2 is different from the spatial processing unit 122-1 in the process of detecting a predetermined area and the process of determining a scan trajectory in order to realize that the information presentation device 10-2 performs a detailed scan. Different processes are performed.
 (所定の領域の検出処理)
 空間処理部122-2は、通常スキャンにより所定の領域にて異常が検出された場合、記憶部130に記憶されている異常を検出した位置を示す位置情報に基づき、詳細スキャンを行う所定の領域を検出する。なお、空間処理部122-2は、詳細スキャンを行う所定の領域をユーザにより指定された位置としてもよい。また、空間処理部122-2は、後述する駆動情報の生成処理を追加して行ってもよい。
(Predetermined area detection processing)
When an abnormality is detected in a predetermined area by the normal scan, the spatial processing unit 122-2 performs a predetermined area for performing a detailed scan based on the position information stored in the storage unit 130 and indicating the position where the abnormality is detected. Is detected. Note that the spatial processing unit 122-2 may set a predetermined area where a detailed scan is performed as a position designated by the user. Further, the spatial processing unit 122-2 may additionally perform a drive information generation process described later.
 なお、空間処理部122-2は、通常スキャンの対象となる所定の領域を詳細スキャンの対象としてもよい。異常が検出された位置の近傍だけでなく、所定の領域の全体を取得部110に詳細スキャンさせてもよい。 The spatial processing unit 122-2 may set a predetermined area to be subjected to a normal scan as a target for a detailed scan. The acquisition unit 110 may cause the acquisition unit 110 to perform a detailed scan of not only the vicinity of the position where the abnormality is detected but also the entire predetermined region.
 (スキャン軌道の決定処理)
 空間処理部122-2は、取得部110が通常スキャン時よりも詳細にスキャンできるようにスキャン軌道を決定する。例えば、空間処理部122-2は、詳細スキャン時のスキャン軌道の間隔を通常スキャン時のスキャン軌道の間隔よりも狭くする。これにより、取得部110は、所定の領域をより細かくスキャンでき、より詳細な所定の領域に関する情報を取得することができる。
(Scan trajectory determination processing)
The spatial processing unit 122-2 determines a scan trajectory so that the acquisition unit 110 can scan in more detail than during a normal scan. For example, the spatial processing unit 122-2 makes the interval between the scan trajectories during the detailed scan narrower than the interval between the scan trajectories during the normal scan. Accordingly, the acquisition unit 110 can scan the predetermined area more finely, and can acquire more detailed information on the predetermined area.
 (駆動情報の生成処理)
 空間処理部122-2は、取得部110が通常スキャン時よりも詳細にスキャンできるように取得部110の位置を制御させる駆動情報を生成してもよい。例えば、空間処理部122-2は、詳細スキャン時の所定の領域に対する取得部110の位置が、通常スキャン時の所定の領域に対する取得部110の位置よりも近くなるように、駆動情報を生成する。
(Drive information generation processing)
The spatial processing unit 122-2 may generate drive information for controlling the position of the acquisition unit 110 so that the acquisition unit 110 can perform more detailed scanning than during normal scanning. For example, the spatial processing unit 122-2 generates the driving information such that the position of the acquisition unit 110 with respect to the predetermined area during the detailed scan is closer to the position of the acquisition unit 110 with respect to the predetermined area during the normal scan. .
 <3-3.処理の流れ>
 第2の実施形態に係る処理の流れは、図4に示した第1の実施形態に係る処理の流れと異なる。以下では、図9及び図10を参照しながら、第2の実施形態に係る処理の流れについて説明する。図9は、第2の実施形態に係るメイン処理の流れの一例を示すフローチャートである。図10は、第2の実施形態に係る詳細スキャン処理の流れの一例を示すフローチャートである。
<3-3. Processing Flow>
The processing flow according to the second embodiment is different from the processing flow according to the first embodiment illustrated in FIG. Hereinafter, the flow of processing according to the second embodiment will be described with reference to FIGS. 9 and 10. FIG. 9 is a flowchart illustrating an example of the flow of a main process according to the second embodiment. FIG. 10 is a flowchart illustrating an example of the flow of the detailed scan process according to the second embodiment.
 (1)メイン処理
 最初に、図9を参照しながら、第2の実施形態に係るメイン処理について説明する。図9に示すように、第2の実施形態に係るメイン処理は、異常検出の記録があった場合に(S1008/YES)、詳細スキャン処理(S1014)を行う点が第1の実施形態に係るメイン処理と異なる。詳細スキャン処理以外の処理は、<2-4.処理の流れ>にて説明した処理と同様で有り得る。そのため、詳細スキャン処理以外の処理についての説明は省略する。
(1) Main Processing First, the main processing according to the second embodiment will be described with reference to FIG. As shown in FIG. 9, the main processing according to the second embodiment is such that the detailed scan processing (S1014) is performed when the abnormality detection is recorded (S1008 / YES) according to the first embodiment. Different from main processing. Processing other than the detailed scan processing is described in <2-4. Flow of processing>. Therefore, description of processes other than the detailed scan process will be omitted.
 (2)通常スキャン処理
 第2の実施形態に係るスキャン処理は、<2-4.処理の流れ>にて説明した第1の実施形態に係る通常スキャン処理と同様で有り得る。そのため、通常スキャン処理についての説明は省略する。
(2) Normal Scan Process The scan process according to the second embodiment is described in <2-4. Process Flow> It can be the same as the normal scan process according to the first embodiment described in <1. Therefore, description of the normal scan processing is omitted.
 (3)詳細スキャン処理
 次に、図10を参照しながら、第2の実施形態に係る詳細スキャン処理について説明する。図10に示すように、まず、情報提示装置10-2は、異常の検出位置の近傍におけるスキャン軌道を決定する(S3000)。次いで、情報提示装置10-2は、異常の検出位置の近傍のスキャンを開始する(S3002)。情報提示装置10-2は、スキャンを行いながら、異常が検出されたか否かを確認する(S3004)。異常が検出されない場合(S3004/NO)、情報提示装置10-2は、特に処理を行わず、スキャンを継続する。異常が検出された場合(S3004/YES)、情報提示装置10-2は、センサ装置のセンシングデータ及び異常の検出位置を記憶部130に記録し(S3006)、スキャンを継続する。
(3) Detailed Scan Process Next, a detailed scan process according to the second embodiment will be described with reference to FIG. As shown in FIG. 10, first, the information presentation device 10-2 determines a scan trajectory in the vicinity of the abnormality detection position (S3000). Next, the information presentation device 10-2 starts scanning near the abnormality detection position (S3002). The information presentation device 10-2 checks whether an abnormality has been detected while performing the scan (S3004). If no abnormality is detected (S3004 / NO), the information presenting device 10-2 continues scanning without performing any processing. When an abnormality is detected (S3004 / YES), the information presentation device 10-2 records the sensing data of the sensor device and the detection position of the abnormality in the storage unit 130 (S3006), and continues scanning.
 次いで、情報提示装置10-2は、異常の検出位置の近傍のスキャンが終了したか否かを確認する(S3008)。異常の検出位置の近傍のスキャンが終了した場合(S3008/YES)、情報提示装置10-2は、詳細スキャン処理を終了する。異常の検出位置の近傍のスキャンが終了していない場合(S3008/NO)、情報提示装置10-2は、S3004~S3008の処理を繰り返す。 Next, the information presentation device 10-2 checks whether the scan near the abnormality detection position has been completed (S3008). When the scan near the abnormality detection position is completed (S3008 / YES), the information presentation device 10-2 ends the detailed scan process. If the scan near the abnormality detection position has not been completed (S3008 / NO), the information presentation device 10-2 repeats the processing of S3004 to S3008.
 <3-4.実施例>
 以上、第2の実施形態に係る処理の流れについて説明した。次に、第2の実施形態に係る実施例について説明する。第2の実施形態に係る実施例では、図11を参照しながら、有毒性を有する可能性がある薬品等の漏れの調査にて詳細スキャンを行う実施例について説明する。図11は、第2の実施形態に係る実施例を示す図である。当該実施例では、情報提示装置10-2がタンク22a~22cに対して通常スキャンを行った結果、図8に示した位置52a~52cと同一の位置にて異常が検出され、位置52a~52cの近傍にて詳細スキャンが行われる例について説明する。また、当該実施例では、センサ装置として化学センサが設けられているとする。
<3-4. Example>
The flow of the process according to the second embodiment has been described above. Next, an example according to the second embodiment will be described. In an example according to the second embodiment, an example in which a detailed scan is performed by investigating leakage of a potentially toxic chemical or the like will be described with reference to FIG. FIG. 11 is a diagram illustrating an example according to the second embodiment. In this embodiment, the information presentation device 10-2 performs a normal scan on the tanks 22a to 22c, and as a result, an abnormality is detected at the same position as the positions 52a to 52c shown in FIG. An example in which a detailed scan is performed in the vicinity of is described. In this embodiment, it is assumed that a chemical sensor is provided as a sensor device.
 まず、情報提示装置10-2は、図11の上部の図に示すように、異常が検出された位置の位置情報に基づき、所定の領域36a及び所定の領域36bを決定する。次いで、情報提示装置10-2は、所定の領域36aにおけるスキャン軌道をスキャン軌道46a、所定の領域36bにおけるスキャン軌道をスキャン軌道46bと決定する。 First, as shown in the upper part of FIG. 11, the information presentation device 10-2 determines the predetermined area 36a and the predetermined area 36b based on the position information of the position where the abnormality is detected. Next, the information presentation device 10-2 determines the scan trajectory in the predetermined area 36a as the scan trajectory 46a and the scan trajectory in the predetermined area 36b as the scan trajectory 46b.
 スキャン軌道の決定後、情報提示装置10-2は、スキャン軌道46a及びスキャン軌道46bに沿って所定の領域36a及び所定の領域36bを詳細スキャンする。情報提示装置10-2は、当該スキャンにより、図11の中部の図に示すように、ひび24a及びひび24bから漏れた薬品が位置54a及び位置54bに存在することを検出したとする。その場合、情報提示装置10-2は、図11の下部の図に示すように、漏れた薬品が存在する位置及び範囲を示す提示情報68a及び68bを漏れた薬品上に異常に関する情報として提示する。これにより、ユーザは、通常スキャンのみを行う場合よりも異常が発生している位置及び範囲を詳細に把握することができる。 After the determination of the scan trajectory, the information presentation device 10-2 performs a detailed scan of the predetermined area 36a and the predetermined area 36b along the scan trajectory 46a and the scan trajectory 46b. It is assumed that the information presentation device 10-2 detects that the chemical leaked from the cracks 24a and 24b is present at the positions 54a and 54b by the scan, as shown in the middle part of FIG. In this case, as shown in the lower diagram of FIG. 11, the information presentation device 10-2 presents presentation information 68a and 68b indicating the location and range where the leaked medicine exists, as information relating to abnormality on the leaked medicine. . Thereby, the user can grasp the position and range where the abnormality has occurred in more detail than in the case where only the normal scan is performed.
<<4.変形例>>
 以上、本開示に係る各実施形態について説明した。次に、各実施形態の変形例について説明する。なお、以下に説明する変形例は、単独で各実施形態に適用されてもよいし、組み合わせで各実施形態に適用されてもよい。また、変形例は、各実施形態で説明した構成に代えて適用されてもよいし、各実施形態で説明した構成に対して追加的に適用されてもよい。
<< 4. Modifications >>
The embodiments of the present disclosure have been described above. Next, a modified example of each embodiment will be described. Note that the modifications described below may be applied to each embodiment independently, or may be applied to each embodiment in combination. Further, the modified example may be applied instead of the configuration described in each embodiment, or may be additionally applied to the configuration described in each embodiment.
 <4-1.第1の変形例>
 まず、図12を参照しながら、各実施形態に係る第1の変形例について説明する。図12は、各実施形態に係る第1の変形例を示す説明図である。
<4-1. First Modification>
First, a first modified example according to each embodiment will be described with reference to FIG. FIG. 12 is an explanatory diagram illustrating a first modified example according to each embodiment.
 情報提示装置10の周辺状況によっては、情報提示装置10により提示される提示情報が正常に提示されないことが有り得る。例えば、提示部140の出力部の位置と異常の検出位置とを結ぶ直線上に物体が存在する場合、当該物体により提示部140の出力部から出力される提示情報が遮られてしまい、異常の検出位置に異常に関する情報が提示されなくなる。具体的に、ユーザが異常に対する処理を行っている際に、作業に伴う移動によりユーザが当該直線上に移動すること、並びにユーザ以外の人、動物、及びロボット等の移動体が当該直線上に移動することが考えられる。仮に、ユーザが提示情報に基づき異常に対する処理を行っている際に、提示情報が正常に提示されなくなると、ユーザが異常の検出位置を把握できず、ユーザに危害が及ぶ可能性がある。 提示 Depending on the surroundings of the information presenting apparatus 10, the presentation information presented by the information presenting apparatus 10 may not be normally presented. For example, when an object is present on a straight line connecting the position of the output unit of the presentation unit 140 and the detection position of the abnormality, the presentation information output from the output unit of the presentation unit 140 is blocked by the object. Information about the abnormality is not presented at the detection position. Specifically, when the user is performing a process for an abnormality, the user moves on the straight line due to a movement accompanying work, and a moving object such as a person, an animal, and a robot other than the user is placed on the straight line. It is possible to move. If the presentation information is not normally presented while the user is performing a process for the abnormality based on the presentation information, the user may not be able to grasp the detection position of the abnormality, and the user may be harmed.
 そこで、提示情報が物体により遮られた場合、または提示情報が物体により遮られると判定した場合、情報提示装置10は、当該物体により提示情報が遮られないように動作を制御してもよい。例えば、情報提示装置10の制御部120は、提示部140に、ユーザを避けるように異常に関する情報を提示させる。具体的に、図12の上部の図に示すように、未知の物体20に提示されている提示情報60が示す位置の異常に対して、作業者28が処理作業を行っているとする。情報提示装置10は、当該作業者28による処理作業中に作業者28の移動を検出し、作業者28により提示情報60が遮られると判定した場合、図12の下部の図に示すように、ユーザを避けるように移動することで、提示情報60が遮られないようにする。なお、情報提示装置10は、移動中も提示情報60を提示し続けることで、作業者28が異常の検出位置を見失わないようにする。これにより、ユーザが異常に対する処理作業中に異常の検出位置を見失わないため、情報提示装置10は、ユーザに危害が及ぶ可能性を低くすることができる。 Therefore, when the presentation information is blocked by the object or when it is determined that the presentation information is blocked by the object, the information presentation device 10 may control the operation so that the presentation information is not blocked by the object. For example, the control unit 120 of the information presenting apparatus 10 causes the presenting unit 140 to present information about the abnormality so as to avoid the user. Specifically, as shown in the upper part of FIG. 12, it is assumed that the worker 28 is performing a processing operation for a position abnormality indicated by the presentation information 60 presented on the unknown object 20. The information presentation device 10 detects the movement of the worker 28 during the processing work by the worker 28, and when it is determined that the presentation information 60 is blocked by the worker 28, as illustrated in the lower diagram of FIG. By moving so as to avoid the user, the presentation information 60 is not obstructed. The information presentation device 10 keeps presenting the presentation information 60 even while moving, so that the worker 28 does not lose sight of the abnormality detection position. Thus, the information presenting apparatus 10 can reduce the possibility of harm to the user because the user does not lose sight of the detection position of the abnormality during the processing operation for the abnormality.
 <4-2.第2の変形例>
 以上、各実施形態に係る第1の変形例について説明した。次に、図13を参照しながら、各実施形態に係る第2の変形例について説明する。図13は、各実施形態に係る第2の変形例を示す説明図である。
<4-2. Second Modification>
The first modification according to each embodiment has been described above. Next, a second modified example according to each embodiment will be described with reference to FIG. FIG. 13 is an explanatory diagram illustrating a second modified example according to each embodiment.
 上述の各実施形態では、情報提示装置10が異常に関する情報のみを提示する例について説明したが、情報提示装置10は、さらに非異常に関する情報を同時に提示する機能を有してもよい。当該機能は、取得部110に複数のセンサ装置が設けられ、複数のセンサ装置により複数の所定の領域に関する情報が取得されることで実現し得る。例えば、取得部110に温度センサと金属センサが設けられているとする。まず、情報提示装置10は、図13の上部の図に示すように、未知の物体20をスキャン軌道40に沿ってスキャンする。 In each of the above-described embodiments, an example has been described in which the information presenting apparatus 10 presents only information regarding abnormalities. However, the information presenting apparatus 10 may further have a function of simultaneously presenting information regarding non-abnormalities. The function can be realized by providing the acquisition unit 110 with a plurality of sensor devices and acquiring information on a plurality of predetermined regions by the plurality of sensor devices. For example, it is assumed that the acquisition unit 110 is provided with a temperature sensor and a metal sensor. First, the information presentation device 10 scans the unknown object 20 along the scan trajectory 40 as shown in the upper part of FIG.
 スキャンの結果、温度センサが取得した所定の領域に関する情報に基づき物体20の一部に発熱の異常が検出され、金属センサのセンシングデータの領域に関する情報に基づき未知の物体20の一部が金属であることが検出されたとする。その場合、情報提示装置10は、図13の下部の図に示すように、発熱している位置及び範囲を示す提示情報60を異常に関する情報として未知の物体20上に提示する。なお、情報提示装置10は、提示情報72aのように、提示情報60の近傍にテキストにより「発熱」と提示してもよい。 As a result of the scan, an abnormal heat generation is detected in a part of the object 20 based on the information on the predetermined area acquired by the temperature sensor, and a part of the unknown object 20 is made of metal based on the information on the sensing data area of the metal sensor. Suppose that something is detected. In this case, the information presentation device 10 presents the presentation information 60 indicating the position and range of heat generation on the unknown object 20 as information relating to abnormality, as shown in the lower diagram of FIG. In addition, the information presentation apparatus 10 may present "heat generation" in the vicinity of the presentation information 60 by text as in the case of the presentation information 72a.
 さらに、情報提示装置10は、未知の物体20の材質が金属である領域を示す提示情報70を非異常に関する情報として未知の物体20上に提示する。なお、情報提示装置10は、提示情報72bのように、提示情報70の近傍にテキストにより「金属」と提示してもよい。これにより、ユーザは、異常に関する情報と非異常に関する情報に基づき、所定の領域の状態を総合的に判断することができる。 情報 Furthermore, the information presenting apparatus 10 presents the presentation information 70 indicating the region where the material of the unknown object 20 is metal on the unknown object 20 as information relating to non-abnormality. Note that the information presentation device 10 may present “metal” by text near the presentation information 70 like the presentation information 72b. Thus, the user can comprehensively determine the state of the predetermined area based on the information regarding the abnormality and the information regarding the non-abnormality.
 <4-3.第3の変形例>
 以上、各実施形態に係る第2の変形例について説明した。最後に、図14を参照しながら、各実施形態に係る第3の変形例について説明する。図14は、各実施形態に係る第3の変形例を示す説明図である。なお、第3の変形例では、センサ装置として化学センサが設けられているとする。
<4-3. Third Modification>
The second modification of each embodiment has been described above. Finally, a third modification according to each embodiment will be described with reference to FIG. FIG. 14 is an explanatory diagram illustrating a third modification example according to each embodiment. In the third modification, it is assumed that a chemical sensor is provided as a sensor device.
 情報提示装置10により提示される異常に関する情報は、例えば、ユーザに対する警告を示す情報であってもよい。例えば、図14に示すように、情報提示装置10は、警告を示す情報として警告ラインである提示情報74を提示してもよい。具体的に、まず、情報提示装置10は、タンク22a~22cをスキャンした結果に基づき、第2の実施形態の実施例で述べた例と同様に、提示情報68a、提示情報68bを提示する。検出された薬品が劇薬であり、薬品が検出された位置にユーザ等が近づくことが危険である場合、情報提示装置10は、提示情報68a、及び68bに加え、警告ラインを示す提示情報74を提示する。 The information on the abnormality presented by the information presentation device 10 may be, for example, information indicating a warning to the user. For example, as shown in FIG. 14, the information presentation device 10 may present presentation information 74 that is a warning line as information indicating a warning. Specifically, first, the information presentation device 10 presents the presentation information 68a and the presentation information 68b based on the result of scanning the tanks 22a to 22c, as in the example described in the example of the second embodiment. When the detected medicine is a powerful medicine and it is dangerous that the user or the like approaches the position where the medicine is detected, the information presenting apparatus 10 displays the presentation information 74 indicating the warning line in addition to the presentation information 68a and 68b. Present.
 しかしながら、提示情報74のような警告ラインだけでは、警告ラインが提示されている理由がユーザに伝わらず、ユーザ等が警告ライン内に立ち入る可能性がある。そこで、情報提示装置10は、警告ラインが提示されている理由をテキストにより警告ラインの近傍にさらに提示してもよい。例えば、図14に示す提示情報76のように、情報提示装置10は、テキストにより「劇薬:立ち入り警告」と提示する。これにより、ユーザは、タンク22bとタンク22cから劇薬が漏れていることにより、警告ライン内へ立ち入らないように警告されていると理解することができる。 However, with only the warning line such as the presentation information 74, the reason why the warning line is presented is not transmitted to the user, and the user or the like may enter the warning line. Therefore, the information presenting device 10 may further present the reason why the warning line is presented near the warning line by text. For example, like the presentation information 76 shown in FIG. 14, the information presentation device 10 presents “drugs: entry warning” by text. Thus, the user can understand that the warning is issued so as not to enter the warning line due to the leakage of the powerful drug from the tank 22b and the tank 22c.
 また、情報提示装置10は、タンク22bから漏れている薬品が薬品Bであり、タンク22cから漏れている薬品が薬品Cであることをテキストにより提示してもよい。この時、情報提示装置10は、各テキストの大きさが異なるようにそれぞれの提示情報を提示することで、検出された薬品の性質が異なることを表現してもよい。例えば、提示情報74aと提示情報74bのように、提示情報74bのテキストの大きさを提示情報74aのテキストの大きさよりも大きくすることで、薬品Cが薬品Bよりも危険性が高いことを表現してもよい。 The information presenting device 10 may present, by text, that the medicine leaking from the tank 22b is the medicine B and the medicine leaking from the tank 22c is the medicine C. At this time, the information presentation device 10 may express that the properties of the detected medicines are different by presenting the respective presentation information so that the sizes of the texts are different. For example, by making the text size of the presentation information 74b larger than the text size of the presentation information 74a like the presentation information 74a and the presentation information 74b, it is expressed that the medicine C is more dangerous than the medicine B. May be.
<<5.ハードウェア構成例>>
 以上、本開示の各実施形態に係る変形例について説明した。最後に、図15を参照しながら、各実施形態に係る情報提示装置10のハードウェア構成例について説明する。図15は、各実施形態に係る情報提示装置10のハードウェア構成例を示すブロック図である。なお、図15に示す情報提示装置10は、例えば、図1及び図3に示した情報提示装置10を実現し得る。本実施形態に係る情報提示装置による情報処理は、ソフトウェアと、以下に説明するハードウェアとの協働により実現される。
<< 5. Hardware configuration example >>
As above, the modified examples according to each embodiment of the present disclosure have been described. Finally, an example of a hardware configuration of the information presentation device 10 according to each embodiment will be described with reference to FIG. FIG. 15 is a block diagram illustrating a hardware configuration example of the information presentation device 10 according to each embodiment. Note that the information presentation device 10 illustrated in FIG. 15 can realize, for example, the information presentation device 10 illustrated in FIGS. 1 and 3. Information processing by the information presentation device according to the present embodiment is realized by cooperation between software and hardware described below.
 図15に示すように、情報提示装置10は、CPU(Central Processing Unit)901、ROM(Read Only Memory)902、及びRAM(Random Access Memory)903を備える。また、情報提示装置10は、ホストバス904a、ブリッジ904、外部バス904b、インタフェース905、入力装置906、出力装置907、ストレージ装置908、ドライブ909、接続ポート911、及び通信装置913を備える。なお、ここで示すハードウェア構成は一例であり、構成要素の一部が省略されてもよい。また、ハードウェア構成は、ここで示される構成要素以外の構成要素をさらに含んでもよい。 As shown in FIG. 15, the information presentation device 10 includes a CPU (Central Processing Unit) 901, a ROM (Read Only Memory) 902, and a RAM (Random Access Memory) 903. The information presentation device 10 includes a host bus 904a, a bridge 904, an external bus 904b, an interface 905, an input device 906, an output device 907, a storage device 908, a drive 909, a connection port 911, and a communication device 913. Note that the hardware configuration shown here is an example, and some of the components may be omitted. Further, the hardware configuration may further include components other than the components shown here.
 (CPU901、ROM902、RAM903)
 CPU901は、例えば、演算処理装置又は制御装置として機能し、ROM902、RAM903、又はストレージ装置908に記録された各種プログラムに基づいて各構成要素の動作全般又はその一部を制御する。ROM902は、CPU901に読み込まれるプログラムや演算に用いるデータ等を格納する手段である。RAM903には、例えば、CPU901に読み込まれるプログラムや、そのプログラムを実行する際に適宜変化する各種パラメータ等が一時的又は永続的に格納される。これらはCPUバスなどから構成されるホストバス904aにより相互に接続されている。CPU901、ROM902及びRAM903は、例えば、ソフトウェアとの協働により、図3を参照して説明した制御部120の機能を実現し得る。
(CPU 901, ROM 902, RAM 903)
The CPU 901 functions as, for example, an arithmetic processing device or a control device, and controls the overall operation of each component or a part thereof based on various programs recorded in the ROM 902, the RAM 903, or the storage device 908. The ROM 902 is a unit that stores a program read by the CPU 901, data used for calculation, and the like. The RAM 903 temporarily or permanently stores, for example, a program read by the CPU 901 and various parameters that appropriately change when the program is executed. These are interconnected by a host bus 904a composed of a CPU bus and the like. The CPU 901, the ROM 902, and the RAM 903 can realize the function of the control unit 120 described with reference to FIG. 3 in cooperation with software, for example.
 (ホストバス904a、ブリッジ904、外部バス904b、インタフェース905)
 CPU901、ROM902、及びRAM903は、例えば、高速なデータ伝送が可能なホストバス904aを介して相互に接続される。一方、ホストバス904aは、例えば、ブリッジ904を介して比較的データ伝送速度が低速な外部バス904bに接続される。また、外部バス904bは、インタフェース905を介して種々の構成要素と接続される。
(Host bus 904a, bridge 904, external bus 904b, interface 905)
The CPU 901, the ROM 902, and the RAM 903 are mutually connected via, for example, a host bus 904a capable of high-speed data transmission. On the other hand, the host bus 904a is connected, for example, via a bridge 904 to an external bus 904b having a relatively low data transmission speed. The external bus 904b is connected to various components via an interface 905.
 (入力装置906)
 入力装置906は、例えば、マウス、キーボード、タッチパネル、ボタン、マイクロフォン、スイッチ及びレバー等、ユーザによって情報が入力される装置によって実現される。また、入力装置906は、例えば、赤外線やその他の電波を利用したリモートコントロール装置であってもよいし、情報提示装置10の操作に対応した携帯電話やPDA等の外部接続機器であってもよい。さらに、入力装置906は、例えば、上記の入力手段を用いてユーザにより入力された情報に基づいて入力信号を生成し、CPU901に出力する入力制御回路などを含んでいてもよい。情報提示装置10のユーザは、この入力装置906を操作することにより、情報提示装置10に対して各種のデータを入力したり処理動作を指示したりすることができる。
(Input device 906)
The input device 906 is realized by a device to which information is input by a user, such as a mouse, a keyboard, a touch panel, a button, a microphone, a switch, and a lever. The input device 906 may be, for example, a remote control device using infrared rays or other radio waves, or may be an externally connected device such as a mobile phone or a PDA that supports the operation of the information presentation device 10. . Further, the input device 906 may include, for example, an input control circuit that generates an input signal based on information input by a user using the above-described input unit and outputs the input signal to the CPU 901. By operating the input device 906, the user of the information presentation device 10 can input various data to the information presentation device 10 or instruct a processing operation.
 他にも、入力装置906は、ユーザに関する情報を検知する装置により形成され得る。例えば、入力装置906は、画像センサ(例えば、カメラ)、深度センサ(例えば、ステレオカメラ)、加速度センサ、ジャイロセンサ、地磁気センサ、光センサ、音センサ、測距センサ(例えば、ToF(Time of Flight)センサ)、力センサ等の各種のセンサを含み得る。また、入力装置906は、情報提示装置10の姿勢、移動速度等、情報提示装置10自身の状態に関する情報や、情報提示装置10の周辺の明るさや騒音等、情報提示装置10の周辺環境に関する情報を取得してもよい。また、入力装置906は、GNSS(Global Navigation Satellite System)衛星からのGNSS信号(例えば、GPS(Global Positioning System)衛星からのGPS信号)を受信して装置の緯度、経度及び高度を含む位置情報を測定するGNSSモジュールを含んでもよい。また、位置情報に関しては、入力装置906は、Wi-Fi(登録商標)、携帯電話・PHS・スマートフォン等との送受信、または近距離通信等により位置を検知するものであってもよい。入力装置906は、例えば、図3を参照して説明した取得部110の機能を実現し得る。 Alternatively, the input device 906 may be formed by a device that detects information about the user. For example, the input device 906 includes an image sensor (for example, a camera), a depth sensor (for example, a stereo camera), an acceleration sensor, a gyro sensor, a terrestrial magnetism sensor, an optical sensor, a sound sensor, and a distance measurement sensor (for example, ToF (Time of Flight) ) Sensors), various sensors such as force sensors. Further, the input device 906 is used for information on the state of the information presenting device 10 itself, such as the posture and moving speed of the information presenting device 10, and information on the surrounding environment of the information presenting device 10, such as brightness and noise around the information presenting device 10. May be obtained. Further, the input device 906 receives a GNSS signal from a GNSS (Global Navigation Satellite System) satellite (for example, a GPS signal from a GPS (Global Positioning System) satellite) and receives position information including the latitude, longitude, and altitude of the device. A GNSS module for measuring may be included. As for the position information, the input device 906 may be a device that detects a position by Wi-Fi (registered trademark), transmission / reception with a mobile phone / PHS / smartphone, or the like, or short-range communication. The input device 906 can realize, for example, the function of the acquisition unit 110 described with reference to FIG.
 (出力装置907)
 出力装置907は、取得した情報をユーザに対して視覚的又は聴覚的に通知することが可能な装置で形成される。このような装置として、CRTディスプレイ装置、液晶ディスプレイ装置、プラズマディスプレイ装置、ELディスプレイ装置、レーザープロジェクタ、LEDプロジェクタ及びランプ等の表示装置や、スピーカ及びヘッドホン等の音声出力装置や、プリンタ装置等がある。出力装置907は、例えば、情報提示装置10が行った各種処理により得られた結果を出力する。具体的には、表示装置は、情報提示装置10が行った各種処理により得られた結果を、テキスト、イメージ、表、グラフ等、様々な形式で視覚的に表示する。他方、音声出力装置は、再生された音声データや音響データ等からなるオーディオ信号をアナログ信号に変換して聴覚的に出力する。出力装置907は、例えば、図3を参照して説明した提示部140の機能を実現し得る。
(Output device 907)
The output device 907 is formed of a device that can visually or audibly notify the user of the acquired information. Examples of such devices include CRT display devices, liquid crystal display devices, plasma display devices, EL display devices, display devices such as laser projectors, LED projectors and lamps, audio output devices such as speakers and headphones, and printer devices. . The output device 907 outputs, for example, results obtained by various processes performed by the information presentation device 10. Specifically, the display device visually displays the results obtained by various processes performed by the information presentation device 10 in various formats such as text, images, tables, and graphs. On the other hand, the audio output device converts an audio signal composed of reproduced audio data, acoustic data, and the like into an analog signal and outputs it audibly. The output device 907 can realize, for example, the function of the presentation unit 140 described with reference to FIG.
 (ストレージ装置908)
 ストレージ装置908は、情報提示装置10の記憶部の一例として形成されたデータ格納用の装置である。ストレージ装置908は、例えば、HDD等の磁気記憶部デバイス、半導体記憶デバイス、光記憶デバイス又は光磁気記憶デバイス等により実現される。ストレージ装置908は、記憶媒体、記憶媒体にデータを記録する記録装置、記憶媒体からデータを読み出す読出し装置及び記憶媒体に記録されたデータを削除する削除装置などを含んでもよい。このストレージ装置908は、CPU901が実行するプログラムや各種データ及び外部から取得した各種のデータ等を格納する。ストレージ装置908は、例えば、図3を参照して説明した記憶部130の機能を実現し得る。
(Storage device 908)
The storage device 908 is a data storage device formed as an example of a storage unit of the information presentation device 10. The storage device 908 is realized by, for example, a magnetic storage device such as an HDD, a semiconductor storage device, an optical storage device, or a magneto-optical storage device. The storage device 908 may include a storage medium, a recording device that records data on the storage medium, a reading device that reads data from the storage medium, a deletion device that deletes data recorded on the storage medium, and the like. The storage device 908 stores programs executed by the CPU 901 and various data, various data acquired from the outside, and the like. The storage device 908 can realize, for example, the function of the storage unit 130 described with reference to FIG.
 (ドライブ909)
 ドライブ909は、記憶媒体用リーダライタであり、情報提示装置10に内蔵、あるいは外付けされる。ドライブ909は、装着されている磁気ディスク、光ディスク、光磁気ディスク、または半導体メモリ等のリムーバブル記憶媒体に記録されている情報を読み出して、RAM903に出力する。また、ドライブ909は、リムーバブル記憶媒体に情報を書き込むこともできる。
(Drive 909)
The drive 909 is a reader / writer for a storage medium, and is built in or external to the information presentation device 10. The drive 909 reads information recorded on a removable storage medium such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory, and outputs the information to the RAM 903. The drive 909 can also write information on a removable storage medium.
 (接続ポート911)
 接続ポート911は、例えば、USB(Universal Serial Bus)ポート、IEEE1394ポート、SCSI(Small Computer System Interface)、RS-232Cポート、又は光オーディオ端子等のような外部接続機器を接続するためのポートである。
(Connection port 911)
The connection port 911 is, for example, a port for connecting an external connection device such as a USB (Universal Serial Bus) port, an IEEE1394 port, a SCSI (Small Computer System Interface), an RS-232C port, or an optical audio terminal. .
 (通信装置913)
 通信装置913は、例えば、ネットワーク920に接続するための通信デバイス等で形成された通信インタフェースである。通信装置913は、例えば、有線若しくは無線LAN(Local Area Network)、LTE(Long Term Evolution)、Bluetooth(登録商標)又はWUSB(Wireless USB)用の通信カード等である。また、通信装置913は、光通信用のルータ、ADSL(Asymmetric Digital Subscriber Line)用のルータ又は各種通信用のモデム等であってもよい。この通信装置913は、例えば、インターネットや他の通信機器との間で、例えばTCP/IP等の所定のプロトコルに則して信号等を送受信することができる。
(Communication device 913)
The communication device 913 is, for example, a communication interface formed by a communication device or the like for connecting to the network 920. The communication device 913 is, for example, a communication card for a wired or wireless LAN (Local Area Network), LTE (Long Term Evolution), Bluetooth (registered trademark), or WUSB (Wireless USB). The communication device 913 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various kinds of communication. The communication device 913 can transmit and receive signals and the like to and from the Internet and other communication devices in accordance with a predetermined protocol such as TCP / IP.
 なお、ネットワーク920は、ネットワーク920に接続されている装置から送信される情報の有線、または無線の伝送路である。例えば、ネットワーク920は、インターネット、電話回線網、衛星通信網などの公衆回線網や、Ethernet(登録商標)を含む各種のLAN(Local Area Network)、WAN(Wide Area Network)などを含んでもよい。また、ネットワーク920は、IP-VPN(Internet Protocol-Virtual Private Network)などの専用回線網を含んでもよい。 The network 920 is a wired or wireless transmission path for information transmitted from a device connected to the network 920. For example, the network 920 may include a public line network such as the Internet, a telephone line network, a satellite communication network, various LANs including Ethernet (registered trademark) (Local Area Network), a WAN (Wide Area Network), and the like. In addition, the network 920 may include a dedicated line network such as an IP-VPN (Internet \ Protocol-Virtual \ Private \ Network).
 以上、図15を参照しながら、本実施形態に係る情報提示装置のハードウェア構成例について説明した。上記の各構成要素は、汎用的な部材を用いて実現されていてもよいし、各構成要素の機能に特化したハードウェアにより実現されていてもよい。従って、本実施形態を実施する時々の技術レベルに応じて、適宜、利用するハードウェア構成を変更することが可能である。 In the above, the example of the hardware configuration of the information presentation apparatus according to the present embodiment has been described with reference to FIG. Each of the above components may be realized using a general-purpose member, or may be realized by hardware specialized for the function of each component. Therefore, it is possible to appropriately change the hardware configuration to be used according to the technical level at the time of implementing the present embodiment.
 <<6.まとめ>>
 以上説明したように、本開示の各実施形態に係る情報提示装置10は、所定の領域に関する情報を取得し、所定の領域に関する情報に基づき検出される、所定の領域における異常に関する情報を異常が検出された位置へ提示する。これにより、情報提示装置10は、異常に関する情報を異常の発生源上に提示することができる。
<< 6. Summary >>
As described above, the information presentation device 10 according to each embodiment of the present disclosure acquires information about a predetermined area, and replaces information about an abnormality in the predetermined area, which is detected based on the information about the predetermined area, with the abnormality. Present to the detected position. Thereby, the information presentation device 10 can present information on the abnormality on the source of the abnormality.
 よって、ユーザが直感的に理解しやすいように異常に関する情報を提示することが可能な、新規かつ改良された情報提示装置、情報提示方法、及びプログラムを提供することが可能である。 Therefore, it is possible to provide a new and improved information presenting device, information presenting method, and program capable of presenting information about an abnormality so that a user can easily and intuitively understand.
 以上、添付図面を参照しながら本開示の好適な実施形態について詳細に説明したが、本開示の技術的範囲はかかる例に限定されない。本開示の技術分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範疇内において、各種の変更例または修正例に想到し得ることは明らかであり、これらについても、当然に本開示の技術的範囲に属するものと了解される。 Although the preferred embodiments of the present disclosure have been described above in detail with reference to the accompanying drawings, the technical scope of the present disclosure is not limited to such examples. It is apparent that a person having ordinary knowledge in the technical field of the present disclosure can come up with various changes or modifications within the scope of the technical idea described in the claims. Is naturally understood to belong to the technical scope of the present disclosure.
 例えば、本明細書において説明した各装置は、単独の装置として実現されてもよく、一部または全部が別々の装置として実現されても良い。例えば、情報提示装置10は、図3に示したロボットアーム12、移動台車14、制御部120、及び記憶部130を備える単独の装置としてとして実現されてもよい。また、例えば、図3に示した制御部120は、情報提示装置10とネットワーク等で接続されたサーバ装置として実現されてもよい。また、当該サーバ装置には、図3に示した記憶部130も備えられてもよい。 For example, each device described in this specification may be realized as a single device, or some or all of the devices may be realized as separate devices. For example, the information presentation device 10 may be realized as a single device including the robot arm 12, the moving trolley 14, the control unit 120, and the storage unit 130 illustrated in FIG. Further, for example, the control unit 120 illustrated in FIG. 3 may be realized as a server device connected to the information presentation device 10 via a network or the like. In addition, the server device may include the storage unit 130 illustrated in FIG.
 また、本明細書において説明した各装置による一連の処理は、ソフトウェア、ハードウェア、及びソフトウェアとハードウェアとの組合せのいずれを用いて実現されてもよい。ソフトウェアを構成するプログラムは、例えば、各装置の内部又は外部に設けられる記録媒体(非一時的な媒体:non-transitory media)に予め格納される。そして、各プログラムは、例えば、コンピュータによる実行時にRAMに読み込まれ、CPUなどのプロセッサにより実行される。 A series of processes by each device described in this specification may be realized using any of software, hardware, and a combination of software and hardware. A program constituting the software is stored in advance in a recording medium (non-transitory medium: non-transition @ media) provided inside or outside each device, for example. Each program is read into the RAM at the time of execution by a computer, for example, and executed by a processor such as a CPU.
 また、本明細書においてフローチャートを用いて説明した処理は、必ずしも図示された順序で実行されなくてもよい。いくつかの処理ステップは、並列的に実行されてもよい。また、追加的な処理ステップが採用されてもよく、一部の処理ステップが省略されてもよい。 The processes described with reference to the flowcharts in this specification do not necessarily have to be executed in the illustrated order. Some processing steps may be performed in parallel. Further, additional processing steps may be employed, and some processing steps may be omitted.
 また、本明細書に記載された効果は、あくまで説明的または例示的なものであって限定的ではない。つまり、本開示に係る技術は、上記の効果とともに、または上記の効果に代えて、本明細書の記載から当業者には明らかな他の効果を奏しうる。 効果 In addition, the effects described in this specification are merely illustrative or exemplary and not restrictive. That is, the technology according to the present disclosure can exhibit other effects that are obvious to those skilled in the art from the description in the present specification, in addition to or instead of the above effects.
 なお、以下のような構成も本開示の技術的範囲に属する。
(1)
 所定の領域に関する情報を取得する取得部と、
 前記所定の領域に関する情報に基づき検出される、前記所定の領域における異常に関する情報を提示する提示部と、
 前記提示部に、前記異常が検出された位置へ前記異常に関する情報を提示させる制御部と、
を備える、情報提示装置。
(2)
 前記制御部は、前記提示部に、前記異常に関する情報を前記異常の発生源上に提示させる、前記(1)に記載の情報提示装置。
(3)
 前記制御部は、前記提示部に、前記異常に関する情報を前記異常の発生源の近傍にさらに提示させる、前記(1)または(2)に記載の情報提示装置。
(4)
 前記取得部は、少なくとも1つのセンサ装置を有し、前記センサ装置により前記所定の領域に関する情報を取得する、前記(1)~(3)のいずれか一項に記載の情報提示装置。
(5)
 前記取得部が複数のセンサ装置を有し、前記複数のセンサ装置により取得される複数の所定の領域に関する情報に基づき、複数の異なる異常に関する情報が検出された場合、
 前記制御部は、前記提示部に、前記複数の異なる異常に関する情報の各々を提示させる、前記(1)~(4)のいずれか一項に記載の情報提示装置。
(6)
 前記制御部は、前記取得部と前記所定の領域との距離を制御する、前記(1)~(5)のいずれか一項に記載の情報提示装置。
(7)
 前記制御部は、前記取得部と前記所定の領域との距離が一定となるように、前記取得部の位置を制御する、前記(6)に記載の情報提示装置。
(8)
 前記所定の領域は、ユーザにより指定される対象の表面を含む、前記(1)~(7)のいずれか一項に記載の情報提示装置。
(9)
 前記所定の領域は、ユーザにより指定される領域を含む、前記(1)~(8)のいずれか一項に記載の情報提示装置。
(10)
 前記制御部は、前記所定の領域にて前記異常が検出された場合、前記異常が検出された位置の近傍における前記所定の領域に関する情報を、前記異常の検出前よりも詳細に前記取得部に取得させる、前記(1)~(9)のいずれか一項に記載の情報提示装置。
(11)
 前記制御部は、前記提示部に、ユーザを避けるように前記異常に関する情報を提示させる、前記(1)~(10)のいずれか一項に記載の情報提示装置。
(12)
 前記制御部は、前記提示部に、前記異常に関する情報をテキストで提示させる、前記(1)~(11)のいずれか一項に記載の情報提示装置。
(13)
 前記制御部は、前記提示部に、前記異常に関する情報をアイコンで提示させる、前記(1)~(12)のいずれか一項に記載の情報提示装置。
(14)
 前記制御部は、前記提示部に、検出された前記異常に応じて色を変化させた前記異常に関する情報を提示させる、前記(1)~(13)のいずれか一項に記載の情報提示装置。
(15)
 前記取得部は、前記所定の領域に関する情報を外部装置から取得する、前記(1)~(14)のいずれか一項に記載の情報提示装置。
(16)
 前記異常に関する情報は、前記異常が検出された範囲を示す情報である、前記(1)~(15)のいずれか一項に記載の情報提示装置。
(17)
 前記異常に関する情報は、前記異常として検出された物質の名称である、前記(1)~(16)のいずれか一項に記載の情報提示装置。
(18)
 前記異常に関する情報は、ユーザに対する警告を示す情報である、前記(1)~(17)のいずれか一項に記載の情報提示装置。
(19)
 所定の領域に関する情報を取得することと、
 前記所定の領域に関する情報に基づき検出される、前記所定の領域における異常に関する情報を提示することと、
 前記異常が検出された位置へ前記異常に関する情報を提示させることと、
を含む、プロセッサにより実行される情報提示方法。
(20)
 コンピュータを、
 所定の領域に関する情報を取得する取得部と、
 前記所定の領域に関する情報に基づき検出される、前記所定の領域における異常に関する情報を提示する提示部と、
 前記提示部に、前記異常が検出された位置へ前記異常に関する情報を提示させる制御部と、
として機能させるためのプログラム。
Note that the following configuration also belongs to the technical scope of the present disclosure.
(1)
An acquisition unit that acquires information about a predetermined area,
A presentation unit that is presented based on information about the predetermined area and presents information about an abnormality in the predetermined area.
A control unit that causes the presenting unit to present information on the abnormality to a position where the abnormality is detected,
An information presentation device comprising:
(2)
The information presentation device according to (1), wherein the control unit causes the presentation unit to present information on the abnormality on a source of the abnormality.
(3)
The information presentation device according to (1) or (2), wherein the control unit causes the presentation unit to further present information on the abnormality near a source of the abnormality.
(4)
The information presentation device according to any one of (1) to (3), wherein the acquisition unit includes at least one sensor device, and acquires information on the predetermined area by the sensor device.
(5)
The acquisition unit has a plurality of sensor devices, based on information about a plurality of predetermined regions acquired by the plurality of sensor devices, when information about a plurality of different abnormalities is detected,
The information presentation device according to any one of (1) to (4), wherein the control unit causes the presentation unit to present each of the information on the plurality of different abnormalities.
(6)
The information presentation device according to any one of (1) to (5), wherein the control unit controls a distance between the acquisition unit and the predetermined area.
(7)
The information presentation device according to (6), wherein the control unit controls a position of the acquisition unit such that a distance between the acquisition unit and the predetermined area is constant.
(8)
The information presentation device according to any one of (1) to (7), wherein the predetermined area includes a target surface designated by a user.
(9)
The information presentation device according to any one of (1) to (8), wherein the predetermined area includes an area specified by a user.
(10)
The control unit, when the abnormality is detected in the predetermined area, the information about the predetermined area in the vicinity of the position where the abnormality is detected, the acquisition unit in more detail than before the detection of the abnormality The information presentation device according to any one of (1) to (9), wherein the information presentation device acquires the information.
(11)
The information presentation device according to any one of (1) to (10), wherein the control unit causes the presentation unit to present information about the abnormality so as to avoid a user.
(12)
The information presentation device according to any one of (1) to (11), wherein the control unit causes the presentation unit to present information on the abnormality in text.
(13)
The information presentation device according to any one of (1) to (12), wherein the control unit causes the presentation unit to present the information on the abnormality as an icon.
(14)
The information presentation device according to any one of (1) to (13), wherein the control unit causes the presentation unit to present information on the abnormality whose color has been changed according to the detected abnormality. .
(15)
The information presentation device according to any one of (1) to (14), wherein the acquisition unit acquires information on the predetermined area from an external device.
(16)
The information presentation device according to any one of (1) to (15), wherein the information regarding the abnormality is information indicating a range in which the abnormality is detected.
(17)
The information presentation device according to any one of (1) to (16), wherein the information on the abnormality is a name of a substance detected as the abnormality.
(18)
The information presentation device according to any one of (1) to (17), wherein the information regarding the abnormality is information indicating a warning to a user.
(19)
Obtaining information about a predetermined area;
Presenting information on an abnormality in the predetermined area, which is detected based on the information on the predetermined area,
Causing the information on the abnormality to be presented at the position where the abnormality is detected;
An information presentation method executed by a processor, including:
(20)
Computer
An acquisition unit that acquires information about a predetermined area,
A presentation unit that is presented based on information about the predetermined area and presents information about an abnormality in the predetermined area.
A control unit that causes the presenting unit to present information on the abnormality to a position where the abnormality is detected,
Program to function as
 10 情報提示装置
 12 ロボットアーム
 14 移動台車
 100 ロボットアーム先端部
 110 取得部
 120 制御部
 122 空間処理部
 124 プロジェクションマッピング制御部
 126 ロボット制御部
 130 記憶部
 140 提示部
 150 ロボットアーム駆動部
 160 移動台車駆動部
 1260 ロボットアームコントローラ
 1262 移動台車コントローラ
DESCRIPTION OF SYMBOLS 10 Information presentation apparatus 12 Robot arm 14 Moving trolley 100 Robot arm tip part 110 Acquisition part 120 Control part 122 Spatial processing part 124 Projection mapping control part 126 Robot control part 130 Storage part 140 Presentation part 150 Robot arm driving part 160 Moving trolley driving part 1260 Robot arm controller 1262 Moving cart controller

Claims (20)

  1.  所定の領域に関する情報を取得する取得部と、
     前記所定の領域に関する情報に基づき検出される、前記所定の領域における異常に関する情報を提示する提示部と、
     前記提示部に、前記異常が検出された位置へ前記異常に関する情報を提示させる制御部と、
    を備える、情報提示装置。
    An acquisition unit that acquires information about a predetermined area,
    A presentation unit that is presented based on information about the predetermined area and presents information about an abnormality in the predetermined area.
    A control unit that causes the presenting unit to present information on the abnormality to a position where the abnormality is detected,
    An information presentation device comprising:
  2.  前記制御部は、前記提示部に、前記異常に関する情報を前記異常の発生源上に提示させる、請求項1に記載の情報提示装置。 The information presentation device according to claim 1, wherein the control unit causes the presentation unit to present information on the abnormality on a source of the abnormality.
  3.  前記制御部は、前記提示部に、前記異常に関する情報を前記異常の発生源の近傍にさらに提示させる、請求項1に記載の情報提示装置。 2. The information presentation device according to claim 1, wherein the control unit causes the presentation unit to further present information on the abnormality near a source of the abnormality.
  4.  前記取得部は、少なくとも1つのセンサ装置を有し、前記センサ装置により前記所定の領域に関する情報を取得する、請求項1に記載の情報提示装置。 The information presentation device according to claim 1, wherein the acquisition unit has at least one sensor device, and acquires information on the predetermined area by the sensor device.
  5.  前記取得部が複数のセンサ装置を有し、前記複数のセンサ装置により取得される複数の所定の領域に関する情報に基づき、複数の異なる異常に関する情報が検出された場合、
     前記制御部は、前記提示部に、前記複数の異なる異常に関する情報の各々を提示させる、請求項1に記載の情報提示装置。
    The acquisition unit has a plurality of sensor devices, based on information about a plurality of predetermined regions acquired by the plurality of sensor devices, when information about a plurality of different abnormalities is detected,
    The information presentation device according to claim 1, wherein the control unit causes the presentation unit to present information on each of the plurality of different abnormalities.
  6.  前記制御部は、前記取得部と前記所定の領域との距離を制御する、請求項1に記載の情報提示装置。 The information presentation device according to claim 1, wherein the control unit controls a distance between the acquisition unit and the predetermined area.
  7.  前記制御部は、前記取得部と前記所定の領域との距離が一定となるように、前記取得部の位置を制御する、請求項6に記載の情報提示装置。 The information presentation device according to claim 6, wherein the control unit controls a position of the acquisition unit such that a distance between the acquisition unit and the predetermined area is constant.
  8.  前記所定の領域は、ユーザにより指定される対象の表面を含む、請求項1に記載の情報提示装置。 The information presentation device according to claim 1, wherein the predetermined area includes a surface of a target designated by a user.
  9.  前記所定の領域は、ユーザにより指定される領域を含む、請求項1に記載の情報提示装置。 The information presentation device according to claim 1, wherein the predetermined area includes an area specified by a user.
  10.  前記制御部は、前記所定の領域にて前記異常が検出された場合、前記異常が検出された位置の近傍における前記所定の領域に関する情報を、前記異常の検出前よりも詳細に前記取得部に取得させる、請求項1に記載の情報提示装置。 The control unit, when the abnormality is detected in the predetermined area, the information about the predetermined area in the vicinity of the position where the abnormality is detected, the acquisition unit in more detail than before the detection of the abnormality The information presentation device according to claim 1, wherein the information presentation device acquires the information.
  11.  前記制御部は、前記提示部に、ユーザを避けるように前記異常に関する情報を提示させる、請求項1に記載の情報提示装置。 The information presenting device according to claim 1, wherein the control unit causes the presenting unit to present information on the abnormality so as to avoid a user.
  12.  前記制御部は、前記提示部に、前記異常に関する情報をテキストで提示させる、請求項1に記載の情報提示装置。 2. The information presentation device according to claim 1, wherein the control unit causes the presentation unit to present information on the abnormality in text.
  13.  前記制御部は、前記提示部に、前記異常に関する情報をアイコンで提示させる、請求項1に記載の情報提示装置。 2. The information presentation device according to claim 1, wherein the control unit causes the presentation unit to present information on the abnormality with an icon.
  14.  前記制御部は、前記提示部に、検出された前記異常に応じて色を変化させた前記異常に関する情報を提示させる、請求項1に記載の情報提示装置。 The information presentation device according to claim 1, wherein the control unit causes the presentation unit to present information on the abnormality whose color has been changed according to the detected abnormality.
  15.  前記取得部は、前記所定の領域に関する情報を外部装置から取得する、請求項1に記載の情報提示装置。 The information presentation device according to claim 1, wherein the acquisition unit acquires information on the predetermined area from an external device.
  16.  前記異常に関する情報は、前記異常が検出された範囲を示す情報である、請求項1に記載の情報提示装置。 The information presentation device according to claim 1, wherein the information on the abnormality is information indicating a range in which the abnormality is detected.
  17.  前記異常に関する情報は、前記異常として検出された物質の名称である、請求項1に記載の情報提示装置。 The information presentation device according to claim 1, wherein the information on the abnormality is a name of a substance detected as the abnormality.
  18.  前記異常に関する情報は、ユーザに対する警告を示す情報である、請求項1に記載の情報提示装置。 The information presentation device according to claim 1, wherein the information on the abnormality is information indicating a warning to a user.
  19.  所定の領域に関する情報を取得することと、
     前記所定の領域に関する情報に基づき検出される、前記所定の領域における異常に関する情報を提示することと、
     前記異常が検出された位置へ前記異常に関する情報を提示させることと、
    を含む、プロセッサにより実行される情報提示方法。
    Obtaining information about a predetermined area;
    Presenting information on an abnormality in the predetermined area, which is detected based on the information on the predetermined area,
    Causing the information on the abnormality to be presented at the position where the abnormality is detected;
    An information presentation method executed by a processor, including:
  20.  コンピュータを、
     所定の領域に関する情報を取得する取得部と、
     前記所定の領域に関する情報に基づき検出される、前記所定の領域における異常に関する情報を提示する提示部と、
     前記提示部に、前記異常が検出された位置へ前記異常に関する情報を提示させる制御部と、
    として機能させるためのプログラム。
    Computer
    An acquisition unit that acquires information about a predetermined area,
    A presentation unit that is presented based on information about the predetermined area and presents information about an abnormality in the predetermined area.
    A control unit that causes the presenting unit to present information on the abnormality to a position where the abnormality is detected,
    Program to function as
PCT/JP2019/032763 2018-10-04 2019-08-22 Information presentation device, information presentation method, and program WO2020071001A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-189235 2018-10-04
JP2018189235 2018-10-04

Publications (1)

Publication Number Publication Date
WO2020071001A1 true WO2020071001A1 (en) 2020-04-09

Family

ID=70055762

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/032763 WO2020071001A1 (en) 2018-10-04 2019-08-22 Information presentation device, information presentation method, and program

Country Status (1)

Country Link
WO (1) WO2020071001A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06198586A (en) * 1993-01-07 1994-07-19 Toshiba Corp Moving robot
JP2007102488A (en) * 2005-10-04 2007-04-19 Toyota Motor Corp Autonomous moving device
JP2015001468A (en) * 2013-06-17 2015-01-05 株式会社松井製作所 Molding inspection device
JP2017101992A (en) * 2015-12-01 2017-06-08 エーエルティー株式会社 Disaster situation monitoring/warning/evacuation guidance system
JP2018056908A (en) * 2016-09-30 2018-04-05 キヤノン株式会社 Information processing device, and information processing method and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06198586A (en) * 1993-01-07 1994-07-19 Toshiba Corp Moving robot
JP2007102488A (en) * 2005-10-04 2007-04-19 Toyota Motor Corp Autonomous moving device
JP2015001468A (en) * 2013-06-17 2015-01-05 株式会社松井製作所 Molding inspection device
JP2017101992A (en) * 2015-12-01 2017-06-08 エーエルティー株式会社 Disaster situation monitoring/warning/evacuation guidance system
JP2018056908A (en) * 2016-09-30 2018-04-05 キヤノン株式会社 Information processing device, and information processing method and program

Similar Documents

Publication Publication Date Title
JP6949107B2 (en) Systems and methods for training robots to drive autonomously on the route
JP6705465B2 (en) Observability grid-based autonomous environment search
US10394327B2 (en) Integration of auxiliary sensors with point cloud-based haptic rendering and virtual fixtures
US20210260773A1 (en) Systems and methods to control an autonomous mobile robot
JP7165259B2 (en) Multiple autonomous mobile robots
US9789612B2 (en) Remotely operating a mobile robot
US10654172B2 (en) Robot, control device, and robot system
US9168656B1 (en) Interfacing with a mobile telepresence robot
JP7025532B2 (en) Collision detection, estimation, and avoidance
CN112020688B (en) Apparatus, system and method for autonomous robot navigation using depth assessment
KR20210056893A (en) Apparatus and method for data visualization in 3D digital twin for facility safety inspection
US20190220043A1 (en) Methods and apparatus to facilitate autonomous navigation of robotic devices
WO2011074165A1 (en) Autonomous mobile device
TW201824794A (en) Method for operating an automatically moving robot
CN114800535B (en) Robot control method, mechanical arm control method, robot and control terminal
Tuvshinjargal et al. Hybrid motion planning method for autonomous robots using kinect based sensor fusion and virtual plane approach in dynamic environments
WO2020071001A1 (en) Information presentation device, information presentation method, and program
JP2011062786A (en) Laser sensor control device and laser sensor control method
WO2021049147A1 (en) Information processing device, information processing method, information processing program, and control device
CN109968366A (en) A kind of multifunction domestic appliance people
CN114603557B (en) Robot projection method and robot
JP2021027884A (en) Autonomous travel type vacuum cleaner, method for controlling autonomous travel type vacuum cleaner, and program
US20220016773A1 (en) Control apparatus, control method, and program
CN113125471A (en) Scanning system and scanning control method
CN110823775A (en) Robot for detecting air cleanliness

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19869848

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19869848

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP